Sample records for linear model-based expert

  1. Sparse distributed memory: understanding the speed and robustness of expert memory

    PubMed Central

    Brogliato, Marcelo S.; Chada, Daniel M.; Linhares, Alexandre

    2014-01-01

    How can experts, sometimes in exacting detail, almost immediately and very precisely recall memory items from a vast repertoire? The problem in which we will be interested concerns models of theoretical neuroscience that could explain the speed and robustness of an expert's recollection. The approach is based on Sparse Distributed Memory, which has been shown to be plausible, both in a neuroscientific and in a psychological manner, in a number of ways. A crucial characteristic concerns the limits of human recollection, the “tip-of-tongue” memory event—which is found at a non-linearity in the model. We expand the theoretical framework, deriving an optimization formula to solve this non-linearity. Numerical results demonstrate how the higher frequency of rehearsal, through work or study, immediately increases the robustness and speed associated with expert memory. PMID:24808842

  2. Developing CORE model-based worksheet with recitation task to facilitate students’ mathematical communication skills in linear algebra course

    NASA Astrophysics Data System (ADS)

    Risnawati; Khairinnisa, S.; Darwis, A. H.

    2018-01-01

    The purpose of this study was to develop a CORE model-based worksheet with recitation task that were valid and practical and could facilitate students’ communication skills in Linear Algebra course. This study was conducted in mathematics education department of one public university in Riau, Indonesia. Participants of the study were media and subject matter experts as validators as well as students from mathematics education department. The objects of this study are students’ worksheet and students’ mathematical communication skills. The results of study showed that: (1) based on validation of the experts, the developed students’ worksheet was valid and could be applied for students in Linear Algebra courses; (2) based on the group trial, the practicality percentage was 92.14% in small group and 90.19% in large group, so the worksheet was very practical and could attract students to learn; and (3) based on the post test, the average percentage of ideals was 87.83%. In addition, the results showed that the students’ worksheet was able to facilitate students’ mathematical communication skills in linear algebra course.

  3. Expert models and modeling processes associated with a computer-modeling tool

    NASA Astrophysics Data System (ADS)

    Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-07-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.

  4. Heuristic Model Of The Composite Quality Index Of Environmental Assessment

    NASA Astrophysics Data System (ADS)

    Khabarov, A. N.; Knyaginin, A. A.; Bondarenko, D. V.; Shepet, I. P.; Korolkova, L. N.

    2017-01-01

    The goal of the paper is to present the heuristic model of the composite environmental quality index based on the integrated application of the elements of utility theory, multidimensional scaling, expert evaluation and decision-making. The composite index is synthesized in linear-quadratic form, it provides higher adequacy of the results of the assessment preferences of experts and decision-makers.

  5. Proceedings: USACERL/ASCE First Joint Conference on Expert Systems, 29-30 June 1988

    DTIC Science & Technology

    1989-01-01

    Wong KOWLEDGE -BASED GRAPHIC DIALOGUES . o ...................... .... 80 D. L Mw 4 CONTENTS (Cont’d) ABSTRACTS ACCEPTED FOR PUBLICATION MAD, AN EXPERT...methodology of inductive shallow modeling was developed. Inductive systems may become powerful shallow modeling tools applicable to a large class of...analysis was conducted using a statistical package, Trajectories. Four different types of relationships were analyzed: linear, logarithmic, power , and

  6. Log-Linear Modeling of Agreement among Expert Exposure Assessors

    PubMed Central

    Hunt, Phillip R.; Friesen, Melissa C.; Sama, Susan; Ryan, Louise; Milton, Donald

    2015-01-01

    Background: Evaluation of expert assessment of exposure depends, in the absence of a validation measurement, upon measures of agreement among the expert raters. Agreement is typically measured using Cohen’s Kappa statistic, however, there are some well-known limitations to this approach. We demonstrate an alternate method that uses log-linear models designed to model agreement. These models contain parameters that distinguish between exact agreement (diagonals of agreement matrix) and non-exact associations (off-diagonals). In addition, they can incorporate covariates to examine whether agreement differs across strata. Methods: We applied these models to evaluate agreement among expert ratings of exposure to sensitizers (none, likely, high) in a study of occupational asthma. Results: Traditional analyses using weighted kappa suggested potential differences in agreement by blue/white collar jobs and office/non-office jobs, but not case/control status. However, the evaluation of the covariates and their interaction terms in log-linear models found no differences in agreement with these covariates and provided evidence that the differences observed using kappa were the result of marginal differences in the distribution of ratings rather than differences in agreement. Differences in agreement were predicted across the exposure scale, with the likely moderately exposed category more difficult for the experts to differentiate from the highly exposed category than from the unexposed category. Conclusions: The log-linear models provided valuable information about patterns of agreement and the structure of the data that were not revealed in analyses using kappa. The models’ lack of dependence on marginal distributions and the ease of evaluating covariates allow reliable detection of observational bias in exposure data. PMID:25748517

  7. Testing a Web-Based, Trained-Peer Model to Build Capacity for Evidence-Based Practices in Community Mental Health Systems.

    PubMed

    German, Ramaris E; Adler, Abby; Frankel, Sarah A; Stirman, Shannon Wiltsey; Pinedo, Paola; Evans, Arthur C; Beck, Aaron T; Creed, Torrey A

    2018-03-01

    Use of expert-led workshops plus consultation has been established as an effective strategy for training community mental health (CMH) clinicians in evidence-based practices (EBPs). Because of high rates of staff turnover, this strategy inadequately addresses the need to maintain capacity to deliver EBPs. This study examined knowledge, competency, and retention outcomes of a two-phase model developed to build capacity for an EBP in CMH programs. In the first phase, an initial training cohort in each CMH program participated in in-person workshops followed by expert-led consultation (in-person, expert-led [IPEL] phase) (N=214 clinicians). After this cohort completed training, new staff members participated in Web-based training (in place of in-person workshops), followed by peer-led consultation with the initial cohort (Web-based, trained-peer [WBTP] phase) (N=148). Tests of noninferiority assessed whether WBTP was not inferior to IPEL at increasing clinician cognitive-behavioral therapy (CBT) competency, as measured by the Cognitive Therapy Rating Scale. WBTP was not inferior to IPEL at developing clinician competency. Hierarchical linear models showed no significant differences in CBT knowledge acquisition between the two phases. Survival analyses indicated that WBTP trainees were less likely than IPEL trainees to complete training. In terms of time required from experts, WBTP required 8% of the resources of IPEL. After an initial investment to build in-house CBT expertise, CMH programs were able to use a WBTP model to broaden their own capacity for high-fidelity CBT. IPEL followed by WBTP offers an effective alternative to build EBP capacity in CMH programs, rather than reliance on external experts.

  8. An automatic and accurate method of full heart segmentation from CT image based on linear gradient model

    NASA Astrophysics Data System (ADS)

    Yang, Zili

    2017-07-01

    Heart segmentation is an important auxiliary method in the diagnosis of many heart diseases, such as coronary heart disease and atrial fibrillation, and in the planning of tumor radiotherapy. Most of the existing methods for full heart segmentation treat the heart as a whole part and cannot accurately extract the bottom of the heart. In this paper, we propose a new method based on linear gradient model to segment the whole heart from the CT images automatically and accurately. Twelve cases were tested in order to test this method and accurate segmentation results were achieved and identified by clinical experts. The results can provide reliable clinical support.

  9. The Journey "Is" the Destination: Reconsidering the Expert Sports Coach

    ERIC Educational Resources Information Center

    Turner, David; Nelson, Lee; Potrac, Paul

    2012-01-01

    This article seeks to critically consider the traditional linear staged model of expertise development commonly employed in the sports coaching literature, which has been principally based upon the accumulation of threshold amounts of hours of experience. Here, we draw upon recent developments in the broader expertise literature, which is starting…

  10. Can linear regression modeling help clinicians in the interpretation of genotypic resistance data? An application to derive a lopinavir-score.

    PubMed

    Cozzi-Lepri, Alessandro; Prosperi, Mattia C F; Kjær, Jesper; Dunn, David; Paredes, Roger; Sabin, Caroline A; Lundgren, Jens D; Phillips, Andrew N; Pillay, Deenan

    2011-01-01

    The question of whether a score for a specific antiretroviral (e.g. lopinavir/r in this analysis) that improves prediction of viral load response given by existing expert-based interpretation systems (IS) could be derived from analyzing the correlation between genotypic data and virological response using statistical methods remains largely unanswered. We used the data of the patients from the UK Collaborative HIV Cohort (UK CHIC) Study for whom genotypic data were stored in the UK HIV Drug Resistance Database (UK HDRD) to construct a training/validation dataset of treatment change episodes (TCE). We used the average square error (ASE) on a 10-fold cross-validation and on a test dataset (the EuroSIDA TCE database) to compare the performance of a newly derived lopinavir/r score with that of the 3 most widely used expert-based interpretation rules (ANRS, HIVDB and Rega). Our analysis identified mutations V82A, I54V, K20I and I62V, which were associated with reduced viral response and mutations I15V and V91S which determined lopinavir/r hypersensitivity. All models performed equally well (ASE on test ranging between 1.1 and 1.3, p = 0.34). We fully explored the potential of linear regression to construct a simple predictive model for lopinavir/r-based TCE. Although, the performance of our proposed score was similar to that of already existing IS, previously unrecognized lopinavir/r-associated mutations were identified. The analysis illustrates an approach of validation of expert-based IS that could be used in the future for other antiretrovirals and in other settings outside HIV research.

  11. An Expert System for the Evaluation of Cost Models

    DTIC Science & Technology

    1990-09-01

    contrast to the condition of equal error variance, called homoscedasticity. (Reference: Applied Linear Regression Models by John Neter - page 423...normal. (Reference: Applied Linear Regression Models by John Neter - page 125) Click Here to continue -> Autocorrelation Click Here for the index - Index...over time. Error terms correlated over time are said to be autocorrelated or serially correlated. (REFERENCE: Applied Linear Regression Models by John

  12. Four Common Simplifications of Multi-Criteria Decision Analysis do not hold for River Rehabilitation

    PubMed Central

    2016-01-01

    River rehabilitation aims at alleviating negative effects of human impacts such as loss of biodiversity and reduction of ecosystem services. Such interventions entail difficult trade-offs between different ecological and often socio-economic objectives. Multi-Criteria Decision Analysis (MCDA) is a very suitable approach that helps assessing the current ecological state and prioritizing river rehabilitation measures in a standardized way, based on stakeholder or expert preferences. Applications of MCDA in river rehabilitation projects are often simplified, i.e. using a limited number of objectives and indicators, assuming linear value functions, aggregating individual indicator assessments additively, and/or assuming risk neutrality of experts. Here, we demonstrate an implementation of MCDA expert preference assessments to river rehabilitation and provide ample material for other applications. To test whether the above simplifications reflect common expert opinion, we carried out very detailed interviews with five river ecologists and a hydraulic engineer. We defined essential objectives and measurable quality indicators (attributes), elicited the experts´ preferences for objectives on a standardized scale (value functions) and their risk attitude, and identified suitable aggregation methods. The experts recommended an extensive objectives hierarchy including between 54 and 93 essential objectives and between 37 to 61 essential attributes. For 81% of these, they defined non-linear value functions and in 76% recommended multiplicative aggregation. The experts were risk averse or risk prone (but never risk neutral), depending on the current ecological state of the river, and the experts´ personal importance of objectives. We conclude that the four commonly applied simplifications clearly do not reflect the opinion of river rehabilitation experts. The optimal level of model complexity, however, remains highly case-study specific depending on data and resource availability, the context, and the complexity of the decision problem. PMID:26954353

  13. Adaptive neural network/expert system that learns fault diagnosis for different structures

    NASA Astrophysics Data System (ADS)

    Simon, Solomon H.

    1992-08-01

    Corporations need better real-time monitoring and control systems to improve productivity by watching quality and increasing production flexibility. The innovative technology to achieve this goal is evolving in the form artificial intelligence and neural networks applied to sensor processing, fusion, and interpretation. By using these advanced Al techniques, we can leverage existing systems and add value to conventional techniques. Neural networks and knowledge-based expert systems can be combined into intelligent sensor systems which provide real-time monitoring, control, evaluation, and fault diagnosis for production systems. Neural network-based intelligent sensor systems are more reliable because they can provide continuous, non-destructive monitoring and inspection. Use of neural networks can result in sensor fusion and the ability to model highly, non-linear systems. Improved models can provide a foundation for more accurate performance parameters and predictions. We discuss a research software/hardware prototype which integrates neural networks, expert systems, and sensor technologies and which can adapt across a variety of structures to perform fault diagnosis. The flexibility and adaptability of the prototype in learning two structures is presented. Potential applications are discussed.

  14. Medical image segmentation based on SLIC superpixels model

    NASA Astrophysics Data System (ADS)

    Chen, Xiang-ting; Zhang, Fan; Zhang, Ruo-ya

    2017-01-01

    Medical imaging has been widely used in clinical practice. It is an important basis for medical experts to diagnose the disease. However, medical images have many unstable factors such as complex imaging mechanism, the target displacement will cause constructed defect and the partial volume effect will lead to error and equipment wear, which increases the complexity of subsequent image processing greatly. The segmentation algorithm which based on SLIC (Simple Linear Iterative Clustering, SLIC) superpixels is used to eliminate the influence of constructed defect and noise by means of the feature similarity in the preprocessing stage. At the same time, excellent clustering effect can reduce the complexity of the algorithm extremely, which provides an effective basis for the rapid diagnosis of experts.

  15. Consideration in selecting crops for the human-rated life support system: a Linear Programming model

    NASA Technical Reports Server (NTRS)

    Wheeler, E. F.; Kossowski, J.; Goto, E.; Langhans, R. W.; White, G.; Albright, L. D.; Wilcox, D.; Henninger, D. L. (Principal Investigator)

    1996-01-01

    A Linear Programming model has been constructed which aids in selecting appropriate crops for CELSS (Controlled Environment Life Support System) food production. A team of Controlled Environment Agriculture (CEA) faculty, staff, graduate students and invited experts representing more than a dozen disciplines, provided a wide range of expertise in developing the model and the crop production program. The model incorporates nutritional content and controlled-environment based production yields of carefully chosen crops into a framework where a crop mix can be constructed to suit the astronauts' needs. The crew's nutritional requirements can be adequately satisfied with only a few crops (assuming vitamin mineral supplements are provided) but this will not be satisfactory from a culinary standpoint. This model is flexible enough that taste and variety driven food choices can be built into the model.

  16. Consideration in selecting crops for the human-rated life support system: a linear programming model

    NASA Astrophysics Data System (ADS)

    Wheeler, E. F.; Kossowski, J.; Goto, E.; Langhans, R. W.; White, G.; Albright, L. D.; Wilcox, D.

    A Linear Programming model has been constructed which aids in selecting appropriate crops for CELSS (Controlled Environment Life Support System) food production. A team of Controlled Environment Agriculture (CEA) faculty, staff, graduate students and invited experts representing more than a dozen disciplines, provided a wide range of expertise in developing the model and the crop production program. The model incorporates nutritional content and controlled-environment based production yields of carefully chosen crops into a framework where a crop mix can be constructed to suit the astronauts' needs. The crew's nutritional requirements can be adequately satisfied with only a few crops (assuming vitamin mineral supplements are provided) but this will not be satisfactory from a culinary standpoint. This model is flexible enough that taste and variety driven food choices can be built into the model.

  17. Cue Reliance in L2 Written Production

    ERIC Educational Resources Information Center

    Wiechmann, Daniel; Kerz, Elma

    2014-01-01

    Second language learners reach expert levels in relative cue weighting only gradually. On the basis of ensemble machine learning models fit to naturalistic written productions of German advanced learners of English and expert writers, we set out to reverse engineer differences in the weighting of multiple cues in a clause linearization problem. We…

  18. Capturing Intuition Through Interactive Inverse Methods: Examples Drawn From Mechanical Non-Linearities in Structural Geology

    NASA Astrophysics Data System (ADS)

    Moresi, L.; May, D.; Peachey, T.; Enticott, C.; Abramson, D.; Robinson, T.

    2004-12-01

    Can you teach intuition ? Obviously we think that this is possible (though it's still just a hunch). People undoubtedly develop intuition for non-linear systems through painstaking repetition of complex tasks until they have sufficient feedback to begin to "see" the emergent behaviour. The better the exploration of the system can be exposed, the quicker the potential for developing an intuitive understanding. We have spent some time considering how to incorporate the intuitive knowledge of field geologists into mechanical modeling of geological processes. Our solution has been to allow expert geologist to steer (via a GUI) a genetic algorithm inversion of a mechanical forward model towards "structures" or patterns which are plausible in nature. The expert knowledge is then captured by analysis of the individual model parameters which are constrained by the steering (and by analysis of those which are unconstrained). The same system can also be used in reverse to expose the influence of individual parameters to the non-expert who is trying to learn just what does make a good match between model and observation. The ``distance'' between models preferred by experts, and those by an individual can be shown graphically to provide feedback. The examples we choose are from numerical models of extensional basins. We will first try to give each person some background information on the scientific problem from the poster and then we will let them loose on the numerical modeling tools with specific tasks to achieve. This will be an experiment in progress - we will later analyse how people use the GUI and whether there is really any significant difference between so-called experts and self-styled novices.

  19. Self-optimizing Pitch Control for Large Scale Wind Turbine Based on ADRC

    NASA Astrophysics Data System (ADS)

    Xia, Anjun; Hu, Guoqing; Li, Zheng; Huang, Dongxiao; Wang, Fengxiang

    2018-01-01

    Since wind turbine is a complex nonlinear and strong coupling system, traditional PI control method can hardly achieve good control performance. A self-optimizing pitch control method based on the active-disturbance-rejection control theory is proposed in this paper. A linear model of the wind turbine is derived by linearizing the aerodynamic torque equation and the dynamic response of wind turbine is transformed into a first-order linear system. An expert system is designed to optimize the amplification coefficient according to the pitch rate and the speed deviation. The purpose of the proposed control method is to regulate the amplification coefficient automatically and keep the variations of pitch rate and rotor speed in proper ranges. Simulation results show that the proposed pitch control method has the ability to modify the amplification coefficient effectively, when it is not suitable, and keep the variations of pitch rate and rotor speed in proper ranges

  20. PDE-based geophysical modelling using finite elements: examples from 3D resistivity and 2D magnetotellurics

    NASA Astrophysics Data System (ADS)

    Schaa, R.; Gross, L.; du Plessis, J.

    2016-04-01

    We present a general finite-element solver, escript, tailored to solve geophysical forward and inverse modeling problems in terms of partial differential equations (PDEs) with suitable boundary conditions. Escript’s abstract interface allows geoscientists to focus on solving the actual problem without being experts in numerical modeling. General-purpose finite element solvers have found wide use especially in engineering fields and find increasing application in the geophysical disciplines as these offer a single interface to tackle different geophysical problems. These solvers are useful for data interpretation and for research, but can also be a useful tool in educational settings. This paper serves as an introduction into PDE-based modeling with escript where we demonstrate in detail how escript is used to solve two different forward modeling problems from applied geophysics (3D DC resistivity and 2D magnetotellurics). Based on these two different cases, other geophysical modeling work can easily be realized. The escript package is implemented as a Python library and allows the solution of coupled, linear or non-linear, time-dependent PDEs. Parallel execution for both shared and distributed memory architectures is supported and can be used without modifications to the scripts.

  1. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    NASA Astrophysics Data System (ADS)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.

  2. BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods

    NASA Astrophysics Data System (ADS)

    Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.

    2017-12-01

    Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made through Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. With the ability to cope with incomplete information and use expert knowledge, as well as inherently providing quantitative uncertainty information, it is shown that loss models based on BNs are superior to deterministic approaches for pluvial flood risk assessment.

  3. A multi-criteria index for ecological evaluation of tropical agriculture in southeastern Mexico.

    PubMed

    Huerta, Esperanza; Kampichler, Christian; Ochoa-Gaona, Susana; De Jong, Ben; Hernandez-Daumas, Salvador; Geissen, Violette

    2014-01-01

    The aim of this study was to generate an easy to use index to evaluate the ecological state of agricultural land from a sustainability perspective. We selected environmental indicators, such as the use of organic soil amendments (green manure) versus chemical fertilizers, plant biodiversity (including crop associations), variables which characterize soil conservation of conventional agricultural systems, pesticide use, method and frequency of tillage. We monitored the ecological state of 52 agricultural plots to test the performance of the index. The variables were hierarchically aggregated with simple mathematical algorithms, if-then rules, and rule-based fuzzy models, yielding the final multi-criteria index with values from 0 (worst) to 1 (best conditions). We validated the model through independent evaluation by experts, and we obtained a linear regression with an r2 = 0.61 (p = 2.4e-06, d.f. = 49) between index output and the experts' evaluation.

  4. A topological substructural molecular design approach for predicting mutagenesis end-points of alpha, beta-unsaturated carbonyl compounds.

    PubMed

    Pérez-Garrido, Alfonso; Helguera, Aliuska Morales; López, Gabriel Caravaca; Cordeiro, M Natália D S; Escudero, Amalio Garrido

    2010-01-31

    Chemically reactive, alpha, beta-unsaturated carbonyl compounds are common environmental pollutants able to produce a wide range of adverse effects, including, e.g. mutagenicity. This toxic property can often be related to chemical structure, in particular to specific molecular substructures or fragments (alerts), which can then be used in specialized software or expert systems for predictive purposes. In the past, there have been many attempts to predict the mutagenicity of alpha, beta-unsaturated carbonyl compounds through quantitative structure activity relationships (QSAR) but considering only one exclusive endpoint: the Ames test. Besides, even though those studies give a comprehensive understanding of the phenomenon, they do not provide substructural information that could be useful forward improving expert systems based on structural alerts (SAs). This work reports an evaluation of classification models to probe the mutagenic activity of alpha, beta-unsaturated carbonyl compounds over two endpoints--the Ames and mammalian cell gene mutation tests--based on linear discriminant analysis along with the topological Substructure molecular design (TOPS-MODE) approach. The obtained results showed the better ability of the TOPS-MODE approach in flagging structural alerts for the mutagenicity of these compounds compared to the expert system TOXTREE. Thus, the application of the present QSAR models can aid toxicologists in risk assessment and in prioritizing testing, as well as in the improvement of expert systems, such as the TOXTREE software, where SAs are implemented. 2009 Elsevier Ireland Ltd. All rights reserved.

  5. Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.

  6. A Model-Based Expert System for Space Power Distribution Diagnostics

    NASA Technical Reports Server (NTRS)

    Quinn, Todd M.; Schlegelmilch, Richard F.

    1994-01-01

    When engineers diagnose system failures, they often use models to confirm system operation. This concept has produced a class of advanced expert systems that perform model-based diagnosis. A model-based diagnostic expert system for the Space Station Freedom electrical power distribution test bed is currently being developed at the NASA Lewis Research Center. The objective of this expert system is to autonomously detect and isolate electrical fault conditions. Marple, a software package developed at TRW, provides a model-based environment utilizing constraint suspension. Originally, constraint suspension techniques were developed for digital systems. However, Marple provides the mechanisms for applying this approach to analog systems such as the test bed, as well. The expert system was developed using Marple and Lucid Common Lisp running on a Sun Sparc-2 workstation. The Marple modeling environment has proved to be a useful tool for investigating the various aspects of model-based diagnostics. This report describes work completed to date and lessons learned while employing model-based diagnostics using constraint suspension within an analog system.

  7. Extension of mixture-of-experts networks for binary classification of hierarchical data.

    PubMed

    Ng, Shu-Kay; McLachlan, Geoffrey J

    2007-09-01

    For many applied problems in the context of medically relevant artificial intelligence, the data collected exhibit a hierarchical or clustered structure. Ignoring the interdependence between hierarchical data can result in misleading classification. In this paper, we extend the mechanism for mixture-of-experts (ME) networks for binary classification of hierarchical data. Another extension is to quantify cluster-specific information on data hierarchy by random effects via the generalized linear mixed-effects model (GLMM). The extension of ME networks is implemented by allowing for correlation in the hierarchical data in both the gating and expert networks via the GLMM. The proposed model is illustrated using a real thyroid disease data set. In our study, we consider 7652 thyroid diagnosis records from 1984 to early 1987 with complete information on 20 attribute values. We obtain 10 independent random splits of the data into a training set and a test set in the proportions 85% and 15%. The test sets are used to assess the generalization performance of the proposed model, based on the percentage of misclassifications. For comparison, the results obtained from the ME network with independence assumption are also included. With the thyroid disease data, the misclassification rate on test sets for the extended ME network is 8.9%, compared to 13.9% for the ME network. In addition, based on model selection methods described in Section 2, a network with two experts is selected. These two expert networks can be considered as modeling two groups of patients with high and low incidence rates. Significant variation among the predicted cluster-specific random effects is detected in the patient group with low incidence rate. It is shown that the extended ME network outperforms the ME network for binary classification of hierarchical data. With the thyroid disease data, useful information on the relative log odds of patients with diagnosed conditions at different periods can be evaluated. This information can be taken into consideration for the assessment of treatment planning of the disease. The proposed extended ME network thus facilitates a more general approach to incorporate data hierarchy mechanism in network modeling.

  8. A Three-Level Hierarchical Linear Model Using Student Growth Curve Modeling and Contextual Data

    ERIC Educational Resources Information Center

    Giorgio, Dorian

    2012-01-01

    Educational experts have criticized status models of school accountability, as required by the No Child Left Behind Act (NCLB), describing them as ineffectual in measuring achievement because their one-time assessment of student knowledge ignores student growth. Research on student achievement has instead identified growth models as superior…

  9. Upper arm elevation and repetitive shoulder movements: a general population job exposure matrix based on expert ratings and technical measurements.

    PubMed

    Dalbøge, Annett; Hansson, Gert-Åke; Frost, Poul; Andersen, Johan Hviid; Heilskov-Hansen, Thomas; Svendsen, Susanne Wulff

    2016-08-01

    We recently constructed a general population job exposure matrix (JEM), The Shoulder JEM, based on expert ratings. The overall aim of this study was to convert expert-rated job exposures for upper arm elevation and repetitive shoulder movements to measurement scales. The Shoulder JEM covers all Danish occupational titles, divided into 172 job groups. For 36 of these job groups, we obtained technical measurements (inclinometry) of upper arm elevation and repetitive shoulder movements. To validate the expert-rated job exposures against the measured job exposures, we used Spearman rank correlations and the explained variance[Formula: see text] according to linear regression analyses (36 job groups). We used the linear regression equations to convert the expert-rated job exposures for all 172 job groups into predicted measured job exposures. Bland-Altman analyses were used to assess the agreement between the predicted and measured job exposures. The Spearman rank correlations were 0.63 for upper arm elevation and 0.64 for repetitive shoulder movements. The expert-rated job exposures explained 64% and 41% of the variance of the measured job exposures, respectively. The corresponding calibration equations were y=0.5%time+0.16×expert rating and y=27°/s+0.47×expert rating. The mean differences between predicted and measured job exposures were zero due to calibration; the 95% limits of agreement were ±2.9% time for upper arm elevation >90° and ±33°/s for repetitive shoulder movements. The updated Shoulder JEM can be used to present exposure-response relationships on measurement scales. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. Automatic classification of artifactual ICA-components for artifact removal in EEG signals.

    PubMed

    Winkler, Irene; Haufe, Stefan; Tangermann, Michael

    2011-08-02

    Artifacts contained in EEG recordings hamper both, the visual interpretation by experts as well as the algorithmic processing and analysis (e.g. for Brain-Computer Interfaces (BCI) or for Mental State Monitoring). While hand-optimized selection of source components derived from Independent Component Analysis (ICA) to clean EEG data is widespread, the field could greatly profit from automated solutions based on Machine Learning methods. Existing ICA-based removal strategies depend on explicit recordings of an individual's artifacts or have not been shown to reliably identify muscle artifacts. We propose an automatic method for the classification of general artifactual source components. They are estimated by TDSEP, an ICA method that takes temporal correlations into account. The linear classifier is based on an optimized feature subset determined by a Linear Programming Machine (LPM). The subset is composed of features from the frequency-, the spatial- and temporal domain. A subject independent classifier was trained on 640 TDSEP components (reaction time (RT) study, n = 12) that were hand labeled by experts as artifactual or brain sources and tested on 1080 new components of RT data of the same study. Generalization was tested on new data from two studies (auditory Event Related Potential (ERP) paradigm, n = 18; motor imagery BCI paradigm, n = 80) that used data with different channel setups and from new subjects. Based on six features only, the optimized linear classifier performed on level with the inter-expert disagreement (<10% Mean Squared Error (MSE)) on the RT data. On data of the auditory ERP study, the same pre-calculated classifier generalized well and achieved 15% MSE. On data of the motor imagery paradigm, we demonstrate that the discriminant information used for BCI is preserved when removing up to 60% of the most artifactual source components. We propose a universal and efficient classifier of ICA components for the subject independent removal of artifacts from EEG data. Based on linear methods, it is applicable for different electrode placements and supports the introspection of results. Trained on expert ratings of large data sets, it is not restricted to the detection of eye- and muscle artifacts. Its performance and generalization ability is demonstrated on data of different EEG studies.

  11. Models Used to Select Strategic Planning Experts for High Technology Productions

    NASA Astrophysics Data System (ADS)

    Zakharova, Alexandra A.; Grigorjeva, Antonina A.; Tseplit, Anna P.; Ozgogov, Evgenij V.

    2016-04-01

    The article deals with the problems and specific aspects in organizing works of experts involved in assessment of companies that manufacture complex high-technology products. A model is presented that is intended for evaluating competences of experts in individual functional areas of expertise. Experts are selected to build a group on the basis of tables used to determine a competence level. An expert selection model based on fuzzy logic is proposed and additional requirements for the expert group composition can be taken into account, with regard to the needed quality and competence related preferences of decision-makers. A Web-based information system model is developed for the interaction between experts and decision-makers when carrying out online examinations.

  12. Classification of ECG beats using deep belief network and active learning.

    PubMed

    G, Sayantan; T, Kien P; V, Kadambari K

    2018-04-12

    A new semi-supervised approach based on deep learning and active learning for classification of electrocardiogram signals (ECG) is proposed. The objective of the proposed work is to model a scientific method for classification of cardiac irregularities using electrocardiogram beats. The model follows the Association for the Advancement of medical instrumentation (AAMI) standards and consists of three phases. In phase I, feature representation of ECG is learnt using Gaussian-Bernoulli deep belief network followed by a linear support vector machine (SVM) training in the consecutive phase. It yields three deep models which are based on AAMI-defined classes, namely N, V, S, and F. In the last phase, a query generator is introduced to interact with the expert to label few beats to improve accuracy and sensitivity. The proposed approach depicts significant improvement in accuracy with minimal queries posed to the expert and fast online training as tested on the MIT-BIH Arrhythmia Database and the MIT-BIH Supra-ventricular Arrhythmia Database (SVDB). With 100 queries labeled by the expert in phase III, the method achieves an accuracy of 99.5% in "S" versus all classifications (SVEB) and 99.4% accuracy in "V " versus all classifications (VEB) on MIT-BIH Arrhythmia Database. In a similar manner, it is attributed that an accuracy of 97.5% for SVEB and 98.6% for VEB on SVDB database is achieved respectively. Graphical Abstract Reply- Deep belief network augmented by active learning for efficient prediction of arrhythmia.

  13. Multispectral imaging burn wound tissue classification system: a comparison of test accuracies between several common machine learning algorithms

    NASA Astrophysics Data System (ADS)

    Squiers, John J.; Li, Weizhi; King, Darlene R.; Mo, Weirong; Zhang, Xu; Lu, Yang; Sellke, Eric W.; Fan, Wensheng; DiMaio, J. Michael; Thatcher, Jeffrey E.

    2016-03-01

    The clinical judgment of expert burn surgeons is currently the standard on which diagnostic and therapeutic decisionmaking regarding burn injuries is based. Multispectral imaging (MSI) has the potential to increase the accuracy of burn depth assessment and the intraoperative identification of viable wound bed during surgical debridement of burn injuries. A highly accurate classification model must be developed using machine-learning techniques in order to translate MSI data into clinically-relevant information. An animal burn model was developed to build an MSI training database and to study the burn tissue classification ability of several models trained via common machine-learning algorithms. The algorithms tested, from least to most complex, were: K-nearest neighbors (KNN), decision tree (DT), linear discriminant analysis (LDA), weighted linear discriminant analysis (W-LDA), quadratic discriminant analysis (QDA), ensemble linear discriminant analysis (EN-LDA), ensemble K-nearest neighbors (EN-KNN), and ensemble decision tree (EN-DT). After the ground-truth database of six tissue types (healthy skin, wound bed, blood, hyperemia, partial injury, full injury) was generated by histopathological analysis, we used 10-fold cross validation to compare the algorithms' performances based on their accuracies in classifying data against the ground truth, and each algorithm was tested 100 times. The mean test accuracy of the algorithms were KNN 68.3%, DT 61.5%, LDA 70.5%, W-LDA 68.1%, QDA 68.9%, EN-LDA 56.8%, EN-KNN 49.7%, and EN-DT 36.5%. LDA had the highest test accuracy, reflecting the bias-variance tradeoff over the range of complexities inherent to the algorithms tested. Several algorithms were able to match the current standard in burn tissue classification, the clinical judgment of expert burn surgeons. These results will guide further development of an MSI burn tissue classification system. Given that there are few surgeons and facilities specializing in burn care, this technology may improve the standard of burn care for patients without access to specialized facilities.

  14. Modeling and simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanham, R.; Vogt, W.G.; Mickle, M.H.

    1986-01-01

    This book presents the papers given at a conference on computerized simulation. Topics considered at the conference included expert systems, modeling in electric power systems, power systems operating strategies, energy analysis, a linear programming approach to optimum load shedding in transmission systems, econometrics, simulation in natural gas engineering, solar energy studies, artificial intelligence, vision systems, hydrology, multiprocessors, and flow models.

  15. Getting expert systems off the ground: Lessons learned from integrating model-based diagnostics with prototype flight hardware

    NASA Technical Reports Server (NTRS)

    Stephan, Amy; Erikson, Carol A.

    1991-01-01

    As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.

  16. A CLIPS-based expert system for the evaluation and selection of robots

    NASA Technical Reports Server (NTRS)

    Nour, Mohamed A.; Offodile, Felix O.; Madey, Gregory R.

    1994-01-01

    This paper describes the development of a prototype expert system for intelligent selection of robots for manufacturing operations. The paper first develops a comprehensive, three-stage process to model the robot selection problem. The decisions involved in this model easily lend themselves to an expert system application. A rule-based system, based on the selection model, is developed using the CLIPS expert system shell. Data about actual robots is used to test the performance of the prototype system. Further extensions to the rule-based system for data handling and interfacing capabilities are suggested.

  17. Clinical reasoning in unimodal interventions in patients with non-specific neck pain in daily physiotherapy practice, a Delphi study.

    PubMed

    Maissan, Francois; Pool, Jan; Stutterheim, Eric; Wittink, Harriet; Ostelo, Raymond

    2018-06-02

    Neck pain is the fourth major cause of disability worldwide but sufficient evidence regarding treatment is not available. This study is a first exploratory attempt to gain insight into and consensus on the clinical reasoning of experts in patients with non-specific neck pain. First, we aimed to inventory expert opinions regarding the indication for physiotherapy when, other than neck pain, no positive signs and symptoms and no positive diagnostic tests are present. Secondly, we aimed to determine which measurement instruments are being used and when they are used to support and objectify the clinical reasoning process. Finally, we wanted to establish consensus among experts regarding the use of unimodal interventions in patients with non-specific neck pain, i.e. their sequential linear clinical reasoning. A Delphi study. A Web-based Delphi study was conducted. Fifteen experts (teachers and researchers) participated. Pain alone was deemed not be an indication for physiotherapy treatment. PROMs are mainly used for evaluative purposes and physical tests for diagnostic and evaluative purposes. Eighteen different variants of sequential linear clinical reasoning were investigated within our Delphi study. Only 6 out of 18 variants of sequential linear clinical reasoning reached more than 50% consensus. Pain alone is not an indication for physiotherapy. Insight has been obtained into which measurement instruments are used and when they are used. Consensus about sequential linear lines of clinical reasoning was poor. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Does an expert-based evaluation allow us to go beyond the Impact Factor? Experiences from building a ranking of national journals in Poland.

    PubMed

    Kulczycki, Emanuel; Rozkosz, Ewa A

    2017-01-01

    This article discusses the Polish Journal Ranking, which is used in the research evaluation system in Poland. In 2015, the ranking, which represents all disciplines, allocated 17,437 journals into three lists: A, B, and C. The B list constitutes a ranking of Polish journals that are indexed neither in the Web of Science nor the European Reference Index for the Humanities. This ranking was built by evaluating journals in three dimensions: formal, bibliometric, and expert-based. We have analysed data on 2035 Polish journals from the B list. Our study aims to determine how an expert-based evaluation influenced the results of final evaluation. In our study, we used structural equation modelling, which is regression based, and we designed three pairs of theoretical models for three fields of science: (1) humanities, (2) social sciences, and (3) engineering, natural sciences, and medical sciences. Each pair consisted of the full model and the reduced model (i.e., the model without the expert-based evaluation). Our analysis revealed that the multidimensional evaluation of local journals should not rely only on the bibliometric indicators, which are based on the Web of Science or Scopus. Moreover, we have shown that the expert-based evaluation plays a major role in all fields of science. We conclude with recommendations that the formal evaluation should be reduced to verifiable parameters and that the expert-based evaluation should be based on common guidelines for the experts.

  19. Multispectral code excited linear prediction coding and its application in magnetic resonance images.

    PubMed

    Hu, J H; Wang, Y; Cahill, P T

    1997-01-01

    This paper reports a multispectral code excited linear prediction (MCELP) method for the compression of multispectral images. Different linear prediction models and adaptation schemes have been compared. The method that uses a forward adaptive autoregressive (AR) model has been proven to achieve a good compromise between performance, complexity, and robustness. This approach is referred to as the MFCELP method. Given a set of multispectral images, the linear predictive coefficients are updated over nonoverlapping three-dimensional (3-D) macroblocks. Each macroblock is further divided into several 3-D micro-blocks, and the best excitation signal for each microblock is determined through an analysis-by-synthesis procedure. The MFCELP method has been applied to multispectral magnetic resonance (MR) images. To satisfy the high quality requirement for medical images, the error between the original image set and the synthesized one is further specified using a vector quantizer. This method has been applied to images from 26 clinical MR neuro studies (20 slices/study, three spectral bands/slice, 256x256 pixels/band, 12 b/pixel). The MFCELP method provides a significant visual improvement over the discrete cosine transform (DCT) based Joint Photographers Expert Group (JPEG) method, the wavelet transform based embedded zero-tree wavelet (EZW) coding method, and the vector tree (VT) coding method, as well as the multispectral segmented autoregressive moving average (MSARMA) method we developed previously.

  20. The effect of project-based learning on students' statistical literacy levels for data representation

    NASA Astrophysics Data System (ADS)

    Koparan, Timur; Güven, Bülent

    2015-07-01

    The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35 in the experimental group and 35 in the control group, took this test twice, one before the application and one after the application. All the raw scores were turned into linear points by using the Winsteps 3.72 modelling program that makes the Rasch analysis and t-tests, and an ANCOVA analysis was carried out with the linear points. Depending on the findings, it was concluded that the project-based learning approach increases students' level of statistical literacy for data representation. Students' levels of statistical literacy before and after the application were shown through the obtained person-item maps.

  1. Goal programming for land use planning.

    Treesearch

    Enoch F. Bell

    1976-01-01

    A simple transformation of the linear programing model used in land use planning to a goal programing model allows the multiple goals implied by multiple use management to be explicitly recognized. This report outlines the procedure for accomplishing the transformation and discusses problems with use of goal programing. Of particular concern are the expert opinions...

  2. a Study on Satellite Diagnostic Expert Systems Using Case-Based Approach

    NASA Astrophysics Data System (ADS)

    Park, Young-Tack; Kim, Jae-Hoon; Park, Hyun-Soo

    1997-06-01

    Many research works are on going to monitor and diagnose diverse malfunctions of satellite systems as the complexity and number of satellites increase. Currently, many works on monitoring and diagnosis are carried out by human experts but there are needs to automate much of the routine works of them. Hence, it is necessary to study on using expert systems which can assist human experts routine work by doing automatically, thereby allow human experts devote their expertise more critical and important areas of monitoring and diagnosis. In this paper, we are employing artificial intelligence techniques to model human experts' knowledge and inference the constructed knowledge. Especially, case-based approaches are used to construct a knowledge base to model human expert capabilities which use previous typical exemplars. We have designed and implemented a prototype case-based system for diagnosing satellite malfunctions using cases. Our system remembers typical failure cases and diagnoses a current malfunction by indexing the case base. Diverse methods are used to build a more user friendly interface which allows human experts can build a knowledge base in as easy way.

  3. Automatic Classification of Artifactual ICA-Components for Artifact Removal in EEG Signals

    PubMed Central

    2011-01-01

    Background Artifacts contained in EEG recordings hamper both, the visual interpretation by experts as well as the algorithmic processing and analysis (e.g. for Brain-Computer Interfaces (BCI) or for Mental State Monitoring). While hand-optimized selection of source components derived from Independent Component Analysis (ICA) to clean EEG data is widespread, the field could greatly profit from automated solutions based on Machine Learning methods. Existing ICA-based removal strategies depend on explicit recordings of an individual's artifacts or have not been shown to reliably identify muscle artifacts. Methods We propose an automatic method for the classification of general artifactual source components. They are estimated by TDSEP, an ICA method that takes temporal correlations into account. The linear classifier is based on an optimized feature subset determined by a Linear Programming Machine (LPM). The subset is composed of features from the frequency-, the spatial- and temporal domain. A subject independent classifier was trained on 640 TDSEP components (reaction time (RT) study, n = 12) that were hand labeled by experts as artifactual or brain sources and tested on 1080 new components of RT data of the same study. Generalization was tested on new data from two studies (auditory Event Related Potential (ERP) paradigm, n = 18; motor imagery BCI paradigm, n = 80) that used data with different channel setups and from new subjects. Results Based on six features only, the optimized linear classifier performed on level with the inter-expert disagreement (<10% Mean Squared Error (MSE)) on the RT data. On data of the auditory ERP study, the same pre-calculated classifier generalized well and achieved 15% MSE. On data of the motor imagery paradigm, we demonstrate that the discriminant information used for BCI is preserved when removing up to 60% of the most artifactual source components. Conclusions We propose a universal and efficient classifier of ICA components for the subject independent removal of artifacts from EEG data. Based on linear methods, it is applicable for different electrode placements and supports the introspection of results. Trained on expert ratings of large data sets, it is not restricted to the detection of eye- and muscle artifacts. Its performance and generalization ability is demonstrated on data of different EEG studies. PMID:21810266

  4. Predicting the graft survival for heart-lung transplantation patients: an integrated data mining methodology.

    PubMed

    Oztekin, Asil; Delen, Dursun; Kong, Zhenyu James

    2009-12-01

    Predicting the survival of heart-lung transplant patients has the potential to play a critical role in understanding and improving the matching procedure between the recipient and graft. Although voluminous data related to the transplantation procedures is being collected and stored, only a small subset of the predictive factors has been used in modeling heart-lung transplantation outcomes. The previous studies have mainly focused on applying statistical techniques to a small set of factors selected by the domain-experts in order to reveal the simple linear relationships between the factors and survival. The collection of methods known as 'data mining' offers significant advantages over conventional statistical techniques in dealing with the latter's limitations such as normality assumption of observations, independence of observations from each other, and linearity of the relationship between the observations and the output measure(s). There are statistical methods that overcome these limitations. Yet, they are computationally more expensive and do not provide fast and flexible solutions as do data mining techniques in large datasets. The main objective of this study is to improve the prediction of outcomes following combined heart-lung transplantation by proposing an integrated data-mining methodology. A large and feature-rich dataset (16,604 cases with 283 variables) is used to (1) develop machine learning based predictive models and (2) extract the most important predictive factors. Then, using three different variable selection methods, namely, (i) machine learning methods driven variables-using decision trees, neural networks, logistic regression, (ii) the literature review-based expert-defined variables, and (iii) common sense-based interaction variables, a consolidated set of factors is generated and used to develop Cox regression models for heart-lung graft survival. The predictive models' performance in terms of 10-fold cross-validation accuracy rates for two multi-imputed datasets ranged from 79% to 86% for neural networks, from 78% to 86% for logistic regression, and from 71% to 79% for decision trees. The results indicate that the proposed integrated data mining methodology using Cox hazard models better predicted the graft survival with different variables than the conventional approaches commonly used in the literature. This result is validated by the comparison of the corresponding Gains charts for our proposed methodology and the literature review based Cox results, and by the comparison of Akaike information criteria (AIC) values received from each. Data mining-based methodology proposed in this study reveals that there are undiscovered relationships (i.e. interactions of the existing variables) among the survival-related variables, which helps better predict the survival of the heart-lung transplants. It also brings a different set of variables into the scene to be evaluated by the domain-experts and be considered prior to the organ transplantation.

  5. Head Motion Modeling for Human Behavior Analysis in Dyadic Interaction

    PubMed Central

    Xiao, Bo; Georgiou, Panayiotis; Baucom, Brian; Narayanan, Shrikanth S.

    2015-01-01

    This paper presents a computational study of head motion in human interaction, notably of its role in conveying interlocutors’ behavioral characteristics. Head motion is physically complex and carries rich information; current modeling approaches based on visual signals, however, are still limited in their ability to adequately capture these important properties. Guided by the methodology of kinesics, we propose a data driven approach to identify typical head motion patterns. The approach follows the steps of first segmenting motion events, then parametrically representing the motion by linear predictive features, and finally generalizing the motion types using Gaussian mixture models. The proposed approach is experimentally validated using video recordings of communication sessions from real couples involved in a couples therapy study. In particular we use the head motion model to classify binarized expert judgments of the interactants’ specific behavioral characteristics where entrainment in head motion is hypothesized to play a role: Acceptance, Blame, Positive, and Negative behavior. We achieve accuracies in the range of 60% to 70% for the various experimental settings and conditions. In addition, we describe a measure of motion similarity between the interaction partners based on the proposed model. We show that the relative change of head motion similarity during the interaction significantly correlates with the expert judgments of the interactants’ behavioral characteristics. These findings demonstrate the effectiveness of the proposed head motion model, and underscore the promise of analyzing human behavioral characteristics through signal processing methods. PMID:26557047

  6. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Volume 1: Study results

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.

  7. Reliability and performance evaluation of systems containing embedded rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.

    1989-01-01

    A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.

  8. Control of the NASA Langley 16-Foot Transonic Tunnel with the Self-Organizing Feature Map

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.

    1998-01-01

    A predictive, multiple model control strategy is developed based on an ensemble of local linear models of the nonlinear system dynamics for a transonic wind tunnel. The local linear models are estimated directly from the weights of a Self Organizing Feature Map (SOFM). Local linear modeling of nonlinear autonomous systems with the SOFM is extended to a control framework where the modeled system is nonautonomous, driven by an exogenous input. This extension to a control framework is based on the consideration of a finite number of subregions in the control space. Multiple self organizing feature maps collectively model the global response of the wind tunnel to a finite set of representative prototype controls. These prototype controls partition the control space and incorporate experimental knowledge gained from decades of operation. Each SOFM models the combination of the tunnel with one of the representative controls, over the entire range of operation. The SOFM based linear models are used to predict the tunnel response to a larger family of control sequences which are clustered on the representative prototypes. The control sequence which corresponds to the prediction that best satisfies the requirements on the system output is applied as the external driving signal. Each SOFM provides a codebook representation of the tunnel dynamics corresponding to a prototype control. Different dynamic regimes are organized into topological neighborhoods where the adjacent entries in the codebook represent the minimization of a similarity metric which is the essence of the self organizing feature of the map. Thus, the SOFM is additionally employed to identify the local dynamical regime, and consequently implements a switching scheme than selects the best available model for the applied control. Experimental results of controlling the wind tunnel, with the proposed method, during operational runs where strict research requirements on the control of the Mach number were met, are presented. Comparison to similar runs under the same conditions with the tunnel controlled by either the existing controller or an expert operator indicate the superiority of the method.

  9. An expert system for prediction of aquatic toxicity of contaminants

    USGS Publications Warehouse

    Hickey, James P.; Aldridge, Andrew J.; Passino, Dora R. May; Frank, Anthony M.; Hushon, Judith M.

    1990-01-01

    The National Fisheries Research Center-Great Lakes has developed an interactive computer program in muLISP that runs on an IBM-compatible microcomputer and uses a linear solvation energy relationship (LSER) to predict acute toxicity to four representative aquatic species from the detailed structure of an organic molecule. Using the SMILES formalism for a chemical structure, the expert system identifies all structural components and uses a knowledge base of rules based on an LSER to generate four structure-related parameter values. A separate module then relates these values to toxicity. The system is designed for rapid screening of potential chemical hazards before laboratory or field investigations are conducted and can be operated by users with little toxicological background. This is the first expert system based on LSER, relying on the first comprehensive compilation of rules and values for the estimation of LSER parameters.

  10. Developing a job-exposure matrix with exposure uncertainty from expert elicitation and data modeling.

    PubMed

    Fischer, Heidi J; Vergara, Ximena P; Yost, Michael; Silva, Michael; Lombardi, David A; Kheifets, Leeka

    2017-01-01

    Job exposure matrices (JEMs) are tools used to classify exposures for job titles based on general job tasks in the absence of individual level data. However, exposure uncertainty due to variations in worker practices, job conditions, and the quality of data has never been quantified systematically in a JEM. We describe a methodology for creating a JEM which defines occupational exposures on a continuous scale and utilizes elicitation methods to quantify exposure uncertainty by assigning exposures probability distributions with parameters determined through expert involvement. Experts use their knowledge to develop mathematical models using related exposure surrogate data in the absence of available occupational level data and to adjust model output against other similar occupations. Formal expert elicitation methods provided a consistent, efficient process to incorporate expert judgment into a large, consensus-based JEM. A population-based electric shock JEM was created using these methods, allowing for transparent estimates of exposure.

  11. Design of an expert system for the development and formulation of push-pull osmotic pump tablets containing poorly water-soluble drugs.

    PubMed

    Zhang, Zhi-hong; Dong, Hong-ye; Peng, Bo; Liu, Hong-fei; Li, Chun-lei; Liang, Min; Pan, Wei-san

    2011-05-30

    The purpose of this article was to build an expert system for the development and formulation of push-pull osmotic pump tablets (PPOP). Hundreds of PPOP formulations were studied according to different poorly water-soluble drugs and pharmaceutical acceptable excipients. The knowledge base including database and rule base was built based on the reported results of hundreds of PPOP formulations containing different poorly water-soluble drugs and pharmaceutical excipients and the experiences available from other researchers. The prediction model of release behavior was built using back propagation (BP) neural network, which is good at nonlinear mapping and learning function. Formulation design model was established based on the prediction model of release behavior, which was the nucleus of the inference engine. Finally, the expert system program was constructed by VB.NET associating with SQL Server. Expert system is one of the most popular aspects in artificial intelligence. To date there is no expert system available for the formulation of controlled release dosage forms yet. Moreover, osmotic pump technology (OPT) is gradually getting consummate all over the world. It is meaningful to apply expert system on OPT. Famotidine, a water insoluble drug was chosen as the model drug to validate the applicability of the developed expert system. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Scalable Options for Extended Skill Building Following Didactic Training in Cognitive-Behavioral Therapy for Anxious Youth: A Pilot Randomized Trial.

    PubMed

    Chu, Brian C; Carpenter, Aubrey L; Wyszynski, Christopher M; Conklin, Phoebe H; Comer, Jonathan S

    2017-01-01

    A sizable gap exists between the availability of evidence-based psychological treatments and the number of community therapists capable of delivering such treatments. Limited time, resources, and access to experts prompt the need for easily disseminable, lower cost options for therapist training and continued support beyond initial training. A pilot randomized trial tested scalable extended support models for therapists following initial training. Thirty-five postdegree professionals (43%) or graduate trainees (57%) from diverse disciplines viewed an initial web-based training in cognitive-behavioral therapy (CBT) for youth anxiety and then were randomly assigned to 10 weeks of expert streaming (ES; viewing weekly online supervision sessions of an expert providing consultation), peer consultation (PC; non-expert-led group discussions of CBT), or fact sheet self-study (FS; weekly review of instructional fact sheets). In initial expectations, trainees rated PC as more appropriate and useful to meet its goals than either ES or FS. At post, all support programs were rated as equally satisfactory and useful for therapists' work, and comparable in increasing self-reported use of CBT strategies (b = .19, p = .02). In contrast, negative linear trends were found on a knowledge quiz (b = -1.23, p = .01) and self-reported beliefs about knowledge (b = -1.50, p < .001) and skill (b = -1.15, p < .001). Attrition and poor attendance presented a moderate concern for PC, and ES was rated as having the lowest implementation potential. Preliminary findings encourage further development of low-cost, scalable options for continued support of evidence-based training.

  13. A Psychological Model for Aggregating Judgments of Magnitude

    NASA Astrophysics Data System (ADS)

    Merkle, Edgar C.; Steyvers, Mark

    In this paper, we develop and illustrate a psychologically-motivated model for aggregating judgments of magnitude across experts. The model assumes that experts' judgments are perturbed from the truth by both systematic biases and random error, and it provides aggregated estimates that are implicitly based on the application of nonlinear weights to individual judgments. The model is also easily extended to situations where experts report multiple quantile judgments. We apply the model to expert judgments concerning flange leaks in a chemical plant, illustrating its use and comparing it to baseline measures.

  14. Key properties of expert movement systems in sport : an ecological dynamics perspective.

    PubMed

    Seifert, Ludovic; Button, Chris; Davids, Keith

    2013-03-01

    This paper identifies key properties of expertise in sport predicated on the performer-environment relationship. Weaknesses of traditional approaches to expert performance, which uniquely focus on the performer and the environment separately, are highlighted by an ecological dynamics perspective. Key properties of expert movement systems include 'multi- and meta-stability', 'adaptive variability', 'redundancy', 'degeneracy' and the 'attunement to affordances'. Empirical research on these expert system properties indicates that skill acquisition does not emerge from the internal representation of declarative and procedural knowledge, or the imitation of expert behaviours to linearly reduce a perceived 'gap' separating movements of beginners and a putative expert model. Rather, expert performance corresponds with the ongoing co-adaptation of an individual's behaviours to dynamically changing, interacting constraints, individually perceived and encountered. The functional role of adaptive movement variability is essential to expert performance in many different sports (involving individuals and teams; ball games and outdoor activities; land and aquatic environments). These key properties signify that, in sport performance, although basic movement patterns need to be acquired by developing athletes, there exists no ideal movement template towards which all learners should aspire, since relatively unique functional movement solutions emerge from the interaction of key constraints.

  15. Computer-Vision-Assisted Palm Rehabilitation With Supervised Learning.

    PubMed

    Vamsikrishna, K M; Dogra, Debi Prosad; Desarkar, Maunendra Sankar

    2016-05-01

    Physical rehabilitation supported by the computer-assisted-interface is gaining popularity among health-care fraternity. In this paper, we have proposed a computer-vision-assisted contactless methodology to facilitate palm and finger rehabilitation. Leap motion controller has been interfaced with a computing device to record parameters describing 3-D movements of the palm of a user undergoing rehabilitation. We have proposed an interface using Unity3D development platform. Our interface is capable of analyzing intermediate steps of rehabilitation without the help of an expert, and it can provide online feedback to the user. Isolated gestures are classified using linear discriminant analysis (DA) and support vector machines (SVM). Finally, a set of discrete hidden Markov models (HMM) have been used to classify gesture sequence performed during rehabilitation. Experimental validation using a large number of samples collected from healthy volunteers reveals that DA and SVM perform similarly while applied on isolated gesture recognition. We have compared the results of HMM-based sequence classification with CRF-based techniques. Our results confirm that both HMM and CRF perform quite similarly when tested on gesture sequences. The proposed system can be used for home-based palm or finger rehabilitation in the absence of experts.

  16. Measures of Agreement Between Many Raters for Ordinal Classifications

    PubMed Central

    Nelson, Kerrie P.; Edwards, Don

    2015-01-01

    Screening and diagnostic procedures often require a physician's subjective interpretation of a patient's test result using an ordered categorical scale to define the patient's disease severity. Due to wide variability observed between physicians’ ratings, many large-scale studies have been conducted to quantify agreement between multiple experts’ ordinal classifications in common diagnostic procedures such as mammography. However, very few statistical approaches are available to assess agreement in these large-scale settings. Existing summary measures of agreement rely on extensions of Cohen's kappa [1 - 5]. These are prone to prevalence and marginal distribution issues, become increasingly complex for more than three experts or are not easily implemented. Here we propose a model-based approach to assess agreement in large-scale studies based upon a framework of ordinal generalized linear mixed models. A summary measure of agreement is proposed for multiple experts assessing the same sample of patients’ test results according to an ordered categorical scale. This measure avoids some of the key flaws associated with Cohen's kappa and its extensions. Simulation studies are conducted to demonstrate the validity of the approach with comparison to commonly used agreement measures. The proposed methods are easily implemented using the software package R and are applied to two large-scale cancer agreement studies. PMID:26095449

  17. The Last Millennium Reanalysis: Improvements to proxies and proxy modeling

    NASA Astrophysics Data System (ADS)

    Tardif, R.; Hakim, G. J.; Emile-Geay, J.; Noone, D.; Anderson, D. M.

    2017-12-01

    The Last Millennium Reanalysis (LMR) employs a paleoclimate data assimilation (PDA) approach to produce climate field reconstructions (CFRs). Here, we focus on two key factors in PDA generated CFRs: the set of assimilated proxy records and forward models (FMs) used to estimate proxies from climate model output. In the initial configuration of the LMR [Hakim et al., 2016], the proxy dataset of [PAGES2k Consortium, 2013] was used, along with univariate linear FMs calibrated against annually-averaged 20th century temperature datasets. In an updated configuration, proxy records from the recent dataset [PAGES2k Consortium, 2017] are used, while a hierarchy of statistical FMs are tested: (1) univariate calibrated on annual temperature as in the initial configuration, (2) univariate against temperature as in (1) but calibration performed using expert-derived seasonality for individual proxy records, (3) as in (2) but expert proxy seasonality replaced by seasonal averaging determined objectively as part of the calibration process, (4) linear objective seasonal FMs as in (3) but objectively selecting relationships calibrated either on temperature or precipitation, and (5) bivariate linear models calibrated on temperature and precipitation with objectively-derived seasonality. (4) and (5) specifically aim at better representing the physical drivers of tree ring width proxies. Reconstructions generated using the CCSM4 Last Millennium simulation as an uninformed prior are evaluated against various 20th century data products. Results show the benefits of using the new proxy collection, particularly on the detrended global mean temperature and spatial patterns. The positive impact of using proper seasonality and temperature/moisture sensitivities for tree ring width records is also notable. This updated configuration will be used for the first generation of LMR-generated CFRs to be publicly released. These also provide a benchmark for future efforts aimed at evaluating the impact of additional proxy records and/or more sophisticated physically-based forward models. References: Hakim, G. J., and co-authors (2016), J. Geophys. Res. Atmos., doi:10.1002/2016JD024751 PAGES2K Consortium (2013), Nat. Geosci., doi:10.1038/ngeo1797 PAGES2k Consortium (2017), Sci. Data. doi:10.1038/sdata.2017.88

  18. Improved quantification of important beer quality parameters based on nonlinear calibration methods applied to FT-MIR spectra.

    PubMed

    Cernuda, Carlos; Lughofer, Edwin; Klein, Helmut; Forster, Clemens; Pawliczek, Marcin; Brandstetter, Markus

    2017-01-01

    During the production process of beer, it is of utmost importance to guarantee a high consistency of the beer quality. For instance, the bitterness is an essential quality parameter which has to be controlled within the specifications at the beginning of the production process in the unfermented beer (wort) as well as in final products such as beer and beer mix beverages. Nowadays, analytical techniques for quality control in beer production are mainly based on manual supervision, i.e., samples are taken from the process and analyzed in the laboratory. This typically requires significant lab technicians efforts for only a small fraction of samples to be analyzed, which leads to significant costs for beer breweries and companies. Fourier transform mid-infrared (FT-MIR) spectroscopy was used in combination with nonlinear multivariate calibration techniques to overcome (i) the time consuming off-line analyses in beer production and (ii) already known limitations of standard linear chemometric methods, like partial least squares (PLS), for important quality parameters Speers et al. (J I Brewing. 2003;109(3):229-235), Zhang et al. (J I Brewing. 2012;118(4):361-367) such as bitterness, citric acid, total acids, free amino nitrogen, final attenuation, or foam stability. The calibration models are established with enhanced nonlinear techniques based (i) on a new piece-wise linear version of PLS by employing fuzzy rules for local partitioning the latent variable space and (ii) on extensions of support vector regression variants (-PLSSVR and ν-PLSSVR), for overcoming high computation times in high-dimensional problems and time-intensive and inappropriate settings of the kernel parameters. Furthermore, we introduce a new model selection scheme based on bagged ensembles in order to improve robustness and thus predictive quality of the final models. The approaches are tested on real-world calibration data sets for wort and beer mix beverages, and successfully compared to linear methods, showing a clear out-performance in most cases and being able to meet the model quality requirements defined by the experts at the beer company. Figure Workflow for calibration of non-Linear model ensembles from FT-MIR spectra in beer production .

  19. The weighted priors approach for combining expert opinions in logistic regression experiments

    DOE PAGES

    Quinlan, Kevin R.; Anderson-Cook, Christine M.; Myers, Kary L.

    2017-04-24

    When modeling the reliability of a system or component, it is not uncommon for more than one expert to provide very different prior estimates of the expected reliability as a function of an explanatory variable such as age or temperature. Our goal in this paper is to incorporate all information from the experts when choosing a design about which units to test. Bayesian design of experiments has been shown to be very successful for generalized linear models, including logistic regression models. We use this approach to develop methodology for the case where there are several potentially non-overlapping priors under consideration.more » While multiple priors have been used for analysis in the past, they have never been used in a design context. The Weighted Priors method performs well for a broad range of true underlying model parameter choices and is more robust when compared to other reasonable design choices. Finally, we illustrate the method through multiple scenarios and a motivating example. Additional figures for this article are available in the online supplementary information.« less

  20. The weighted priors approach for combining expert opinions in logistic regression experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinlan, Kevin R.; Anderson-Cook, Christine M.; Myers, Kary L.

    When modeling the reliability of a system or component, it is not uncommon for more than one expert to provide very different prior estimates of the expected reliability as a function of an explanatory variable such as age or temperature. Our goal in this paper is to incorporate all information from the experts when choosing a design about which units to test. Bayesian design of experiments has been shown to be very successful for generalized linear models, including logistic regression models. We use this approach to develop methodology for the case where there are several potentially non-overlapping priors under consideration.more » While multiple priors have been used for analysis in the past, they have never been used in a design context. The Weighted Priors method performs well for a broad range of true underlying model parameter choices and is more robust when compared to other reasonable design choices. Finally, we illustrate the method through multiple scenarios and a motivating example. Additional figures for this article are available in the online supplementary information.« less

  1. Use of cccupancy models to evaluate expert knowledge-based species-habitat relationships

    USGS Publications Warehouse

    Iglecia, Monica N.; Collazo, Jaime A.; McKerrow, Alexa

    2012-01-01

    Expert knowledge-based species-habitat relationships are used extensively to guide conservation planning, particularly when data are scarce. Purported relationships describe the initial state of knowledge, but are rarely tested. We assessed support in the data for suitability rankings of vegetation types based on expert knowledge for three terrestrial avian species in the South Atlantic Coastal Plain of the United States. Experts used published studies, natural history, survey data, and field experience to rank vegetation types as optimal, suitable, and marginal. We used single-season occupancy models, coupled with land cover and Breeding Bird Survey data, to examine the hypothesis that patterns of occupancy conformed to species-habitat suitability rankings purported by experts. Purported habitat suitability was validated for two of three species. As predicted for the Eastern Wood-Pewee (Contopus virens) and Brown-headed Nuthatch (Sitta pusilla), occupancy was strongly influenced by vegetation types classified as “optimal habitat” by the species suitability rankings for nuthatches and wood-pewees. Contrary to predictions, Red-headed Woodpecker (Melanerpes erythrocephalus) models that included vegetation types as covariates received similar support by the data as models without vegetation types. For all three species, occupancy was also related to sampling latitude. Our results suggest that covariates representing other habitat requirements might be necessary to model occurrence of generalist species like the woodpecker. The modeling approach described herein provides a means to test expert knowledge-based species-habitat relationships, and hence, help guide conservation planning.

  2. EX.MAIN. Expert System Model for Maintenance and Staff Training.

    ERIC Educational Resources Information Center

    Masturzi, Elio R.

    EX.MAIN, a model for maintenance and staff training which combines knowledge based expert systems and computer based training, was developed jointly by the Department of Production Engineering of the University of Naples and CIRCUMVESUVIANA, the largest private railroad in Italy. It is a global model in the maintenance field which contains both…

  3. Expert systems applied to spacecraft fire safety

    NASA Technical Reports Server (NTRS)

    Smith, Richard L.; Kashiwagi, Takashi

    1989-01-01

    Expert systems are problem-solving programs that combine a knowledge base and a reasoning mechanism to simulate a human expert. The development of an expert system to manage fire safety in spacecraft, in particular the NASA Space Station Freedom, is difficult but clearly advantageous in the long-term. Some needs in low-gravity flammability characteristics, ventilating-flow effects, fire detection, fire extinguishment, and decision models, all necessary to establish the knowledge base for an expert system, are discussed.

  4. Temporal and contextual knowledge in model-based expert systems

    NASA Technical Reports Server (NTRS)

    Toth-Fejel, Tihamer; Heher, Dennis

    1987-01-01

    A basic paradigm that allows representation of physical systems with a focus on context and time is presented. Paragon provides the capability to quickly capture an expert's knowledge in a cognitively resonant manner. From that description, Paragon creates a simulation model in LISP, which when executed, verifies that the domain expert did not make any mistakes. The Achille's heel of rule-based systems has been the lack of a systematic methodology for testing, and Paragon's developers are certain that the model-based approach overcomes that problem. The reason this testing is now possible is that software, which is very difficult to test, has in essence been transformed into hardware.

  5. Development of Learning Models Based on Problem Solving and Meaningful Learning Standards by Expert Validity for Animal Development Course

    NASA Astrophysics Data System (ADS)

    Lufri, L.; Fitri, R.; Yogica, R.

    2018-04-01

    The purpose of this study is to produce a learning model based on problem solving and meaningful learning standards by expert assessment or validation for the course of Animal Development. This research is a development research that produce the product in the form of learning model, which consist of sub product, namely: the syntax of learning model and student worksheets. All of these products are standardized through expert validation. The research data is the level of validity of all sub products obtained using questionnaire, filled by validators from various field of expertise (field of study, learning strategy, Bahasa). Data were analysed using descriptive statistics. The result of the research shows that the problem solving and meaningful learning model has been produced. Sub products declared appropriate by expert include the syntax of learning model and student worksheet.

  6. Cooperating Expert Systems For Space Station Power Distribution Management

    NASA Astrophysics Data System (ADS)

    Nguyen, T. A.; Chiou, W. C.

    1987-02-01

    In a complex system such as the manned Space Station, it is deem necessary that many expert systems must perform tasks in a concurrent and cooperative manner. An important question arise is: what cooperative-task-performing models are appropriate for multiple expert systems to jointly perform tasks. The solution to this question will provide a crucial automation design criteria for the Space Station complex systems architecture. Based on a client/server model for performing tasks, we have developed a system that acts as a front-end to support loosely-coupled communications between expert systems running on multiple Symbolics machines. As an example, we use two ART*-based expert systems to demonstrate the concept of parallel symbolic manipulation for power distribution management and dynamic load planner/scheduler in the simulated Space Station environment. This on-going work will also explore other cooperative-task-performing models as alternatives which can evaluate inter and intra expert system communication mechanisms. It will be served as a testbed and a bench-marking tool for other Space Station expert subsystem communication and information exchange.

  7. Objective calibration of regional climate models

    NASA Astrophysics Data System (ADS)

    Bellprat, O.; Kotlarski, S.; Lüthi, D.; SchäR, C.

    2012-12-01

    Climate models are subject to high parametric uncertainty induced by poorly confined model parameters of parameterized physical processes. Uncertain model parameters are typically calibrated in order to increase the agreement of the model with available observations. The common practice is to adjust uncertain model parameters manually, often referred to as expert tuning, which lacks objectivity and transparency in the use of observations. These shortcomings often haze model inter-comparisons and hinder the implementation of new model parameterizations. Methods which would allow to systematically calibrate model parameters are unfortunately often not applicable to state-of-the-art climate models, due to computational constraints facing the high dimensionality and non-linearity of the problem. Here we present an approach to objectively calibrate a regional climate model, using reanalysis driven simulations and building upon a quadratic metamodel presented by Neelin et al. (2010) that serves as a computationally cheap surrogate of the model. Five model parameters originating from different parameterizations are selected for the optimization according to their influence on the model performance. The metamodel accurately estimates spatial averages of 2 m temperature, precipitation and total cloud cover, with an uncertainty of similar magnitude as the internal variability of the regional climate model. The non-linearities of the parameter perturbations are well captured, such that only a limited number of 20-50 simulations are needed to estimate optimal parameter settings. Parameter interactions are small, which allows to further reduce the number of simulations. In comparison to an ensemble of the same model which has undergone expert tuning, the calibration yields similar optimal model configurations, but leading to an additional reduction of the model error. The performance range captured is much wider than sampled with the expert-tuned ensemble and the presented methodology is effective and objective. It is argued that objective calibration is an attractive tool and could become standard procedure after introducing new model implementations, or after a spatial transfer of a regional climate model. Objective calibration of parameterizations with regional models could also serve as a strategy toward improving parameterization packages of global climate models.

  8. SWAN: An expert system with natural language interface for tactical air capability assessment

    NASA Technical Reports Server (NTRS)

    Simmons, Robert M.

    1987-01-01

    SWAN is an expert system and natural language interface for assessing the war fighting capability of Air Force units in Europe. The expert system is an object oriented knowledge based simulation with an alternate worlds facility for performing what-if excursions. Responses from the system take the form of generated text, tables, or graphs. The natural language interface is an expert system in its own right, with a knowledge base and rules which understand how to access external databases, models, or expert systems. The distinguishing feature of the Air Force expert system is its use of meta-knowledge to generate explanations in the frame and procedure based environment.

  9. Quantifying the predictive consequences of model error with linear subspace analysis

    USGS Publications Warehouse

    White, Jeremy T.; Doherty, John E.; Hughes, Joseph D.

    2014-01-01

    All computer models are simplified and imperfect simulators of complex natural systems. The discrepancy arising from simplification induces bias in model predictions, which may be amplified by the process of model calibration. This paper presents a new method to identify and quantify the predictive consequences of calibrating a simplified computer model. The method is based on linear theory, and it scales efficiently to the large numbers of parameters and observations characteristic of groundwater and petroleum reservoir models. The method is applied to a range of predictions made with a synthetic integrated surface-water/groundwater model with thousands of parameters. Several different observation processing strategies and parameterization/regularization approaches are examined in detail, including use of the Karhunen-Loève parameter transformation. Predictive bias arising from model error is shown to be prediction specific and often invisible to the modeler. The amount of calibration-induced bias is influenced by several factors, including how expert knowledge is applied in the design of parameterization schemes, the number of parameters adjusted during calibration, how observations and model-generated counterparts are processed, and the level of fit with observations achieved through calibration. Failure to properly implement any of these factors in a prediction-specific manner may increase the potential for predictive bias in ways that are not visible to the calibration and uncertainty analysis process.

  10. Farm-level economics of innovative tillage technologies: the case of no-till in the Altai Krai in Russian Siberia.

    PubMed

    Bavorova, Miroslava; Imamverdiyev, Nizami; Ponkina, Elena

    2018-01-01

    In the agricultural Altai Krai in Russian Siberia, soil degradation problems are prevalent. Agronomists recommend "reduced tillage systems," especially no-till, as a sustainable way to cultivate land that is threatened by soil degradation. In the Altai Krai, less is known about the technologies in practice. In this paper, we provide information on plant cultivation technologies used in the Altai Krai and on selected factors preventing farm managers in this region from adopting no-till technology based on our own quantitative survey conducted across 107 farms in 2015 and 2016. The results of the quantitative survey show that farm managers have high uncertainty regarding the use of no-till technology including its economics. To close this gap, we provide systematic analysis of factors influencing the economy of the plant production systems by using a farm optimization model (linear programming) for a real farm, together with expert estimations. The farm-specific results of the optimization model show that under optimal management and climatic conditions, the expert Modern Canadian no-till technology outperforms the farm min-till technology, but this is not the case for suboptimal conditions with lower yields.

  11. Comparing Cognitive Models of Domain Mastery and Task Performance in Algebra: Validity Evidence for a State Assessment

    ERIC Educational Resources Information Center

    Warner, Zachary B.

    2013-01-01

    This study compared an expert-based cognitive model of domain mastery with student-based cognitive models of task performance for Integrated Algebra. Interpretations of student test results are limited by experts' hypotheses of how students interact with the items. In reality, the cognitive processes that students use to solve each item may be…

  12. Testing Expert-Based versus Student-Based Cognitive Models for a Grade 3 Diagnostic Mathematics Assessment

    ERIC Educational Resources Information Center

    Roduta Roberts, Mary; Alves, Cecilia B.; Chu, Man-Wai; Thompson, Margaret; Bahry, Louise M.; Gotzmann, Andrea

    2014-01-01

    The purpose of this study was to evaluate the adequacy of three cognitive models, one developed by content experts and two generated from student verbal reports for explaining examinee performance on a grade 3 diagnostic mathematics test. For this study, the items were developed to directly measure the attributes in the cognitive model. The…

  13. Expert knowledge maps for knowledge management: a case study in Traditional Chinese Medicine research.

    PubMed

    Cui, Meng; Yang, Shuo; Yu, Tong; Yang, Ce; Gao, Yonghong; Zhu, Haiyan

    2013-10-01

    To design a model to capture information on the state and trends of knowledge creation, at both an individual and an organizational level, in order to enhance knowledge management. We designed a graph-theoretic knowledge model, the expert knowledge map (EKM), based on literature-based annotation. A case study in the domain of Traditional Chinese Medicine research was used to illustrate the usefulness of the model. The EKM successfully captured various aspects of knowledge and enhanced knowledge management within the case-study organization through the provision of knowledge graphs, expert graphs, and expert-knowledge biography. Our model could help to reveal the hot topics, trends, and products of the research done by an organization. It can potentially be used to facilitate knowledge learning, sharing and decision-making among researchers, academicians, students, and administrators of organizations.

  14. Expert systems for automated maintenance of a Mars oxygen production system

    NASA Astrophysics Data System (ADS)

    Huang, Jen-Kuang; Ho, Ming-Tsang; Ash, Robert L.

    1992-08-01

    Application of expert system concepts to a breadboard Mars oxygen processor unit have been studied and tested. The research was directed toward developing the methodology required to enable autonomous operation and control of these simple chemical processors at Mars. Failure detection and isolation was the key area of concern, and schemes using forward chaining, backward chaining, knowledge-based expert systems, and rule-based expert systems were examined. Tests and simulations were conducted that investigated self-health checkout, emergency shutdown, and fault detection, in addition to normal control activities. A dynamic system model was developed using the Bond-Graph technique. The dynamic model agreed well with tests involving sudden reductions in throughput. However, nonlinear effects were observed during tests that incorporated step function increases in flow variables. Computer simulations and experiments have demonstrated the feasibility of expert systems utilizing rule-based diagnosis and decision-making algorithms.

  15. Designing across ages: Multi-agent-based models and learning electricity

    NASA Astrophysics Data System (ADS)

    Sengupta, Pratim

    Electricity is regarded as one of the most challenging topics for students at all levels -- middle school -- college (Cohen, Eylon, & Ganiel, 1983; Belcher & Olbert, 2003; Eylon & Ganiel, 1990; Steinberg et al., 1985). Several researchers have suggested that naive misconceptions about electricity stem from a deep incommensurability (Slotta & Chi, 2006; Chi, 2005) or incompatibility (Chi, Slotta & Leauw, 1994; Reiner, Slotta, Chi, & Resnick, 2000) between naive and expert knowledge structures. I first present an alternative theoretical framework that adopts an emergent levels-based perspective as proposed by Wilensky & Resnick (1999). From this perspective, macro-level phenomena such as electric current and resistance, as well as behavior of linear electric circuits, can be conceived of as emergent from simple, body-syntonic interactions between electrons and ions in a circuit. I argue that adopting such a perspective enables us to reconceive commonly noted misconceptions in electricity as behavioral evidences of "slippage between levels" -- i.e., these misconceptions appear when otherwise productive knowledge elements are sometimes inappropriately activated due to certain macro-level phenomenological cues only -- and, that the same knowledge elements when activated due to phenomenological cues at both micro- and macro-levels, can engender a deeper, expert-like understanding. I will then introduce NIELS (NetLogo Investigations In Electromagnetism, Sengupta & Wilensky, 2006, 2008, 2009), a low-threshold high-ceiling (LTHC) learning environment of multi-agent-based computational models that represent phenomena such as electric current and resistance, as well as the behavior of linear electric circuits, as emergent. I also present results from implementations of NIELS in 5th, 7th and 12th grade classrooms that show the following: (a) how leveraging certain "design elements" over others in NIELS models can create new phenomenological cues, which in turn can be appropriated for learners in different grades; (b) how learners' existing knowledge structures can be bootstrapped to generate deep understanding; (c) how these knowledge structures evolve as the learners progress through the implemented curriculum; (d) improvement of learners' understanding in the post-test compared to the pre-test; and (e) how NIELS students compare with a comparison group of 12th grade students who underwent traditional classroom instruction.

  16. Perception of low dose radiation risks among radiation researchers in Korea.

    PubMed

    Seong, Ki Moon; Kwon, TaeWoo; Seo, Songwon; Lee, Dalnim; Park, Sunhoo; Jin, Young Woo; Lee, Seung-Sook

    2017-01-01

    Expert's risk evaluation of radiation exposure strongly influences the public's risk perception. Experts can inform laypersons of significant radiation information including health knowledge based on experimental data. However, some experts' radiation risk perception is often based on non-conclusive scientific evidence (i.e., radiation levels below 100 millisievert), which is currently under debate. Examining perception levels among experts is important for communication with the public since these individual's opinions have often exacerbated the public's confusion. We conducted a survey of Korean radiation researchers to investigate their perceptions of the risks associated with radiation exposure below 100 millisievert. A linear regression analysis revealed that having ≥ 11 years' research experience was a critical factor associated with radiation risk perception, which was inversely correlated with each other. Increased opportunities to understand radiation effects at < 100 millisievert could alter the public's risk perception of radiation exposure. In addition, radiation researchers conceived that more scientific evidence reducing the uncertainty for radiation effects < 100 millisievert is necessary for successful public communication. We concluded that sustained education addressing scientific findings is a critical attribute that will affect the risk perception of radiation exposure.

  17. Retrosynthetic Reaction Prediction Using Neural Sequence-to-Sequence Models

    PubMed Central

    2017-01-01

    We describe a fully data driven model that learns to perform a retrosynthetic reaction prediction task, which is treated as a sequence-to-sequence mapping problem. The end-to-end trained model has an encoder–decoder architecture that consists of two recurrent neural networks, which has previously shown great success in solving other sequence-to-sequence prediction tasks such as machine translation. The model is trained on 50,000 experimental reaction examples from the United States patent literature, which span 10 broad reaction types that are commonly used by medicinal chemists. We find that our model performs comparably with a rule-based expert system baseline model, and also overcomes certain limitations associated with rule-based expert systems and with any machine learning approach that contains a rule-based expert system component. Our model provides an important first step toward solving the challenging problem of computational retrosynthetic analysis. PMID:29104927

  18. Comparing the performance of expert user heuristics and an integer linear program in aircraft carrier deck operations.

    PubMed

    Ryan, Jason C; Banerjee, Ashis Gopal; Cummings, Mary L; Roy, Nicholas

    2014-06-01

    Planning operations across a number of domains can be considered as resource allocation problems with timing constraints. An unexplored instance of such a problem domain is the aircraft carrier flight deck, where, in current operations, replanning is done without the aid of any computerized decision support. Rather, veteran operators employ a set of experience-based heuristics to quickly generate new operating schedules. These expert user heuristics are neither codified nor evaluated by the United States Navy; they have grown solely from the convergent experiences of supervisory staff. As unmanned aerial vehicles (UAVs) are introduced in the aircraft carrier domain, these heuristics may require alterations due to differing capabilities. The inclusion of UAVs also allows for new opportunities for on-line planning and control, providing an alternative to the current heuristic-based replanning methodology. To investigate these issues formally, we have developed a decision support system for flight deck operations that utilizes a conventional integer linear program-based planning algorithm. In this system, a human operator sets both the goals and constraints for the algorithm, which then returns a proposed schedule for operator approval. As a part of validating this system, the performance of this collaborative human-automation planner was compared with that of the expert user heuristics over a set of test scenarios. The resulting analysis shows that human heuristics often outperform the plans produced by an optimization algorithm, but are also often more conservative.

  19. Multivariate modelling of prostate cancer combining magnetic resonance derived T2, diffusion, dynamic contrast-enhanced and spectroscopic parameters.

    PubMed

    Riches, S F; Payne, G S; Morgan, V A; Dearnaley, D; Morgan, S; Partridge, M; Livni, N; Ogden, C; deSouza, N M

    2015-05-01

    The objectives are determine the optimal combination of MR parameters for discriminating tumour within the prostate using linear discriminant analysis (LDA) and to compare model accuracy with that of an experienced radiologist. Multiparameter MRIs in 24 patients before prostatectomy were acquired. Tumour outlines from whole-mount histology, T2-defined peripheral zone (PZ), and central gland (CG) were superimposed onto slice-matched parametric maps. T2, Apparent Diffusion Coefficient, initial area under the gadolinium curve, vascular parameters (K(trans),Kep,Ve), and (choline+polyamines+creatine)/citrate were compared between tumour and non-tumour tissues. Receiver operating characteristic (ROC) curves determined sensitivity and specificity at spectroscopic voxel resolution and per lesion, and LDA determined the optimal multiparametric model for identifying tumours. Accuracy was compared with an expert observer. Tumours were significantly different from PZ and CG for all parameters (all p < 0.001). Area under the ROC curve for discriminating tumour from non-tumour was significantly greater (p < 0.001) for the multiparametric model than for individual parameters; at 90 % specificity, sensitivity was 41 % (MRSI voxel resolution) and 59 % per lesion. At this specificity, an expert observer achieved 28 % and 49 % sensitivity, respectively. The model was more accurate when parameters from all techniques were included and performed better than an expert observer evaluating these data. • The combined model increases diagnostic accuracy in prostate cancer compared with individual parameters • The optimal combined model includes parameters from diffusion, spectroscopy, perfusion, and anatominal MRI • The computed model improves tumour detection compared to an expert viewing parametric maps.

  20. Image-based teleconsultation using smartphones or tablets: qualitative assessment of medical experts.

    PubMed

    Boissin, Constance; Blom, Lisa; Wallis, Lee; Laflamme, Lucie

    2017-02-01

    Mobile health has promising potential in improving healthcare delivery by facilitating access to expert advice. Enabling experts to review images on their smartphone or tablet may save valuable time. This study aims at assessing whether images viewed by medical specialists on handheld devices such as smartphones and tablets are perceived to be of comparable quality as when viewed on a computer screen. This was a prospective study comparing the perceived quality of 18 images on three different display devices (smartphone, tablet and computer) by 27 participants (4 burn surgeons and 23 emergency medicine specialists). The images, presented in random order, covered clinical (dermatological conditions, burns, ECGs and X-rays) and non-clinical subjects and their perceived quality was assessed using a 7-point Likert scale. Differences in devices' quality ratings were analysed using linear regression models for clustered data adjusting for image type and participants' characteristics (age, gender and medical specialty). Overall, the images were rated good or very good in most instances and more so for the smartphone (83.1%, mean score 5.7) and tablet (78.2%, mean 5.5) than for a standard computer (70.6%, mean 5.2). Both handheld devices had significantly higher ratings than the computer screen, even after controlling for image type and participants' characteristics. Nearly all experts expressed that they would be comfortable using smartphones (n=25) or tablets (n=26) for image-based teleconsultation. This study suggests that handheld devices could be a substitute for computer screens for teleconsultation by physicians working in emergency settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  1. Does a web-based feedback training program result in improved reliability in clinicians' ratings of the Global Assessment of Functioning (GAF) Scale?

    PubMed

    Støre-Valen, Jakob; Ryum, Truls; Pedersen, Geir A F; Pripp, Are H; Jose, Paul E; Karterud, Sigmund

    2015-09-01

    The Global Assessment of Functioning (GAF) Scale is used in routine clinical practice and research to estimate symptom and functional severity and longitudinal change. Concerns about poor interrater reliability have been raised, and the present study evaluated the effect of a Web-based GAF training program designed to improve interrater reliability in routine clinical practice. Clinicians rated up to 20 vignettes online, and received deviation scores as immediate feedback (i.e., own scores compared with expert raters) after each rating. Growth curves of absolute SD scores across the vignettes were modeled. A linear mixed effects model, using the clinician's deviation scores from expert raters as the dependent variable, indicated an improvement in reliability during training. Moderation by content of scale (symptoms; functioning), scale range (average; extreme), previous experience with GAF rating, profession, and postgraduate training were assessed. Training reduced deviation scores for inexperienced GAF raters, for individuals in clinical professions other than nursing and medicine, and for individuals with no postgraduate specialization. In addition, training was most beneficial for cases with average severity of symptoms compared with cases with extreme severity. The results support the use of Web-based training with feedback routines as a means to improve the reliability of GAF ratings performed by clinicians in mental health practice. These results especially pertain to clinicians in mental health practice who do not have a masters or doctoral degree. (c) 2015 APA, all rights reserved.

  2. Expert judgement and uncertainty quantification for climate change

    NASA Astrophysics Data System (ADS)

    Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.

    2016-05-01

    Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.

  3. Assessing uncertainty in sighting records: an example of the Barbary lion.

    PubMed

    Lee, Tamsin E; Black, Simon A; Fellous, Amina; Yamaguchi, Nobuyuki; Angelici, Francesco M; Al Hikmani, Hadi; Reed, J Michael; Elphick, Chris S; Roberts, David L

    2015-01-01

    As species become rare and approach extinction, purported sightings can be controversial, especially when scarce management resources are at stake. We consider the probability that each individual sighting of a series is valid. Obtaining these probabilities requires a strict framework to ensure that they are as accurately representative as possible. We used a process, which has proven to provide accurate estimates from a group of experts, to obtain probabilities for the validation of 32 sightings of the Barbary lion. We consider the scenario where experts are simply asked whether a sighting was valid, as well as asking them to score the sighting based on distinguishablity, observer competence, and verifiability. We find that asking experts to provide scores for these three aspects resulted in each sighting being considered more individually, meaning that this new questioning method provides very different estimated probabilities that a sighting is valid, which greatly affects the outcome from an extinction model. We consider linear opinion pooling and logarithm opinion pooling to combine the three scores, and also to combine opinions on each sighting. We find the two methods produce similar outcomes, allowing the user to focus on chosen features of each method, such as satisfying the marginalisation property or being externally Bayesian.

  4. Example-based learning: effects of model expertise in relation to student expertise.

    PubMed

    Boekhout, Paul; van Gog, Tamara; van de Wiel, Margje W J; Gerards-Last, Dorien; Geraets, Jacques

    2010-12-01

    Worked examples are very effective for novice learners. They typically present a written-out ideal (didactical) solution for learners to study. This study used worked examples of patient history taking in physiotherapy that presented a non-didactical solution (i.e., based on actual performance). The effects of model expertise (i.e., worked example based on advanced, third-year student model or expert physiotherapist model) in relation to students' expertise (i.e., first- or second-year) were investigated. One hundred and thirty-four physiotherapy students (61 first-year and 73 second-year). Design was 2 × 2 factorial with factors 'Student Expertise' (first-year vs. second-year) and 'Model Expertise' (expert vs. advanced student). Within expertise levels, students were randomly assigned to the Expert Example or the Advanced Student Example condition. All students studied two examples (content depending on their assigned condition) and then completed a retention and test task. They rated their invested mental effort after each example and test task. Second-year students invested less mental effort in studying the examples, and in performing the retention and transfer tasks than first-year students. They also performed better on the retention test, but not on the transfer test. In contrast to our hypothesis, there was no interaction between student expertise and model expertise: all students who had studied the Expert examples performed better on the transfer test than students who had studied Advanced Student Examples. This study suggests that when worked examples are based on actual performance, rather than an ideal procedure, expert models are to be preferred over advanced student models.

  5. Use of the self-organising map network (SOMNet) as a decision support system for regional mental health planning.

    PubMed

    Chung, Younjin; Salvador-Carulla, Luis; Salinas-Pérez, José A; Uriarte-Uriarte, Jose J; Iruin-Sanz, Alvaro; García-Alonso, Carlos R

    2018-04-25

    Decision-making in mental health systems should be supported by the evidence-informed knowledge transfer of data. Since mental health systems are inherently complex, involving interactions between its structures, processes and outcomes, decision support systems (DSS) need to be developed using advanced computational methods and visual tools to allow full system analysis, whilst incorporating domain experts in the analysis process. In this study, we use a DSS model developed for interactive data mining and domain expert collaboration in the analysis of complex mental health systems to improve system knowledge and evidence-informed policy planning. We combine an interactive visual data mining approach, the self-organising map network (SOMNet), with an operational expert knowledge approach, expert-based collaborative analysis (EbCA), to develop a DSS model. The SOMNet was applied to the analysis of healthcare patterns and indicators of three different regional mental health systems in Spain, comprising 106 small catchment areas and providing healthcare for over 9 million inhabitants. Based on the EbCA, the domain experts in the development team guided and evaluated the analytical processes and results. Another group of 13 domain experts in mental health systems planning and research evaluated the model based on the analytical information of the SOMNet approach for processing information and discovering knowledge in a real-world context. Through the evaluation, the domain experts assessed the feasibility and technology readiness level (TRL) of the DSS model. The SOMNet, combined with the EbCA, effectively processed evidence-based information when analysing system outliers, explaining global and local patterns, and refining key performance indicators with their analytical interpretations. The evaluation results showed that the DSS model was feasible by the domain experts and reached level 7 of the TRL (system prototype demonstration in operational environment). This study supports the benefits of combining health systems engineering (SOMNet) and expert knowledge (EbCA) to analyse the complexity of health systems research. The use of the SOMNet approach contributes to the demonstration of DSS for mental health planning in practice.

  6. MOAB: a spatially explicit, individual-based expert system for creating animal foraging models

    USGS Publications Warehouse

    Carter, J.; Finn, John T.

    1999-01-01

    We describe the development, structure, and corroboration process of a simulation model of animal behavior (MOAB). MOAB can create spatially explicit, individual-based animal foraging models. Users can create or replicate heterogeneous landscape patterns, and place resources and individual animals of a goven species on that landscape to simultaneously simulate the foraging behavior of multiple species. The heuristic rules for animal behavior are maintained in a user-modifiable expert system. MOAB can be used to explore hypotheses concerning the influence of landscape patttern on animal movement and foraging behavior. A red fox (Vulpes vulpes L.) foraging and nest predation model was created to test MOAB's capabilities. Foxes were simulated for 30-day periods using both expert system and random movement rules. Home range size, territory formation and other available simulation studies. A striped skunk (Mephitis mephitis L.) model also was developed. The expert system model proved superior to stochastic in respect to territory formation, general movement patterns and home range size.

  7. Quantifying and predicting interpretational uncertainty in cross-sections

    NASA Astrophysics Data System (ADS)

    Randle, Charles; Bond, Clare; Monaghan, Alison; Lark, Murray

    2015-04-01

    Cross-sections are often constructed from data to create a visual impression of the geologist's interpretation of the sub-surface geology. However as with all interpretations, this vision of the sub-surface geology is uncertain. We have designed and carried out an experiment with the aim of quantifying the uncertainty in geological cross-sections created by experts interpreting borehole data. By analysing different attributes of the data and interpretations we reflect on the main controls on uncertainty. A group of ten expert modellers at the British Geological Survey were asked to interpret an 11.4 km long cross-section from south-east Glasgow, UK. The data provided consisted of map and borehole data of the superficial deposits and shallow bedrock. Each modeller had a unique set of 11 boreholes removed from their dataset, to which their interpretations of the top of the bedrock were compared. This methodology allowed quantification of how far from the 'correct answer' each interpretation is at 11 points along each interpreted cross-section line; through comparison of the interpreted and actual bedrock elevations in the boreholes. This resulted in the collection of 110 measurements of the error to use in further analysis. To determine the potential control on uncertainty various attributes relating to the modeller, the interpretation and the data were recorded. Modellers were asked to fill out a questionnaire asking for information; such as how much 3D modelling experience they had, and how long it took them to complete the interpretation. They were also asked to record their confidence in their interpretations graphically, in the form of a confidence level drawn onto the cross-section. Initial analysis showed the majority of the experts' interpreted bedrock elevations within 5 metres of those recorded in the withheld boreholes. Their distribution is peaked and symmetrical about a mean of zero, indicating that there was no tendency for the experts to either under or over estimate the elevation of the bedrock. More complex analysis was completed in the form of linear mixed effects modelling. The modelling was used to determine if there were any correlations between the error and any other parameter recorded in the questionnaire, section or the initial dataset. This has resulted in the determination of both data based and interpreter based controls on uncertainty, adding insight into how uncertainty can be predicted, as well as how interpretation workflows can be improved. Our results will inform further experiments across a wide variety of geological situations to build understanding and best practice workflows for cross-section interpretation to reduce uncertainty.

  8. How much expert knowledge is it worth to put in conceptual hydrological models?

    NASA Astrophysics Data System (ADS)

    Antonetti, Manuel; Zappa, Massimiliano

    2017-04-01

    Both modellers and experimentalists agree on using expert knowledge to improve our conceptual hydrological simulations on ungauged basins. However, they use expert knowledge differently for both hydrologically mapping the landscape and parameterising a given hydrological model. Modellers use generally very simplified (e.g. topography-based) mapping approaches and put most of the knowledge for constraining the model by defining parameter and process relational rules. In contrast, experimentalists tend to invest all their detailed and qualitative knowledge about processes to obtain a spatial distribution of areas with different dominant runoff generation processes (DRPs) as realistic as possible, and for defining plausible narrow value ranges for each model parameter. Since, most of the times, the modelling goal is exclusively to simulate runoff at a specific site, even strongly simplified hydrological classifications can lead to satisfying results due to equifinality of hydrological models, overfitting problems and the numerous uncertainty sources affecting runoff simulations. Therefore, to test to which extent expert knowledge can improve simulation results under uncertainty, we applied a typical modellers' modelling framework relying on parameter and process constraints defined based on expert knowledge to several catchments on the Swiss Plateau. To map the spatial distribution of the DRPs, mapping approaches with increasing involvement of expert knowledge were used. Simulation results highlighted the potential added value of using all the expert knowledge available on a catchment. Also, combinations of event types and landscapes, where even a simplified mapping approach can lead to satisfying results, were identified. Finally, the uncertainty originated by the different mapping approaches was compared with the one linked to meteorological input data and catchment initial conditions.

  9. Modeling of driver's collision avoidance maneuver based on controller switching model.

    PubMed

    Kim, Jong-Hae; Hayakawa, Soichiro; Suzuki, Tatsuya; Hayashi, Koji; Okuma, Shigeru; Tsuchida, Nuio; Shimizu, Masayuki; Kido, Shigeyuki

    2005-12-01

    This paper presents a modeling strategy of human driving behavior based on the controller switching model focusing on the driver's collision avoidance maneuver. The driving data are collected by using the three-dimensional (3-D) driving simulator based on the CAVE Automatic Virtual Environment (CAVE), which provides stereoscopic immersive virtual environment. In our modeling, the control scenario of the human driver, that is, the mapping from the driver's sensory information to the operation of the driver such as acceleration, braking, and steering, is expressed by Piecewise Polynomial (PWP) model. Since the PWP model includes both continuous behaviors given by polynomials and discrete logical conditions, it can be regarded as a class of Hybrid Dynamical System (HDS). The identification problem for the PWP model is formulated as the Mixed Integer Linear Programming (MILP) by transforming the switching conditions into binary variables. From the obtained results, it is found that the driver appropriately switches the "control law" according to the sensory information. In addition, the driving characteristics of the beginner driver and the expert driver are compared and discussed. These results enable us to capture not only the physical meaning of the driving skill but the decision-making aspect (switching conditions) in the driver's collision avoidance maneuver as well.

  10. Model of critical diagnostic reasoning: achieving expert clinician performance.

    PubMed

    Harjai, Prashant Kumar; Tiwari, Ruby

    2009-01-01

    Diagnostic reasoning refers to the analytical processes used to determine patient health problems. While the education curriculum and health care system focus on training nurse clinicians to accurately recognize and rescue clinical situations, assessments of non-expert nurses have yielded less than satisfactory data on diagnostic competency. The contrast between the expert and non-expert nurse clinician raises the important question of how differences in thinking may contribute to a large divergence in accurate diagnostic reasoning. This article recognizes superior organization of one's knowledge base, using prototypes, and quick retrieval of pertinent information, using similarity recognition as two reasons for the expert's superior diagnostic performance. A model of critical diagnostic reasoning, using prototypes and similarity recognition, is proposed and elucidated using case studies. This model serves as a starting point toward bridging the gap between clinical data and accurate problem identification, verification, and management while providing a structure for a knowledge exchange between expert and non-expert clinicians.

  11. Understanding the Life Cycle of Computer-Based Models: The Role of Expert Contributions in Design, Development and Implementation

    ERIC Educational Resources Information Center

    Waight, Noemi; Liu, Xiufeng; Gregorius, Roberto Ma.

    2015-01-01

    This paper examined the nuances of the background process of design and development and follow up classroom implementation of computer-based models for high school chemistry. More specifically, the study examined the knowledge contributions of an interdisciplinary team of experts; points of tensions, negotiations and non-negotiable aspects of…

  12. Classification of microscopy images of Langerhans islets

    NASA Astrophysics Data System (ADS)

    Å vihlík, Jan; Kybic, Jan; Habart, David; Berková, Zuzana; Girman, Peter; Kříž, Jan; Zacharovová, Klára

    2014-03-01

    Evaluation of images of Langerhans islets is a crucial procedure for planning an islet transplantation, which is a promising diabetes treatment. This paper deals with segmentation of microscopy images of Langerhans islets and evaluation of islet parameters such as area, diameter, or volume (IE). For all the available images, the ground truth and the islet parameters were independently evaluated by four medical experts. We use a pixelwise linear classifier (perceptron algorithm) and SVM (support vector machine) for image segmentation. The volume is estimated based on circle or ellipse fitting to individual islets. The segmentations were compared with the corresponding ground truth. Quantitative islet parameters were also evaluated and compared with parameters given by medical experts. We can conclude that accuracy of the presented fully automatic algorithm is fully comparable with medical experts.

  13. Knowledge-based fault diagnosis system for refuse collection vehicle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, CheeFai; Juffrizal, K.; Khalil, S. N.

    The refuse collection vehicle is manufactured by local vehicle body manufacturer. Currently; the company supplied six model of the waste compactor truck to the local authority as well as waste management company. The company is facing difficulty to acquire the knowledge from the expert when the expert is absence. To solve the problem, the knowledge from the expert can be stored in the expert system. The expert system is able to provide necessary support to the company when the expert is not available. The implementation of the process and tool is able to be standardize and more accurate. The knowledgemore » that input to the expert system is based on design guidelines and experience from the expert. This project highlighted another application on knowledge-based system (KBS) approached in trouble shooting of the refuse collection vehicle production process. The main aim of the research is to develop a novel expert fault diagnosis system framework for the refuse collection vehicle.« less

  14. The Evonik-Mainz Eye Care-Study (EMECS): Development of an Expert System for Glaucoma Risk Detection in a Working Population

    PubMed Central

    Wahl, Jochen; Barleon, Lorenz; Morfeld, Peter; Lichtmeß, Andrea; Haas-Brähler, Sibylle; Pfeiffer, Norbert

    2016-01-01

    Purpose To develop an expert system for glaucoma screening in a working population based on a human expert procedure using images of optic nerve head (ONH), visual field (frequency doubling technology, FDT) and intraocular pressure (IOP). Methods 4167 of 13037 (32%) employees between 40 and 65 years of Evonik Industries were screened. An experienced glaucoma expert (JW) assessed papilla parameters and evaluated all individual screening results. His classification into “no glaucoma”, “possible glaucoma” and “probable glaucoma” was defined as “gold standard”. A screening model was developed which was tested versus the gold-standard. This model took into account the assessment of the ONH. Values and relationships of CDR and IOP and the FDT were considered additionally and a glaucoma score was generated. The structure of the screening model was specified a priori whereas values of the parameters were chosen post-hoc to optimize sensitivity and specificity of the algorithm. Simple screening models based on IOP and / or FDT were investigated for comparison. Results 111 persons (2.66%) were classified as glaucoma suspects, thereof 13 (0.31%) as probable and 98 (2.35%) as possible glaucoma suspects by the expert. Re-evaluation by the screening model revealed a sensitivity of 83.8% and a specificity of 99.6% for all glaucoma suspects. The positive predictive value of the model was 80.2%, the negative predictive value 99.6%. Simple screening models showed insufficient diagnostic accuracy. Conclusion Adjustment of ONH and symmetry parameters with respect to excavation and IOP in an expert system produced sufficiently satisfying diagnostic accuracy. This screening model seems to be applicable in such a working population with relatively low age and low glaucoma prevalence. Different experts should validate the model in different populations. PMID:27479301

  15. The Evonik-Mainz Eye Care-Study (EMECS): Development of an Expert System for Glaucoma Risk Detection in a Working Population.

    PubMed

    Wahl, Jochen; Barleon, Lorenz; Morfeld, Peter; Lichtmeß, Andrea; Haas-Brähler, Sibylle; Pfeiffer, Norbert

    2016-01-01

    To develop an expert system for glaucoma screening in a working population based on a human expert procedure using images of optic nerve head (ONH), visual field (frequency doubling technology, FDT) and intraocular pressure (IOP). 4167 of 13037 (32%) employees between 40 and 65 years of Evonik Industries were screened. An experienced glaucoma expert (JW) assessed papilla parameters and evaluated all individual screening results. His classification into "no glaucoma", "possible glaucoma" and "probable glaucoma" was defined as "gold standard". A screening model was developed which was tested versus the gold-standard. This model took into account the assessment of the ONH. Values and relationships of CDR and IOP and the FDT were considered additionally and a glaucoma score was generated. The structure of the screening model was specified a priori whereas values of the parameters were chosen post-hoc to optimize sensitivity and specificity of the algorithm. Simple screening models based on IOP and / or FDT were investigated for comparison. 111 persons (2.66%) were classified as glaucoma suspects, thereof 13 (0.31%) as probable and 98 (2.35%) as possible glaucoma suspects by the expert. Re-evaluation by the screening model revealed a sensitivity of 83.8% and a specificity of 99.6% for all glaucoma suspects. The positive predictive value of the model was 80.2%, the negative predictive value 99.6%. Simple screening models showed insufficient diagnostic accuracy. Adjustment of ONH and symmetry parameters with respect to excavation and IOP in an expert system produced sufficiently satisfying diagnostic accuracy. This screening model seems to be applicable in such a working population with relatively low age and low glaucoma prevalence. Different experts should validate the model in different populations.

  16. The Shrinkage Model And Expert System Of Plastic Lens Formation

    NASA Astrophysics Data System (ADS)

    Chang, Rong-Seng

    1988-06-01

    Shrinkage causes both the appearance & dimension defects of the injected plastic lens. We have built up a model of state equations with the help of finite element analysis program to estimate the volume change (shrinkage and swelling) under the combinations of injection variables such as pressure and temperature etc., then the personal computer expert system has been build up to make that knowledge conveniently available to the user in the model design, process planning, process operation and some other work. The domain knowledge is represented by a R-graph (Relationship-graph) model which states the relationships of variables & equations. This model could be compare with other models in the expert system. If the user has better model to solve the shrinkage problem, the program will evaluate it automatically and a learning file will be trigger by the expert system to teach the user to update their knowledge base and modify the old model by this better model. The Rubin's model and Gilmore's model have been input to the expert system. The conflict has been solved both from the user and the deeper knowledge base. A cube prism and the convex lens examples have been shown in this paper. This program is written by MULISP language in IBM PC-AT. The natural language provides English Explaination of know why and know how and the automatic English translation for the equation rules and the production rules.

  17. Knowledge discovery from data and Monte-Carlo DEA to evaluate technical efficiency of mental health care in small health areas

    PubMed Central

    García-Alonso, Carlos; Pérez-Naranjo, Leonor

    2009-01-01

    Introduction Knowledge management, based on information transfer between experts and analysts, is crucial for the validity and usability of data envelopment analysis (DEA). Aim To design and develop a methodology: i) to assess technical efficiency of small health areas (SHA) in an uncertainty environment, and ii) to transfer information between experts and operational models, in both directions, for improving expert’s knowledge. Method A procedure derived from knowledge discovery from data (KDD) is used to select, interpret and weigh DEA inputs and outputs. Based on KDD results, an expert-driven Monte-Carlo DEA model has been designed to assess the technical efficiency of SHA in Andalusia. Results In terms of probability, SHA 29 is the most efficient being, on the contrary, SHA 22 very inefficient. 73% of analysed SHA have a probability of being efficient (Pe) >0.9 and 18% <0.5. Conclusions Expert knowledge is necessary to design and validate any operational model. KDD techniques make the transfer of information from experts to any operational model easy and results obtained from the latter improve expert’s knowledge.

  18. Expert opinion on landslide susceptibility elicted by probabilistic inversion from scenario rankings

    NASA Astrophysics Data System (ADS)

    Lee, Katy; Dashwood, Claire; Lark, Murray

    2016-04-01

    For many natural hazards the opinion of experts, with experience in assessing susceptibility under different circumstances, is a valuable source of information on which to base risk assessments. This is particularly important where incomplete process understanding, and limited data, limit the scope to predict susceptibility by mechanistic or statistical modelling. The expert has a tacit model of a system, based on their understanding of processes and their field experience. This model may vary in quality, depending on the experience of the expert. There is considerable interest in how one may elicit expert understanding by a process which is transparent and robust, to provide a basis for decision support. One approach is to provide experts with a set of scenarios, and then to ask them to rank small overlapping subsets of these with respect to susceptibility. Methods of probabilistic inversion have been used to compute susceptibility scores for each scenario, implicit in the expert ranking. It is also possible to model these scores as functions of measurable properties of the scenarios. This approach has been used to assess susceptibility of animal populations to invasive diseases, to assess risk to vulnerable marine environments and to assess the risk in hypothetical novel technologies for food production. We will present the results of a study in which a group of geologists with varying degrees of expertise in assessing landslide hazards were asked to rank sets of hypothetical simplified scenarios with respect to land slide susceptibility. We examine the consistency of their rankings and the importance of different properties of the scenarios in the tacit susceptibility model that their rankings implied. Our results suggest that this is a promising approach to the problem of how experts can communicate their tacit model of uncertain systems to those who want to make use of their expertise.

  19. Use of an expert system data analysis manager for space shuttle main engine test evaluation

    NASA Technical Reports Server (NTRS)

    Abernethy, Ken

    1988-01-01

    The ability to articulate, collect, and automate the application of the expertise needed for the analysis of space shuttle main engine (SSME) test data would be of great benefit to NASA liquid rocket engine experts. This paper describes a project whose goal is to build a rule-based expert system which incorporates such expertise. Experiential expertise, collected directly from the experts currently involved in SSME data analysis, is used to build a rule base to identify engine anomalies similar to those analyzed previously. Additionally, an alternate method of expertise capture is being explored. This method would generate rules inductively based on calculations made using a theoretical model of the SSME's operation. The latter rules would be capable of diagnosing anomalies which may not have appeared before, but whose effects can be predicted by the theoretical model.

  20. Expert system for web based collaborative CAE

    NASA Astrophysics Data System (ADS)

    Hou, Liang; Lin, Zusheng

    2006-11-01

    An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.

  1. Application of Hybrid Optimization-Expert System for Optimal Power Management on Board Space Power Station

    NASA Technical Reports Server (NTRS)

    Momoh, James; Chattopadhyay, Deb; Basheer, Omar Ali AL

    1996-01-01

    The space power system has two sources of energy: photo-voltaic blankets and batteries. The optimal power management problem on-board has two broad operations: off-line power scheduling to determine the load allocation schedule of the next several hours based on the forecast of load and solar power availability. The nature of this study puts less emphasis on speed requirement for computation and more importance on the optimality of the solution. The second category problem, on-line power rescheduling, is needed in the event of occurrence of a contingency to optimally reschedule the loads to minimize the 'unused' or 'wasted' energy while keeping the priority on certain type of load and minimum disturbance of the original optimal schedule determined in the first-stage off-line study. The computational performance of the on-line 'rescheduler' is an important criterion and plays a critical role in the selection of the appropriate tool. The Howard University Center for Energy Systems and Control has developed a hybrid optimization-expert systems based power management program. The pre-scheduler has been developed using a non-linear multi-objective optimization technique called the Outer Approximation method and implemented using the General Algebraic Modeling System (GAMS). The optimization model has the capability of dealing with multiple conflicting objectives viz. maximizing energy utilization, minimizing the variation of load over a day, etc. and incorporates several complex interaction between the loads in a space system. The rescheduling is performed using an expert system developed in PROLOG which utilizes a rule-base for reallocation of the loads in an emergency condition viz. shortage of power due to solar array failure, increase of base load, addition of new activity, repetition of old activity etc. Both the modules handle decision making on battery charging and discharging and allocation of loads over a time-horizon of a day divided into intervals of 10 minutes. The models have been extensively tested using a case study for the Space Station Freedom and the results for the case study will be presented. Several future enhancements of the pre-scheduler and the 'rescheduler' have been outlined which include graphic analyzer for the on-line module, incorporating probabilistic considerations, including spatial location of the loads and the connectivity using a direct current (DC) load flow model.

  2. 77 FR 50172 - Expert Forum on the Use of Performance-Based Regulatory Models in the U.S. Oil and Gas Industry...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-20

    ... Transportation, Pipeline and Hazardous Materials Safety Administration (PHMSA) invite interested parties to...] Expert Forum on the Use of Performance-Based Regulatory Models in the U.S. Oil and Gas Industry, Offshore... and gas industry. The meeting will take place at the College of the Mainland, and hosted by the Gulf...

  3. Visual to Parametric Interaction (V2PI)

    PubMed Central

    Maiti, Dipayan; Endert, Alex; North, Chris

    2013-01-01

    Typical data visualizations result from linear pipelines that start by characterizing data using a model or algorithm to reduce the dimension and summarize structure, and end by displaying the data in a reduced dimensional form. Sensemaking may take place at the end of the pipeline when users have an opportunity to observe, digest, and internalize any information displayed. However, some visualizations mask meaningful data structures when model or algorithm constraints (e.g., parameter specifications) contradict information in the data. Yet, due to the linearity of the pipeline, users do not have a natural means to adjust the displays. In this paper, we present a framework for creating dynamic data displays that rely on both mechanistic data summaries and expert judgement. The key is that we develop both the theory and methods of a new human-data interaction to which we refer as “ Visual to Parametric Interaction” (V2PI). With V2PI, the pipeline becomes bi-directional in that users are embedded in the pipeline; users learn from visualizations and the visualizations adjust to expert judgement. We demonstrate the utility of V2PI and a bi-directional pipeline with two examples. PMID:23555552

  4. Mathematical modeling in realistic mathematics education

    NASA Astrophysics Data System (ADS)

    Riyanto, B.; Zulkardi; Putri, R. I. I.; Darmawijoyo

    2017-12-01

    The purpose of this paper is to produce Mathematical modelling in Realistics Mathematics Education of Junior High School. This study used development research consisting of 3 stages, namely analysis, design and evaluation. The success criteria of this study were obtained in the form of local instruction theory for school mathematical modelling learning which was valid and practical for students. The data were analyzed using descriptive analysis method as follows: (1) walk through, analysis based on the expert comments in the expert review to get Hypothetical Learning Trajectory for valid mathematical modelling learning; (2) analyzing the results of the review in one to one and small group to gain practicality. Based on the expert validation and students’ opinion and answers, the obtained mathematical modeling problem in Realistics Mathematics Education was valid and practical.

  5. An evaluation of an expert system for detecting critical events during anesthesia in a human patient simulator: a prospective randomized controlled study.

    PubMed

    Görges, Matthias; Winton, Pamela; Koval, Valentyna; Lim, Joanne; Stinson, Jonathan; Choi, Peter T; Schwarz, Stephan K W; Dumont, Guy A; Ansermino, J Mark

    2013-08-01

    Perioperative monitoring systems produce a large amount of uninterpreted data, use threshold alarms prone to artifacts, and rely on the clinician to continuously visually track changes in physiological data. To address these deficiencies, we developed an expert system that provides real-time clinical decisions for the identification of critical events. We evaluated the efficacy of the expert system for enhancing critical event detection in a simulated environment. We hypothesized that anesthesiologists would identify critical ventilatory events more rapidly and accurately with the expert system. We used a high-fidelity human patient simulator to simulate an operating room environment. Participants managed 4 scenarios (anesthetic vapor overdose, tension pneumothorax, anaphylaxis, and endotracheal tube cuff leak) in random order. In 2 of their 4 scenarios, participants were randomly assigned to the expert system, which provided trend-based alerts and potential differential diagnoses. Time to detection and time to treatment were measured. Workload questionnaires and structured debriefings were completed after each scenario, and a usability questionnaire at the conclusion of the session. Data were analyzed using a mixed-effects linear regression model; Fisher exact test was used for workload scores. Twenty anesthesiology trainees and 15 staff anesthesiologists with a combined median (range) of 36 (29-66) years of age and 6 (1-38) years of anesthesia experience participated. For the endotracheal tube cuff leak, the expert system caused mean reductions of 128 (99% confidence interval [CI], 54-202) seconds in time to detection and 140 (99% CI, 79-200) seconds in time to treatment. In the other 3 scenarios, a best-case decrease of 97 seconds (lower 99% CI) in time to diagnosis for anaphylaxis and a worst-case increase of 63 seconds (upper 99% CI) in time to treatment for anesthetic vapor overdose were found. Participants were highly satisfied with the expert system (median score, 2 on a scale of 1-7). Based on participant debriefings, we identified avoidance of task fixation, reassurance to initiate invasive treatment, and confirmation of a suspected diagnosis as 3 safety-critical areas. When using the expert system, clinically important and statistically significant decreases in time to detection and time to treatment were observed for the endotracheal tube cuff Leak scenario. The observed differences in the other 3 scenarios were much smaller and not statistically significant. Further evaluation is required to confirm the clinical utility of real-time expert systems for anesthesia.

  6. Evaluation of HardSys/HardDraw, An Expert System for Electromagnetic Interactions Modelling

    DTIC Science & Technology

    1993-05-01

    interactions ir complex systems. This report gives a description of HardSys/HardDraw and reviews the main concepts used in its design. Various aspects of its ...HardDraw, an expert system for the modelling of electromagnetic interactions in complex systems. It consists of two main components: HardSys and HardDraw...HardSys is the advisor part of the expert system. It is knowledge-based, that is it contains a database of models and properties for various types of

  7. An image overall complexity evaluation method based on LSD line detection

    NASA Astrophysics Data System (ADS)

    Li, Jianan; Duan, Jin; Yang, Xu; Xiao, Bo

    2017-04-01

    In the artificial world, whether it is the city's traffic roads or engineering buildings contain a lot of linear features. Therefore, the research on the image complexity of linear information has become an important research direction in digital image processing field. This paper, by detecting the straight line information in the image and using the straight line as the parameter index, establishing the quantitative and accurate mathematics relationship. In this paper, we use LSD line detection algorithm which has good straight-line detection effect to detect the straight line, and divide the detected line by the expert consultation strategy. Then we use the neural network to carry on the weight training and get the weight coefficient of the index. The image complexity is calculated by the complexity calculation model. The experimental results show that the proposed method is effective. The number of straight lines in the image, the degree of dispersion, uniformity and so on will affect the complexity of the image.

  8. Designing simulator-based training: an approach integrating cognitive task analysis and four-component instructional design.

    PubMed

    Tjiam, Irene M; Schout, Barbara M A; Hendrikx, Ad J M; Scherpbier, Albert J J M; Witjes, J Alfred; van Merriënboer, Jeroen J G

    2012-01-01

    Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic processes that experts are not always able to explicate, simulator developers should collaborate with educational experts and physicians in developing efficient and effective training programmes. This article presents an approach to designing simulator-based skill training comprising cognitive task analysis integrated with instructional design according to the four-component/instructional design model. This theory-driven approach is illustrated by a description of how it was used in the development of simulator-based training for the nephrostomy procedure.

  9. A Bayesian Approach to Model Selection in Hierarchical Mixtures-of-Experts Architectures.

    PubMed

    Tanner, Martin A.; Peng, Fengchun; Jacobs, Robert A.

    1997-03-01

    There does not exist a statistical model that shows good performance on all tasks. Consequently, the model selection problem is unavoidable; investigators must decide which model is best at summarizing the data for each task of interest. This article presents an approach to the model selection problem in hierarchical mixtures-of-experts architectures. These architectures combine aspects of generalized linear models with those of finite mixture models in order to perform tasks via a recursive "divide-and-conquer" strategy. Markov chain Monte Carlo methodology is used to estimate the distribution of the architectures' parameters. One part of our approach to model selection attempts to estimate the worth of each component of an architecture so that relatively unused components can be pruned from the architecture's structure. A second part of this approach uses a Bayesian hypothesis testing procedure in order to differentiate inputs that carry useful information from nuisance inputs. Simulation results suggest that the approach presented here adheres to the dictum of Occam's razor; simple architectures that are adequate for summarizing the data are favored over more complex structures. Copyright 1997 Elsevier Science Ltd. All Rights Reserved.

  10. A condition metric for Eucalyptus woodland derived from expert evaluations.

    PubMed

    Sinclair, Steve J; Bruce, Matthew J; Griffioen, Peter; Dodd, Amanda; White, Matthew D

    2018-02-01

    The evaluation of ecosystem quality is important for land-management and land-use planning. Evaluation is unavoidably subjective, and robust metrics must be based on consensus and the structured use of observations. We devised a transparent and repeatable process for building and testing ecosystem metrics based on expert data. We gathered quantitative evaluation data on the quality of hypothetical grassy woodland sites from experts. We used these data to train a model (an ensemble of 30 bagged regression trees) capable of predicting the perceived quality of similar hypothetical woodlands based on a set of 13 site variables as inputs (e.g., cover of shrubs, richness of native forbs). These variables can be measured at any site and the model implemented in a spreadsheet as a metric of woodland quality. We also investigated the number of experts required to produce an opinion data set sufficient for the construction of a metric. The model produced evaluations similar to those provided by experts, as shown by assessing the model's quality scores of expert-evaluated test sites not used to train the model. We applied the metric to 13 woodland conservation reserves and asked managers of these sites to independently evaluate their quality. To assess metric performance, we compared the model's evaluation of site quality with the managers' evaluations through multidimensional scaling. The metric performed relatively well, plotting close to the center of the space defined by the evaluators. Given the method provides data-driven consensus and repeatability, which no single human evaluator can provide, we suggest it is a valuable tool for evaluating ecosystem quality in real-world contexts. We believe our approach is applicable to any ecosystem. © 2017 State of Victoria.

  11. Uncertainty reasoning in expert systems

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik

    1993-01-01

    Intelligent control is a very successful way to transform the expert's knowledge of the type 'if the velocity is big and the distance from the object is small, hit the brakes and decelerate as fast as possible' into an actual control. To apply this transformation, one must choose appropriate methods for reasoning with uncertainty, i.e., one must: (1) choose the representation for words like 'small', 'big'; (2) choose operations corresponding to 'and' and 'or'; (3) choose a method that transforms the resulting uncertain control recommendations into a precise control strategy. The wrong choice can drastically affect the quality of the resulting control, so the problem of choosing the right procedure is very important. From a mathematical viewpoint these choice problems correspond to non-linear optimization and are therefore extremely difficult. In this project, a new mathematical formalism (based on group theory) is developed that allows us to solve the problem of optimal choice and thus: (1) explain why the existing choices are really the best (in some situations); (2) explain a rather mysterious fact that fuzzy control (i.e., control based on the experts' knowledge) is often better than the control by these same experts; and (3) give choice recommendations for the cases when traditional choices do not work.

  12. The use of subjective expert opinions in cost optimum design of aerospace structures. [probabilistic failure models

    NASA Technical Reports Server (NTRS)

    Thomas, J. M.; Hanagud, S.

    1975-01-01

    The results of two questionnaires sent to engineering experts are statistically analyzed and compared with objective data from Saturn V design and testing. Engineers were asked how likely it was for structural failure to occur at load increments above and below analysts' stress limit predictions. They were requested to estimate the relative probabilities of different failure causes, and of failure at each load increment given a specific cause. Three mathematical models are constructed based on the experts' assessment of causes. The experts' overall assessment of prediction strength fits the Saturn V data better than the models do, but a model test option (T-3) based on the overall assessment gives more design change likelihood to overstrength structures than does an older standard test option. T-3 compares unfavorably with the standard option in a cost optimum structural design problem. The report reflects a need for subjective data when objective data are unavailable.

  13. Using ecosystem services in decision-making to support sustainable development: Critiques, model development, a case study, and perspectives.

    PubMed

    Zagonari, Fabio

    2016-04-01

    In this paper, I propose a general, consistent, and operational approach that accounts for ecosystem services in a decision-making context: I link ecosystem services to sustainable development criteria; adopt multi-criteria analysis to measure ecosystem services, with weights provided by stakeholders used to account for equity issues; apply both temporal and spatial discount rates; and adopt a technique to order performance of the possible solutions based on their similarity to an ideal solution (TOPSIS) to account for uncertainty about the parameters and functions. Applying this approach in a case study of an offshore research platform in Italy (CNR Acqua Alta) revealed that decisions depend non-linearly on the degree of loss aversion, to a smaller extent on a global focus (as opposed to a local focus), and to the smallest extent on social concerns (as opposed to economic or environmental concerns). Application of the general model to the case study leads to the conclusion that the ecosystem services framework is likely to be less useful in supporting decisions than in identifying the crucial features on which decisions depend, unless experts from different disciplines are involved, stakeholders are represented, and experts and stakeholders achieve mutual understanding. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This annual report summarizes the work completed during the third year of technical effort on the referenced contract. Principal developments continue to focus on the Probabilistic Finite Element Method (PFEM) which has been under development for three years. Essentially all of the linear capabilities within the PFEM code are in place. Major progress in the application or verifications phase was achieved. An EXPERT module architecture was designed and partially implemented. EXPERT is a user interface module which incorporates an expert system shell for the implementation of a rule-based interface utilizing the experience and expertise of the user community. The Fast Probability Integration (FPI) Algorithm continues to demonstrate outstanding performance characteristics for the integration of probability density functions for multiple variables. Additionally, an enhanced Monte Carlo simulation algorithm was developed and demonstrated for a variety of numerical strategies.

  15. Harnessing expert knowledge: Defining a Bayesian network decision model with limited data-Model structure for the vibration qualification problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rizzo, Davinia B.; Blackburn, Mark R.

    As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less

  16. Harnessing expert knowledge: Defining a Bayesian network decision model with limited data-Model structure for the vibration qualification problem

    DOE PAGES

    Rizzo, Davinia B.; Blackburn, Mark R.

    2018-03-30

    As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less

  17. A New Perspective on Modeling Groundwater-Driven Health Risk With Subjective Information

    NASA Astrophysics Data System (ADS)

    Ozbek, M. M.

    2003-12-01

    Fuzzy rule-based systems provide an efficient environment for the modeling of expert information in the context of risk management for groundwater contamination problems. In general, their use in the form of conditional pieces of knowledge, has been either as a tool for synthesizing control laws from data (i.e., conjunction-based models), or in a knowledge representation and reasoning perspective in Artificial Intelligence (i.e., implication-based models), where only the latter may lead to coherence problems (e.g., input data that leads to logical inconsistency when added to the knowledge base). We implement a two-fold extension to an implication-based groundwater risk model (Ozbek and Pinder, 2002) including: 1) the implementation of sufficient conditions for a coherent knowledge base, and 2) the interpolation of expert statements to supplement gaps in knowledge. The original model assumes statements of public health professionals for the characterization of the exposed individual and the relation of dose and pattern of exposure to its carcinogenic effects. We demonstrate the utility of the extended model in that it: 1)identifies inconsistent statements and establishes coherence in the knowledge base, and 2) minimizes the burden of knowledge elicitation from the experts for utilizing existing knowledge in an optimal fashion.ÿÿ

  18. Interactive Inverse Groundwater Modeling - Addressing User Fatigue

    NASA Astrophysics Data System (ADS)

    Singh, A.; Minsker, B. S.

    2006-12-01

    This paper builds on ongoing research on developing an interactive and multi-objective framework to solve the groundwater inverse problem. In this work we solve the classic groundwater inverse problem of estimating a spatially continuous conductivity field, given field measurements of hydraulic heads. The proposed framework is based on an interactive multi-objective genetic algorithm (IMOGA) that not only considers quantitative measures such as calibration error and degree of regularization, but also takes into account expert knowledge about the structure of the underlying conductivity field expressed as subjective rankings of potential conductivity fields by the expert. The IMOGA converges to the optimal Pareto front representing the best trade- off among the qualitative as well as quantitative objectives. However, since the IMOGA is a population-based iterative search it requires the user to evaluate hundreds of solutions. This leads to the problem of 'user fatigue'. We propose a two step methodology to combat user fatigue in such interactive systems. The first step is choosing only a few highly representative solutions to be shown to the expert for ranking. Spatial clustering is used to group the search space based on the similarity of the conductivity fields. Sampling is then carried out from different clusters to improve the diversity of solutions shown to the user. Once the expert has ranked representative solutions from each cluster a machine learning model is used to 'learn user preference' and extrapolate these for the solutions not ranked by the expert. We investigate different machine learning models such as Decision Trees, Bayesian learning model, and instance based weighting to model user preference. In addition, we also investigate ways to improve the performance of these models by providing information about the spatial structure of the conductivity fields (which is what the expert bases his or her rank on). Results are shown for each of these machine learning models and the advantages and disadvantages for each approach are discussed. These results indicate that using the proposed two-step methodology leads to significant reduction in user-fatigue without deteriorating the solution quality of the IMOGA.

  19. Three CLIPS-based expert systems for solving engineering problems

    NASA Technical Reports Server (NTRS)

    Parkinson, W. J.; Luger, G. F.; Bretz, R. E.

    1990-01-01

    We have written three expert systems, using the CLIPS PC-based expert system shell. These three expert systems are rule based and are relatively small, with the largest containing slightly less than 200 rules. The first expert system is an expert assistant that was written to help users of the ASPEN computer code choose the proper thermodynamic package to use with their particular vapor-liquid equilibrium problem. The second expert system was designed to help petroleum engineers choose the proper enhanced oil recovery method to be used with a given reservoir. The effectiveness of each technique is highly dependent upon the reservoir conditions. The third expert system is a combination consultant and control system. This system was designed specifically for silicon carbide whisker growth. Silicon carbide whiskers are an extremely strong product used to make ceramic and metal composites. The manufacture of whiskers is a very complicated process. which to date. has defied a good mathematical model. The process was run by experts who had gained their expertise by trial and error. A system of rules was devised by these experts both for procedure setup and for the process control. In this paper we discuss the three problem areas of the design, development and evaluation of the CLIPS-based programs.

  20. Measuring novices' field mapping abilities using an in-class exercise based on expert task analysis

    NASA Astrophysics Data System (ADS)

    Caulkins, J. L.

    2010-12-01

    We are interested in developing a model of expert-like behavior for improving the teaching methods of undergraduate field geology. Our aim is to assist students in mastering the process of field mapping more efficiently and effectively and to improve their ability to think creatively in the field. To examine expert-mapping behavior, a cognitive task analysis was conducted with expert geologic mappers in an attempt to define the process of geologic mapping (i.e. to understand how experts carry out geological mapping). The task analysis indicates that expert mappers have a wealth of geologic scenarios at their disposal that they compare against examples seen in the field, experiences that most undergraduate mappers will not have had. While presenting students with many geological examples in class may increase their understanding of geologic processes, novices still struggle when presented with a novel field situation. Based on the task analysis, a short (45-minute) paper-map-based exercise was designed and tested with 14 pairs of 3rd year geology students. The exercise asks students to generate probable geologic models based on a series of four (4) data sets. Each data set represents a day’s worth of data; after the first “day,” new sheets simply include current and previously collected data (e.g. “Day 2” data set includes data from “Day 1” plus the new “Day 2” data). As the geologic complexity increases, students must adapt, reject or generate new geologic models in order to fit the growing data set. Preliminary results of the exercise indicate that students who produced more probable geologic models, and produced higher ratios of probable to improbable models, tended to go on to do better on the mapping exercises at the 3rd year field school. These results suggest that those students with more cognitively available geologic models may be more able to use these models in field settings than those who are unable to draw on these models for whatever reason. Giving students practice at generating geologic models to explain data may be useful in preparing our students for field mapping exercises.

  1. The Too-Much-Precision Effect.

    PubMed

    Loschelder, David D; Friese, Malte; Schaerer, Michael; Galinsky, Adam D

    2016-12-01

    Past research has suggested a fundamental principle of price precision: The more precise an opening price, the more it anchors counteroffers. The present research challenges this principle by demonstrating a too-much-precision effect. Five experiments (involving 1,320 experts and amateurs in real-estate, jewelry, car, and human-resources negotiations) showed that increasing the precision of an opening offer had positive linear effects for amateurs but inverted-U-shaped effects for experts. Anchor precision backfired because experts saw too much precision as reflecting a lack of competence. This negative effect held unless first movers gave rationales that boosted experts' perception of their competence. Statistical mediation and experimental moderation established the critical role of competence attributions. This research disentangles competing theoretical accounts (attribution of competence vs. scale granularity) and qualifies two putative truisms: that anchors affect experts and amateurs equally, and that more precise prices are linearly more potent anchors. The results refine current theoretical understanding of anchoring and have significant implications for everyday life.

  2. Combined chamber-tower approach: Using eddy covariance measurements to cross-validate carbon fluxes modeled from manual chamber campaigns

    NASA Astrophysics Data System (ADS)

    Brümmer, C.; Moffat, A. M.; Huth, V.; Augustin, J.; Herbst, M.; Kutsch, W. L.

    2016-12-01

    Manual carbon dioxide flux measurements with closed chambers at scheduled campaigns are a versatile method to study management effects at small scales in multiple-plot experiments. The eddy covariance technique has the advantage of quasi-continuous measurements but requires large homogeneous areas of a few hectares. To evaluate the uncertainties associated with interpolating from individual campaigns to the whole vegetation period, we installed both techniques at an agricultural site in Northern Germany. The presented comparison covers two cropping seasons, winter oilseed rape in 2012/13 and winter wheat in 2013/14. Modeling half-hourly carbon fluxes from campaigns is commonly performed based on non-linear regressions for the light response and respiration. The daily averages of net CO2 modeled from chamber data deviated from eddy covariance measurements in the range of ± 5 g C m-2 day-1. To understand the observed differences and to disentangle the effects, we performed four additional setups (expert versus default settings of the non-linear regressions based algorithm, purely empirical modeling with artificial neural networks versus non-linear regressions, cross-validating using eddy covariance measurements as campaign fluxes, weekly versus monthly scheduling of campaigns) to model the half-hourly carbon fluxes for the whole vegetation period. The good agreement of the seasonal course of net CO2 at plot and field scale for our agricultural site demonstrates that both techniques are robust and yield consistent results at seasonal time scale even for a managed ecosystem with high temporal dynamics in the fluxes. This allows combining the respective advantages of factorial experiments at plot scale with dense time series data at field scale. Furthermore, the information from the quasi-continuous eddy covariance measurements can be used to derive vegetation proxies to support the interpolation of carbon fluxes in-between the manual chamber campaigns.

  3. Thermal Expert System (TEXSYS): Systems autonomy demonstration project, volume 2. Results

    NASA Technical Reports Server (NTRS)

    Glass, B. J. (Editor)

    1992-01-01

    The Systems Autonomy Demonstration Project (SADP) produced a knowledge-based real-time control system for control and fault detection, isolation, and recovery (FDIR) of a prototype two-phase Space Station Freedom external active thermal control system (EATCS). The Thermal Expert System (TEXSYS) was demonstrated in recent tests to be capable of reliable fault anticipation and detection, as well as ordinary control of the thermal bus. Performance requirements were addressed by adopting a hierarchical symbolic control approach-layering model-based expert system software on a conventional, numerical data acquisition and control system. The model-based reasoning capabilities of TEXSYS were shown to be advantageous over typical rule-based expert systems, particularly for detection of unforeseen faults and sensor failures. Volume 1 gives a project overview and testing highlights. Volume 2 provides detail on the EATCS testbed, test operations, and online test results. Appendix A is a test archive, while Appendix B is a compendium of design and user manuals for the TEXSYS software.

  4. Thermal Expert System (TEXSYS): Systems automony demonstration project, volume 1. Overview

    NASA Technical Reports Server (NTRS)

    Glass, B. J. (Editor)

    1992-01-01

    The Systems Autonomy Demonstration Project (SADP) produced a knowledge-based real-time control system for control and fault detection, isolation, and recovery (FDIR) of a prototype two-phase Space Station Freedom external active thermal control system (EATCS). The Thermal Expert System (TEXSYS) was demonstrated in recent tests to be capable of reliable fault anticipation and detection, as well as ordinary control of the thermal bus. Performance requirements were addressed by adopting a hierarchical symbolic control approach-layering model-based expert system software on a conventional, numerical data acquisition and control system. The model-based reasoning capabilities of TEXSYS were shown to be advantageous over typical rule-based expert systems, particularly for detection of unforeseen faults and sensor failures. Volume 1 gives a project overview and testing highlights. Volume 2 provides detail on the EATCS test bed, test operations, and online test results. Appendix A is a test archive, while Appendix B is a compendium of design and user manuals for the TEXSYS software.

  5. Thermal Expert System (TEXSYS): Systems autonomy demonstration project, volume 2. Results

    NASA Astrophysics Data System (ADS)

    Glass, B. J.

    1992-10-01

    The Systems Autonomy Demonstration Project (SADP) produced a knowledge-based real-time control system for control and fault detection, isolation, and recovery (FDIR) of a prototype two-phase Space Station Freedom external active thermal control system (EATCS). The Thermal Expert System (TEXSYS) was demonstrated in recent tests to be capable of reliable fault anticipation and detection, as well as ordinary control of the thermal bus. Performance requirements were addressed by adopting a hierarchical symbolic control approach-layering model-based expert system software on a conventional, numerical data acquisition and control system. The model-based reasoning capabilities of TEXSYS were shown to be advantageous over typical rule-based expert systems, particularly for detection of unforeseen faults and sensor failures. Volume 1 gives a project overview and testing highlights. Volume 2 provides detail on the EATCS testbed, test operations, and online test results. Appendix A is a test archive, while Appendix B is a compendium of design and user manuals for the TEXSYS software.

  6. Improving linear transport infrastructure efficiency by automated learning and optimised predictive maintenance techniques (INFRALERT)

    NASA Astrophysics Data System (ADS)

    Jiménez-Redondo, Noemi; Calle-Cordón, Alvaro; Kandler, Ute; Simroth, Axel; Morales, Francisco J.; Reyes, Antonio; Odelius, Johan; Thaduri, Aditya; Morgado, Joao; Duarte, Emmanuele

    2017-09-01

    The on-going H2020 project INFRALERT aims to increase rail and road infrastructure capacity in the current framework of increased transportation demand by developing and deploying solutions to optimise maintenance interventions planning. It includes two real pilots for road and railways infrastructure. INFRALERT develops an ICT platform (the expert-based Infrastructure Management System, eIMS) which follows a modular approach including several expert-based toolkits. This paper presents the methodologies and preliminary results of the toolkits for i) nowcasting and forecasting of asset condition, ii) alert generation, iii) RAMS & LCC analysis and iv) decision support. The results of these toolkits in a meshed road network in Portugal under the jurisdiction of Infraestruturas de Portugal (IP) are presented showing the capabilities of the approaches.

  7. Revised definitions of women's sexual dysfunction.

    PubMed

    Basson, Rosemary; Leiblum, Sandra; Brotto, Lori; Derogatis, Leonard; Fourcroy, Jean; Fugl-Meyer, Kerstin; Graziottin, Alessandra; Heiman, Julia R; Laan, Ellen; Meston, Cindy; Schover, Leslie; van Lankveld, Jacques; Schultz, Willibrord Weijmar

    2004-07-01

    Existing definitions of women's sexual disorders are based mainly on genitally focused events in a linear sequence model (desire, arousal and orgasm). To revise definitions based on an alternative model reflecting women's reasons/incentives for sexual activity beyond any initial awareness of sexual desire. An International Definitions Committee of 13 experts from seven countries repeatedly communicated, proposed new definitions and presented at the 2nd International Consultation on Sexual Medicine in Paris July 2003. Expert opinions/recommendations are based on a process that involved review of evidence-based medical literature, extensive internal committee discussion, informal testing and re-testing of drafted definitions in various clinical settings, public presentation and deliberation. Women have many reasons/incentives for sexual activity. Desire may be experienced once sexual stimuli have triggered arousal. Arousal and desire co-occur and reinforce each other. Women's subjective arousal may be minimally influenced by genital congestion. An absence of desire any time during the sexual experience designates disorder. Arousal disorder subtypes are proposed that separate an absence of subjective arousal from all types of sexual stimulation, from an absence of subjective arousal when the only stimulus is genital. A new arousal disorder has provisionally been suggested, namely that of persistent genital arousal. Orgasm disorder is limited to absence of orgasm despite high subjective arousal. Dyspareunia includes partial painful vaginal entry attempts as well as pain with intercourse. Variable reflex muscle tightening around the vagina and an absence of abnormal physical findings are noted in the definition of vaginismus. Women's sexuality is highly contextual and descriptors are recommended re past psychosexual development, current context, as well as medical status. Diagnosing sexual disorders need not imply intrinsic dysfunction of the woman's own sex response system. The International Definitions Committee has recommended a number of fundamental changes to the existing definitions of women's sexual disorders.

  8. A Multi-Criteria Index for Ecological Evaluation of Tropical Agriculture in Southeastern Mexico

    PubMed Central

    Huerta, Esperanza; Kampichler, Christian; Ochoa-Gaona, Susana; De Jong, Ben; Hernandez-Daumas, Salvador; Geissen, Violette

    2014-01-01

    The aim of this study was to generate an easy to use index to evaluate the ecological state of agricultural land from a sustainability perspective. We selected environmental indicators, such as the use of organic soil amendments (green manure) versus chemical fertilizers, plant biodiversity (including crop associations), variables which characterize soil conservation of conventional agricultural systems, pesticide use, method and frequency of tillage. We monitored the ecological state of 52 agricultural plots to test the performance of the index. The variables were hierarchically aggregated with simple mathematical algorithms, if-then rules, and rule-based fuzzy models, yielding the final multi-criteria index with values from 0 (worst) to 1 (best conditions). We validated the model through independent evaluation by experts, and we obtained a linear regression with an r2 = 0.61 (p = 2.4e-06, d.f. = 49) between index output and the experts’ evaluation. PMID:25405980

  9. User Documentation; POTW EXPERT v1.1; An Advisory System for Improving the Performance of Wastewater Treatment Facilities

    EPA Science Inventory

    POTW Expert is a PCX-based software program modeled after EPA/s Handbook Retrofitting POTWs (EPA-625/6-89/020) (formerly, Handbook for Improving POTW Performance Using the Composite Correction Program Approach). POTW Expert assists POTW owners and operators, state and local regu...

  10. Toward a theory of distributed word expert natural language parsing

    NASA Technical Reports Server (NTRS)

    Rieger, C.; Small, S.

    1981-01-01

    An approach to natural language meaning-based parsing in which the unit of linguistic knowledge is the word rather than the rewrite rule is described. In the word expert parser, knowledge about language is distributed across a population of procedural experts, each representing a word of the language, and each an expert at diagnosing that word's intended usage in context. The parser is structured around a coroutine control environment in which the generator-like word experts ask questions and exchange information in coming to collective agreement on sentence meaning. The word expert theory is advanced as a better cognitive model of human language expertise than the traditional rule-based approach. The technical discussion is organized around examples taken from the prototype LISP system which implements parts of the theory.

  11. Tuberculosis-Diagnostic Expert System: an architecture for translating patients information from the web for use in tuberculosis diagnosis.

    PubMed

    Osamor, Victor C; Azeta, Ambrose A; Ajulo, Oluseyi O

    2014-12-01

    Over 1.5-2 million tuberculosis deaths occur annually. Medical professionals are faced with a lot of challenges in delivering good health-care with unassisted automation in hospitals where there are several patients who need the doctor's attention. To automate the pre-laboratory screening process against tuberculosis infection to aid diagnosis and make it fast and accessible to the public via the Internet. The expert system we have built is designed to also take care of people who do not have access to medical experts, but would want to check their medical status. A rule-based approach has been used, and unified modeling language and the client-server architecture technique were applied to model the system and to develop it as a web-based expert system for tuberculosis diagnosis. Algorithmic rules in the Tuberculosis-Diagnosis Expert System necessitate decision coverage where tuberculosis is either suspected or not suspected. The architecture consists of a rule base, knowledge base, and patient database. These units interact with the inference engine, which receives patient' data through the Internet via a user interface. We present the architecture of the Tuberculosis-Diagnosis Expert System and its implementation. We evaluated it for usability to determine the level of effectiveness, efficiency and user satisfaction. The result of the usability evaluation reveals that the system has a usability of 4.08 out of a scale of 5. This is an indication of a more-than-average system performance. Several existing expert systems have been developed for the purpose of supporting different medical diagnoses, but none is designed to translate tuberculosis patients' symptomatic data for online pre-laboratory screening. Our Tuberculosis-Diagnosis Expert System is an effective solution for the implementation of the needed web-based expert system diagnosis. © The Author(s) 2013.

  12. Characterizing Forest Change Using Community-Based Monitoring Data and Landsat Time Series

    PubMed Central

    DeVries, Ben; Pratihast, Arun Kumar; Verbesselt, Jan; Kooistra, Lammert; Herold, Martin

    2016-01-01

    Increasing awareness of the issue of deforestation and degradation in the tropics has resulted in efforts to monitor forest resources in tropical countries. Advances in satellite-based remote sensing and ground-based technologies have allowed for monitoring of forests with high spatial, temporal and thematic detail. Despite these advances, there is a need to engage communities in monitoring activities and include these stakeholders in national forest monitoring systems. In this study, we analyzed activity data (deforestation and forest degradation) collected by local forest experts over a 3-year period in an Afro-montane forest area in southwestern Ethiopia and corresponding Landsat Time Series (LTS). Local expert data included forest change attributes, geo-location and photo evidence recorded using mobile phones with integrated GPS and photo capabilities. We also assembled LTS using all available data from all spectral bands and a suite of additional indices and temporal metrics based on time series trajectory analysis. We predicted deforestation, degradation or stable forests using random forest models trained with data from local experts and LTS spectral-temporal metrics as model covariates. Resulting models predicted deforestation and degradation with an out of bag (OOB) error estimate of 29% overall, and 26% and 31% for the deforestation and degradation classes, respectively. By dividing the local expert data into training and operational phases corresponding to local monitoring activities, we found that forest change models improved as more local expert data were used. Finally, we produced maps of deforestation and degradation using the most important spectral bands. The results in this study represent some of the first to combine local expert based forest change data and dense LTS, demonstrating the complementary value of both continuous data streams. Our results underpin the utility of both datasets and provide a useful foundation for integrated forest monitoring systems relying on data streams from diverse sources. PMID:27018852

  13. Network approaches for expert decisions in sports.

    PubMed

    Glöckner, Andreas; Heinen, Thomas; Johnson, Joseph G; Raab, Markus

    2012-04-01

    This paper focuses on a model comparison to explain choices based on gaze behavior via simulation procedures. We tested two classes of models, a parallel constraint satisfaction (PCS) artificial neuronal network model and an accumulator model in a handball decision-making task from a lab experiment. Both models predict action in an option-generation task in which options can be chosen from the perspective of a playmaker in handball (i.e., passing to another player or shooting at the goal). Model simulations are based on a dataset of generated options together with gaze behavior measurements from 74 expert handball players for 22 pieces of video footage. We implemented both classes of models as deterministic vs. probabilistic models including and excluding fitted parameters. Results indicated that both classes of models can fit and predict participants' initially generated options based on gaze behavior data, and that overall, the classes of models performed about equally well. Early fixations were thereby particularly predictive for choices. We conclude that the analyses of complex environments via network approaches can be successfully applied to the field of experts' decision making in sports and provide perspectives for further theoretical developments. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Knowledge Engineering as a Component of the Curriculum for Medical Cybernetists.

    PubMed

    Karas, Sergey; Konev, Arthur

    2017-01-01

    According to a new state educational standard, students who have chosen medical cybernetics as their major must develop a knowledge engineering competency. Previously, in the course "Clinical cybernetics" while practicing project-based learning students were designing automated workstations for medical personnel using client-server technology. The purpose of the article is to give insight into the project of a new educational module "Knowledge engineering". Students will acquire expert knowledge by holding interviews and conducting surveys, and then they will formalize it. After that, students will form declarative expert knowledge in a network model and analyze the knowledge graph. Expert decision making methods will be applied in software on the basis of a production model of knowledge. Project implementation will result not only in the development of analytical competencies among students, but also creation of a practically useful expert system based on student models to support medical decisions. Nowadays, this module is being tested in the educational process.

  15. Microcomputer-based classification of environmental data in municipal areas

    NASA Astrophysics Data System (ADS)

    Thiergärtner, H.

    1995-10-01

    Multivariate data-processing methods used in mineral resource identification can be used to classify urban regions. Using elements of expert systems, geographical information systems, as well as known classification and prognosis systems, it is possible to outline a single model that consists of resistant and of temporary parts of a knowledge base including graphical input and output treatment and of resistant and temporary elements of a bank of methods and algorithms. Whereas decision rules created by experts will be stored in expert systems directly, powerful classification rules in form of resistant but latent (implicit) decision algorithms may be implemented in the suggested model. The latent functions will be transformed into temporary explicit decision rules by learning processes depending on the actual task(s), parameter set(s), pixels selection(s), and expert control(s). This takes place both at supervised and nonsupervised classification of multivariately described pixel sets representing municipal subareas. The model is outlined briefly and illustrated by results obtained in a target area covering a part of the city of Berlin (Germany).

  16. Induced seismicity closed-form traffic light system for actuarial decision-making during deep fluid injections.

    PubMed

    Mignan, A; Broccardo, M; Wiemer, S; Giardini, D

    2017-10-19

    The rise in the frequency of anthropogenic earthquakes due to deep fluid injections is posing serious economic, societal, and legal challenges to many geo-energy and waste-disposal projects. Existing tools to assess such problems are still inherently heuristic and mostly based on expert elicitation (so-called clinical judgment). We propose, as a complementary approach, an adaptive traffic light system (ATLS) that is function of a statistical model of induced seismicity. It offers an actuarial judgement of the risk, which is based on a mapping between earthquake magnitude and risk. Using data from six underground reservoir stimulation experiments, mostly from Enhanced Geothermal Systems, we illustrate how such a data-driven adaptive forecasting system could guarantee a risk-based safety target. The proposed model, which includes a linear relationship between seismicity rate and flow rate, as well as a normal diffusion process for post-injection, is first confirmed to be representative of the data. Being integrable, the model yields a closed-form ATLS solution that is both transparent and robust. Although simulations verify that the safety target is consistently ensured when the ATLS is applied, the model from which simulations are generated is validated on a limited dataset, hence still requiring further tests in additional fluid injection environments.

  17. Assessing Security of Supply: Three Methods Used in Finland

    NASA Astrophysics Data System (ADS)

    Sivonen, Hannu

    Public Private Partnership (PPP) has an important role in securing supply in Finland. Three methods are used in assessing the level of security of supply. First, in national expert groups, a linear mathematical model has been used. The model is based on interdependency estimates. It ranks societal functions or its more detailed components, such as items in the food supply chain, according to the effect and risk pertinent to the interdependencies. Second, the security of supply is assessed in industrial branch committees (clusters and pools) in the form of indicators. The level of security of supply is assessed against five generic factors (dimension 1) and tens of business branch specific functions (dimension 2). Third, in two thousand individual critical companies, the maturity of operational continuity management is assessed using Capability Maturity Model (CMM) in an extranet application. The pool committees and authorities obtain an anonymous summary. The assessments are used in allocating efforts for securing supply. The efforts may be new instructions, training, exercising, and in some cases, investment and regulation.

  18. Expert Systems for Libraries at SCIL [Small Computers in Libraries]'88.

    ERIC Educational Resources Information Center

    Kochtanek, Thomas R.; And Others

    1988-01-01

    Six brief papers on expert systems for libraries cover (1) a knowledge-based approach to database design; (2) getting started in expert systems; (3) using public domain software to develop a business reference system; (4) a music cataloging inquiry system; (5) linguistic analysis of reference transactions; and (6) a model of a reference librarian.…

  19. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    USGS Publications Warehouse

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  20. The effects of AVIRIS atmospheric calibration methodology on identification and quantitative mapping of surface mineralogy, Drum Mountains, Utah

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Dwyer, John L.

    1993-01-01

    The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures reflected light in 224 contiguous spectra bands in the 0.4 to 2.45 micron region of the electromagnetic spectrum. Numerous studies have used these data for mineralogic identification and mapping based on the presence of diagnostic spectral features. Quantitative mapping requires conversion of the AVIRIS data to physical units (usually reflectance) so that analysis results can be compared and validated with field and laboratory measurements. This study evaluated two different AVIRIS calibration techniques to ground reflectance: an empirically-based method and an atmospheric model based method to determine their effects on quantitative scientific analyses. Expert system analysis and linear spectral unmixing were applied to both calibrated data sets to determine the effect of the calibration on the mineral identification and quantitative mapping results. Comparison of the image-map results and image reflectance spectra indicate that the model-based calibrated data can be used with automated mapping techniques to produce accurate maps showing the spatial distribution and abundance of surface mineralogy. This has positive implications for future operational mapping using AVIRIS or similar imaging spectrometer data sets without requiring a priori knowledge.

  1. Literally experts: expertise and the processing of analogical metaphors in pharmaceutical advertising.

    PubMed

    Delbaere, Marjorie; Smith, Malcolm C

    2014-01-01

    This research examined differences between novices and experts in processing analogical metaphors appearing in prescription drug advertisements. In contrast to previous studies on knowledge transfer, no evidence of the superiority of experts in processing metaphors was found. The results from an experiment suggest that expert consumers were more likely to process a metaphor in an ad literally than novices. Our findings point to a condition in which the expertise effect with processing analogies is not the linear relationship assumed in previous studies.

  2. Expert systems for automated maintenance of a Mars oxygen production system

    NASA Technical Reports Server (NTRS)

    Ash, Robert L.; Huang, Jen-Kuang; Ho, Ming-Tsang

    1989-01-01

    A prototype expert system was developed for maintaining autonomous operation of a Mars oxygen production system. Normal operation conditions and failure modes according to certain desired criteria are tested and identified. Several schemes for failure detection and isolation using forward chaining, backward chaining, knowledge-based and rule-based are devised to perform several housekeeping functions. These functions include self-health checkout, an emergency shut down program, fault detection and conventional control activities. An effort was made to derive the dynamic model of the system using Bond-Graph technique in order to develop the model-based failure detection and isolation scheme by estimation method. Finally, computer simulations and experimental results demonstrated the feasibility of the expert system and a preliminary reliability analysis for the oxygen production system is also provided.

  3. A comprehensive information technology system to support physician learning at the point of care.

    PubMed

    Cook, David A; Sorensen, Kristi J; Nishimura, Rick A; Ommen, Steve R; Lloyd, Farrell J

    2015-01-01

    MayoExpert is a multifaceted information system integrated with the electronic medical record (EMR) across Mayo Clinic's multisite health system. It was developed as a technology-based solution to manage information, standardize clinical practice, and promote and document learning in clinical contexts. Features include urgent test result notifications; models illustrating expert-approved care processes; concise, expert-approved answers to frequently asked questions (FAQs); a directory of topic-specific experts; and a portfolio for provider licensure and credentialing. The authors evaluate MayoExpert's reach, effectiveness, adoption, implementation, and maintenance. Evaluation data sources included usage statistics, user surveys, and pilot studies.As of October 2013, MayoExpert was available at 94 clinical sites in 12 states and contained 1,368 clinical topics, answers to 7,640 FAQs, and 92 care process models. In 2012, MayoExpert was accessed at least once by 2,578/3,643 (71%) staff physicians, 900/1,374 (66%) midlevel providers, and 1,728/2,291 (75%) residents and fellows. In a 2013 survey of MayoExpert users with 536 respondents, all features were highly rated (≥67% favorable). More providers reported using MayoExpert to answer questions before/after than during patient visits (68% versus 36%). During November 2012 to April 2013, MayoExpert sent 1,660 notifications of new-onset atrial fibrillation and 1,590 notifications of prolonged QT. MayoExpert has become part of routine clinical and educational operations, and its care process models now define Mayo Clinic best practices. MayoExpert's infrastructure and content will continue to expand with improved templates and content organization, new care process models, additional notifications, better EMR integration, and improved support for credentialing activities.

  4. Engaging communication experts in a Delphi process to identify patient behaviors that could enhance communication in medical encounters

    PubMed Central

    2010-01-01

    Background The communication literature currently focuses primarily on improving physicians' verbal and non-verbal behaviors during the medical interview. The Four Habits Model is a teaching and research framework for physician communication that is based on evidence linking specific communication behaviors with processes and outcomes of care. The Model conceptualizes basic communication tasks as "Habits" and describes the sequence of physician communication behaviors during the clinical encounter associated with improved outcomes. Using the Four Habits Model as a starting point, we asked communication experts to identify the verbal communication behaviors of patients that are important in outpatient encounters. Methods We conducted a 4-round Delphi process with 17 international experts in communication research, medical education, and health care delivery. All rounds were conducted via the internet. In round 1, experts reviewed a list of proposed patient verbal communication behaviors within the Four Habits Model framework. The proposed patient verbal communication behaviors were identified based on a review of the communication literature. The experts could: approve the proposed list; add new behaviors; or modify behaviors. In rounds 2, 3, and 4, they rated each behavior for its fit (agree or disagree) with a particular habit. After each round, we calculated the percent agreement for each behavior and provided these data in the next round. Behaviors receiving more than 70% of experts' votes (either agree or disagree) were considered as achieving consensus. Results Of the 14 originally-proposed patient verbal communication behaviors, the experts modified all but 2, and they added 20 behaviors to the Model in round 1. In round 2, they were presented with 59 behaviors and 14 options to remove specific behaviors for rating. After 3 rounds of rating, the experts retained 22 behaviors. This set included behaviors such as asking questions, expressing preferences, and summarizing information. Conclusion The process identified communication tasks and verbal communication behaviors for patients similar to those outlined for physicians in the Four Habits Model. This represents an important step in building a single model that can be applied to teaching patients and physicians the communication skills associated with improved satisfaction and positive outcomes of care. PMID:20403173

  5. Development of experimental design approach and ANN-based models for determination of Cr(VI) ions uptake rate from aqueous solution onto the solid biodiesel waste residue.

    PubMed

    Shanmugaprakash, M; Sivakumar, V

    2013-11-01

    In the present work, the evaluation capacities of two optimization methodologies such as RSM and ANN were employed and compared for predication of Cr(VI) uptake rate using defatted pongamia oil cake (DPOC) in both batch and column mode. The influence of operating parameters was investigated through a central composite design (CCD) of RSM using Design Expert 8.0.7.1 software. The same data was fed as input in ANN to obtain a trained the multilayer feed-forward networks back-propagation algorithm using MATLAB. The performance of the developed ANN models were compared with RSM mathematical models for Cr(VI) uptake rate in terms of the coefficient of determination (R(2)), root mean square error (RMSE) and absolute average deviation (AAD). The estimated values confirm that ANN predominates RSM representing the superiority of a trained ANN models over RSM models in order to capture the non-linear behavior of the given system. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Mapping three-dimensional geological features from remotely-sensed images and digital elevation models

    NASA Astrophysics Data System (ADS)

    Morris, Kevin Peter

    Accurate mapping of geological structures is important in numerous applications, ranging from mineral exploration through to hydrogeological modelling. Remotely sensed data can provide synoptic views of study areas enabling mapping of geological units within the area. Structural information may be derived from such data using standard manual photo-geologic interpretation techniques, although these are often inaccurate and incomplete. The aim of this thesis is, therefore, to compile a suite of automated and interactive computer-based analysis routines, designed to help a the user map geological structure. These are examined and integrated in the context of an expert system. The data used in this study include Digital Elevation Model (DEM) and Airborne Thematic Mapper images, both with a spatial resolution of 5m, for a 5 x 5 km area surrounding Llyn Cow lyd, Snowdonia, North Wales. The geology of this area comprises folded and faulted Ordo vician sediments intruded throughout by dolerite sills, providing a stringent test for the automated and semi-automated procedures. The DEM is used to highlight geomorphological features which may represent surface expressions of the sub-surface geology. The DEM is created from digitized contours, for which kriging is found to provide the best interpolation routine, based on a number of quantitative measures. Lambertian shading and the creation of slope and change of slope datasets are shown to provide the most successful enhancement of DEMs, in terms of highlighting a range of key geomorphological features. The digital image data are used to identify rock outcrops as well as lithologically controlled features in the land cover. To this end, a series of standard spectral enhancements of the images is examined. In this respect, the least correlated 3 band composite and a principal component composite are shown to give the best visual discrimination of geological and vegetation cover types. Automatic edge detection (followed by line thinning and extraction) and manual interpretation techniques are used to identify a set of 'geological primitives' (linear or arc features representing lithological boundaries) within these data. Inclusion of the DEM data provides the three-dimensional co-ordinates of these primitives enabling a least-squares fit to be employed to calculate dip and strike values, based, initially, on the assumption of a simple, linearly dipping structural model. A very large number of scene 'primitives' is identified using these procedures, only some of which have geological significance. Knowledge-based rules are therefore used to identify the relevant. For example, rules are developed to identify lake edges, forest boundaries, forest tracks, rock-vegetation boundaries, and areas of geomorphological interest. Confidence in the geological significance of some of the geological primitives is increased where they are found independently in both the DEM and remotely sensed data. The dip and strike values derived in this way are compared to information taken from the published geological map for this area, as well as measurements taken in the field. Many results are shown to correspond closely to those taken from the map and in the field, with an error of < 1°. These data and rules are incorporated into an expert system which, initially, produces a simple model of the geological structure. The system also provides a graphical user interface for manual control and interpretation, where necessary. Although the system currently only allows a relatively simple structural model (linearly dipping with faulting), in the future it will be possible to extend the system to model more complex features, such as anticlines, synclines, thrusts, nappes, and igneous intrusions.

  7. Distributed Web-Based Expert System for Launch Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar

    2005-01-01

    The simulation and modeling of launch operations is based on a representation of the organization of the operations suitable to experiment of the physical, procedural, software, hardware and psychological aspects of space flight operations. The virtual test bed consists of a weather expert system to advice on the effect of weather to the launch operations. It also simulates toxic gas dispersion model, and the risk impact on human health. Since all modeling and simulation is based on the internet, it could reduce the cost of operations of launch and range safety by conducting extensive research before a particular launch. Each model has an independent decision making module to derive the best decision for launch.

  8. The SF3M approach to 3-D photo-reconstruction for non-expert users: application to a gully network

    NASA Astrophysics Data System (ADS)

    Castillo, C.; James, M. R.; Redel-Macías, M. D.; Pérez, R.; Gómez, J. A.

    2015-04-01

    3-D photo-reconstruction (PR) techniques have been successfully used to produce high resolution elevation models for different applications and over different spatial scales. However, innovative approaches are required to overcome some limitations that this technique may present in challenging scenarios. Here, we evaluate SF3M, a new graphical user interface for implementing a complete PR workflow based on freely available software (including external calls to VisualSFM and CloudCompare), in combination with a low-cost survey design for the reconstruction of a several-hundred-meters-long gully network. SF3M provided a semi-automated workflow for 3-D reconstruction requiring ~ 49 h (of which only 17% required operator assistance) for obtaining a final gully network model of > 17 million points over a gully plan area of 4230 m2. We show that a walking itinerary along the gully perimeter using two light-weight automatic cameras (1 s time-lapse mode) and a 6 m-long pole is an efficient method for 3-D monitoring of gullies, at a low cost (about EUR 1000 budget for the field equipment) and time requirements (~ 90 min for image collection). A mean error of 6.9 cm at the ground control points was found, mainly due to model deformations derived from the linear geometry of the gully and residual errors in camera calibration. The straightforward image collection and processing approach can be of great benefit for non-expert users working on gully erosion assessment.

  9. Proceedings of the international conference on cybernetics and societ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1985-01-01

    This book presents the papers given at a conference on artificial intelligence, expert systems and knowledge bases. Topics considered at the conference included automating expert system development, modeling expert systems, causal maps, data covariances, robot vision, image processing, multiprocessors, parallel processing, VLSI structures, man-machine systems, human factors engineering, cognitive decision analysis, natural language, computerized control systems, and cybernetics.

  10. ICADS: A cooperative decision making model with CLIPS experts

    NASA Technical Reports Server (NTRS)

    Pohl, Jens; Myers, Leonard

    1991-01-01

    A cooperative decision making model is described which is comprised of six concurrently executing domain experts coordinated by a blackboard control expert. The focus application field is architectural design, and the domain experts represent consultants in the area of daylighting, noise control, structural support, cost estimating, space planning, and climate responsiveness. Both the domain experts and the blackboard were implemented as production systems, using an enhanced version of the basic CLIPS package. Acting in unison as an Expert Design Advisor, the domain and control experts react to the evolving design solution progressively developed by the user in a 2-D CAD drawing environment. A Geometry Interpreter maps each drawing action taken by the user to real world objects, such as spaces, walls, windows, and doors. These objects, endowed with geometric and nongeometric attributes, are stored as frames in a semantic network. Object descriptions are derived partly from the geometry of the drawing environment and partly from knowledge bases containing prototypical, generalized information about the building type and site conditions under consideration.

  11. A Cognitive Architecture for Human Performance Process Model Research

    DTIC Science & Technology

    1992-11-01

    individually defined, updatable world representation which is a description of the world as the operator knows it. It contains rules for decisions, an...operate it), and rules of engagement (knowledge about the operator’s expected behavior). The HPP model works in the following way. Information enters...based models depict the problem-solving processes of experts. The experts’ knowledge is represented in symbol structures, along with rules for

  12. Impact of Linear Programming on Computer Development.

    DTIC Science & Technology

    1985-06-01

    soon see. It all really began when Dal Hitchcock, an advisor to General Rawlings , the Air Comptroller, and Marshall Wood, an expert on military...unifying principles . Of course, I thought first to try to adapt the Leontief Input-Output Model. But Marshall and I also talked about certain...still with the Ford Motor Company. I told him about my presentation to General Rawlings on the possibility of a "program Integrator" for planning and

  13. Designing an Agent-Based Model Using Group Model Building: Application to Food Insecurity Patterns in a U.S. Midwestern Metropolitan City.

    PubMed

    Koh, Keumseok; Reno, Rebecca; Hyder, Ayaz

    2018-04-01

    Recent advances in computing resources have increased interest in systems modeling and population health. While group model building (GMB) has been effectively applied in developing system dynamics models (SD), few studies have used GMB for developing an agent-based model (ABM). This article explores the use of a GMB approach to develop an ABM focused on food insecurity. In our GMB workshops, we modified a set of the standard GMB scripts to develop and validate an ABM in collaboration with local experts and stakeholders. Based on this experience, we learned that GMB is a useful collaborative modeling platform for modelers and community experts to address local population health issues. We also provide suggestions for increasing the use of the GMB approach to develop rigorous, useful, and validated ABMs.

  14. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW, were designed to automate functions and decisions associated with a combat aircraft's subsystem. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base, and to assess the cooperation between the rule-bases. Each AUTOCREW subsystem is composed of several expert systems that perform specific tasks. AUTOCREW's NAVIGATOR was analyzed in detail to understand the difficulties involved in designing the system and to identify tools and methodologies that ease development. The NAVIGATOR determines optimal navigation strategies from a set of available sensors. A Navigation Sensor Management (NSM) expert system was systematically designed from Kalman filter covariance data; four ground-based, a satellite-based, and two on-board INS-aiding sensors were modeled and simulated to aid an INS. The NSM Expert was developed using the Analysis of Variance (ANOVA) and the ID3 algorithm. Navigation strategy selection is based on an RSS position error decision metric, which is computed from the covariance data. Results show that the NSM Expert predicts position error correctly between 45 and 100 percent of the time for a specified navaid configuration and aircraft trajectory. The NSM Expert adapts to new situations, and provides reasonable estimates of hybrid performance. The systematic nature of the ANOVA/ID3 method makes it broadly applicable to expert system design when experimental or simulation data is available.

  15. Interictal epileptiform discharge characteristics underlying expert interrater agreement.

    PubMed

    Bagheri, Elham; Dauwels, Justin; Dean, Brian C; Waters, Chad G; Westover, M Brandon; Halford, Jonathan J

    2017-10-01

    The presence of interictal epileptiform discharges (IED) in the electroencephalogram (EEG) is a key finding in the medical workup of a patient with suspected epilepsy. However, inter-rater agreement (IRA) regarding the presence of IED is imperfect, leading to incorrect and delayed diagnoses. An improved understanding of which IED attributes mediate expert IRA might help in developing automatic methods for IED detection able to emulate the abilities of experts. Therefore, using a set of IED scored by a large number of experts, we set out to determine which attributes of IED predict expert agreement regarding the presence of IED. IED were annotated on a 5-point scale by 18 clinical neurophysiologists within 200 30-s EEG segments from recordings of 200 patients. 5538 signal analysis features were extracted from the waveforms, including wavelet coefficients, morphological features, signal energy, nonlinear energy operator response, electrode location, and spectrogram features. Feature selection was performed by applying elastic net regression and support vector regression (SVR) was applied to predict expert opinion, with and without the feature selection procedure and with and without several types of signal normalization. Multiple types of features were useful for predicting expert annotations, but particular types of wavelet features performed best. Local EEG normalization also enhanced best model performance. As the size of the group of EEGers used to train the models was increased, the performance of the models leveled off at a group size of around 11. The features that best predict inter-rater agreement among experts regarding the presence of IED are wavelet features, using locally standardized EEG. Our models for predicting expert opinion based on EEGer's scores perform best with a large group of EEGers (more than 10). By examining a large group of EEG signal analysis features we found that wavelet features with certain wavelet basis functions performed best to identify IEDs. Local normalization also improves predictability, suggesting the importance of IED morphology over amplitude-based features. Although most IED detection studies in the past have used opinion from three or fewer experts, our study suggests a "wisdom of the crowd" effect, such that pooling over a larger number of expert opinions produces a better correlation between expert opinion and objectively quantifiable features of the EEG. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  16. An SSME High Pressure Oxidizer Turbopump diagnostic system using G2 real-time expert system

    NASA Technical Reports Server (NTRS)

    Guo, Ten-Huei

    1991-01-01

    An expert system which diagnoses various seal leakage faults in the High Pressure Oxidizer Turbopump of the SSME was developed using G2 real-time expert system. Three major functions of the software were implemented: model-based data generation, real-time expert system reasoning, and real-time input/output communication. This system is proposed as one module of a complete diagnostic system for the SSME. Diagnosis of a fault is defined as the determination of its type, severity, and likelihood. Since fault diagnosis is often accomplished through the use of heuristic human knowledge, an expert system based approach has been adopted as a paradigm to develop this diagnostic system. To implement this approach, a software shell which can be easily programmed to emulate the human decision process, the G2 Real-Time Expert System, was selected. Lessons learned from this implementation are discussed.

  17. An SSME high pressure oxidizer turbopump diagnostic system using G2(TM) real-time expert system

    NASA Technical Reports Server (NTRS)

    Guo, Ten-Huei

    1991-01-01

    An expert system which diagnoses various seal leakage faults in the High Pressure Oxidizer Turbopump of the SSME was developed using G2(TM) real-time expert system. Three major functions of the software were implemented: model-based data generation, real-time expert system reasoning, and real-time input/output communication. This system is proposed as one module of a complete diagnostic system for Space Shuttle Main Engine. Diagnosis of a fault is defined as the determination of its type, severity, and likelihood. Since fault diagnosis is often accomplished through the use of heuristic human knowledge, an expert system based approach was adopted as a paradigm to develop this diagnostic system. To implement this approach, a software shell which can be easily programmed to emulate the human decision process, the G2 Real-Time Expert System, was selected. Lessons learned from this implementation are discussed.

  18. A computationally efficient method for incorporating spike waveform information into decoding algorithms.

    PubMed

    Ventura, Valérie; Todorova, Sonia

    2015-05-01

    Spike-based brain-computer interfaces (BCIs) have the potential to restore motor ability to people with paralysis and amputation, and have shown impressive performance in the lab. To transition BCI devices from the lab to the clinic, decoding must proceed automatically and in real time, which prohibits the use of algorithms that are computationally intensive or require manual tweaking. A common choice is to avoid spike sorting and treat the signal on each electrode as if it came from a single neuron, which is fast, easy, and therefore desirable for clinical use. But this approach ignores the kinematic information provided by individual neurons recorded on the same electrode. The contribution of this letter is a linear decoding model that extracts kinematic information from individual neurons without spike-sorting the electrode signals. The method relies on modeling sample averages of waveform features as functions of kinematics, which is automatic and requires minimal data storage and computation. In offline reconstruction of arm trajectories of a nonhuman primate performing reaching tasks, the proposed method performs as well as decoders based on expertly manually and automatically sorted spikes.

  19. Assessing animal welfare in sow herds using data on meat inspection, medication and mortality.

    PubMed

    Knage-Rasmussen, K M; Rousing, T; Sørensen, J T; Houe, H

    2015-03-01

    This paper aims to contribute to the development of a cost-effective alternative to expensive on-farm animal-based welfare assessment systems. The objective of the study was to design an animal welfare index based on central database information (DBWI), and to validate it against an animal welfare index based on-farm animal-based measurements (AWI). Data on 63 Danish sow herds with herd-sizes of 80 to 2500 sows and an average herd size of 501 were collected from three central databases containing: Meat inspection data collected at animal level in the abattoir, mortality data at herd level from the rendering plants of DAKA, and medicine records at both herd and animal group level (sow with piglets, weaners or finishers) from the central database Vetstat. Selected measurements taken from these central databases were used to construct the DBWI. The relative welfare impacts of both individual database measurements and the databases overall were assigned in consultation with a panel consisting of 12 experts. The experts were drawn from production advisory activities, animal science and in one case an animal welfare organization. The expert panel weighted each measurement on a scale from 1 (not-important) to 5 (very important). The experts also gave opinions on the relative weightings of measurements for each of the three databases by stating a relative weight of each database in the DBWI. On the basis of this, the aggregated DBWI was normalized. The aggregation of AWI was based on weighted summary of herd prevalence's of 20 clinical and behavioural measurements originating from a 1 day data collection. AWI did not show linear dependency of DBWI. This suggests that DBWI is not suited to replace an animal welfare index using on-farm animal-based measurements.

  20. [Study on Information Extraction of Clinic Expert Information from Hospital Portals].

    PubMed

    Zhang, Yuanpeng; Dong, Jiancheng; Qian, Danmin; Geng, Xingyun; Wu, Huiqun; Wang, Li

    2015-12-01

    Clinic expert information provides important references for residents in need of hospital care. Usually, such information is hidden in the deep web and cannot be directly indexed by search engines. To extract clinic expert information from the deep web, the first challenge is to make a judgment on forms. This paper proposes a novel method based on a domain model, which is a tree structure constructed by the attributes of search interfaces. With this model, search interfaces can be classified to a domain and filled in with domain keywords. Another challenge is to extract information from the returned web pages indexed by search interfaces. To filter the noise information on a web page, a block importance model is proposed. The experiment results indicated that the domain model yielded a precision 10.83% higher than that of the rule-based method, whereas the block importance model yielded an F₁ measure 10.5% higher than that of the XPath method.

  1. Fire risk in San Diego County, California: A weighted Bayesian model approach

    USGS Publications Warehouse

    Kolden, Crystal A.; Weigel, Timothy J.

    2007-01-01

    Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.

  2. Expert operator's associate: A knowledge based system for spacecraft control

    NASA Technical Reports Server (NTRS)

    Nielsen, Mogens; Grue, Klaus; Lecouat, Francois

    1991-01-01

    The Expert Operator's Associate (EOA) project is presented which studies the applicability of expert systems for day-to-day space operations. A prototype expert system is developed, which operates on-line with an existing spacecraft control system at the European Space Operations Centre, and functions as an 'operator's assistant' in controlling satellites. The prototype is demonstrated using an existing real-time simulation model of the MARECS-B2 telecommunication satellite. By developing a prototype system, the extent to which reliability and effectivens of operations can be enhanced by AI based support is examined. In addition the study examines the questions of acquisition and representation of the 'knowledge' for such systems, and the feasibility of 'migration' of some (currently) ground-based functions into future spaceborne autonomous systems.

  3. Traffic simulation in regional modeling : application to interstate infrastructure near the Toledo sea port.

    DOT National Transportation Integrated Search

    2012-06-01

    A small team of university-based transportation system experts and simulation experts has been : assembled to develop, test, and apply an approach to assessing road infrastructure capacity using : micro traffic simulation supported by publically avai...

  4. Intrusion Detection Systems with Live Knowledge System

    DTIC Science & Technology

    2016-05-31

    Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR, which is a machine-learning based RDR...propose novel approach that uses Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR...detection model by applying Induct RDR approach. The proposed induct RDR ( Ripple Down Rules) approach allows to acquire the phishing detection

  5. A center for commercial development of space: Real-time satellite mapping. Remote sensing-based agricultural information expert system

    NASA Technical Reports Server (NTRS)

    Hadipriono, Fabian C.; Diaz, Carlos F.; Merritt, Earl S.

    1989-01-01

    The research project results in a powerful yet user friendly CROPCAST expert system for use by a client to determine the crop yield production of a certain crop field. The study is based on the facts that heuristic assessment and decision making in agriculture are significant and dominate much of agribusiness. Transfer of the expert knowledge concerning remote sensing based crop yield production into a specific expert system is the key program in this study. A knowledge base consisting of a root frame, CROP-YIELD-FORECAST, and four subframes, namely, SATELLITE, PLANT-PHYSIOLOGY, GROUND, and MODEL were developed to accommodate the production rules obtained from the domain expert. The expert system shell Personal Consultant Plus version 4.0. was used for this purpose. An external geographic program was integrated to the system. This project is the first part of a completely built expert system. The study reveals that much effort was given to the development of the rules. Such effort is inevitable if workable, efficient, and accurate rules are desired. Furthermore, abundant help statements and graphics were included. Internal and external display routines add to the visual capability of the system. The work results in a useful tool for the client for making decisions on crop yield production.

  6. A Thermal Expert System (TEXSYS) development overview - AI-based control of a Space Station prototype thermal bus

    NASA Technical Reports Server (NTRS)

    Glass, B. J.; Hack, E. C.

    1990-01-01

    A knowledge-based control system for real-time control and fault detection, isolation and recovery (FDIR) of a prototype two-phase Space Station Freedom external thermal control system (TCS) is discussed in this paper. The Thermal Expert System (TEXSYS) has been demonstrated in recent tests to be capable of both fault anticipation and detection and real-time control of the thermal bus. Performance requirements were achieved by using a symbolic control approach, layering model-based expert system software on a conventional numerical data acquisition and control system. The model-based capabilities of TEXSYS were shown to be advantageous during software development and testing. One representative example is given from on-line TCS tests of TEXSYS. The integration and testing of TEXSYS with a live TCS testbed provides some insight on the use of formal software design, development and documentation methodologies to qualify knowledge-based systems for on-line or flight applications.

  7. The blackboard model - A framework for integrating multiple cooperating expert systems

    NASA Technical Reports Server (NTRS)

    Erickson, W. K.

    1985-01-01

    The use of an artificial intelligence (AI) architecture known as the blackboard model is examined as a framework for designing and building distributed systems requiring the integration of multiple cooperating expert systems (MCXS). Aerospace vehicles provide many examples of potential systems, ranging from commercial and military aircraft to spacecraft such as satellites, the Space Shuttle, and the Space Station. One such system, free-flying, spaceborne telerobots to be used in construction, servicing, inspection, and repair tasks around NASA's Space Station, is examined. The major difficulties found in designing and integrating the individual expert system components necessary to implement such a robot are outlined. The blackboard model, a general expert system architecture which seems to address many of the problems found in designing and building such a system, is discussed. A progress report on a prototype system under development called DBB (Distributed BlackBoard model) is given. The prototype will act as a testbed for investigating the feasibility, utility, and efficiency of MCXS-based designs developed under the blackboard model.

  8. Predicting Mycobacterium tuberculosis Complex Clades Using Knowledge-Based Bayesian Networks

    PubMed Central

    Bennett, Kristin P.

    2014-01-01

    We develop a novel approach for incorporating expert rules into Bayesian networks for classification of Mycobacterium tuberculosis complex (MTBC) clades. The proposed knowledge-based Bayesian network (KBBN) treats sets of expert rules as prior distributions on the classes. Unlike prior knowledge-based support vector machine approaches which require rules expressed as polyhedral sets, KBBN directly incorporates the rules without any modification. KBBN uses data to refine rule-based classifiers when the rule set is incomplete or ambiguous. We develop a predictive KBBN model for 69 MTBC clades found in the SITVIT international collection. We validate the approach using two testbeds that model knowledge of the MTBC obtained from two different experts and large DNA fingerprint databases to predict MTBC genetic clades and sublineages. These models represent strains of MTBC using high-throughput biomarkers called spacer oligonucleotide types (spoligotypes), since these are routinely gathered from MTBC isolates of tuberculosis (TB) patients. Results show that incorporating rules into problems can drastically increase classification accuracy if data alone are insufficient. The SITVIT KBBN is publicly available for use on the World Wide Web. PMID:24864238

  9. Expert Maintenance Advisor Development for Navy Shipboard Systems

    DTIC Science & Technology

    1994-01-01

    Estoril (EDEN) Chair: Xavier Alaman, Instituto de Ingenieria del Conocimiento, SPAIN "A Model of Handling Uncertainty in Expert Systems," 01 Zhao...for Supervisory Process Control," Xavier Alaman, Instituto de Ingenieria del Conocimiento, SPAIN - (L) INTEGRATED KNOWLEDGE BASED SYSTEMS IN POWER

  10. Clinic expert information extraction based on domain model and block importance model.

    PubMed

    Zhang, Yuanpeng; Wang, Li; Qian, Danmin; Geng, Xingyun; Yao, Dengfu; Dong, Jiancheng

    2015-11-01

    To extract expert clinic information from the Deep Web, there are two challenges to face. The first one is to make a judgment on forms. A novel method based on a domain model, which is a tree structure constructed by the attributes of query interfaces is proposed. With this model, query interfaces can be classified to a domain and filled in with domain keywords. Another challenge is to extract information from response Web pages indexed by query interfaces. To filter the noisy information on a Web page, a block importance model is proposed, both content and spatial features are taken into account in this model. The experimental results indicate that the domain model yields a precision 4.89% higher than that of the rule-based method, whereas the block importance model yields an F1 measure 10.5% higher than that of the XPath method. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. A new scoring system in Cystic Fibrosis: statistical tools for database analysis - a preliminary report.

    PubMed

    Hafen, G M; Hurst, C; Yearwood, J; Smith, J; Dzalilov, Z; Robinson, P J

    2008-10-05

    Cystic fibrosis is the most common fatal genetic disorder in the Caucasian population. Scoring systems for assessment of Cystic fibrosis disease severity have been used for almost 50 years, without being adapted to the milder phenotype of the disease in the 21st century. The aim of this current project is to develop a new scoring system using a database and employing various statistical tools. This study protocol reports the development of the statistical tools in order to create such a scoring system. The evaluation is based on the Cystic Fibrosis database from the cohort at the Royal Children's Hospital in Melbourne. Initially, unsupervised clustering of the all data records was performed using a range of clustering algorithms. In particular incremental clustering algorithms were used. The clusters obtained were characterised using rules from decision trees and the results examined by clinicians. In order to obtain a clearer definition of classes expert opinion of each individual's clinical severity was sought. After data preparation including expert-opinion of an individual's clinical severity on a 3 point-scale (mild, moderate and severe disease), two multivariate techniques were used throughout the analysis to establish a method that would have a better success in feature selection and model derivation: 'Canonical Analysis of Principal Coordinates' and 'Linear Discriminant Analysis'. A 3-step procedure was performed with (1) selection of features, (2) extracting 5 severity classes out of a 3 severity class as defined per expert-opinion and (3) establishment of calibration datasets. (1) Feature selection: CAP has a more effective "modelling" focus than DA.(2) Extraction of 5 severity classes: after variables were identified as important in discriminating contiguous CF severity groups on the 3-point scale as mild/moderate and moderate/severe, Discriminant Function (DF) was used to determine the new groups mild, intermediate moderate, moderate, intermediate severe and severe disease. (3) Generated confusion tables showed a misclassification rate of 19.1% for males and 16.5% for females, with a majority of misallocations into adjacent severity classes particularly for males. Our preliminary data show that using CAP for detection of selection features and Linear DA to derive the actual model in a CF database might be helpful in developing a scoring system. However, there are several limitations, particularly more data entry points are needed to finalize a score and the statistical tools have further to be refined and validated, with re-running the statistical methods in the larger dataset.

  12. Development and validation of a sensor- and expert model-based training system for laparoscopic surgery: the iSurgeon.

    PubMed

    Kowalewski, Karl-Friedrich; Hendrie, Jonathan D; Schmidt, Mona W; Garrow, Carly R; Bruckner, Thomas; Proctor, Tanja; Paul, Sai; Adigüzel, Davud; Bodenstedt, Sebastian; Erben, Andreas; Kenngott, Hannes; Erben, Young; Speidel, Stefanie; Müller-Stich, Beat P; Nickel, Felix

    2017-05-01

    Training and assessment outside of the operating room is crucial for minimally invasive surgery due to steep learning curves. Thus, we have developed and validated the sensor- and expert model-based laparoscopic training system, the iSurgeon. Participants of different experience levels (novice, intermediate, expert) performed four standardized laparoscopic knots. Instruments and surgeons' joint motions were tracked with an NDI Polaris camera and Microsoft Kinect v1. With frame-by-frame image analysis, the key steps of suturing and knot tying were identified and registered with motion data. Construct validity, concurrent validity, and test-retest reliability were analyzed. The Objective Structured Assessment of Technical Skills (OSATS) was used as the gold standard for concurrent validity. The system showed construct validity by discrimination between experience levels by parameters such as time (novice = 442.9 ± 238.5 s; intermediate = 190.1 ± 50.3 s; expert = 115.1 ± 29.1 s; p < 0.001), total path length (novice = 18,817 ± 10318 mm; intermediate = 9995 ± 3286 mm; expert = 7265 ± 2232 mm; p < 0.001), average speed (novice = 42.9 ± 8.3 mm/s; intermediate = 52.7 ± 11.2 mm/s; expert = 63.6 ± 12.9 mm/s; p < 0.001), angular path (novice = 20,573 ± 12,611°; intermediate = 8652 ± 2692°; expert = 5654 ± 1746°; p < 0.001), number of movements (novice = 2197 ± 1405; intermediate = 987 ± 367; expert = 743 ± 238; p < 0.001), number of movements per second (novice = 5.0 ± 1.4; intermediate = 5.2 ± 1.5; expert = 6.6 ± 1.6; p = 0.025), and joint angle range (for different axes and joints all p < 0.001). Concurrent validity of OSATS and iSurgeon parameters was established. Test-retest reliability was given for 7 out of 8 parameters. The key steps "wrapping the thread around the instrument" and "needle positioning" were most difficult to learn. Validity and reliability of the self-developed sensor-and expert model-based laparoscopic training system "iSurgeon" were established. Using multiple parameters proved more reliable than single metric parameters. Wrapping of the needle around the thread and needle positioning were identified as difficult key steps for laparoscopic suturing and knot tying. The iSurgeon could generate automated real-time feedback based on expert models which may result in shorter learning curves for laparoscopic tasks. Our next steps will be the implementation and evaluation of full procedural training in an experimental model.

  13. Elicitation of neurological knowledge with argument-based machine learning.

    PubMed

    Groznik, Vida; Guid, Matej; Sadikov, Aleksander; Možina, Martin; Georgiev, Dejan; Kragelj, Veronika; Ribarič, Samo; Pirtošek, Zvezdan; Bratko, Ivan

    2013-02-01

    The paper describes the use of expert's knowledge in practice and the efficiency of a recently developed technique called argument-based machine learning (ABML) in the knowledge elicitation process. We are developing a neurological decision support system to help the neurologists differentiate between three types of tremors: Parkinsonian, essential, and mixed tremor (comorbidity). The system is intended to act as a second opinion for the neurologists, and most importantly to help them reduce the number of patients in the "gray area" that require a very costly further examination (DaTSCAN). We strive to elicit comprehensible and medically meaningful knowledge in such a way that it does not come at the cost of diagnostic accuracy. To alleviate the difficult problem of knowledge elicitation from data and domain experts, we used ABML. ABML guides the expert to explain critical special cases which cannot be handled automatically by machine learning. This very efficiently reduces the expert's workload, and combines expert's knowledge with learning data. 122 patients were enrolled into the study. The classification accuracy of the final model was 91%. Equally important, the initial and the final models were also evaluated for their comprehensibility by the neurologists. All 13 rules of the final model were deemed as appropriate to be able to support its decisions with good explanations. The paper demonstrates ABML's advantage in combining machine learning and expert knowledge. The accuracy of the system is very high with respect to the current state-of-the-art in clinical practice, and the system's knowledge base is assessed to be very consistent from a medical point of view. This opens up the possibility to use the system also as a teaching tool. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. External validation of EPIWIN biodegradation models.

    PubMed

    Posthumus, R; Traas, T P; Peijnenburg, W J G M; Hulzebos, E M

    2005-01-01

    The BIOWIN biodegradation models were evaluated for their suitability for regulatory purposes. BIOWIN includes the linear and non-linear BIODEG and MITI models for estimating the probability of rapid aerobic biodegradation and an expert survey model for primary and ultimate biodegradation estimation. Experimental biodegradation data for 110 newly notified substances were compared with the estimations of the different models. The models were applied separately and in combinations to determine which model(s) showed the best performance. The results of this study were compared with the results of other validation studies and other biodegradation models. The BIOWIN models predict not-readily biodegradable substances with high accuracy in contrast to ready biodegradability. In view of the high environmental concern of persistent chemicals and in view of the large number of not-readily biodegradable chemicals compared to the readily ones, a model is preferred that gives a minimum of false positives without a corresponding high percentage false negatives. A combination of the BIOWIN models (BIOWIN2 or BIOWIN6) showed the highest predictive value for not-readily biodegradability. However, the highest score for overall predictivity with lowest percentage false predictions was achieved by applying BIOWIN3 (pass level 2.75) and BIOWIN6.

  15. Comparing Habitat Suitability and Connectivity Modeling Methods for Conserving Pronghorn Migrations

    PubMed Central

    Poor, Erin E.; Loucks, Colby; Jakes, Andrew; Urban, Dean L.

    2012-01-01

    Terrestrial long-distance migrations are declining globally: in North America, nearly 75% have been lost. Yet there has been limited research comparing habitat suitability and connectivity models to identify migration corridors across increasingly fragmented landscapes. Here we use pronghorn (Antilocapra americana) migrations in prairie habitat to compare two types of models that identify habitat suitability: maximum entropy (Maxent) and expert-based (Analytic Hierarchy Process). We used distance to wells, distance to water, NDVI, land cover, distance to roads, terrain shape and fence presence to parameterize the models. We then used the output of these models as cost surfaces to compare two common connectivity models, least-cost modeling (LCM) and circuit theory. Using pronghorn movement data from spring and fall migrations, we identified potential migration corridors by combining each habitat suitability model with each connectivity model. The best performing model combination was Maxent with LCM corridors across both seasons. Maxent out-performed expert-based habitat suitability models for both spring and fall migrations. However, expert-based corridors can perform relatively well and are a cost-effective alternative if species location data are unavailable. Corridors created using LCM out-performed circuit theory, as measured by the number of pronghorn GPS locations present within the corridors. We suggest the use of a tiered approach using different corridor widths for prioritizing conservation and mitigation actions, such as fence removal or conservation easements. PMID:23166656

  16. Comparing habitat suitability and connectivity modeling methods for conserving pronghorn migrations.

    PubMed

    Poor, Erin E; Loucks, Colby; Jakes, Andrew; Urban, Dean L

    2012-01-01

    Terrestrial long-distance migrations are declining globally: in North America, nearly 75% have been lost. Yet there has been limited research comparing habitat suitability and connectivity models to identify migration corridors across increasingly fragmented landscapes. Here we use pronghorn (Antilocapra americana) migrations in prairie habitat to compare two types of models that identify habitat suitability: maximum entropy (Maxent) and expert-based (Analytic Hierarchy Process). We used distance to wells, distance to water, NDVI, land cover, distance to roads, terrain shape and fence presence to parameterize the models. We then used the output of these models as cost surfaces to compare two common connectivity models, least-cost modeling (LCM) and circuit theory. Using pronghorn movement data from spring and fall migrations, we identified potential migration corridors by combining each habitat suitability model with each connectivity model. The best performing model combination was Maxent with LCM corridors across both seasons. Maxent out-performed expert-based habitat suitability models for both spring and fall migrations. However, expert-based corridors can perform relatively well and are a cost-effective alternative if species location data are unavailable. Corridors created using LCM out-performed circuit theory, as measured by the number of pronghorn GPS locations present within the corridors. We suggest the use of a tiered approach using different corridor widths for prioritizing conservation and mitigation actions, such as fence removal or conservation easements.

  17. Extension of TOPAS for the simulation of proton radiation effects considering molecular and cellular endpoints

    NASA Astrophysics Data System (ADS)

    Polster, Lisa; Schuemann, Jan; Rinaldi, Ilaria; Burigo, Lucas; McNamara, Aimee L.; Stewart, Robert D.; Attili, Andrea; Carlson, David J.; Sato, Tatsuhiko; Ramos Méndez, José; Faddegon, Bruce; Perl, Joseph; Paganetti, Harald

    2015-07-01

    The aim of this work is to extend a widely used proton Monte Carlo tool, TOPAS, towards the modeling of relative biological effect (RBE) distributions in experimental arrangements as well as patients. TOPAS provides a software core which users configure by writing parameter files to, for instance, define application specific geometries and scoring conditions. Expert users may further extend TOPAS scoring capabilities by plugging in their own additional C++ code. This structure was utilized for the implementation of eight biophysical models suited to calculate proton RBE. As far as physics parameters are concerned, four of these models are based on the proton linear energy transfer, while the others are based on DNA double strand break induction and the frequency-mean specific energy, lineal energy, or delta electron generated track structure. The biological input parameters for all models are typically inferred from fits of the models to radiobiological experiments. The model structures have been implemented in a coherent way within the TOPAS architecture. Their performance was validated against measured experimental data on proton RBE in a spread-out Bragg peak using V79 Chinese Hamster cells. This work is an important step in bringing biologically optimized treatment planning for proton therapy closer to the clinical practice as it will allow researchers to refine and compare pre-defined as well as user-defined models.

  18. Extension of TOPAS for the simulation of proton radiation effects considering molecular and cellular endpoints

    PubMed Central

    Polster, Lisa; Schuemann, Jan; Rinaldi, Ilaria; Burigo, Lucas; McNamara, Aimee L.; Stewart, Robert D.; Attili, Andrea; Carlson, David J.; Sato, Tatsuhiko; Méndez, José Ramos; Faddegon, Bruce; Perl, Joseph; Paganetti, Harald

    2015-01-01

    The aim of this work is to extend a widely used proton Monte Carlo tool, TOPAS, towards the modeling of relative biological effect (RBE) distributions in experimental arrangements as well as patients. TOPAS provides a software core which users configure by writing parameter files to, for instance, define application specific geometries and scoring conditions. Expert users may further extend TOPAS scoring capabilities by plugging in their own additional C++ code. This structure was utilized for the implementation of eight biophysical models suited to calculate proton RBE. As far as physics parameters are concerned, four of these models are based on the proton linear energy transfer (LET), while the others are based on DNA Double Strand Break (DSB) induction and the frequency-mean specific energy, lineal energy, or delta electron generated track structure. The biological input parameters for all models are typically inferred from fits of the models to radiobiological experiments. The model structures have been implemented in a coherent way within the TOPAS architecture. Their performance was validated against measured experimental data on proton RBE in a spread-out Bragg peak using V79 Chinese Hamster cells. This work is an important step in bringing biologically optimized treatment planning for proton therapy closer to the clinical practice as it will allow researchers to refine and compare pre-defined as well as user-defined models. PMID:26061666

  19. Using Hierarchical Cluster Models to Systematically Identify Groups of Jobs With Similar Occupational Questionnaire Response Patterns to Assist Rule-Based Expert Exposure Assessment in Population-Based Studies

    PubMed Central

    Friesen, Melissa C.; Shortreed, Susan M.; Wheeler, David C.; Burstyn, Igor; Vermeulen, Roel; Pronk, Anjoeka; Colt, Joanne S.; Baris, Dalsu; Karagas, Margaret R.; Schwenn, Molly; Johnson, Alison; Armenti, Karla R.; Silverman, Debra T.; Yu, Kai

    2015-01-01

    Objectives: Rule-based expert exposure assessment based on questionnaire response patterns in population-based studies improves the transparency of the decisions. The number of unique response patterns, however, can be nearly equal to the number of jobs. An expert may reduce the number of patterns that need assessment using expert opinion, but each expert may identify different patterns of responses that identify an exposure scenario. Here, hierarchical clustering methods are proposed as a systematic data reduction step to reproducibly identify similar questionnaire response patterns prior to obtaining expert estimates. As a proof-of-concept, we used hierarchical clustering methods to identify groups of jobs (clusters) with similar responses to diesel exhaust-related questions and then evaluated whether the jobs within a cluster had similar (previously assessed) estimates of occupational diesel exhaust exposure. Methods: Using the New England Bladder Cancer Study as a case study, we applied hierarchical cluster models to the diesel-related variables extracted from the occupational history and job- and industry-specific questionnaires (modules). Cluster models were separately developed for two subsets: (i) 5395 jobs with ≥1 variable extracted from the occupational history indicating a potential diesel exposure scenario, but without a module with diesel-related questions; and (ii) 5929 jobs with both occupational history and module responses to diesel-relevant questions. For each subset, we varied the numbers of clusters extracted from the cluster tree developed for each model from 100 to 1000 groups of jobs. Using previously made estimates of the probability (ordinal), intensity (µg m−3 respirable elemental carbon), and frequency (hours per week) of occupational exposure to diesel exhaust, we examined the similarity of the exposure estimates for jobs within the same cluster in two ways. First, the clusters’ homogeneity (defined as >75% with the same estimate) was examined compared to a dichotomized probability estimate (<5 versus ≥5%; <50 versus ≥50%). Second, for the ordinal probability metric and continuous intensity and frequency metrics, we calculated the intraclass correlation coefficients (ICCs) between each job’s estimate and the mean estimate for all jobs within the cluster. Results: Within-cluster homogeneity increased when more clusters were used. For example, ≥80% of the clusters were homogeneous when 500 clusters were used. Similarly, ICCs were generally above 0.7 when ≥200 clusters were used, indicating minimal within-cluster variability. The most within-cluster variability was observed for the frequency metric (ICCs from 0.4 to 0.8). We estimated that using an expert to assign exposure at the cluster-level assignment and then to review each job in non-homogeneous clusters would require ~2000 decisions per expert, in contrast to evaluating 4255 unique questionnaire patterns or 14983 individual jobs. Conclusions: This proof-of-concept shows that using cluster models as a data reduction step to identify jobs with similar response patterns prior to obtaining expert ratings has the potential to aid rule-based assessment by systematically reducing the number of exposure decisions needed. While promising, additional research is needed to quantify the actual reduction in exposure decisions and the resulting homogeneity of exposure estimates within clusters for an exposure assessment effort that obtains cluster-level expert assessments as part of the assessment process. PMID:25477475

  20. Using hierarchical cluster models to systematically identify groups of jobs with similar occupational questionnaire response patterns to assist rule-based expert exposure assessment in population-based studies.

    PubMed

    Friesen, Melissa C; Shortreed, Susan M; Wheeler, David C; Burstyn, Igor; Vermeulen, Roel; Pronk, Anjoeka; Colt, Joanne S; Baris, Dalsu; Karagas, Margaret R; Schwenn, Molly; Johnson, Alison; Armenti, Karla R; Silverman, Debra T; Yu, Kai

    2015-05-01

    Rule-based expert exposure assessment based on questionnaire response patterns in population-based studies improves the transparency of the decisions. The number of unique response patterns, however, can be nearly equal to the number of jobs. An expert may reduce the number of patterns that need assessment using expert opinion, but each expert may identify different patterns of responses that identify an exposure scenario. Here, hierarchical clustering methods are proposed as a systematic data reduction step to reproducibly identify similar questionnaire response patterns prior to obtaining expert estimates. As a proof-of-concept, we used hierarchical clustering methods to identify groups of jobs (clusters) with similar responses to diesel exhaust-related questions and then evaluated whether the jobs within a cluster had similar (previously assessed) estimates of occupational diesel exhaust exposure. Using the New England Bladder Cancer Study as a case study, we applied hierarchical cluster models to the diesel-related variables extracted from the occupational history and job- and industry-specific questionnaires (modules). Cluster models were separately developed for two subsets: (i) 5395 jobs with ≥1 variable extracted from the occupational history indicating a potential diesel exposure scenario, but without a module with diesel-related questions; and (ii) 5929 jobs with both occupational history and module responses to diesel-relevant questions. For each subset, we varied the numbers of clusters extracted from the cluster tree developed for each model from 100 to 1000 groups of jobs. Using previously made estimates of the probability (ordinal), intensity (µg m(-3) respirable elemental carbon), and frequency (hours per week) of occupational exposure to diesel exhaust, we examined the similarity of the exposure estimates for jobs within the same cluster in two ways. First, the clusters' homogeneity (defined as >75% with the same estimate) was examined compared to a dichotomized probability estimate (<5 versus ≥5%; <50 versus ≥50%). Second, for the ordinal probability metric and continuous intensity and frequency metrics, we calculated the intraclass correlation coefficients (ICCs) between each job's estimate and the mean estimate for all jobs within the cluster. Within-cluster homogeneity increased when more clusters were used. For example, ≥80% of the clusters were homogeneous when 500 clusters were used. Similarly, ICCs were generally above 0.7 when ≥200 clusters were used, indicating minimal within-cluster variability. The most within-cluster variability was observed for the frequency metric (ICCs from 0.4 to 0.8). We estimated that using an expert to assign exposure at the cluster-level assignment and then to review each job in non-homogeneous clusters would require ~2000 decisions per expert, in contrast to evaluating 4255 unique questionnaire patterns or 14983 individual jobs. This proof-of-concept shows that using cluster models as a data reduction step to identify jobs with similar response patterns prior to obtaining expert ratings has the potential to aid rule-based assessment by systematically reducing the number of exposure decisions needed. While promising, additional research is needed to quantify the actual reduction in exposure decisions and the resulting homogeneity of exposure estimates within clusters for an exposure assessment effort that obtains cluster-level expert assessments as part of the assessment process. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2014.

  1. A Knowledge Navigation Method for the Domain of Customers' Services of Mobile Communication Corporations in China

    NASA Astrophysics Data System (ADS)

    Wu, Jiangning; Wang, Xiaohuan

    Rapidly increasing amount of mobile phone users and types of services leads to a great accumulation of complaining information. How to use this information to enhance the quality of customers' services is a big issue at present. To handle this kind of problem, the paper presents an approach to construct a domain knowledge map for navigating the explicit and tacit knowledge in two ways: building the Topic Map-based explicit knowledge navigation model, which includes domain TM construction, a semantic topic expansion algorithm and VSM-based similarity calculation; building Social Network Analysis-based tacit knowledge navigation model, which includes a multi-relational expert navigation algorithm and the criterions to evaluate the performance of expert networks. In doing so, both the customer managers and operators in call centers can find the appropriate knowledge and experts quickly and exactly. The experimental results show that the above method is very powerful for knowledge navigation.

  2. Linear and nonlinear regression techniques for simultaneous and proportional myoelectric control.

    PubMed

    Hahne, J M; Biessmann, F; Jiang, N; Rehbaum, H; Farina, D; Meinecke, F C; Muller, K-R; Parra, L C

    2014-03-01

    In recent years the number of active controllable joints in electrically powered hand-prostheses has increased significantly. However, the control strategies for these devices in current clinical use are inadequate as they require separate and sequential control of each degree-of-freedom (DoF). In this study we systematically compare linear and nonlinear regression techniques for an independent, simultaneous and proportional myoelectric control of wrist movements with two DoF. These techniques include linear regression, mixture of linear experts (ME), multilayer-perceptron, and kernel ridge regression (KRR). They are investigated offline with electro-myographic signals acquired from ten able-bodied subjects and one person with congenital upper limb deficiency. The control accuracy is reported as a function of the number of electrodes and the amount and diversity of training data providing guidance for the requirements in clinical practice. The results showed that KRR, a nonparametric statistical learning method, outperformed the other methods. However, simple transformations in the feature space could linearize the problem, so that linear models could achieve similar performance as KRR at much lower computational costs. Especially ME, a physiologically inspired extension of linear regression represents a promising candidate for the next generation of prosthetic devices.

  3. Proposed Models of Appropriate Website and Courseware for E-Learning in Higher Education: Research Based Design Models

    ERIC Educational Resources Information Center

    Khlaisang, Jintavee

    2010-01-01

    The purpose of this study was to investigate proper website and courseware for e-learning in higher education. Methods used in this study included the data collection, the analysis surveys, the experts' in-depth interview, and the experts' focus group. Results indicated that there were 16 components for website, as well as 16 components for…

  4. Development of multimedia learning based inquiry on vibration and wave material

    NASA Astrophysics Data System (ADS)

    Madeali, H.; Prahani, B. K.

    2018-03-01

    This study aims to develop multimedia learning based inquiry that is interesting, easy to understand by students and streamline the time of teachers in bringing the teaching materials as well as feasible to be used in learning the physics subject matter of vibration and wave. This research is a Research and Development research with reference to ADDIE model that is Analysis, Design, Development, Implementation, and Evaluation. Multimedia based learning inquiry is packaged in hypertext form using Adobe Flash CS6 Software. The inquiry aspect is constructed by showing the animation of the concepts that the student wants to achieve and then followed by questions that will ask the students what is observable. Multimedia learning based inquiry is then validated by 2 learning experts, 3 material experts and 3 media experts and tested on 3 junior high school teachers and 23 students of state junior high school 5 of Kendari. The results of the study include: (1) Validation results by learning experts, material experts and media experts in valid categories; (2) The results of trials by teachers and students fall into the practical category. These results prove that the multimedia learning based inquiry on vibration and waves materials that have been developed feasible use in physics learning by students of junior high school class VIII.

  5. Discussion of “Bayesian design of experiments for industrial and scientific applications via gaussian processes”

    DOE PAGES

    Anderson-Cook, Christine M.; Burke, Sarah E.

    2016-10-18

    First, we would like to commend Dr. Woods on his thought-provoking paper and insightful presentation at the 4th Annual Stu Hunter conference. We think that the material presented highlights some important needs in the area of design of experiments for generalized linear models (GLMs). In addition, we agree with Dr. Woods that design of experiements of GLMs does implicitly require expert judgement about model parameters, and hence using a Bayesian approach to capture this knowledge is a natural strategy to summarize what is known with the opportunity to incorporate associated uncertainty about that information.

  6. Discussion of “Bayesian design of experiments for industrial and scientific applications via gaussian processes”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson-Cook, Christine M.; Burke, Sarah E.

    First, we would like to commend Dr. Woods on his thought-provoking paper and insightful presentation at the 4th Annual Stu Hunter conference. We think that the material presented highlights some important needs in the area of design of experiments for generalized linear models (GLMs). In addition, we agree with Dr. Woods that design of experiements of GLMs does implicitly require expert judgement about model parameters, and hence using a Bayesian approach to capture this knowledge is a natural strategy to summarize what is known with the opportunity to incorporate associated uncertainty about that information.

  7. Model authoring system for fail safe analysis

    NASA Technical Reports Server (NTRS)

    Sikora, Scott E.

    1990-01-01

    The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.

  8. First study of the evolution of the SeDeM expert system parameters based on percolation theory: Monitoring of their critical behavior.

    PubMed

    Galdón, Eduardo; Casas, Marta; Gayango, Manuel; Caraballo, Isidoro

    2016-12-01

    The deep understanding of products and processes has become a requirement for pharmaceutical industries to follow the Quality by Design principles promoted by the regulatory authorities. With this aim, SeDeM expert system was developed as a useful preformulation tool to predict the likelihood to process drugs and excipients through direct compression. SeDeM system is a step forward in the rational development of a formulation, allowing the normalisation of the rheological parameters and the identification of the weaknesses and strengths of a powder or a powder blend. However, this method is based on the assumption of a linear behavior of disordered systems. As percolation theory has demonstrated, powder blends behave as non-linear systems that can suffer abrupt changes in their properties near to geometrical phase transitions of the components. The aim of this paper was to analyze for the first time the evolution of the SeDeM parameters in drug/excipient powder blends from the point of view of the percolation theory and to compare the changes predicted by SeDeM with the predictions of Percolation theory. For this purpose, powder blends of lactose and theophylline with varying concentrations of the model drug have been prepared and the SeDeM analysis has been applied to each blend in order to monitor the evolution of their properties. On the other hand, percolation thresholds have been estimated for these powder blends where critical points have been found for important rheological parameters as the powder flow. Finally, the predictions of percolation theory and SeDeM have been compared concluding that percolation theory can complement the SeDeM method for a more accurate estimation of the Design Space. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Development of a dynamic framework to explain population patterns of leisure-time physical activity through agent-based modeling.

    PubMed

    Garcia, Leandro M T; Diez Roux, Ana V; Martins, André C R; Yang, Yong; Florindo, Alex A

    2017-08-22

    Despite the increasing body of evidences on the factors influencing leisure-time physical activity, our understanding of the mechanisms and interactions that lead to the formation and evolution of population patterns is still limited. Moreover, most frameworks in this field fail to capture dynamic processes. Our aim was to create a dynamic conceptual model depicting the interaction between key psychological attributes of individuals and main aspects of the built and social environments in which they live. This conceptual model will inform and support the development of an agent-based model aimed to explore how population patterns of LTPA in adults may emerge from the dynamic interplay between psychological traits and built and social environments. We integrated existing theories and models as well as available empirical data (both from literature reviews), and expert opinions (based on a systematic expert assessment of an intermediary version of the model). The model explicitly presents intention as the proximal determinant of leisure-time physical activity, a relationship dynamically moderated by the built environment (access, quality, and available activities) - with the strength of the moderation varying as a function of the person's intention- and influenced both by the social environment (proximal network's and community's behavior) and the person's behavior. Our conceptual model is well supported by evidence and experts' opinions and will inform the design of our agent-based model, as well as data collection and analysis of future investigations on population patterns of leisure-time physical activity among adults.

  10. Design And Ground Testing For The Expert PL4/PL5 'Natural And Roughness Induced Transition'

    NASA Astrophysics Data System (ADS)

    Masutti, Davie; Chazot, Olivier; Donelli, Raffaele; de Rosa, Donato

    2011-05-01

    Unpredicted boundary layer transition can impact dramatically the stability of the vehicle, its aerodynamic coefficients and reduce the efficiency of the thermal protection system. In this frame, ESA started the EXPERT (European eXPErimental Reentry Testbed) program to pro- vide and perform in-flight experiments in order to obtain aerothermodynamic data for the validation of numerical models and of ground-to-flight extrapolation methodologies. Considering the boundary layer transition investigation, the EXPERT vehicle is equipped with two specific payloads, PL4 and PL5, concerning respectively the study of the natural and roughness induced transition. The paper is a survey on the design process of these two in-flight experiments and it covers the major analyses and findings encountered during the development of the payloads. A large amount of transition criteria have been investigated and used to estimate either the dangerousness of the height of the distributed roughness, arising due to nose erosion, or the effectiveness of height of the isolated roughness element forcing the boundary layer transition. Supporting the PL4 design, linear stability computations and CFD analyses have been performed by CIRA on the EXPERT flight vehicle to determine the amplification factor of the boundary layer instabilities at different point of the re-entry trajectory. Ground test experiments regarding the PL5 are carried on in the Mach 6 VKI H3 Hypersonic Wind Tunnel with a Reynolds numbers ranging from 18E6/m to 26E6/m. Infrared measurements (Stanton number) and flow visualization are used on a 1/16 scaled model of the EXPERT vehicle and a flat plate to validate the Potter and Whitfield criterion as a suitable methodology for ground-to-flight extrapolation and the payload design.

  11. Hierarchical mixture of experts and diagnostic modeling approach to reduce hydrologic model structural uncertainty: STRUCTURAL UNCERTAINTY DIAGNOSTICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moges, Edom; Demissie, Yonas; Li, Hong-Yi

    2016-04-01

    In most water resources applications, a single model structure might be inadequate to capture the dynamic multi-scale interactions among different hydrological processes. Calibrating single models for dynamic catchments, where multiple dominant processes exist, can result in displacement of errors from structure to parameters, which in turn leads to over-correction and biased predictions. An alternative to a single model structure is to develop local expert structures that are effective in representing the dominant components of the hydrologic process and adaptively integrate them based on an indicator variable. In this study, the Hierarchical Mixture of Experts (HME) framework is applied to integratemore » expert model structures representing the different components of the hydrologic process. Various signature diagnostic analyses are used to assess the presence of multiple dominant processes and the adequacy of a single model, as well as to identify the structures of the expert models. The approaches are applied for two distinct catchments, the Guadalupe River (Texas) and the French Broad River (North Carolina) from the Model Parameter Estimation Experiment (MOPEX), using different structures of the HBV model. The results show that the HME approach has a better performance over the single model for the Guadalupe catchment, where multiple dominant processes are witnessed through diagnostic measures. Whereas, the diagnostics and aggregated performance measures prove that French Broad has a homogeneous catchment response, making the single model adequate to capture the response.« less

  12. Quantitative investigation of inappropriate regression model construction and the importance of medical statistics experts in observational medical research: a cross-sectional study.

    PubMed

    Nojima, Masanori; Tokunaga, Mutsumi; Nagamura, Fumitaka

    2018-05-05

    To investigate under what circumstances inappropriate use of 'multivariate analysis' is likely to occur and to identify the population that needs more support with medical statistics. The frequency of inappropriate regression model construction in multivariate analysis and related factors were investigated in observational medical research publications. The inappropriate algorithm of using only variables that were significant in univariate analysis was estimated to occur at 6.4% (95% CI 4.8% to 8.5%). This was observed in 1.1% of the publications with a medical statistics expert (hereinafter 'expert') as the first author, 3.5% if an expert was included as coauthor and in 12.2% if experts were not involved. In the publications where the number of cases was 50 or less and the study did not include experts, inappropriate algorithm usage was observed with a high proportion of 20.2%. The OR of the involvement of experts for this outcome was 0.28 (95% CI 0.15 to 0.53). A further, nation-level, analysis showed that the involvement of experts and the implementation of unfavourable multivariate analysis are associated at the nation-level analysis (R=-0.652). Based on the results of this study, the benefit of participation of medical statistics experts is obvious. Experts should be involved for proper confounding adjustment and interpretation of statistical models. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Evolving Expert Knowledge Bases: Applications of Crowdsourcing and Serious Gaming to Advance Knowledge Development for Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Floryan, Mark

    2013-01-01

    This dissertation presents a novel effort to develop ITS technologies that adapt by observing student behavior. In particular, we define an evolving expert knowledge base (EEKB) that structures a domain's information as a set of nodes and the relationships that exist between those nodes. The structure of this model is not the particularly novel…

  14. SF3M software: 3-D photo-reconstruction for non-expert users and its application to a gully network

    NASA Astrophysics Data System (ADS)

    Castillo, C.; James, M. R.; Redel-Macías, M. D.; Pérez, R.; Gómez, J. A.

    2015-08-01

    Three-dimensional photo-reconstruction (PR) techniques have been successfully used to produce high-resolution surface models for different applications and over different spatial scales. However, innovative approaches are required to overcome some limitations that this technique may present for field image acquisition in challenging scene geometries. Here, we evaluate SF3M, a new graphical user interface for implementing a complete PR workflow based on freely available software (including external calls to VisualSFM and CloudCompare), in combination with a low-cost survey design for the reconstruction of a several-hundred-metres-long gully network. SF3M provided a semi-automated workflow for 3-D reconstruction requiring ~ 49 h (of which only 17 % required operator assistance) for obtaining a final gully network model of > 17 million points over a gully plan area of 4230 m2. We show that a walking itinerary along the gully perimeter using two lightweight automatic cameras (1 s time-lapse mode) and a 6 m long pole is an efficient method for 3-D monitoring of gullies, at a low cost (~ EUR 1000 budget for the field equipment) and the time requirements (~ 90 min for image collection). A mean error of 6.9 cm at the ground control points was found, mainly due to model deformations derived from the linear geometry of the gully and residual errors in camera calibration. The straightforward image collection and processing approach can be of great benefit for non-expert users working on gully erosion assessment.

  15. A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base

    NASA Technical Reports Server (NTRS)

    Kautzmann, Frank N., III

    1988-01-01

    Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.

  16. Evaluation of Visual Field Progression in Glaucoma: Quasar Regression Program and Event Analysis.

    PubMed

    Díaz-Alemán, Valentín T; González-Hernández, Marta; Perera-Sanz, Daniel; Armas-Domínguez, Karintia

    2016-01-01

    To determine the sensitivity, specificity and agreement between the Quasar program, glaucoma progression analysis (GPA II) event analysis and expert opinion in the detection of glaucomatous progression. The Quasar program is based on linear regression analysis of both mean defect (MD) and pattern standard deviation (PSD). Each series of visual fields was evaluated by three methods; Quasar, GPA II and four experts. The sensitivity, specificity and agreement (kappa) for each method was calculated, using expert opinion as the reference standard. The study included 439 SITA Standard visual fields of 56 eyes of 42 patients, with a mean of 7.8 ± 0.8 visual fields per eye. When suspected cases of progression were considered stable, sensitivity and specificity of Quasar, GPA II and the experts were 86.6% and 70.7%, 26.6% and 95.1%, and 86.6% and 92.6% respectively. When suspected cases of progression were considered as progressing, sensitivity and specificity of Quasar, GPA II and the experts were 79.1% and 81.2%, 45.8% and 90.6%, and 85.4% and 90.6% respectively. The agreement between Quasar and GPA II when suspected cases were considered stable or progressing was 0.03 and 0.28 respectively. The degree of agreement between Quasar and the experts when suspected cases were considered stable or progressing was 0.472 and 0.507. The degree of agreement between GPA II and the experts when suspected cases were considered stable or progressing was 0.262 and 0.342. The combination of MD and PSD regression analysis in the Quasar program showed better agreement with the experts and higher sensitivity than GPA II.

  17. Eliciting expert opinion for economic models: an applied example.

    PubMed

    Leal, José; Wordsworth, Sarah; Legood, Rosa; Blair, Edward

    2007-01-01

    Expert opinion is considered as a legitimate source of information for decision-analytic modeling where required data are unavailable. Our objective was to develop a practical computer-based tool for eliciting expert opinion about the shape of the uncertainty distribution around individual model parameters. We first developed a prepilot survey with departmental colleagues to test a number of alternative approaches to eliciting opinions on the shape of the uncertainty distribution around individual parameters. This information was used to develop a survey instrument for an applied clinical example. This involved eliciting opinions from experts to inform a number of parameters involving Bernoulli processes in an economic model evaluating DNA testing for families with a genetic disease, hypertrophic cardiomyopathy. The experts were cardiologists, clinical geneticists, and laboratory scientists working with cardiomyopathy patient populations and DNA testing. Our initial prepilot work suggested that the more complex elicitation techniques advocated in the literature were difficult to use in practice. In contrast, our approach achieved a reasonable response rate (50%), provided logical answers, and was generally rated as easy to use by respondents. The computer software user interface permitted graphical feedback throughout the elicitation process. The distributions obtained were incorporated into the model, enabling the use of probabilistic sensitivity analysis. There is clearly a gap in the literature between theoretical elicitation techniques and tools that can be used in applied decision-analytic models. The results of this methodological study are potentially valuable for other decision analysts deriving expert opinion.

  18. Expertise facilitates the transfer of anticipation skill across domains.

    PubMed

    Rosalie, Simon M; Müller, Sean

    2014-02-01

    It is unclear whether perceptual-motor skill transfer is based upon similarity between the learning and transfer domains per identical elements theory, or facilitated by an understanding of underlying principles in accordance with general principle theory. Here, the predictions of identical elements theory, general principle theory, and aspects of a recently proposed model for the transfer of perceptual-motor skill with respect to expertise in the learning and transfer domains are examined. The capabilities of expert karate athletes, near-expert karate athletes, and novices to anticipate and respond to stimulus skills derived from taekwondo and Australian football were investigated in ecologically valid contexts using an in situ temporal occlusion paradigm and complex whole-body perceptual-motor skills. Results indicated that the karate experts and near-experts are as capable of using visual information to anticipate and guide motor skill responses as domain experts and near-experts in the taekwondo transfer domain, but only karate experts could perform like domain experts in the Australian football transfer domain. Findings suggest that transfer of anticipation skill is based upon expertise and an understanding of principles but may be supplemented by similarities that exist between the stimulus and response elements of the learning and transfer domains.

  19. To sort or not to sort: the impact of spike-sorting on neural decoding performance.

    PubMed

    Todorova, Sonia; Sadtler, Patrick; Batista, Aaron; Chase, Steven; Ventura, Valérie

    2014-10-01

    Brain-computer interfaces (BCIs) are a promising technology for restoring motor ability to paralyzed patients. Spiking-based BCIs have successfully been used in clinical trials to control multi-degree-of-freedom robotic devices. Current implementations of these devices require a lengthy spike-sorting step, which is an obstacle to moving this technology from the lab to the clinic. A viable alternative is to avoid spike-sorting, treating all threshold crossings of the voltage waveform on an electrode as coming from one putative neuron. It is not known, however, how much decoding information might be lost by ignoring spike identity. We present a full analysis of the effects of spike-sorting schemes on decoding performance. Specifically, we compare how well two common decoders, the optimal linear estimator and the Kalman filter, reconstruct the arm movements of non-human primates performing reaching tasks, when receiving input from various sorting schemes. The schemes we tested included: using threshold crossings without spike-sorting; expert-sorting discarding the noise; expert-sorting, including the noise as if it were another neuron; and automatic spike-sorting using waveform features. We also decoded from a joint statistical model for the waveforms and tuning curves, which does not involve an explicit spike-sorting step. Discarding the threshold crossings that cannot be assigned to neurons degrades decoding: no spikes should be discarded. Decoding based on spike-sorted units outperforms decoding based on electrodes voltage crossings: spike-sorting is useful. The four waveform based spike-sorting methods tested here yield similar decoding efficiencies: a fast and simple method is competitive. Decoding using the joint waveform and tuning model shows promise but is not consistently superior. Our results indicate that simple automated spike-sorting performs as well as the more computationally or manually intensive methods used here. Even basic spike-sorting adds value to the low-threshold waveform-crossing methods often employed in BCI decoding.

  20. To sort or not to sort: the impact of spike-sorting on neural decoding performance

    NASA Astrophysics Data System (ADS)

    Todorova, Sonia; Sadtler, Patrick; Batista, Aaron; Chase, Steven; Ventura, Valérie

    2014-10-01

    Objective. Brain-computer interfaces (BCIs) are a promising technology for restoring motor ability to paralyzed patients. Spiking-based BCIs have successfully been used in clinical trials to control multi-degree-of-freedom robotic devices. Current implementations of these devices require a lengthy spike-sorting step, which is an obstacle to moving this technology from the lab to the clinic. A viable alternative is to avoid spike-sorting, treating all threshold crossings of the voltage waveform on an electrode as coming from one putative neuron. It is not known, however, how much decoding information might be lost by ignoring spike identity. Approach. We present a full analysis of the effects of spike-sorting schemes on decoding performance. Specifically, we compare how well two common decoders, the optimal linear estimator and the Kalman filter, reconstruct the arm movements of non-human primates performing reaching tasks, when receiving input from various sorting schemes. The schemes we tested included: using threshold crossings without spike-sorting; expert-sorting discarding the noise; expert-sorting, including the noise as if it were another neuron; and automatic spike-sorting using waveform features. We also decoded from a joint statistical model for the waveforms and tuning curves, which does not involve an explicit spike-sorting step. Main results. Discarding the threshold crossings that cannot be assigned to neurons degrades decoding: no spikes should be discarded. Decoding based on spike-sorted units outperforms decoding based on electrodes voltage crossings: spike-sorting is useful. The four waveform based spike-sorting methods tested here yield similar decoding efficiencies: a fast and simple method is competitive. Decoding using the joint waveform and tuning model shows promise but is not consistently superior. Significance. Our results indicate that simple automated spike-sorting performs as well as the more computationally or manually intensive methods used here. Even basic spike-sorting adds value to the low-threshold waveform-crossing methods often employed in BCI decoding.

  1. A microanalytic study of self-regulated learning processes of expert, non-expert, and at-risk science students

    NASA Astrophysics Data System (ADS)

    Dibenedetto, Maria K.

    2009-12-01

    The present investigation sought to examine differences in the self-regulated learning processes and beliefs of students who vary in their level of expertise in science and to investigate if there are gender differences. Participants were 51 ethnically diverse 11th grade students from three parochial high schools consisting of 34 females and 17 males. Students were grouped as either expert, non-expert, or at-risk based on the school's classification. Students were provided with a short passage on tornados to read and study. The two achievement measures obtained were the Tornado Knowledge Test : ten short-answer questions and the Conceptual Model Test : a question which required the students to draw and describe the three sequential images of tornado development from the textual description of the three phases. A microanalytic methodology was used which consists of asking a series of questions aimed at assessing students' psychological behaviors, feelings, and thoughts in each of Zimmerman's three phases of self-regulation: forethought, performance, and reflection. These questions were asked of the students while they were engaged in learning. Two additional measures were obtained: the Rating Student Self-Regulated Learning Outcomes: A Teacher Scale (RSSRL) and the Self-Efficacy for Self-Regulated Learning (SELF). Analysis of variance, chi square analysis, and post hoc test results showed significant expertise differences, large effect sizes, and positive linear trends on most measures. Regarding gender, there were significant differences on only two measures. Correlational analyses also revealed significant relations among the self-regulatory subprocesses across the three phases. The microanalytic measures were combined across the three phases and entered into a regression formula to predict the students' scores on the Tornado Knowledge Test. These self-regulatory processes explained 77% of the variance in the Tornado Knowledge Test, which was a significant and substantial effect. Prior to this investigation, there have been no studies which have tested Zimmerman's three phase model on an academic task, such as science, within an expertise framework. Implications from the present study suggest that students varying in expertise level in science achievement also vary in self-regulatory behavior, and that gender is not a significant factor.

  2. TROUBLE 3: A fault diagnostic expert system for Space Station Freedom's power system

    NASA Technical Reports Server (NTRS)

    Manner, David B.

    1990-01-01

    Designing Space Station Freedom has given NASA many opportunities to develop expert systems that automate onboard operations of space based systems. One such development, TROUBLE 3, an expert system that was designed to automate the fault diagnostics of Space Station Freedom's electric power system is described. TROUBLE 3's design is complicated by the fact that Space Station Freedom's power system is evolving and changing. TROUBLE 3 has to be made flexible enough to handle changes with minimal changes to the program. Three types of expert systems were studied: rule-based, set-covering, and model-based. A set-covering approach was selected for TROUBLE 3 because if offered the needed flexibility that was missing from the other approaches. With this flexibility, TROUBLE 3 is not limited to Space Station Freedom applications, it can easily be adapted to handle any diagnostic system.

  3. Defining landscape resistance values in least-cost connectivity models for the invasive grey squirrel: a comparison of approaches using expert-opinion and habitat suitability modelling.

    PubMed

    Stevenson-Holt, Claire D; Watts, Kevin; Bellamy, Chloe C; Nevin, Owen T; Ramsey, Andrew D

    2014-01-01

    Least-cost models are widely used to study the functional connectivity of habitat within a varied landscape matrix. A critical step in the process is identifying resistance values for each land cover based upon the facilitating or impeding impact on species movement. Ideally resistance values would be parameterised with empirical data, but due to a shortage of such information, expert-opinion is often used. However, the use of expert-opinion is seen as subjective, human-centric and unreliable. This study derived resistance values from grey squirrel habitat suitability models (HSM) in order to compare the utility and validity of this approach with more traditional, expert-led methods. Models were built and tested with MaxEnt, using squirrel presence records and a categorical land cover map for Cumbria, UK. Predictions on the likelihood of squirrel occurrence within each land cover type were inverted, providing resistance values which were used to parameterise a least-cost model. The resulting habitat networks were measured and compared to those derived from a least-cost model built with previously collated information from experts. The expert-derived and HSM-inferred least-cost networks differ in precision. The HSM-informed networks were smaller and more fragmented because of the higher resistance values attributed to most habitats. These results are discussed in relation to the applicability of both approaches for conservation and management objectives, providing guidance to researchers and practitioners attempting to apply and interpret a least-cost approach to mapping ecological networks.

  4. Shared Mechanisms in the Estimation of Self-Generated Actions and the Prediction of Other's Actions by Humans.

    PubMed

    Ikegami, Tsuyoshi; Ganesh, Gowrishankar

    2017-01-01

    The question of how humans predict outcomes of observed motor actions by others is a fundamental problem in cognitive and social neuroscience. Previous theoretical studies have suggested that the brain uses parts of the forward model (used to estimate sensory outcomes of self-generated actions) to predict outcomes of observed actions. However, this hypothesis has remained controversial due to the lack of direct experimental evidence. To address this issue, we analyzed the behavior of darts experts in an understanding learning paradigm and utilized computational modeling to examine how outcome prediction of observed actions affected the participants' ability to estimate their own actions. We recruited darts experts because sports experts are known to have an accurate outcome estimation of their own actions as well as prediction of actions observed in others. We first show that learning to predict the outcomes of observed dart throws deteriorates an expert's abilities to both produce his own darts actions and estimate the outcome of his own throws (or self-estimation). Next, we introduce a state-space model to explain the trial-by-trial changes in the darts performance and self-estimation through our experiment. The model-based analysis reveals that the change in an expert's self-estimation is explained only by considering a change in the individual's forward model, showing that an improvement in an expert's ability to predict outcomes of observed actions affects the individual's forward model. These results suggest that parts of the same forward model are utilized in humans to both estimate outcomes of self-generated actions and predict outcomes of observed actions.

  5. Detection of epileptic seizure in EEG signals using linear least squares preprocessing.

    PubMed

    Roshan Zamir, Z

    2016-09-01

    An epileptic seizure is a transient event of abnormal excessive neuronal discharge in the brain. This unwanted event can be obstructed by detection of electrical changes in the brain that happen before the seizure takes place. The automatic detection of seizures is necessary since the visual screening of EEG recordings is a time consuming task and requires experts to improve the diagnosis. Much of the prior research in detection of seizures has been developed based on artificial neural network, genetic programming, and wavelet transforms. Although the highest achieved accuracy for classification is 100%, there are drawbacks, such as the existence of unbalanced datasets and the lack of investigations in performances consistency. To address these, four linear least squares-based preprocessing models are proposed to extract key features of an EEG signal in order to detect seizures. The first two models are newly developed. The original signal (EEG) is approximated by a sinusoidal curve. Its amplitude is formed by a polynomial function and compared with the predeveloped spline function. Different statistical measures, namely classification accuracy, true positive and negative rates, false positive and negative rates and precision, are utilised to assess the performance of the proposed models. These metrics are derived from confusion matrices obtained from classifiers. Different classifiers are used over the original dataset and the set of extracted features. The proposed models significantly reduce the dimension of the classification problem and the computational time while the classification accuracy is improved in most cases. The first and third models are promising feature extraction methods with the classification accuracy of 100%. Logistic, LazyIB1, LazyIB5, and J48 are the best classifiers. Their true positive and negative rates are 1 while false positive and negative rates are 0 and the corresponding precision values are 1. Numerical results suggest that these models are robust and efficient for detecting epileptic seizure. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Expert and non-expert knowledge in medical practice.

    PubMed

    Nordin, I

    2000-01-01

    One problematic aspect of the rationality of medical practice concerns the relation between expert knowledge and non-expert knowledge. In medical practice it is important to match medical knowledge with the self-knowledge of the individual patient. This paper tries to study the problem of such matching by describing a model for technological paradigms and comparing it with an ideal of technological rationality. The professionalised experts tend to base their decisions and actions mostly on medical knowledge while the rationality of medicine also involves just as important elements of the personal evaluation and knowledge of the patients. Since both types of knowledge are necessary for rational decisions, the gap between the expert and the non-expert has to be bridged in some way. A solution to the problem is suggested in terms of pluralism, with the patient as ultimate decision-maker.

  7. Flood damage modeling based on expert knowledge: Insights from French damage model for agricultural sector

    NASA Astrophysics Data System (ADS)

    Grelot, Frédéric; Agenais, Anne-Laurence; Brémond, Pauline

    2015-04-01

    In France, since 2011, it is mandatory for local communities to conduct cost-benefit analysis (CBA) of their flood management projects, to make them eligible for financial support from the State. Meanwhile, as a support, the French Ministry in charge of Environment proposed a methodology to fulfill CBA. Like for many other countries, this methodology is based on the estimation of flood damage. However, existing models to estimate flood damage were judged not convenient for a national-wide use. As a consequence, the French Ministry in charge of Environment launched studies to develop damage models for different sectors, such as: residential sector, public infrastructures, agricultural sector, and commercial and industrial sector. In this presentation, we aim at presenting and discussing methodological choices of those damage models. They all share the same principle: no sufficient data from past events were available to build damage models on a statistical analysis, so modeling was based on expert knowledge. We will focus on the model built for agricultural activities and more precisely for agricultural lands. This model was based on feedback from 30 agricultural experts who experienced floods in their geographical areas. They were selected to have a representative experience of crops and flood conditions in France. The model is composed of: (i) damaging functions, which reveal physiological vulnerability of crops, (ii) action functions, which correspond to farmers' decision rules for carrying on crops after a flood, and (iii) economic agricultural data, which correspond to featured characteristics of crops in the geographical area where the flood management project studied takes place. The two first components are generic and the third one is specific to the area studied. It is, thus, possible to produce flood damage functions adapted to different agronomic and geographical contexts. In the end, the model was applied to obtain a pool of damage functions giving damage in euros by hectare for 14 agricultural lands categories. As a conclusion, we will discuss the validation step of the model. Although the model was validated by experts, we analyse how it could gain insight from comparison with past events.

  8. Flood damage modeling based on expert knowledge: Insights from French damage model for agricultural sector

    NASA Astrophysics Data System (ADS)

    Grelot, Frédéric; Agenais, Anne-Laurence; Brémond, Pauline

    2014-05-01

    In France, since 2011, it is mandatory for local communities to conduct cost-benefit analysis (CBA) of their flood management projects, to make them eligible for financial support from the State. Meanwhile, as a support, the French Ministry in charge of Environment proposed a methodology to fulfill CBA. Like for many other countries, this methodology is based on the estimation of flood damage. Howerver, existing models to estimate flood damage were judged not convenient for a national-wide use. As a consequence, the French Ministry in charge of Environment launched studies to develop damage models for different sectors, such as: residential sector, public infrastructures, agricultural sector, and commercial and industrial sector. In this presentation, we aim at presenting and discussing methodological choices of those damage models. They all share the same principle: no sufficient data from past events were available to build damage models on a statistical analysis, so modeling was based on expert knowledge. We will focus on the model built for agricultural activities and more precisely for agricultural lands. This model was based on feedback from 30 agricultural experts who experienced floods in their geographical areas. They were selected to have a representative experience of crops and flood conditions in France. The model is composed of: (i) damaging functions, which reveal physiological vulnerability of crops, (ii) action functions, which correspond to farmers' decision rules for carrying on crops after a flood, and (iii) economic agricultural data, which correspond to featured characteristics of crops in the geographical area where the flood management project studied takes place. The two first components are generic and the third one is specific to the area studied. It is, thus, possible to produce flood damage functions adapted to different agronomic and geographical contexts. In the end, the model was applied to obtain a pool of damage functions giving damage in euros by hectare for 14 agricultural lands categories. As a conclusion, we will discuss the validation step of the model. Although the model was validated by experts, we analyse how it could gain insight from comparison with past events.

  9. Plus Disease in Retinopathy of Prematurity: Improving Diagnosis by Ranking Disease Severity and Using Quantitative Image Analysis.

    PubMed

    Kalpathy-Cramer, Jayashree; Campbell, J Peter; Erdogmus, Deniz; Tian, Peng; Kedarisetti, Dharanish; Moleta, Chace; Reynolds, James D; Hutcheson, Kelly; Shapiro, Michael J; Repka, Michael X; Ferrone, Philip; Drenser, Kimberly; Horowitz, Jason; Sonmez, Kemal; Swan, Ryan; Ostmo, Susan; Jonas, Karyn E; Chan, R V Paul; Chiang, Michael F

    2016-11-01

    To determine expert agreement on relative retinopathy of prematurity (ROP) disease severity and whether computer-based image analysis can model relative disease severity, and to propose consideration of a more continuous severity score for ROP. We developed 2 databases of clinical images of varying disease severity (100 images and 34 images) as part of the Imaging and Informatics in ROP (i-ROP) cohort study and recruited expert physician, nonexpert physician, and nonphysician graders to classify and perform pairwise comparisons on both databases. Six participating expert ROP clinician-scientists, each with a minimum of 10 years of clinical ROP experience and 5 ROP publications, and 5 image graders (3 physicians and 2 nonphysician graders) who analyzed images that were obtained during routine ROP screening in neonatal intensive care units. Images in both databases were ranked by average disease classification (classification ranking), by pairwise comparison using the Elo rating method (comparison ranking), and by correlation with the i-ROP computer-based image analysis system. Interexpert agreement (weighted κ statistic) compared with the correlation coefficient (CC) between experts on pairwise comparisons and correlation between expert rankings and computer-based image analysis modeling. There was variable interexpert agreement on diagnostic classification of disease (plus, preplus, or normal) among the 6 experts (mean weighted κ, 0.27; range, 0.06-0.63), but good correlation between experts on comparison ranking of disease severity (mean CC, 0.84; range, 0.74-0.93) on the set of 34 images. Comparison ranking provided a severity ranking that was in good agreement with ranking obtained by classification ranking (CC, 0.92). Comparison ranking on the larger dataset by both expert and nonexpert graders demonstrated good correlation (mean CC, 0.97; range, 0.95-0.98). The i-ROP system was able to model this continuous severity with good correlation (CC, 0.86). Experts diagnose plus disease on a continuum, with poor absolute agreement on classification but good relative agreement on disease severity. These results suggest that the use of pairwise rankings and a continuous severity score, such as that provided by the i-ROP system, may improve agreement on disease severity in the future. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  10. Estimating distribution and connectivity of recolonizing American marten in the northeastern United States using expert elicitation techniques

    USGS Publications Warehouse

    Aylward, C.M.; Murdoch, J.D.; Donovan, Therese M.; Kilpatrick, C.W.; Bernier, C.; Katz, J.

    2018-01-01

    The American marten Martes americana is a species of conservation concern in the northeastern United States due to widespread declines from over‐harvesting and habitat loss. Little information exists on current marten distribution and how landscape characteristics shape patterns of occupancy across the region, which could help develop effective recovery strategies. The rarity of marten and lack of historical distribution records are also problematic for region‐wide conservation planning. Expert opinion can provide a source of information for estimating species–landscape relationships and is especially useful when empirical data are sparse. We created a survey to elicit expert opinion and build a model that describes marten occupancy in the northeastern United States as a function of landscape conditions. We elicited opinions from 18 marten experts that included wildlife managers, trappers and researchers. Each expert estimated occupancy probability at 30 sites in their geographic region of expertise. We, then, fit the response data with a set of 58 models that incorporated the effects of covariates related to forest characteristics, climate, anthropogenic impacts and competition at two spatial scales (1.5 and 5 km radii), and used model selection techniques to determine the best model in the set. Three top models had strong empirical support, which we model averaged based on AIC weights. The final model included effects of five covariates at the 5‐km scale: percent canopy cover (positive), percent spruce‐fir land cover (positive), winter temperature (negative), elevation (positive) and road density (negative). A receiver operating characteristic curve indicated that the model performed well based on recent occurrence records. We mapped distribution across the region and used circuit theory to estimate movement corridors between isolated core populations. The results demonstrate the effectiveness of expert‐opinion data at modeling occupancy for rare species and provide tools for planning marten recovery in the northeastern United States.

  11. Conditioning of high voltage radio frequency cavities by using fuzzy logic in connection with rule based programming

    NASA Astrophysics Data System (ADS)

    Perreard, S.; Wildner, E.

    1994-12-01

    Many processes are controlled by experts using some kind of mental model to decide on actions and make conclusions. This model, based on heuristic knowledge, can often be represented by rules and does not have to be particularly accurate. Such is the case for the problem of conditioning high voltage RF cavities; the expert has to decide, by observing some criteria, whether to increase or to decrease the voltage and by how much. A program has been implemented which can be applied to a class of similar problems. The kernel of the program is a small rule base, which is independent of the kind of cavity. To model a specific cavity, we use fuzzy logic which is implemented as a separate routine called by the rule base, to translate from numeric to symbolic information.

  12. Viewing Knowledge Bases as Qualitative Models.

    ERIC Educational Resources Information Center

    Clancey, William J.

    The concept of a qualitative model provides a unifying perspective for understanding how expert systems differ from conventional programs. Knowledge bases contain qualitative models of systems in the world, that is, primarily non-numeric descriptions that provide a basis for explaining and predicting behavior and formulating action plans. The…

  13. A statistical approach to identify, monitor, and manage incomplete curated data sets.

    PubMed

    Howe, Douglas G

    2018-04-02

    Many biological knowledge bases gather data through expert curation of published literature. High data volume, selective partial curation, delays in access, and publication of data prior to the ability to curate it can result in incomplete curation of published data. Knowing which data sets are incomplete and how incomplete they are remains a challenge. Awareness that a data set may be incomplete is important for proper interpretation, to avoiding flawed hypothesis generation, and can justify further exploration of published literature for additional relevant data. Computational methods to assess data set completeness are needed. One such method is presented here. In this work, a multivariate linear regression model was used to identify genes in the Zebrafish Information Network (ZFIN) Database having incomplete curated gene expression data sets. Starting with 36,655 gene records from ZFIN, data aggregation, cleansing, and filtering reduced the set to 9870 gene records suitable for training and testing the model to predict the number of expression experiments per gene. Feature engineering and selection identified the following predictive variables: the number of journal publications; the number of journal publications already attributed for gene expression annotation; the percent of journal publications already attributed for expression data; the gene symbol; and the number of transgenic constructs associated with each gene. Twenty-five percent of the gene records (2483 genes) were used to train the model. The remaining 7387 genes were used to test the model. One hundred and twenty-two and 165 of the 7387 tested genes were identified as missing expression annotations based on their residuals being outside the model lower or upper 95% confidence interval respectively. The model had precision of 0.97 and recall of 0.71 at the negative 95% confidence interval and precision of 0.76 and recall of 0.73 at the positive 95% confidence interval. This method can be used to identify data sets that are incompletely curated, as demonstrated using the gene expression data set from ZFIN. This information can help both database resources and data consumers gauge when it may be useful to look further for published data to augment the existing expertly curated information.

  14. Using collective expert judgements to evaluate quality measures of mass spectrometry images.

    PubMed

    Palmer, Andrew; Ovchinnikova, Ekaterina; Thuné, Mikael; Lavigne, Régis; Guével, Blandine; Dyatlov, Andrey; Vitek, Olga; Pineau, Charles; Borén, Mats; Alexandrov, Theodore

    2015-06-15

    Imaging mass spectrometry (IMS) is a maturating technique of molecular imaging. Confidence in the reproducible quality of IMS data is essential for its integration into routine use. However, the predominant method for assessing quality is visual examination, a time consuming, unstandardized and non-scalable approach. So far, the problem of assessing the quality has only been marginally addressed and existing measures do not account for the spatial information of IMS data. Importantly, no approach exists for unbiased evaluation of potential quality measures. We propose a novel approach for evaluating potential measures by creating a gold-standard set using collective expert judgements upon which we evaluated image-based measures. To produce a gold standard, we engaged 80 IMS experts, each to rate the relative quality between 52 pairs of ion images from MALDI-TOF IMS datasets of rat brain coronal sections. Experts' optional feedback on their expertise, the task and the survey showed that (i) they had diverse backgrounds and sufficient expertise, (ii) the task was properly understood, and (iii) the survey was comprehensible. A moderate inter-rater agreement was achieved with Krippendorff's alpha of 0.5. A gold-standard set of 634 pairs of images with accompanying ratings was constructed and showed a high agreement of 0.85. Eight families of potential measures with a range of parameters and statistical descriptors, giving 143 in total, were evaluated. Both signal-to-noise and spatial chaos-based measures performed highly with a correlation of 0.7 to 0.9 with the gold standard ratings. Moreover, we showed that a composite measure with the linear coefficients (trained on the gold standard with regularized least squares optimization and lasso) showed a strong linear correlation of 0.94 and an accuracy of 0.98 in predicting which image in a pair was of higher quality. The anonymized data collected from the survey and the Matlab source code for data processing can be found at: https://github.com/alexandrovteam/IMS_quality. © The Author 2015. Published by Oxford University Press.

  15. Efficient fuzzy Bayesian inference algorithms for incorporating expert knowledge in parameter estimation

    NASA Astrophysics Data System (ADS)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad

    2016-05-01

    Bayesian inference has traditionally been conceived as the proper framework for the formal incorporation of expert knowledge in parameter estimation of groundwater models. However, conventional Bayesian inference is incapable of taking into account the imprecision essentially embedded in expert provided information. In order to solve this problem, a number of extensions to conventional Bayesian inference have been introduced in recent years. One of these extensions is 'fuzzy Bayesian inference' which is the result of integrating fuzzy techniques into Bayesian statistics. Fuzzy Bayesian inference has a number of desirable features which makes it an attractive approach for incorporating expert knowledge in the parameter estimation process of groundwater models: (1) it is well adapted to the nature of expert provided information, (2) it allows to distinguishably model both uncertainty and imprecision, and (3) it presents a framework for fusing expert provided information regarding the various inputs of the Bayesian inference algorithm. However an important obstacle in employing fuzzy Bayesian inference in groundwater numerical modeling applications is the computational burden, as the required number of numerical model simulations often becomes extremely exhaustive and often computationally infeasible. In this paper, a novel approach of accelerating the fuzzy Bayesian inference algorithm is proposed which is based on using approximate posterior distributions derived from surrogate modeling, as a screening tool in the computations. The proposed approach is first applied to a synthetic test case of seawater intrusion (SWI) in a coastal aquifer. It is shown that for this synthetic test case, the proposed approach decreases the number of required numerical simulations by an order of magnitude. Then the proposed approach is applied to a real-world test case involving three-dimensional numerical modeling of SWI in Kish Island, located in the Persian Gulf. An expert elicitation methodology is developed and applied to the real-world test case in order to provide a road map for the use of fuzzy Bayesian inference in groundwater modeling applications.

  16. Localized Smart-Interpretation

    NASA Astrophysics Data System (ADS)

    Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas; Bach, Torben; Pallesen, Tom

    2014-05-01

    The complex task of setting up a geological model consists not only of combining available geological information into a conceptual plausible model, but also requires consistency with availably data, e.g. geophysical data. However, in many cases the direct geological information, e.g borehole samples, are very sparse, so in order to create a geological model, the geologist needs to rely on the geophysical data. The problem is however, that the amount of geophysical data in many cases are so vast that it is practically impossible to integrate all of them in the manual interpretation process. This means that a lot of the information available from the geophysical surveys are unexploited, which is a problem, due to the fact that the resulting geological model does not fulfill its full potential and hence are less trustworthy. We suggest an approach to geological modeling that 1. allow all geophysical data to be considered when building the geological model 2. is fast 3. allow quantification of geological modeling. The method is constructed to build a statistical model, f(d,m), describing the relation between what the geologists interpret, d, and what the geologist knows, m. The para- meter m reflects any available information that can be quantified, such as geophysical data, the result of a geophysical inversion, elevation maps, etc... The parameter d reflects an actual interpretation, such as for example the depth to the base of a ground water reservoir. First we infer a statistical model f(d,m), by examining sets of actual interpretations made by a geological expert, [d1, d2, ...], and the information used to perform the interpretation; [m1, m2, ...]. This makes it possible to quantify how the geological expert performs interpolation through f(d,m). As the geological expert proceeds interpreting, the number of interpreted datapoints from which the statistical model is inferred increases, and therefore the accuracy of the statistical model increases. When a model f(d,m) successfully has been inferred, we are able to simulate how the geological expert would perform an interpretation given some external information m, through f(d|m). We will demonstrate this method applied on geological interpretation and densely sampled airborne electromagnetic data. In short, our goal is to build a statistical model describing how a geological expert performs geological interpretation given some geophysical data. We then wish to use this statistical model to perform semi automatic interpretation, everywhere where such geophysical data exist, in a manner consistent with the choices made by a geological expert. Benefits of such a statistical model are that 1. it provides a quantification of how a geological expert performs interpretation based on available diverse data 2. all available geophysical information can be used 3. it allows much faster interpretation of large data sets.

  17. Effects of Instructional Design with Mental Model Analysis on Learning.

    ERIC Educational Resources Information Center

    Hong, Eunsook

    This paper presents a model for systematic instructional design that includes mental model analysis together with the procedures used in developing computer-based instructional materials in the area of statistical hypothesis testing. The instructional design model is based on the premise that the objective for learning is to achieve expert-like…

  18. Evaluation models of some morphological characteristics for talent scouting in sport.

    PubMed

    Rogulj, Nenad; Papić, Vladan; Cavala, Marijana

    2009-03-01

    In this paper, for the purpose of expert system evaluation within the scientific project "Talent scouting in sport", two methodological approaches for recognizing an athlete's morphological compatibility for various sports has been presented, evaluated and compared. First approach is based on the fuzzy logic and expert opinion about compatibility of proposed hypothetical morphological models for 14 different sports which are part of the expert system. Second approach is based on determining the differences between morphological characteristics of a tested individual and top athlete's morphological characteristics for particular sport. Logical and mathematical bases of both methodological approaches have been explained in detail. High prognostic efficiency in recognition of individual's sport has been determined. Some improvements in further development of both methods have been proposed. Results of the research so far suggest that this or similar approaches can be successfully used for detection of individual's morphological compatibility for different sports. Also, it is expected to be useful in the selection of young talents for particular sport.

  19. A Model of Expertise: A Case Study of a Second Language Teacher Educator

    ERIC Educational Resources Information Center

    Asaba, Mayumi

    2018-01-01

    This study investigates the characteristics of an L2 expert teacher educator. The expert participant was selected based on the criteria suggested by educational expertise studies: years of teaching experience, high reputation among multiple constituencies, and evidence of impact on student performance. The data collection included observations,…

  20. Diversified models for portfolio selection based on uncertain semivariance

    NASA Astrophysics Data System (ADS)

    Chen, Lin; Peng, Jin; Zhang, Bo; Rosyida, Isnaini

    2017-02-01

    Since the financial markets are complex, sometimes the future security returns are represented mainly based on experts' estimations due to lack of historical data. This paper proposes a semivariance method for diversified portfolio selection, in which the security returns are given subjective to experts' estimations and depicted as uncertain variables. In the paper, three properties of the semivariance of uncertain variables are verified. Based on the concept of semivariance of uncertain variables, two types of mean-semivariance diversified models for uncertain portfolio selection are proposed. Since the models are complex, a hybrid intelligent algorithm which is based on 99-method and genetic algorithm is designed to solve the models. In this hybrid intelligent algorithm, 99-method is applied to compute the expected value and semivariance of uncertain variables, and genetic algorithm is employed to seek the best allocation plan for portfolio selection. At last, several numerical examples are presented to illustrate the modelling idea and the effectiveness of the algorithm.

  1. Applications of artificial intelligence 1993: Knowledge-based systems in aerospace and industry; Proceedings of the Meeting, Orlando, FL, Apr. 13-15, 1993

    NASA Technical Reports Server (NTRS)

    Fayyad, Usama M. (Editor); Uthurusamy, Ramasamy (Editor)

    1993-01-01

    The present volume on applications of artificial intelligence with regard to knowledge-based systems in aerospace and industry discusses machine learning and clustering, expert systems and optimization techniques, monitoring and diagnosis, and automated design and expert systems. Attention is given to the integration of AI reasoning systems and hardware description languages, care-based reasoning, knowledge, retrieval, and training systems, and scheduling and planning. Topics addressed include the preprocessing of remotely sensed data for efficient analysis and classification, autonomous agents as air combat simulation adversaries, intelligent data presentation for real-time spacecraft monitoring, and an integrated reasoner for diagnosis in satellite control. Also discussed are a knowledge-based system for the design of heat exchangers, reuse of design information for model-based diagnosis, automatic compilation of expert systems, and a case-based approach to handling aircraft malfunctions.

  2. Elicitation of quantitative data from a heterogeneous expert panel: formal process and application in animal health.

    PubMed

    Van der Fels-Klerx, Ine H J; Goossens, Louis H J; Saatkamp, Helmut W; Horst, Suzan H S

    2002-02-01

    This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.

  3. High-resolution modeling of antibody structures by a combination of bioinformatics, expert knowledge, and molecular simulations.

    PubMed

    Shirai, Hiroki; Ikeda, Kazuyoshi; Yamashita, Kazuo; Tsuchiya, Yuko; Sarmiento, Jamica; Liang, Shide; Morokata, Tatsuaki; Mizuguchi, Kenji; Higo, Junichi; Standley, Daron M; Nakamura, Haruki

    2014-08-01

    In the second antibody modeling assessment, we used a semiautomated template-based structure modeling approach for 11 blinded antibody variable region (Fv) targets. The structural modeling method involved several steps, including template selection for framework and canonical structures of complementary determining regions (CDRs), homology modeling, energy minimization, and expert inspection. The submitted models for Fv modeling in Stage 1 had the lowest average backbone root mean square deviation (RMSD) (1.06 Å). Comparison to crystal structures showed the most accurate Fv models were generated for 4 out of 11 targets. We found that the successful modeling in Stage 1 mainly was due to expert-guided template selection for CDRs, especially for CDR-H3, based on our previously proposed empirical method (H3-rules) and the use of position specific scoring matrix-based scoring. Loop refinement using fragment assembly and multicanonical molecular dynamics (McMD) was applied to CDR-H3 loop modeling in Stage 2. Fragment assembly and McMD produced putative structural ensembles with low free energy values that were scored based on the OSCAR all-atom force field and conformation density in principal component analysis space, respectively, as well as the degree of consensus between the two sampling methods. The quality of 8 out of 10 targets improved as compared with Stage 1. For 4 out of 10 Stage-2 targets, our method generated top-scoring models with RMSD values of less than 1 Å. In this article, we discuss the strengths and weaknesses of our approach as well as possible directions for improvement to generate better predictions in the future. © 2014 Wiley Periodicals, Inc.

  4. Combining Decision Rules from Classification Tree Models and Expert Assessment to Estimate Occupational Exposure to Diesel Exhaust for a Case-Control Study

    PubMed Central

    Friesen, Melissa C.; Wheeler, David C.; Vermeulen, Roel; Locke, Sarah J.; Zaebst, Dennis D.; Koutros, Stella; Pronk, Anjoeka; Colt, Joanne S.; Baris, Dalsu; Karagas, Margaret R.; Malats, Nuria; Schwenn, Molly; Johnson, Alison; Armenti, Karla R.; Rothman, Nathanial; Stewart, Patricia A.; Kogevinas, Manolis; Silverman, Debra T.

    2016-01-01

    Objectives: To efficiently and reproducibly assess occupational diesel exhaust exposure in a Spanish case-control study, we examined the utility of applying decision rules that had been extracted from expert estimates and questionnaire response patterns using classification tree (CT) models from a similar US study. Methods: First, previously extracted CT decision rules were used to obtain initial ordinal (0–3) estimates of the probability, intensity, and frequency of occupational exposure to diesel exhaust for the 10 182 jobs reported in a Spanish case-control study of bladder cancer. Second, two experts reviewed the CT estimates for 350 jobs randomly selected from strata based on each CT rule’s agreement with the expert ratings in the original study [agreement rate, from 0 (no agreement) to 1 (perfect agreement)]. Their agreement with each other and with the CT estimates was calculated using weighted kappa (κ w) and guided our choice of jobs for subsequent expert review. Third, an expert review comprised all jobs with lower confidence (low-to-moderate agreement rates or discordant assignments, n = 931) and a subset of jobs with a moderate to high CT probability rating and with moderately high agreement rates (n = 511). Logistic regression was used to examine the likelihood that an expert provided a different estimate than the CT estimate based on the CT rule agreement rates, the CT ordinal rating, and the availability of a module with diesel-related questions. Results: Agreement between estimates made by two experts and between estimates made by each of the experts and the CT estimates was very high for jobs with estimates that were determined by rules with high CT agreement rates (κ w: 0.81–0.90). For jobs with estimates based on rules with lower agreement rates, moderate agreement was observed between the two experts (κ w: 0.42–0.67) and poor-to-moderate agreement was observed between the experts and the CT estimates (κ w: 0.09–0.57). In total, the expert review of 1442 jobs changed 156 probability estimates, 128 intensity estimates, and 614 frequency estimates. The expert was more likely to provide a different estimate when the CT rule agreement rate was <0.8, when the CT ordinal ratings were low to moderate, or when a module with diesel questions was available. Conclusions: Our reliability assessment provided important insight into where to prioritize additional expert review; as a result, only 14% of the jobs underwent expert review, substantially reducing the exposure assessment burden. Overall, we found that we could efficiently, reproducibly, and reliably apply CT decision rules from one study to assess exposure in another study. PMID:26732820

  5. Expert Systems for Real-Time Volcano Monitoring

    NASA Astrophysics Data System (ADS)

    Cassisi, C.; Cannavo, F.; Montalto, P.; Motta, P.; Schembra, G.; Aliotta, M. A.; Cannata, A.; Patanè, D.; Prestifilippo, M.

    2014-12-01

    In the last decade, the capability to monitor and quickly respond to remote detection of volcanic activity has been greatly improved through use of advanced techniques and semi-automatic software applications installed in most of the 24h control rooms devoted to volcanic surveillance. Ability to monitor volcanoes is being advanced by new technology, such as broad-band seismology, microphone networks mainly recording in the infrasonic frequency band, satellite observations of ground deformation, high quality video surveillance systems, also in infrared band, improved sensors for volcanic gas measurements, and advances in computer power and speed, leading to improvements in data transmission, data analysis and modeling techniques. One of the most critical point in the real-time monitoring chain is the evaluation of the volcano state from all the measurements. At the present, most of this task is delegated to one or more human experts in volcanology. Unfortunately, the volcano state assessment becomes harder if we observe that, due to the coupling of highly non-linear and complex volcanic dynamic processes, the measurable effects can show a rich range of different behaviors. Moreover, due to intrinsic uncertainties and possible failures in some recorded data, precise state assessment is usually not achievable. Hence, the volcano state needs to be expressed in probabilistic terms that take account of uncertainties. In the framework of the project PON SIGMA (Integrated Cloud-Sensor System for Advanced Multirisk Management) work, we have developed an expert system approach to estimate the ongoing volcano state from all the available measurements and with minimal human interaction. The approach is based on hidden markov model and deals with uncertainties and probabilities. We tested the proposed approach on data coming from the Mt. Etna (Italy) continuous monitoring networks for the period 2011-2013. Results show that this approach can be a valuable tool to aid the operator in volcano real-time monitoring.

  6. Identification and Control of Aircrafts using Multiple Models and Adaptive Critics

    NASA Technical Reports Server (NTRS)

    Principe, Jose C.

    2007-01-01

    We compared two possible implementations of local linear models for control: one approach is based on a self-organizing map (SOM) to cluster the dynamics followed by a set of linear models operating at each cluster. Therefore the gating function is hard (a single local model will represent the regional dynamics). This simplifies the controller design since there is a one to one mapping between controllers and local models. The second approach uses a soft gate using a probabilistic framework based on a Gaussian Mixture Model (also called a dynamic mixture of experts). In this approach several models may be active at a given time, we can expect a smaller number of models, but the controller design is more involved, with potentially better noise rejection characteristics. Our experiments showed that the SOM provides overall best performance in high SNRs, but the performance degrades faster than with the GMM for the same noise conditions. The SOM approach required about an order of magnitude more models than the GMM, so in terms of implementation cost, the GMM is preferable. The design of the SOM is straight forward, while the design of the GMM controllers, although still reasonable, is more involved and needs more care in the selection of the parameters. Either one of these locally linear approaches outperform global nonlinear controllers based on neural networks, such as the time delay neural network (TDNN). Therefore, in essence the local model approach warrants practical implementations. In order to call the attention of the control community for this design methodology we extended successfully the multiple model approach to PID controllers (still today the most widely used control scheme in the industry), and wrote a paper on this subject. The echo state network (ESN) is a recurrent neural network with the special characteristics that only the output parameters are trained. The recurrent connections are preset according to the problem domain and are fixed. In a nutshell, the states of the reservoir of recurrent processing elements implement a projection space, where the desired response is optimally projected. This architecture trades training efficiency by a large increase in the dimension of the recurrent layer. However, the power of the recurrent neural networks can be brought to bear on practical difficult problems. Our goal was to implement an adaptive critic architecture implementing Bellman s approach to optimal control. However, we could only characterize the ESN performance as a critic in value function evaluation, which is just one of the pieces of the overall adaptive critic controller. The results were very convincing, and the simplicity of the implementation was unparalleled.

  7. Advanced Technology Training System on Motor-Operated Valves

    NASA Technical Reports Server (NTRS)

    Wiederholt, Bradley J.; Widjaja, T. Kiki; Yasutake, Joseph Y.; Isoda, Hachiro

    1993-01-01

    This paper describes how features from the field of Intelligent Tutoring Systems are applied to the Motor-Operated Valve (MOV) Advanced Technology Training System (ATTS). The MOV ATTS is a training system developed at Galaxy Scientific Corporation for the Central Research Institute of Electric Power Industry in Japan and the Electric Power Research Institute in the United States. The MOV ATTS combines traditional computer-based training approaches with system simulation, integrated expert systems, and student and expert modeling. The primary goal of the MOV ATTS is to reduce human errors that occur during MOV overhaul and repair. The MOV ATTS addresses this goal by providing basic operational information of the MOV, simulating MOV operation, providing troubleshooting practice of MOV failures, and tailoring this training to the needs of each individual student. The MOV ATTS integrates multiple expert models (functional and procedural) to provide advice and feedback to students. The integration also provides expert model validation support to developers. Student modeling is supported by two separate student models: one model registers and updates the student's current knowledge of basic MOV information, while another model logs the student's actions and errors during troubleshooting exercises. These two models are used to provide tailored feedback to the student during the MOV course.

  8. a Novel Approach to Support Majority Voting in Spatial Group Mcdm Using Density Induced Owa Operator for Seismic Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Moradi, M.; Delavar, M. R.; Moshiri, B.; Khamespanah, F.

    2014-10-01

    Being one of the most frightening disasters, earthquakes frequently cause huge damages to buildings, facilities and human beings. Although the prediction of characteristics of an earthquake seems to be impossible, its loss and damage is predictable in advance. Seismic loss estimation models tend to evaluate the extent to which the urban areas are vulnerable to earthquakes. Many factors contribute to the vulnerability of urban areas against earthquakes including age and height of buildings, the quality of the materials, the density of population and the location of flammable facilities. Therefore, seismic vulnerability assessment is a multi-criteria problem. A number of multi criteria decision making models have been proposed based on a single expert. The main objective of this paper is to propose a model which facilitates group multi criteria decision making based on the concept of majority voting. The main idea of majority voting is providing a computational tool to measure the degree to which different experts support each other's opinions and make a decision regarding this measure. The applicability of this model is examined in Tehran metropolitan area which is located in a seismically active region. The results indicate that neglecting the experts which get lower degrees of support from others enables the decision makers to avoid the extreme strategies. Moreover, a computational method is proposed to calculate the degree of optimism in the experts' opinions.

  9. Real-time diagnostics for a reusable rocket engine

    NASA Technical Reports Server (NTRS)

    Guo, T. H.; Merrill, W.; Duyar, A.

    1992-01-01

    A hierarchical, decentralized diagnostic system is proposed for the Real-Time Diagnostic System component of the Intelligent Control System (ICS) for reusable rocket engines. The proposed diagnostic system has three layers of information processing: condition monitoring, fault mode detection, and expert system diagnostics. The condition monitoring layer is the first level of signal processing. Here, important features of the sensor data are extracted. These processed data are then used by the higher level fault mode detection layer to do preliminary diagnosis on potential faults at the component level. Because of the closely coupled nature of the rocket engine propulsion system components, it is expected that a given engine condition may trigger more than one fault mode detector. Expert knowledge is needed to resolve the conflicting reports from the various failure mode detectors. This is the function of the diagnostic expert layer. Here, the heuristic nature of this decision process makes it desirable to use an expert system approach. Implementation of the real-time diagnostic system described above requires a wide spectrum of information processing capability. Generally, in the condition monitoring layer, fast data processing is often needed for feature extraction and signal conditioning. This is usually followed by some detection logic to determine the selected faults on the component level. Three different techniques are used to attack different fault detection problems in the NASA LeRC ICS testbed simulation. The first technique employed is the neural network application for real-time sensor validation which includes failure detection, isolation, and accommodation. The second approach demonstrated is the model-based fault diagnosis system using on-line parameter identification. Besides these model based diagnostic schemes, there are still many failure modes which need to be diagnosed by the heuristic expert knowledge. The heuristic expert knowledge is implemented using a real-time expert system tool called G2 by Gensym Corp. Finally, the distributed diagnostic system requires another level of intelligence to oversee the fault mode reports generated by component fault detectors. The decision making at this level can best be done using a rule-based expert system. This level of expert knowledge is also implemented using G2.

  10. Development of computer informational system of diagnostics integrated optical materials, elements, and devices

    NASA Astrophysics Data System (ADS)

    Volosovitch, Anatoly E.; Konopaltseva, Lyudmila I.

    1995-11-01

    Well-known methods of optical diagnostics, database for their storage, as well as expert system (ES) for their development are analyzed. A computer informational system is developed, which is based on a hybrid ES built on modern DBMS. As an example, the structural and constructive circuits of the hybrid integrated-optical devices based on laser diodes, diffusion waveguides, geodetic lenses, package-free linear photodiode arrays, etc. are presented. The features of methods and test results as well as the advanced directions of works related to the hybrid integrated-optical devices in the field of metrology are discussed.

  11. Desiderata for product labeling of medical expert systems.

    PubMed

    Geissbühler, A; Miller, R A

    1997-12-01

    The proliferation and increasing complexity of medical expert systems raise ethical and legal concerns about the ability of practitioners to protect their patients from defective or misused software. Appropriate product labeling of expert systems can help clinical users to understand software indications and limitations. Mechanisms of action and knowledge representation schema should be explained in layperson's terminology. User qualifications and resources available for acquiring the skills necessary to understand and critique the system output should be listed. The processes used for building and maintaining the system's knowledge base are key determinants of the product's quality, and should be carefully documented. To meet these desiderata, a printed label is insufficient. The authors suggest a new, more active, model of product labeling for medical expert systems that involves embedding 'knowledge of the knowledge base', creating user-specific data, and sharing global information using the Internet.

  12. An inventory on rotational kinematics of a particle: unravelling misconceptions and pitfalls in reasoning

    NASA Astrophysics Data System (ADS)

    Mashood, K. K.; Singh, Vijay A.

    2012-09-01

    Student difficulties regarding the angular velocity (\\vec{\\omega }) and angular acceleration (\\vec{\\alpha }) of a particle have remained relatively unexplored in contrast to their linear counterparts. We present an inventory comprising multiple choice questions aimed at probing misconceptions and eliciting ill-suited reasoning patterns. The development of the inventory was based on interactions with students, teachers and experts. We report misconceptions, some of which are parallel to those found earlier in linear kinematics. Fixations with inappropriate prototypes were uncovered. Many students and even teachers mistakenly assume that all rotational motion is necessarily circular. A persistent notion that the direction of \\vec{\\omega } and \\vec{\\alpha } should be ‘along’ the motion exists. Instances of indiscriminate usage of equations were identified.

  13. Sugeno-Fuzzy Expert System Modeling for Quality Prediction of Non-Contact Machining Process

    NASA Astrophysics Data System (ADS)

    Sivaraos; Khalim, A. Z.; Salleh, M. S.; Sivakumar, D.; Kadirgama, K.

    2018-03-01

    Modeling can be categorised into four main domains: prediction, optimisation, estimation and calibration. In this paper, the Takagi-Sugeno-Kang (TSK) fuzzy logic method is examined as a prediction modelling method to investigate the taper quality of laser lathing, which seeks to replace traditional lathe machines with 3D laser lathing in order to achieve the desired cylindrical shape of stock materials. Three design parameters were selected: feed rate, cutting speed and depth of cut. A total of twenty-four experiments were conducted with eight sequential runs and replicated three times. The results were found to be 99% of accuracy rate of the TSK fuzzy predictive model, which suggests that the model is a suitable and practical method for non-linear laser lathing process.

  14. Inside the black box: starting to uncover the underlying decision rules used in one-by-one expert assessment of occupational exposure in case-control studies

    PubMed Central

    Wheeler, David C.; Burstyn, Igor; Vermeulen, Roel; Yu, Kai; Shortreed, Susan M.; Pronk, Anjoeka; Stewart, Patricia A.; Colt, Joanne S.; Baris, Dalsu; Karagas, Margaret R.; Schwenn, Molly; Johnson, Alison; Silverman, Debra T.; Friesen, Melissa C.

    2014-01-01

    Objectives Evaluating occupational exposures in population-based case-control studies often requires exposure assessors to review each study participants' reported occupational information job-by-job to derive exposure estimates. Although such assessments likely have underlying decision rules, they usually lack transparency, are time-consuming and have uncertain reliability and validity. We aimed to identify the underlying rules to enable documentation, review, and future use of these expert-based exposure decisions. Methods Classification and regression trees (CART, predictions from a single tree) and random forests (predictions from many trees) were used to identify the underlying rules from the questionnaire responses and an expert's exposure assignments for occupational diesel exhaust exposure for several metrics: binary exposure probability and ordinal exposure probability, intensity, and frequency. Data were split into training (n=10,488 jobs), testing (n=2,247), and validation (n=2,248) data sets. Results The CART and random forest models' predictions agreed with 92–94% of the expert's binary probability assignments. For ordinal probability, intensity, and frequency metrics, the two models extracted decision rules more successfully for unexposed and highly exposed jobs (86–90% and 57–85%, respectively) than for low or medium exposed jobs (7–71%). Conclusions CART and random forest models extracted decision rules and accurately predicted an expert's exposure decisions for the majority of jobs and identified questionnaire response patterns that would require further expert review if the rules were applied to other jobs in the same or different study. This approach makes the exposure assessment process in case-control studies more transparent and creates a mechanism to efficiently replicate exposure decisions in future studies. PMID:23155187

  15. Workflow Agents vs. Expert Systems: Problem Solving Methods in Work Systems Design

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten; Seah, Chin

    2009-01-01

    During the 1980s, a community of artificial intelligence researchers became interested in formalizing problem solving methods as part of an effort called "second generation expert systems" (2nd GES). How do the motivations and results of this research relate to building tools for the workplace today? We provide an historical review of how the theory of expertise has developed, a progress report on a tool for designing and implementing model-based automation (Brahms), and a concrete example how we apply 2nd GES concepts today in an agent-based system for space flight operations (OCAMS). Brahms incorporates an ontology for modeling work practices, what people are doing in the course of a day, characterized as "activities." OCAMS was developed using a simulation-to-implementation methodology, in which a prototype tool was embedded in a simulation of future work practices. OCAMS uses model-based methods to interactively plan its actions and keep track of the work to be done. The problem solving methods of practice are interactive, employing reasoning for and through action in the real world. Analogously, it is as if a medical expert system were charged not just with interpreting culture results, but actually interacting with a patient. Our perspective shifts from building a "problem solving" (expert) system to building an actor in the world. The reusable components in work system designs include entire "problem solvers" (e.g., a planning subsystem), interoperability frameworks, and workflow agents that use and revise models dynamically in a network of people and tools. Consequently, the research focus shifts so "problem solving methods" include ways of knowing that models do not fit the world, and ways of interacting with other agents and people to gain or verify information and (ultimately) adapt rules and procedures to resolve problematic situations.

  16. Prioritizing public- private partnership models for public hospitals of iran based on performance indicators.

    PubMed

    Gholamzadeh Nikjoo, Raana; Jabbari Beyrami, Hossein; Jannati, Ali; Asghari Jaafarabadi, Mohammad

    2012-01-01

    The present study was conducted to scrutinize Public- Private Partnership (PPP) models in public hospitals of different countries based on performance indicators in order to se-lect appropriated models for Iran hospitals. In this mixed (quantitative-qualitative) study, systematic review and expert panel has been done to identify varied models of PPP as well as performance indicators. In the second step we prioritized performance indicator and PPP models based on selected performance indicators by Analytical Hierarchy process (AHP) technique. The data were analyzed by Excel 2007 and Expert Choice11 software's. In quality - effectiveness area, indicators like the rate of hospital infections (100%), hospital accidents prevalence rate (73%), pure rate of hospital mortality (63%), patient satisfaction percentage (53%), in accessibility equity area indicators such as average inpatient waiting time (100%) and average outpatient waiting time (74%), and in financial - efficiency area, indicators including average length of stay (100%), bed occupation ratio (99%), specific income to total cost ratio (97%) have been chosen to be the most key performance indicators. In the pri¬oritization of the PPP models clinical outsourcing, management, privatization, BOO (build, own, operate) and non-clinical outsourcing models, achieved high priority for various performance in¬dicator areas. This study had been provided the most common PPP options in the field of public hospitals and had gathered suitable evidences from experts for choosing appropriate PPP option for public hospitals. Effect of private sector presence in public hospital performance, based on which PPP options undertaken, will be different.

  17. Prioritizing Public- Private Partnership Models for Public Hospitals of Iran Based on Performance Indicators

    PubMed Central

    Gholamzadeh Nikjoo, Raana; Jabbari Beyrami, Hossein; Jannati, Ali; Asghari Jaafarabadi, Mohammad

    2012-01-01

    Background: The present study was conducted to scrutinize Public- Private Partnership (PPP) models in public hospitals of different countries based on performance indicators in order to se-lect appropriated models for Iran hospitals. Methods: In this mixed (quantitative-qualitative) study, systematic review and expert panel has been done to identify varied models of PPP as well as performance indicators. In the second step we prioritized performance indicator and PPP models based on selected performance indicators by Analytical Hierarchy process (AHP) technique. The data were analyzed by Excel 2007 and Expert Choice11 software’s. Results: In quality – effectiveness area, indicators like the rate of hospital infections (100%), hospital accidents prevalence rate (73%), pure rate of hospital mortality (63%), patient satisfaction percentage (53%), in accessibility equity area indicators such as average inpatient waiting time (100%) and average outpatient waiting time (74%), and in financial – efficiency area, indicators including average length of stay (100%), bed occupation ratio (99%), specific income to total cost ratio (97%) have been chosen to be the most key performance indicators. In the pri¬oritization of the PPP models clinical outsourcing, management, privatization, BOO (build, own, operate) and non-clinical outsourcing models, achieved high priority for various performance in¬dicator areas. Conclusion: This study had been provided the most common PPP options in the field of public hospitals and had gathered suitable evidences from experts for choosing appropriate PPP option for public hospitals. Effect of private sector presence in public hospital performance, based on which PPP options undertaken, will be different. PMID:24688942

  18. The Implementation of Blended Learning Using Android-Based Tutorial Video in Computer Programming Course II

    NASA Astrophysics Data System (ADS)

    Huda, C.; Hudha, M. N.; Ain, N.; Nandiyanto, A. B. D.; Abdullah, A. G.; Widiaty, I.

    2018-01-01

    Computer programming course is theoretical. Sufficient practice is necessary to facilitate conceptual understanding and encouraging creativity in designing computer programs/animation. The development of tutorial video in an Android-based blended learning is needed for students’ guide. Using Android-based instructional material, students can independently learn anywhere and anytime. The tutorial video can facilitate students’ understanding about concepts, materials, and procedures of programming/animation making in detail. This study employed a Research and Development method adapting Thiagarajan’s 4D model. The developed Android-based instructional material and tutorial video were validated by experts in instructional media and experts in physics education. The expert validation results showed that the Android-based material was comprehensive and very feasible. The tutorial video was deemed feasible as it received average score of 92.9%. It was also revealed that students’ conceptual understanding, skills, and creativity in designing computer program/animation improved significantly.

  19. A phenomenological biological dose model for proton therapy based on linear energy transfer spectra.

    PubMed

    Rørvik, Eivind; Thörnqvist, Sara; Stokkevåg, Camilla H; Dahle, Tordis J; Fjaera, Lars Fredrik; Ytre-Hauge, Kristian S

    2017-06-01

    The relative biological effectiveness (RBE) of protons varies with the radiation quality, quantified by the linear energy transfer (LET). Most phenomenological models employ a linear dependency of the dose-averaged LET (LET d ) to calculate the biological dose. However, several experiments have indicated a possible non-linear trend. Our aim was to investigate if biological dose models including non-linear LET dependencies should be considered, by introducing a LET spectrum based dose model. The RBE-LET relationship was investigated by fitting of polynomials from 1st to 5th degree to a database of 85 data points from aerobic in vitro experiments. We included both unweighted and weighted regression, the latter taking into account experimental uncertainties. Statistical testing was performed to decide whether higher degree polynomials provided better fits to the data as compared to lower degrees. The newly developed models were compared to three published LET d based models for a simulated spread out Bragg peak (SOBP) scenario. The statistical analysis of the weighted regression analysis favored a non-linear RBE-LET relationship, with the quartic polynomial found to best represent the experimental data (P = 0.010). The results of the unweighted regression analysis were on the borderline of statistical significance for non-linear functions (P = 0.053), and with the current database a linear dependency could not be rejected. For the SOBP scenario, the weighted non-linear model estimated a similar mean RBE value (1.14) compared to the three established models (1.13-1.17). The unweighted model calculated a considerably higher RBE value (1.22). The analysis indicated that non-linear models could give a better representation of the RBE-LET relationship. However, this is not decisive, as inclusion of the experimental uncertainties in the regression analysis had a significant impact on the determination and ranking of the models. As differences between the models were observed for the SOBP scenario, both non-linear LET spectrum- and linear LET d based models should be further evaluated in clinically realistic scenarios. © 2017 American Association of Physicists in Medicine.

  20. Plant Distribution Data Show Broader Climatic Limits than Expert-Based Climatic Tolerance Estimates

    PubMed Central

    Curtis, Caroline A.; Bradley, Bethany A.

    2016-01-01

    Background Although increasingly sophisticated environmental measures are being applied to species distributions models, the focus remains on using climatic data to provide estimates of habitat suitability. Climatic tolerance estimates based on expert knowledge are available for a wide range of plants via the USDA PLANTS database. We aim to test how climatic tolerance inferred from plant distribution records relates to tolerance estimated by experts. Further, we use this information to identify circumstances when species distributions are more likely to approximate climatic tolerance. Methods We compiled expert knowledge estimates of minimum and maximum precipitation and minimum temperature tolerance for over 1800 conservation plant species from the ‘plant characteristics’ information in the USDA PLANTS database. We derived climatic tolerance from distribution data downloaded from the Global Biodiversity and Information Facility (GBIF) and corresponding climate from WorldClim. We compared expert-derived climatic tolerance to empirical estimates to find the difference between their inferred climate niches (ΔCN), and tested whether ΔCN was influenced by growth form or range size. Results Climate niches calculated from distribution data were significantly broader than expert-based tolerance estimates (Mann-Whitney p values << 0.001). The average plant could tolerate 24 mm lower minimum precipitation, 14 mm higher maximum precipitation, and 7° C lower minimum temperatures based on distribution data relative to expert-based tolerance estimates. Species with larger ranges had greater ΔCN for minimum precipitation and minimum temperature. For maximum precipitation and minimum temperature, forbs and grasses tended to have larger ΔCN while grasses and trees had larger ΔCN for minimum precipitation. Conclusion Our results show that distribution data are consistently broader than USDA PLANTS experts’ knowledge and likely provide more robust estimates of climatic tolerance, especially for widespread forbs and grasses. These findings suggest that widely available expert-based climatic tolerance estimates underrepresent species’ fundamental niche and likely fail to capture the realized niche. PMID:27870859

  1. Estimating community health needs against a Triple Aim background: What can we learn from current predictive risk models?

    PubMed

    Elissen, Arianne M J; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk

    2015-05-01

    To support providers and commissioners in accurately assessing their local populations' health needs, this study produces an overview of Dutch predictive risk models for health care, focusing specifically on the type, combination and relevance of included determinants for achieving the Triple Aim (improved health, better care experience, and lower costs). We conducted a mixed-methods study combining document analyses, interviews and a Delphi study. Predictive risk models were identified based on a web search and expert input. Participating in the study were Dutch experts in predictive risk modelling (interviews; n=11) and experts in healthcare delivery, insurance and/or funding methodology (Delphi panel; n=15). Ten predictive risk models were analysed, comprising 17 unique determinants. Twelve were considered relevant by experts for estimating community health needs. Although some compositional similarities were identified between models, the combination and operationalisation of determinants varied considerably. Existing predictive risk models provide a good starting point, but optimally balancing resources and targeting interventions on the community level will likely require a more holistic approach to health needs assessment. Development of additional determinants, such as measures of people's lifestyle and social network, may require policies pushing the integration of routine data from different (healthcare) sources. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. "It's Harder Than We Thought It Would Be": A Comparative Case Study of Expert-Novice Experimentation Strategies.

    ERIC Educational Resources Information Center

    Hmelo-Silver, Cindy E.; Nagarajan, Anandi; Day, Roger S.

    2002-01-01

    Compares a group of expert cancer researchers with four groups of fourth year medical students (the "novice" groups) engaged in the task of designing a clinical trial to test a new cancer drug using a computer-based modeling tool, the Oncology Thinking Cap. (Contains 24 references.) (Author/YDS)

  3. Investigating Learning Space for Research Workspaces in Higher Education in Malaysia

    ERIC Educational Resources Information Center

    Yusof, Norhafezah; Hashim, Rosna Awang; Kian, Chan Kok

    2016-01-01

    Purpose: The purpose of this paper is to investigate learning space for research workspaces in Higher Education Institutions (HEIs) in Malaysia based on the evaluations by experts and university research workers on a practical model for creating an effective research learning space. It examines expert analyses of the notion of a suitable research…

  4. Development and Evaluation of an Adaptive Computerized Training System (ACTS). R&D Report 78-1.

    ERIC Educational Resources Information Center

    Knerr, Bruce W.; Nawrocki, Leon H.

    This report describes the development of a computer based system designed to train electronic troubleshooting procedures. The ACTS uses artificial intelligence techniques to develop models of student and expert troubleshooting behavior as they solve a series of troubleshooting problems on the system. Comparisons of the student and expert models…

  5. Quality control of 3D Geological Models using an Attention Model based on Gaze

    NASA Astrophysics Data System (ADS)

    Busschers, Freek S.; van Maanen, Peter-Paul; Brouwer, Anne-Marie

    2014-05-01

    The Geological Survey of the Netherlands (GSN) produces 3D stochastic geological models of the upper 50 meters of the Dutch subsurface. The voxel models are regarded essential in answering subsurface questions on, for example, aggregate resources, groundwater flow, land subsidence studies and the planning of large-scale infrastructural works such as tunnels. GeoTOP is the most recent and detailed generation of 3D voxel models. This model describes 3D lithological variability up to a depth of 50 m using voxels of 100*100*0.5m. Due to the expected increase in data-flow, model output and user demands, the development of (semi-)automated quality control systems is getting more important in the near future. Besides numerical control systems, capturing model errors as seen from the expert geologist viewpoint is of increasing interest. We envision the use of eye gaze to support and speed up detection of errors in the geological voxel models. As a first step in this direction we explore gaze behavior of 12 geological experts from the GSN during quality control of part of the GeoTOP 3D geological model using an eye-tracker. Gaze is used as input of an attention model that results in 'attended areas' for each individual examined image of the GeoTOP model and each individual expert. We compared these attended areas to errors as marked by the experts using a mouse. Results show that: 1) attended areas as determined from experts' gaze data largely match with GeoTOP errors as indicated by the experts using a mouse, and 2) a substantial part of the match can be reached using only gaze data from the first few seconds of the time geologists spend to search for errors. These results open up the possibility of faster GeoTOP model control using gaze if geologists accept a small decrease of error detection accuracy. Attention data may also be used to make independent comparisons between different geologists varying in focus and expertise. This would facilitate a more effective use of experts in specific different projects or areas. Part of such a procedure could be to confront geological experts with their own results, allowing possible training steps in order to improve their geological expertise and eventually improve the GeoTop model. Besides the directions as indicated above, future research should focus on concrete implementation of facilitating and optimizing error detection in present and future 3D voxel models that are commonly characterized by very large amounts of data.

  6. Which Dimensions of Patient-Centeredness Matter? - Results of a Web-Based Expert Delphi Survey.

    PubMed

    Zill, Jördis M; Scholl, Isabelle; Härter, Martin; Dirmaier, Jörg

    2015-01-01

    Present models and definitions of patient-centeredness revealed a lack of conceptual clarity. Based on a prior systematic literature review, we developed an integrative model with 15 dimensions of patient-centeredness. The aims of this study were to 1) validate, and 2) prioritize these dimensions. A two-round web-based Delphi study was conducted. 297 international experts were invited to participate. In round one they were asked to 1) give an individual rating on a nine-point-scale on relevance and clarity of the dimensions, 2) add missing dimensions, and 3) prioritize the dimensions. In round two, experts received feedback about the results of round one and were asked to reflect and re-rate their own results. The cut-off for the validation of a dimension was a median < 7 on one of the criteria. 105 experts participated in round one and 71 in round two. In round one, one new dimension was suggested and included for discussion in round two. In round two, this dimension did not reach sufficient ratings to be included in the model. Eleven dimensions reached a median ≥ 7 on both criteria (relevance and clarity). Four dimensions had a median < 7 on one or both criteria. The five dimensions rated as most important were: patient as a unique person, patient involvement in care, patient information, clinician-patient communication and patient empowerment. 11 out of the 15 dimensions have been validated through experts' ratings. Further research on the four dimensions that received insufficient ratings is recommended. The priority order of the dimensions can help researchers and clinicians to focus on the most important dimensions of patient-centeredness. Overall, the model provides a useful framework that can be used in the development of measures, interventions, and medical education curricula, as well as the adoption of a new perspective in health policy.

  7. Leveraging Strengths Assessment and Intervention Model (LeStAIM): A Theoretical Strength-Based Assessment Framework

    ERIC Educational Resources Information Center

    Laija-Rodriguez, Wilda; Grites, Karen; Bouman, Doug; Pohlman, Craig; Goldman, Richard L.

    2013-01-01

    Current assessments in the schools are based on a deficit model (Epstein, 1998). "The National Association of School Psychologists (NASP) Model for Comprehensive and Integrated School Psychological Services" (2010), federal initiatives and mandates, and experts in the field of assessment have highlighted the need for the comprehensive…

  8. Expert knowledge elicitation using computer simulation: the organization of frail elderly case management as an illustration.

    PubMed

    Chiêm, Jean-Christophe; Van Durme, Thérèse; Vandendorpe, Florence; Schmitz, Olivier; Speybroeck, Niko; Cès, Sophie; Macq, Jean

    2014-08-01

    Various elderly case management projects have been implemented in Belgium. This type of long-term health care intervention involves contextual factors and human interactions. These underlying complex mechanisms can be usefully informed with field experts' knowledge, which are hard to make explicit. However, computer simulation has been suggested as one possible method of overcoming the difficulty of articulating such elicited qualitative views. A simulation model of case management was designed using an agent-based methodology, based on the initial qualitative research material. Variables and rules of interaction were formulated into a simple conceptual framework. This model has been implemented and was used as a support for a structured discussion with experts in case management. The rigorous formulation provided by the agent-based methodology clarified the descriptions of the interventions and the problems encountered regarding: the diverse network topologies of health care actors in the project; the adaptation time required by the intervention; the communication between the health care actors; the institutional context; the organization of the care; and the role of the case manager and his or hers personal ability to interpret the informal demands of the frail older person. The simulation model should be seen primarily as a tool for thinking and learning. A number of insights were gained as part of a valuable cognitive process. Computer simulation supporting field experts' elicitation can lead to better-informed decisions in the organization of complex health care interventions. © 2013 John Wiley & Sons, Ltd.

  9. The influence of expertise on brain activation of the action observation network during anticipation of tennis and volleyball serves.

    PubMed

    Balser, Nils; Lorey, Britta; Pilgramm, Sebastian; Naumann, Tim; Kindermann, Stefan; Stark, Rudolf; Zentgraf, Karen; Williams, A Mark; Munzert, Jörn

    2014-01-01

    In many daily activities, and especially in sport, it is necessary to predict the effects of others' actions in order to initiate appropriate responses. Recently, researchers have suggested that the action-observation network (AON) including the cerebellum plays an essential role during such anticipation, particularly in sport expert performers. In the present study, we examined the influence of task-specific expertise on the AON by investigating differences between two expert groups trained in different sports while anticipating action effects. Altogether, 15 tennis and 16 volleyball experts anticipated the direction of observed tennis and volleyball serves while undergoing functional magnetic resonance imaging (fMRI). The expert group in each sport acted as novice controls in the other sport with which they had only little experience. When contrasting anticipation in both expertise conditions with the corresponding untrained sport, a stronger activation of AON areas (SPL, SMA), and particularly of cerebellar structures, was observed. Furthermore, the neural activation within the cerebellum and the SPL was linearly correlated with participant's anticipation performance, irrespective of the specific expertise. For the SPL, this relationship also holds when an expert performs a domain-specific anticipation task. Notably, the stronger activation of the cerebellum as well as of the SMA and the SPL in the expertise conditions suggests that experts rely on their more fine-tuned perceptual-motor representations that have improved during years of training when anticipating the effects of others' actions in their preferred sport. The association of activation within the SPL and the cerebellum with the task achievement suggests that these areas are the predominant brain sites involved in fast motor predictions. The SPL reflects the processing of domain-specific contextual information and the cerebellum the usage of a predictive internal model to solve the anticipation task.

  10. Group prioritisation with unknown expert weights in incomplete linguistic context

    NASA Astrophysics Data System (ADS)

    Cheng, Dong; Cheng, Faxin; Zhou, Zhili; Wang, Juan

    2017-09-01

    In this paper, we study a group prioritisation problem in situations when the expert weights are completely unknown and their judgement preferences are linguistic and incomplete. Starting from the theory of relative entropy (RE) and multiplicative consistency, an optimisation model is provided for deriving an individual priority vector without estimating the missing value(s) of an incomplete linguistic preference relation. In order to address the unknown expert weights in the group aggregating process, we define two new kinds of expert weight indicators based on RE: proximity entropy weight and similarity entropy weight. Furthermore, a dynamic-adjusting algorithm (DAA) is proposed to obtain an objective expert weight vector and capture the dynamic properties involved in it. Unlike the extant literature of group prioritisation, the proposed RE approach does not require pre-allocation of expert weights and can solve incomplete preference relations. An interesting finding is that once all the experts express their preference relations, the final expert weight vector derived from the DAA is fixed irrespective of the initial settings of expert weights. Finally, an application example is conducted to validate the effectiveness and robustness of the RE approach.

  11. A multiprofessional information model for Brazilian primary care: Defining a consensus model towards an interoperable electronic health record.

    PubMed

    Braga, Renata Dutra

    2016-06-01

    To develop a multiprofessional information model to be used in the decision-making process in primary care in Brazil. This was an observational study with a descriptive and exploratory approach, using action research associated with the Delphi method. A group of 13 health professionals made up a panel of experts that, through individual and group meetings, drew up a preliminary health information records model. The questionnaire used to validate this model included four questions based on a Likert scale. These questions evaluated the completeness and relevance of information on each of the four pillars that composed the model. The changes suggested in each round of evaluation were included when accepted by the majority (≥ 50%). This process was repeated as many times as necessary to obtain the desirable and recommended consensus level (> 50%), and the final version became the consensus model. Multidisciplinary health training of the panel of experts allowed a consensus model to be obtained based on four categories of health information, called pillars: Data Collection, Diagnosis, Care Plan and Evaluation. The obtained consensus model was considered valid by the experts and can contribute to the collection and recording of multidisciplinary information in primary care, as well as the identification of relevant concepts for defining electronic health records at this level of complexity in health care. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Quantitative assessment of the relationships among ecological, morphological and aesthetic values in a river rehabilitation initiative.

    PubMed

    McCormick, Ashlee; Fisher, Karen; Brierley, Gary

    2015-04-15

    Promoting community support in rehabilitation efforts through incorporation of aesthetic considerations is an important component of environmental management. This research utilised a small-scale survey methodology to explore relationships among the ecological and morphological goals of scientists and the aesthetic goals of the public using the Twin Streams Catchment, Auckland, New Zealand, as a case study. Analyses using a linear model and a generalised linear mixed model showed statistically significant relationships between perceived naturalness of landscapes and their aesthetic ratings, and among ratings of perceived naturalness and ecological integrity and morphological condition. Expert measures of health and the aesthetic evaluations of the public were well aligned, indicating public preferences for landscapes of high ecological integrity with good morphological condition. Further analysis revealed participants used 'cues to care' to rate naturalness. This suggests that environmental education endeavours could further align values with these cues in efforts to enhance approaches to landscape sustainability. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. TES: A modular systems approach to expert system development for real-time space applications

    NASA Technical Reports Server (NTRS)

    Cacace, Ralph; England, Brenda

    1988-01-01

    A major goal of the Space Station era is to reduce reliance on support from ground based experts. The development of software programs using expert systems technology is one means of reaching this goal without requiring crew members to become intimately familiar with the many complex spacecraft subsystems. Development of an expert systems program requires a validation of the software with actual flight hardware. By combining accurate hardware and software modelling techniques with a modular systems approach to expert systems development, the validation of these software programs can be successfully completed with minimum risk and effort. The TIMES Expert System (TES) is an application that monitors and evaluates real time data to perform fault detection and fault isolation tasks as they would otherwise be carried out by a knowledgeable designer. The development process and primary features of TES, a modular systems approach, and the lessons learned are discussed.

  14. Tip-tilt disturbance model identification based on non-linear least squares fitting for Linear Quadratic Gaussian control

    NASA Astrophysics Data System (ADS)

    Yang, Kangjian; Yang, Ping; Wang, Shuai; Dong, Lizhi; Xu, Bing

    2018-05-01

    We propose a method to identify tip-tilt disturbance model for Linear Quadratic Gaussian control. This identification method based on Levenberg-Marquardt method conducts with a little prior information and no auxiliary system and it is convenient to identify the tip-tilt disturbance model on-line for real-time control. This identification method makes it easy that Linear Quadratic Gaussian control runs efficiently in different adaptive optics systems for vibration mitigation. The validity of the Linear Quadratic Gaussian control associated with this tip-tilt disturbance model identification method is verified by experimental data, which is conducted in replay mode by simulation.

  15. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  16. Portfolio optimization by using linear programing models based on genetic algorithm

    NASA Astrophysics Data System (ADS)

    Sukono; Hidayat, Y.; Lesmana, E.; Putra, A. S.; Napitupulu, H.; Supian, S.

    2018-01-01

    In this paper, we discussed the investment portfolio optimization using linear programming model based on genetic algorithms. It is assumed that the portfolio risk is measured by absolute standard deviation, and each investor has a risk tolerance on the investment portfolio. To complete the investment portfolio optimization problem, the issue is arranged into a linear programming model. Furthermore, determination of the optimum solution for linear programming is done by using a genetic algorithm. As a numerical illustration, we analyze some of the stocks traded on the capital market in Indonesia. Based on the analysis, it is shown that the portfolio optimization performed by genetic algorithm approach produces more optimal efficient portfolio, compared to the portfolio optimization performed by a linear programming algorithm approach. Therefore, genetic algorithms can be considered as an alternative on determining the investment portfolio optimization, particularly using linear programming models.

  17. Proceedings of the 1986 IEEE international conference on systems, man and cybernetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-01-01

    This book presents the papers given at a conference on man-machine systems. Topics considered at the conference included neural model-based cognitive theory and engineering, user interfaces, adaptive and learning systems, human interaction with robotics, decision making, the testing and evaluation of expert systems, software development, international conflict resolution, intelligent interfaces, automation in man-machine system design aiding, knowledge acquisition in expert systems, advanced architectures for artificial intelligence, pattern recognition, knowledge bases, and machine vision.

  18. FoSSI: the family of simplified solver interfaces for the rapid development of parallel numerical atmosphere and ocean models

    NASA Astrophysics Data System (ADS)

    Frickenhaus, Stephan; Hiller, Wolfgang; Best, Meike

    The portable software FoSSI is introduced that—in combination with additional free solver software packages—allows for an efficient and scalable parallel solution of large sparse linear equations systems arising in finite element model codes. FoSSI is intended to support rapid model code development, completely hiding the complexity of the underlying solver packages. In particular, the model developer need not be an expert in parallelization and is yet free to switch between different solver packages by simple modifications of the interface call. FoSSI offers an efficient and easy, yet flexible interface to several parallel solvers, most of them available on the web, such as PETSC, AZTEC, MUMPS, PILUT and HYPRE. FoSSI makes use of the concept of handles for vectors, matrices, preconditioners and solvers, that is frequently used in solver libraries. Hence, FoSSI allows for a flexible treatment of several linear equations systems and associated preconditioners at the same time, even in parallel on separate MPI-communicators. The second special feature in FoSSI is the task specifier, being a combination of keywords, each configuring a certain phase in the solver setup. This enables the user to control a solver over one unique subroutine. Furthermore, FoSSI has rather similar features for all solvers, making a fast solver intercomparison or exchange an easy task. FoSSI is a community software, proven in an adaptive 2D-atmosphere model and a 3D-primitive equation ocean model, both formulated in finite elements. The present paper discusses perspectives of an OpenMP-implementation of parallel iterative solvers based on domain decomposition methods. This approach to OpenMP solvers is rather attractive, as the code for domain-local operations of factorization, preconditioning and matrix-vector product can be readily taken from a sequential implementation that is also suitable to be used in an MPI-variant. Code development in this direction is in an advanced state under the name ScOPES: the Scalable Open Parallel sparse linear Equations Solver.

  19. Using expert opinion to evaluate a habitat effectiveness model for elk in western Oregon and Washington.

    Treesearch

    Richard S. Holthausen; Michael J. Wisdom; John Pierce; Daniel K. Edwards; Mary M. Rowland

    1994-01-01

    We used expert opinion to evaluate the predictive reliability of a habitat effectiveness model for elk in western Oregon and Washington. Twenty-five experts in elk ecology were asked to rate habitat quality for 16 example landscapes. Rankings and ratings of 21 experts were significantly correlated with model output. Expert opinion and model predictions differed for 4...

  20. How Do Novice and Expert Learners Represent, Understand, and Discuss Geologic Time?

    NASA Astrophysics Data System (ADS)

    Layow, Erica Amanda

    This dissertation examined the representations novice and expert learners constructed for the geologic timescale. Learners engaged in a three-part activity. The purpose was to compare novice learners' representations to those of expert learners. This provided insight into the similarities and differences between their strategies for event ordering, assigning values and scale to the geologic timescale model, as well as their language and practices to complete the model. With a qualitative approach to data analysis informed by an expert-novice theoretical framework grounded in phenomenography, learner responses comprised the data analyzed. These data highlighted learners' metacognitive thoughts that might not otherwise be shared through lectures or laboratory activities. Learners' responses were analyzed using a discourse framework that positioned learners as knowers. Novice and expert learners both excelled at ordering and discussing events before the Phanerozoic, but were challenged with events during the Phanerozoic. Novice learners had difficulty assigning values to events and establishing a scale for their models. Expert learners expressed difficulty with determining a scale because of the size of the model, yet eventually used anchor points and unitized the model to establish a scale. Despite challenges constructing their models, novice learners spoke confidently using claims and few hedging phrases indicating their confidence in statements made. Experts used more hedges than novices, however the hedging comments were made about more complex conceptions. Using both phenomenographic and discourse analysis approaches for analysis foregrounded learners' discussions of how they perceived geologic time and their ways of knowing and doing. This research is intended to enhance the geoscience community's understanding of the ways novice and expert learners think and discuss conceptions of geologic time, including the events and values of time, and the strategies used to determine accuracy of scale. This knowledge will provide a base from which to support geoscience curriculum development at the university level, specifically to design activities that will not only engage and express learners' metacognitive scientific practices, but to encourage their construction of scientific identities and membership in the geoscience community.

  1. Assessing the chances of success: naïve statistics versus kind experience.

    PubMed

    Hogarth, Robin M; Mukherjee, Kanchan; Soyer, Emre

    2013-01-01

    Additive integration of information is ubiquitous in judgment and has been shown to be effective even when multiplicative rules of probability theory are prescribed. We explore the generality of these findings in the context of estimating probabilities of success in contests. We first define a normative model of these probabilities that takes account of relative skill levels in contests where only a limited number of entrants can win. We then report 4 experiments using a scenario about a competition. Experiments 1 and 2 both elicited judgments of probabilities, and, although participants' responses demonstrated considerable variability, their mean judgments provide a good fit to a simple linear model. Experiment 3 explored choices. Most participants entered most contests and showed little awareness of appropriate probabilities. Experiment 4 investigated effects of providing aids to calculate probabilities, specifically, access to expert advice and 2 simulation tools. With these aids, estimates were accurate and decisions varied appropriately with economic consequences. We discuss implications by considering when additive decision rules are dysfunctional, the interpretation of overconfidence based on contest-entry behavior, and the use of aids to help people make better decisions.

  2. High-level user interfaces for transfer function design with semantics.

    PubMed

    Salama, Christof Rezk; Keller, Maik; Kohlmann, Peter

    2006-01-01

    Many sophisticated techniques for the visualization of volumetric data such as medical data have been published. While existing techniques are mature from a technical point of view, managing the complexity of visual parameters is still difficult for non-expert users. To this end, this paper presents new ideas to facilitate the specification of optical properties for direct volume rendering. We introduce an additional level of abstraction for parametric models of transfer functions. The proposed framework allows visualization experts to design high-level transfer function models which can intuitively be used by non-expert users. The results are user interfaces which provide semantic information for specialized visualization problems. The proposed method is based on principal component analysis as well as on concepts borrowed from computer animation.

  3. A brief history and technical review of the expert system research

    NASA Astrophysics Data System (ADS)

    Tan, Haocheng

    2017-09-01

    The expert system is a computer system that emulates the decision-making ability of a human expert, which aims to solve complex problems by reasoning knowledge. It is an important branch of artificial intelligence. In this paper, firstly, we briefly introduce the development and basic structure of the expert system. Then, from the perspective of the enabling technology, we classify the current expert systems and elaborate four expert systems: The Rule-Based Expert System, the Framework-Based Expert System, the Fuzzy Logic-Based Expert System and the Expert System Based on Neural Network.

  4. PSG-EXPERT. An expert system for the diagnosis of sleep disorders.

    PubMed

    Fred, A; Filipe, J; Partinen, M; Paiva, T

    2000-01-01

    This paper describes PSG-EXPERT, an expert system in the domain of sleep disorders exploring polysomnographic data. The developed software tool is addressed from two points of view: (1)--as an integrated environment for the development of diagnosis-oriented expert systems; (2)--as an auxiliary diagnosis tool in the particular domain of sleep disorders. Developed over a Windows platform, this software tool extends one of the most popular shells--CLIPS (C Language Integrated Production System) with the following features: backward chaining engine; graph-based explanation facilities; knowledge editor including a fuzzy fact editor and a rules editor, with facts-rules integrity checking; belief revision mechanism; built-in case generator and validation module. It therefore provides graphical support for knowledge acquisition, edition, explanation and validation. From an application domain point of view, PSG-Expert is an auxiliary diagnosis system for sleep disorders based on polysomnographic data, that aims at assisting the medical expert in his diagnosis task by providing automatic analysis of polysomnographic data, summarising the results of this analysis in terms of a report of major findings and possible diagnosis consistent with the polysomnographic data. Sleep disorders classification follows the International Classification of Sleep Disorders. Major features of the system include: browsing on patients data records; structured navigation on Sleep Disorders descriptions according to ASDA definitions; internet links to related pages; diagnosis consistent with polysomnographic data; graphical user-interface including graph-based explanatory facilities; uncertainty modelling and belief revision; production of reports; connection to remote databases.

  5. Conceptual FOM design tool

    NASA Astrophysics Data System (ADS)

    Krause, Lee S.; Burns, Carla L.

    2000-06-01

    This paper discusses the research currently in progress to develop the Conceptual Federation Object Model Design Tool. The objective of the Conceptual FOM (C-FOM) Design Tool effort is to provide domain and subject matter experts, such as scenario developers, with automated support for understanding and utilizing available HLA simulation and other simulation assets during HLA Federation development. The C-FOM Design Tool will import Simulation Object Models from HLA reuse repositories, such as the MSSR, to populate the domain space that will contain all the objects and their supported interactions. In addition, the C-FOM tool will support the conversion of non-HLA legacy models into HLA- compliant models by applying proven abstraction techniques against the legacy models. Domain experts will be able to build scenarios based on the domain objects and interactions in both a text and graphical form and export a minimal FOM. The ability for domain and subject matter experts to effectively access HLA and non-HLA assets is critical to the long-term acceptance of the HLA initiative.

  6. Three-dimensional modeling of flexible pavements : research implementation plan.

    DOT National Transportation Integrated Search

    2006-02-14

    Many of the asphalt pavement analysis programs are based on linear elastic models. A linear viscoelastic models : would be superior to linear elastic models for analyzing the response of asphalt concrete pavements to loads. There : is a need to devel...

  7. Validity and validation of expert (Q)SAR systems.

    PubMed

    Hulzebos, E; Sijm, D; Traas, T; Posthumus, R; Maslankiewicz, L

    2005-08-01

    At a recent workshop in Setubal (Portugal) principles were drafted to assess the suitability of (quantitative) structure-activity relationships ((Q)SARs) for assessing the hazards and risks of chemicals. In the present study we applied some of the Setubal principles to test the validity of three (Q)SAR expert systems and validate the results. These principles include a mechanistic basis, the availability of a training set and validation. ECOSAR, BIOWIN and DEREK for Windows have a mechanistic or empirical basis. ECOSAR has a training set for each QSAR. For half of the structural fragments the number of chemicals in the training set is >4. Based on structural fragments and log Kow, ECOSAR uses linear regression to predict ecotoxicity. Validating ECOSAR for three 'valid' classes results in predictivity of > or = 64%. BIOWIN uses (non-)linear regressions to predict the probability of biodegradability based on fragments and molecular weight. It has a large training set and predicts non-ready biodegradability well. DEREK for Windows predictions are supported by a mechanistic rationale and literature references. The structural alerts in this program have been developed with a training set of positive and negative toxicity data. However, to support the prediction only a limited number of chemicals in the training set is presented to the user. DEREK for Windows predicts effects by 'if-then' reasoning. The program predicts best for mutagenicity and carcinogenicity. Each structural fragment in ECOSAR and DEREK for Windows needs to be evaluated and validated separately.

  8. A comparison of two methods for expert elicitation in health technology assessments.

    PubMed

    Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken

    2016-07-26

    When data needed to inform parameters in decision models are lacking, formal elicitation of expert judgement can be used to characterise parameter uncertainty. Although numerous methods for eliciting expert opinion as probability distributions exist, there is little research to suggest whether one method is more useful than any other method. This study had three objectives: (i) to obtain subjective probability distributions characterising parameter uncertainty in the context of a health technology assessment; (ii) to compare two elicitation methods by eliciting the same parameters in different ways; (iii) to collect subjective preferences of the experts for the different elicitation methods used. Twenty-seven clinical experts were invited to participate in an elicitation exercise to inform a published model-based cost-effectiveness analysis of alternative treatments for prostate cancer. Participants were individually asked to express their judgements as probability distributions using two different methods - the histogram and hybrid elicitation methods - presented in a random order. Individual distributions were mathematically aggregated across experts with and without weighting. The resulting combined distributions were used in the probabilistic analysis of the decision model and mean incremental cost-effectiveness ratios and the expected values of perfect information (EVPI) were calculated for each method, and compared with the original cost-effectiveness analysis. Scores on the ease of use of the two methods and the extent to which the probability distributions obtained from each method accurately reflected the expert's opinion were also recorded. Six experts completed the task. Mean ICERs from the probabilistic analysis ranged between £162,600-£175,500 per quality-adjusted life year (QALY) depending on the elicitation and weighting methods used. Compared to having no information, use of expert opinion decreased decision uncertainty: the EVPI value at the £30,000 per QALY threshold decreased by 74-86 % from the original cost-effectiveness analysis. Experts indicated that the histogram method was easier to use, but attributed a perception of more accuracy to the hybrid method. Inclusion of expert elicitation can decrease decision uncertainty. Here, choice of method did not affect the overall cost-effectiveness conclusions, but researchers intending to use expert elicitation need to be aware of the impact different methods could have.

  9. [Assessment of an educational technology in the string literature about breastfeeding].

    PubMed

    de Oliveira, Paula Marciana Pinheiro; Pagliuca, Lorita Marlena Freitag

    2013-02-01

    The goal of this study was to assess educational technology in the string literature about breastfeeding. The study was conducted between March and September 2009 by breastfeeding experts and experts on string literature. A psychometric model was adopted as the theoretical-methodological framework. For data collection, an instrument was used to assess the content about breastfeeding and the string literature rules. The analysis was based on comparisons of the notes and critical reflections of experts. Ethical guidelines were followed during the study. After the assessments, the educational technology was adjusted until all of the experts agreed. The assessment of educational technology can reduce obstacles to information dissemination and can lead to improvements in quality of life.

  10. Genomic prediction based on data from three layer lines using non-linear regression models.

    PubMed

    Huang, Heyun; Windig, Jack J; Vereijken, Addie; Calus, Mario P L

    2014-11-06

    Most studies on genomic prediction with reference populations that include multiple lines or breeds have used linear models. Data heterogeneity due to using multiple populations may conflict with model assumptions used in linear regression methods. In an attempt to alleviate potential discrepancies between assumptions of linear models and multi-population data, two types of alternative models were used: (1) a multi-trait genomic best linear unbiased prediction (GBLUP) model that modelled trait by line combinations as separate but correlated traits and (2) non-linear models based on kernel learning. These models were compared to conventional linear models for genomic prediction for two lines of brown layer hens (B1 and B2) and one line of white hens (W1). The three lines each had 1004 to 1023 training and 238 to 240 validation animals. Prediction accuracy was evaluated by estimating the correlation between observed phenotypes and predicted breeding values. When the training dataset included only data from the evaluated line, non-linear models yielded at best a similar accuracy as linear models. In some cases, when adding a distantly related line, the linear models showed a slight decrease in performance, while non-linear models generally showed no change in accuracy. When only information from a closely related line was used for training, linear models and non-linear radial basis function (RBF) kernel models performed similarly. The multi-trait GBLUP model took advantage of the estimated genetic correlations between the lines. Combining linear and non-linear models improved the accuracy of multi-line genomic prediction. Linear models and non-linear RBF models performed very similarly for genomic prediction, despite the expectation that non-linear models could deal better with the heterogeneous multi-population data. This heterogeneity of the data can be overcome by modelling trait by line combinations as separate but correlated traits, which avoids the occasional occurrence of large negative accuracies when the evaluated line was not included in the training dataset. Furthermore, when using a multi-line training dataset, non-linear models provided information on the genotype data that was complementary to the linear models, which indicates that the underlying data distributions of the three studied lines were indeed heterogeneous.

  11. Aerothermal Assment Of The Expert Flap In The SCIROCCO Wind Tunnel

    NASA Astrophysics Data System (ADS)

    Walpot, L.; Di Clemente, M.; Vos, J.; Etchells, J.; Trifoni, E.; Thoemel, J.; Gavira, J.

    2011-05-01

    In the frame of the “In-Flight Test Measurement Techniques for Aerothermodynamics” activity of the EXPERT Program, the EXPERT Instrumented Open Flap Assembly experiment has the objective to verify the design/sensor integration and validate the CFD tools. Ground based measurements were made in Europe’s largest high enthalpy plasma facility, Scirocco in Italy. Two EXPERT flaps of the flight article, instrumented with 14 thermocouples, 5 pressure ports, a pyrometer and an IR camera mounted in the cavity instrumented flap will collect in-flight data. During the Scirocco experiment, an EXPERT flap model identical to the flight article was mounted at 45 deg on a holder including cavity and was subjected to a hot plasma flow at an enthalpy up to 11MJ/kg at a stagnation pressure of 7 bar. The test model sports the same pressure sensors as the flight article. Hypersonic state-of-the-art codes were then be used to perform code-to-code and wind tunnel-to-code comparisons, including thermal response of the flap as collected during the tests by the sensors and camera.

  12. Expert system training and control based on the fuzzy relation matrix

    NASA Technical Reports Server (NTRS)

    Ren, Jie; Sheridan, T. B.

    1991-01-01

    Fuzzy knowledge, that for which the terms of reference are not crisp but overlapped, seems to characterize human expertise. This can be shown from the fact that an experienced human operator can control some complex plants better than a computer can. Proposed here is fuzzy theory to build a fuzzy expert relation matrix (FERM) from given rules or/and examples, either in linguistic terms or in numerical values to mimic human processes of perception and decision making. The knowledge base is codified in terms of many implicit fuzzy rules. Fuzzy knowledge thus codified may also be compared with explicit rules specified by a human expert. It can also provide a basis for modeling the human operator and allow comparison of what a human operator says to what he does in practice. Two experiments were performed. In the first, control of liquid in a tank, demonstrates how the FERM knowledge base is elicited and trained. The other shows how to use a FERM, build up from linguistic rules, and to control an inverted pendulum without a dynamic model.

  13. Many-core graph analytics using accelerated sparse linear algebra routines

    NASA Astrophysics Data System (ADS)

    Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric

    2016-05-01

    Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.

  14. Can upstaging of ductal carcinoma in situ be predicted at biopsy by histologic and mammographic features?

    NASA Astrophysics Data System (ADS)

    Shi, Bibo; Grimm, Lars J.; Mazurowski, Maciej A.; Marks, Jeffrey R.; King, Lorraine M.; Maley, Carlo C.; Hwang, E. Shelley; Lo, Joseph Y.

    2017-03-01

    Reducing the overdiagnosis and overtreatment associated with ductal carcinoma in situ (DCIS) requires accurate prediction of the invasive potential at cancer screening. In this work, we investigated the utility of pre-operative histologic and mammographic features to predict upstaging of DCIS. The goal was to provide intentionally conservative baseline performance using readily available data from radiologists and pathologists and only linear models. We conducted a retrospective analysis on 99 patients with DCIS. Of those 25 were upstaged to invasive cancer at the time of definitive surgery. Pre-operative factors including both the histologic features extracted from stereotactic core needle biopsy (SCNB) reports and the mammographic features annotated by an expert breast radiologist were investigated with statistical analysis. Furthermore, we built classification models based on those features in an attempt to predict the presence of an occult invasive component in DCIS, with generalization performance assessed by receiver operating characteristic (ROC) curve analysis. Histologic features including nuclear grade and DCIS subtype did not show statistically significant differences between cases with pure DCIS and with DCIS plus invasive disease. However, three mammographic features, i.e., the major axis length of DCIS lesion, the BI-RADS level of suspicion, and radiologist's assessment did achieve the statistical significance. Using those three statistically significant features as input, a linear discriminant model was able to distinguish patients with DCIS plus invasive disease from those with pure DCIS, with AUC-ROC equal to 0.62. Overall, mammograms used for breast screening contain useful information that can be perceived by radiologists and help predict occult invasive components in DCIS.

  15. Development of Conceptual Models for Internet Search: A Case Study.

    ERIC Educational Resources Information Center

    Uden, Lorna; Tearne, Stephen; Alderson, Albert

    This paper describes the creation and evaluation of a World Wide Web-based courseware module, using conceptual models based on constructivism, that teaches novices how to use the Internet for searching. Questionnaires and interviews were used to understand the difficulties of a group of novices. The conceptual model of the experts for the task was…

  16. Renewable energy education and industrial arts: linking knowledge producers with knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, R.L.

    This study introduces renewable energy technology into the industrial arts programs in the State of New Hampshire by providing the following information for decision making: (1) a broad-based perspective on renewable energy technology; (2) the selection of an educational change model; (3) data from a needs analysis; (4) an initial screening of potential teacher-trainers. The Wolf-Welsh Linkage Model was selected as the knowledge production/utilization model for bridging the knowledge gap between renewable energy experts and industrial arts teachers. Ninety-six renewable energy experts were identified by a three-step peer nomination process (92% response rate). The experts stressed the conceptual foundations, economicmore » justifications, and the scientific and quantitative basics of renewable energy technology. The teachers focused on wood-burning technology, educational strategies, and the more popular alternative energy sources such as windpower, hydropower, photovoltaics, and biomass. The most emphatic contribution of the needs analysis was the experts' and teachers' shared perception that residential/commercial building design, retrofitting, and construction is the single most important practical, technical area for the application of renewable energy technology.« less

  17. The SCERTS[TM] Model: A Comprehensive Educational Approach for Children with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Prizant, Barry M.; Wetherby, Amy M.; Rubin, Emily; Laurent, Amy C.; Rydell, Patrick J.

    2005-01-01

    A groundbreaking synthesis of developmental, relationship-based, and skill-based approaches, The SCERTS[TM] Model provides a framework for improving communication and social-emotional abilities in individuals with autism spectrum disorders (ASD) and their families. Developed by internationally recognized experts, SCERTS[TM] supports developmental…

  18. The Analysis on Systematic Development of College Microlecture

    ERIC Educational Resources Information Center

    Liu, Xiaohong; Wang, Lisi

    2013-01-01

    In order to apply micro lectures to college education successfully, construct new teaching and learning strategies and teaching model, this paper proposes characteristics of college microlecture based on the college education features and construct microlecture structure model based on the definitions by the experts and scholars. Microlecture's…

  19. Development of a biologically based dose response (BBDR) model for arsenic induced cancer

    EPA Science Inventory

    We are developing a biologically based dose response (BBDR) model for arsenic carcinogenicity in order to reduce uncertainty in estimates of low dose risk by maximizing the use of relevant data on the mode of action. Expert consultation and literature review are being conducted t...

  20. Information Retrieval Using UMLS-based Structured Queries

    PubMed Central

    Fagan, Lawrence M.; Berrios, Daniel C.; Chan, Albert; Cucina, Russell; Datta, Anupam; Shah, Maulik; Surendran, Sujith

    2001-01-01

    During the last three years, we have developed and described components of ELBook, a semantically based information-retrieval system [1-4]. Using these components, domain experts can specify a query model, indexers can use the query model to index documents, and end-users can search these documents for instances of indexed queries.

  1. Graph-based real-time fault diagnostics

    NASA Technical Reports Server (NTRS)

    Padalkar, S.; Karsai, G.; Sztipanovits, J.

    1988-01-01

    A real-time fault detection and diagnosis capability is absolutely crucial in the design of large-scale space systems. Some of the existing AI-based fault diagnostic techniques like expert systems and qualitative modelling are frequently ill-suited for this purpose. Expert systems are often inadequately structured, difficult to validate and suffer from knowledge acquisition bottlenecks. Qualitative modelling techniques sometimes generate a large number of failure source alternatives, thus hampering speedy diagnosis. In this paper we present a graph-based technique which is well suited for real-time fault diagnosis, structured knowledge representation and acquisition and testing and validation. A Hierarchical Fault Model of the system to be diagnosed is developed. At each level of hierarchy, there exist fault propagation digraphs denoting causal relations between failure modes of subsystems. The edges of such a digraph are weighted with fault propagation time intervals. Efficient and restartable graph algorithms are used for on-line speedy identification of failure source components.

  2. Analyzing the Language of Therapist Empathy in Motivational Interview based Psychotherapy

    PubMed Central

    Xiao, Bo; Can, Dogan; Georgiou, Panayiotis G.; Atkins, David; Narayanan, Shrikanth S.

    2016-01-01

    Empathy is an important aspect of social communication, especially in medical and psychotherapy applications. Measures of empathy can offer insights into the quality of therapy. We use an N-gram language model based maximum likelihood strategy to classify empathic versus non-empathic utterances and report the precision and recall of classification for various parameters. High recall is obtained with unigram while bigram features achieved the highest F1-score. Based on the utterance level models, a group of lexical features are extracted at the therapy session level. The effectiveness of these features in modeling session level annotator perceptions of empathy is evaluated through correlation with expert-coded session level empathy scores. Our combined feature set achieved a correlation of 0.558 between predicted and expert-coded empathy scores. Results also suggest that the longer term empathy perception process may be more related to isolated empathic salient events. PMID:27602411

  3. School Funding and Resource Allocation: How It Impacts Instructional Practices at the School Level

    ERIC Educational Resources Information Center

    Wall, Shelly R.

    2012-01-01

    In the 2006-2007 school year, the State of Wyoming adopted an evidenced-based school funding model. The Wyoming funding model reviewed in this study is considered an evidence-based approach, utilizing expert judgment to determine educational funding. In an evidence-based approach, educational strategies are identified and a dollar figure is…

  4. Automatic computation of 2D cardiac measurements from B-mode echocardiography

    NASA Astrophysics Data System (ADS)

    Park, JinHyeong; Feng, Shaolei; Zhou, S. Kevin

    2012-03-01

    We propose a robust and fully automatic algorithm which computes the 2D echocardiography measurements recommended by America Society of Echocardiography. The algorithm employs knowledge-based imaging technologies which can learn the expert's knowledge from the training images and expert's annotation. Based on the models constructed from the learning stage, the algorithm searches initial location of the landmark points for the measurements by utilizing heart structure of left ventricle including mitral valve aortic valve. It employs the pseudo anatomic M-mode image generated by accumulating the line images in 2D parasternal long axis view along the time to refine the measurement landmark points. The experiment results with large volume of data show that the algorithm runs fast and is robust comparable to expert.

  5. Regional income inequality model based on theil index decomposition and weighted variance coeficient

    NASA Astrophysics Data System (ADS)

    Sitepu, H. R.; Darnius, O.; Tambunan, W. N.

    2018-03-01

    Regional income inequality is an important issue in the study on economic development of a certain region. Rapid economic development may not in accordance with people’s per capita income. The method of measuring the regional income inequality has been suggested by many experts. This research used Theil index and weighted variance coefficient in order to measure the regional income inequality. Regional income decomposition which becomes the productivity of work force and their participation in regional income inequality, based on Theil index, can be presented in linear relation. When the economic assumption in j sector, sectoral income value, and the rate of work force are used, the work force productivity imbalance can be decomposed to become the component in sectors and in intra-sectors. Next, weighted variation coefficient is defined in the revenue and productivity of the work force. From the quadrate of the weighted variation coefficient result, it was found that decomposition of regional revenue imbalance could be analyzed by finding out how far each component contribute to regional imbalance which, in this research, was analyzed in nine sectors of economic business.

  6. Automatic segmentation of invasive breast carcinomas from dynamic contrast-enhanced MRI using time series analysis.

    PubMed

    Jayender, Jagadaeesan; Chikarmane, Sona; Jolesz, Ferenc A; Gombos, Eva

    2014-08-01

    To accurately segment invasive ductal carcinomas (IDCs) from dynamic contrast-enhanced MRI (DCE-MRI) using time series analysis based on linear dynamic system (LDS) modeling. Quantitative segmentation methods based on black-box modeling and pharmacokinetic modeling are highly dependent on imaging pulse sequence, timing of bolus injection, arterial input function, imaging noise, and fitting algorithms. We modeled the underlying dynamics of the tumor by an LDS and used the system parameters to segment the carcinoma on the DCE-MRI. Twenty-four patients with biopsy-proven IDCs were analyzed. The lesions segmented by the algorithm were compared with an expert radiologist's segmentation and the output of a commercial software, CADstream. The results are quantified in terms of the accuracy and sensitivity of detecting the lesion and the amount of overlap, measured in terms of the Dice similarity coefficient (DSC). The segmentation algorithm detected the tumor with 90% accuracy and 100% sensitivity when compared with the radiologist's segmentation and 82.1% accuracy and 100% sensitivity when compared with the CADstream output. The overlap of the algorithm output with the radiologist's segmentation and CADstream output, computed in terms of the DSC was 0.77 and 0.72, respectively. The algorithm also shows robust stability to imaging noise. Simulated imaging noise with zero mean and standard deviation equal to 25% of the base signal intensity was added to the DCE-MRI series. The amount of overlap between the tumor maps generated by the LDS-based algorithm from the noisy and original DCE-MRI was DSC = 0.95. The time-series analysis based segmentation algorithm provides high accuracy and sensitivity in delineating the regions of enhanced perfusion corresponding to tumor from DCE-MRI. © 2013 Wiley Periodicals, Inc.

  7. Automatic Segmentation of Invasive Breast Carcinomas from DCE-MRI using Time Series Analysis

    PubMed Central

    Jayender, Jagadaeesan; Chikarmane, Sona; Jolesz, Ferenc A.; Gombos, Eva

    2013-01-01

    Purpose Quantitative segmentation methods based on black-box modeling and pharmacokinetic modeling are highly dependent on imaging pulse sequence, timing of bolus injection, arterial input function, imaging noise and fitting algorithms. To accurately segment invasive ductal carcinomas (IDCs) from dynamic contrast enhanced MRI (DCE-MRI) using time series analysis based on linear dynamic system (LDS) modeling. Methods We modeled the underlying dynamics of the tumor by a LDS and use the system parameters to segment the carcinoma on the DCE-MRI. Twenty-four patients with biopsy-proven IDCs were analyzed. The lesions segmented by the algorithm were compared with an expert radiologist’s segmentation and the output of a commercial software, CADstream. The results are quantified in terms of the accuracy and sensitivity of detecting the lesion and the amount of overlap, measured in terms of the Dice similarity coefficient (DSC). Results The segmentation algorithm detected the tumor with 90% accuracy and 100% sensitivity when compared to the radiologist’s segmentation and 82.1% accuracy and 100% sensitivity when compared to the CADstream output. The overlap of the algorithm output with the radiologist’s segmentation and CADstream output, computed in terms of the DSC was 0.77 and 0.72 respectively. The algorithm also shows robust stability to imaging noise. Simulated imaging noise with zero mean and standard deviation equal to 25% of the base signal intensity was added to the DCE-MRI series. The amount of overlap between the tumor maps generated by the LDS-based algorithm from the noisy and original DCE-MRI was DSC=0.95. Conclusion The time-series analysis based segmentation algorithm provides high accuracy and sensitivity in delineating the regions of enhanced perfusion corresponding to tumor from DCE-MRI. PMID:24115175

  8. Structural protein descriptors in 1-dimension and their sequence-based predictions.

    PubMed

    Kurgan, Lukasz; Disfani, Fatemeh Miri

    2011-09-01

    The last few decades observed an increasing interest in development and application of 1-dimensional (1D) descriptors of protein structure. These descriptors project 3D structural features onto 1D strings of residue-wise structural assignments. They cover a wide-range of structural aspects including conformation of the backbone, burying depth/solvent exposure and flexibility of residues, and inter-chain residue-residue contacts. We perform first-of-its-kind comprehensive comparative review of the existing 1D structural descriptors. We define, review and categorize ten structural descriptors and we also describe, summarize and contrast over eighty computational models that are used to predict these descriptors from the protein sequences. We show that the majority of the recent sequence-based predictors utilize machine learning models, with the most popular being neural networks, support vector machines, hidden Markov models, and support vector and linear regressions. These methods provide high-throughput predictions and most of them are accessible to a non-expert user via web servers and/or stand-alone software packages. We empirically evaluate several recent sequence-based predictors of secondary structure, disorder, and solvent accessibility descriptors using a benchmark set based on CASP8 targets. Our analysis shows that the secondary structure can be predicted with over 80% accuracy and segment overlap (SOV), disorder with over 0.9 AUC, 0.6 Matthews Correlation Coefficient (MCC), and 75% SOV, and relative solvent accessibility with PCC of 0.7 and MCC of 0.6 (0.86 when homology is used). We demonstrate that the secondary structure predicted from sequence without the use of homology modeling is as good as the structure extracted from the 3D folds predicted by top-performing template-based methods.

  9. Assimilation of a knowledge base and physical models to reduce errors in passive-microwave classifications of sea ice

    NASA Technical Reports Server (NTRS)

    Maslanik, J. A.; Key, J.

    1992-01-01

    An expert system framework has been developed to classify sea ice types using satellite passive microwave data, an operational classification algorithm, spatial and temporal information, ice types estimated from a dynamic-thermodynamic model, output from a neural network that detects the onset of melt, and knowledge about season and region. The rule base imposes boundary conditions upon the ice classification, modifies parameters in the ice algorithm, determines a `confidence' measure for the classified data, and under certain conditions, replaces the algorithm output with model output. Results demonstrate the potential power of such a system for minimizing overall error in the classification and for providing non-expert data users with a means of assessing the usefulness of the classification results for their applications.

  10. Vivaldi: visualization and validation of biomacromolecular NMR structures from the PDB.

    PubMed

    Hendrickx, Pieter M S; Gutmanas, Aleksandras; Kleywegt, Gerard J

    2013-04-01

    We describe Vivaldi (VIsualization and VALidation DIsplay; http://pdbe.org/vivaldi), a web-based service for the analysis, visualization, and validation of NMR structures in the Protein Data Bank (PDB). Vivaldi provides access to model coordinates and several types of experimental NMR data using interactive visualization tools, augmented with structural annotations and model-validation information. The service presents information about the modeled NMR ensemble, validation of experimental chemical shifts, residual dipolar couplings, distance and dihedral angle constraints, as well as validation scores based on empirical knowledge and databases. Vivaldi was designed for both expert NMR spectroscopists and casual non-expert users who wish to obtain a better grasp of the information content and quality of NMR structures in the public archive. Copyright © 2013 Wiley Periodicals, Inc.

  11. Development of uncertainty-based work injury model using Bayesian structural equation modelling.

    PubMed

    Chatterjee, Snehamoy

    2014-01-01

    This paper proposed a Bayesian method-based structural equation model (SEM) of miners' work injury for an underground coal mine in India. The environmental and behavioural variables for work injury were identified and causal relationships were developed. For Bayesian modelling, prior distributions of SEM parameters are necessary to develop the model. In this paper, two approaches were adopted to obtain prior distribution for factor loading parameters and structural parameters of SEM. In the first approach, the prior distributions were considered as a fixed distribution function with specific parameter values, whereas, in the second approach, prior distributions of the parameters were generated from experts' opinions. The posterior distributions of these parameters were obtained by applying Bayesian rule. The Markov Chain Monte Carlo sampling in the form Gibbs sampling was applied for sampling from the posterior distribution. The results revealed that all coefficients of structural and measurement model parameters are statistically significant in experts' opinion-based priors, whereas, two coefficients are not statistically significant when fixed prior-based distributions are applied. The error statistics reveals that Bayesian structural model provides reasonably good fit of work injury with high coefficient of determination (0.91) and less mean squared error as compared to traditional SEM.

  12. Crowdsourcing: a valid alternative to expert evaluation of robotic surgery skills.

    PubMed

    Polin, Michael R; Siddiqui, Nazema Y; Comstock, Bryan A; Hesham, Helai; Brown, Casey; Lendvay, Thomas S; Martino, Martin A

    2016-11-01

    Robotic-assisted gynecologic surgery is common, but requires unique training. A validated assessment tool for evaluating trainees' robotic surgery skills is Robotic-Objective Structured Assessments of Technical Skills. We sought to assess whether crowdsourcing can be used as an alternative to expert surgical evaluators in scoring Robotic-Objective Structured Assessments of Technical Skills. The Robotic Training Network produced the Robotic-Objective Structured Assessments of Technical Skills, which evaluate trainees across 5 dry lab robotic surgical drills. Robotic-Objective Structured Assessments of Technical Skills were previously validated in a study of 105 participants, where dry lab surgical drills were recorded, de-identified, and scored by 3 expert surgeons using the Robotic-Objective Structured Assessments of Technical Skills checklist. Our methods-comparison study uses these previously obtained recordings and expert surgeon scores. Mean scores per participant from each drill were separated into quartiles. Crowdworkers were trained and calibrated on Robotic-Objective Structured Assessments of Technical Skills scoring using a representative recording of a skilled and novice surgeon. Following this, 3 recordings from each scoring quartile for each drill were randomly selected. Crowdworkers evaluated the randomly selected recordings using Robotic-Objective Structured Assessments of Technical Skills. Linear mixed effects models were used to derive mean crowdsourced ratings for each drill. Pearson correlation coefficients were calculated to assess the correlation between crowdsourced and expert surgeons' ratings. In all, 448 crowdworkers reviewed videos from 60 dry lab drills, and completed a total of 2517 Robotic-Objective Structured Assessments of Technical Skills assessments within 16 hours. Crowdsourced Robotic-Objective Structured Assessments of Technical Skills ratings were highly correlated with expert surgeon ratings across each of the 5 dry lab drills (r ranging from 0.75-0.91). Crowdsourced assessments of recorded dry lab surgical drills using a validated assessment tool are a rapid and suitable alternative to expert surgeon evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Prioritizing Measures of Digital Patient Engagement: A Delphi Expert Panel Study

    PubMed Central

    2017-01-01

    Background Establishing a validated scale of patient engagement through use of information technology (ie, digital patient engagement) is the first step to understanding its role in health and health care quality, outcomes, and efficient implementation by health care providers and systems. Objective The aim of this study was to develop and prioritize measures of digital patient engagement based on patients’ use of the US Department of Veterans Affairs (VA)’s MyHealtheVet (MHV) portal, focusing on the MHV/Blue Button and Secure Messaging functions. Methods We aligned two models from the information systems and organizational behavior literatures to create a theory-based model of digital patient engagement. On the basis of this model, we conducted ten key informant interviews to identify potential measures from existing VA studies and consolidated the measures. We then conducted three rounds of modified Delphi rating by 12 national eHealth experts via Web-based surveys to prioritize the measures. Results All 12 experts completed the study’s three rounds of modified Delphi ratings, resulting in two sets of final candidate measures representing digital patient engagement for Secure Messaging (58 measures) and MHV/Blue Button (71 measures). These measure sets map to Donabedian’s three types of quality measures: (1) antecedents (eg, patient demographics); (2) processes (eg, a novel measure of Web-based care quality); and (3) outcomes (eg, patient engagement). Conclusions This national expert panel study using a modified Delphi technique prioritized candidate measures to assess digital patient engagement through patients’ use of VA’s My HealtheVet portal. The process yielded two robust measures sets prepared for future piloting and validation in surveys among Veterans. PMID:28550008

  14. Prioritizing Measures of Digital Patient Engagement: A Delphi Expert Panel Study.

    PubMed

    Garvin, Lynn A; Simon, Steven R

    2017-05-26

    Establishing a validated scale of patient engagement through use of information technology (ie, digital patient engagement) is the first step to understanding its role in health and health care quality, outcomes, and efficient implementation by health care providers and systems. The aim of this study was to develop and prioritize measures of digital patient engagement based on patients' use of the US Department of Veterans Affairs (VA)'s MyHealtheVet (MHV) portal, focusing on the MHV/Blue Button and Secure Messaging functions. We aligned two models from the information systems and organizational behavior literatures to create a theory-based model of digital patient engagement. On the basis of this model, we conducted ten key informant interviews to identify potential measures from existing VA studies and consolidated the measures. We then conducted three rounds of modified Delphi rating by 12 national eHealth experts via Web-based surveys to prioritize the measures. All 12 experts completed the study's three rounds of modified Delphi ratings, resulting in two sets of final candidate measures representing digital patient engagement for Secure Messaging (58 measures) and MHV/Blue Button (71 measures). These measure sets map to Donabedian's three types of quality measures: (1) antecedents (eg, patient demographics); (2) processes (eg, a novel measure of Web-based care quality); and (3) outcomes (eg, patient engagement). This national expert panel study using a modified Delphi technique prioritized candidate measures to assess digital patient engagement through patients' use of VA's My HealtheVet portal. The process yielded two robust measures sets prepared for future piloting and validation in surveys among Veterans. ©Lynn A Garvin, Steven R Simon. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 26.05.2017.

  15. Statistical Methodology for the Analysis of Repeated Duration Data in Behavioral Studies.

    PubMed

    Letué, Frédérique; Martinez, Marie-José; Samson, Adeline; Vilain, Anne; Vilain, Coriandre

    2018-03-15

    Repeated duration data are frequently used in behavioral studies. Classical linear or log-linear mixed models are often inadequate to analyze such data, because they usually consist of nonnegative and skew-distributed variables. Therefore, we recommend use of a statistical methodology specific to duration data. We propose a methodology based on Cox mixed models and written under the R language. This semiparametric model is indeed flexible enough to fit duration data. To compare log-linear and Cox mixed models in terms of goodness-of-fit on real data sets, we also provide a procedure based on simulations and quantile-quantile plots. We present two examples from a data set of speech and gesture interactions, which illustrate the limitations of linear and log-linear mixed models, as compared to Cox models. The linear models are not validated on our data, whereas Cox models are. Moreover, in the second example, the Cox model exhibits a significant effect that the linear model does not. We provide methods to select the best-fitting models for repeated duration data and to compare statistical methodologies. In this study, we show that Cox models are best suited to the analysis of our data set.

  16. Assessing socioeconomic vulnerability to dengue fever in Cali, Colombia: statistical vs expert-based modeling

    PubMed Central

    2013-01-01

    Background As a result of changes in climatic conditions and greater resistance to insecticides, many regions across the globe, including Colombia, have been facing a resurgence of vector-borne diseases, and dengue fever in particular. Timely information on both (1) the spatial distribution of the disease, and (2) prevailing vulnerabilities of the population are needed to adequately plan targeted preventive intervention. We propose a methodology for the spatial assessment of current socioeconomic vulnerabilities to dengue fever in Cali, a tropical urban environment of Colombia. Methods Based on a set of socioeconomic and demographic indicators derived from census data and ancillary geospatial datasets, we develop a spatial approach for both expert-based and purely statistical-based modeling of current vulnerability levels across 340 neighborhoods of the city using a Geographic Information System (GIS). The results of both approaches are comparatively evaluated by means of spatial statistics. A web-based approach is proposed to facilitate the visualization and the dissemination of the output vulnerability index to the community. Results The statistical and the expert-based modeling approach exhibit a high concordance, globally, and spatially. The expert-based approach indicates a slightly higher vulnerability mean (0.53) and vulnerability median (0.56) across all neighborhoods, compared to the purely statistical approach (mean = 0.48; median = 0.49). Both approaches reveal that high values of vulnerability tend to cluster in the eastern, north-eastern, and western part of the city. These are poor neighborhoods with high percentages of young (i.e., < 15 years) and illiterate residents, as well as a high proportion of individuals being either unemployed or doing housework. Conclusions Both modeling approaches reveal similar outputs, indicating that in the absence of local expertise, statistical approaches could be used, with caution. By decomposing identified vulnerability “hotspots” into their underlying factors, our approach provides valuable information on both (1) the location of neighborhoods, and (2) vulnerability factors that should be given priority in the context of targeted intervention strategies. The results support decision makers to allocate resources in a manner that may reduce existing susceptibilities and strengthen resilience, and thus help to reduce the burden of vector-borne diseases. PMID:23945265

  17. Regression to fuzziness method for estimation of remaining useful life in power plant components

    NASA Astrophysics Data System (ADS)

    Alamaniotis, Miltiadis; Grelle, Austin; Tsoukalas, Lefteri H.

    2014-10-01

    Mitigation of severe accidents in power plants requires the reliable operation of all systems and the on-time replacement of mechanical components. Therefore, the continuous surveillance of power systems is a crucial concern for the overall safety, cost control, and on-time maintenance of a power plant. In this paper a methodology called regression to fuzziness is presented that estimates the remaining useful life (RUL) of power plant components. The RUL is defined as the difference between the time that a measurement was taken and the estimated failure time of that component. The methodology aims to compensate for a potential lack of historical data by modeling an expert's operational experience and expertise applied to the system. It initially identifies critical degradation parameters and their associated value range. Once completed, the operator's experience is modeled through fuzzy sets which span the entire parameter range. This model is then synergistically used with linear regression and a component's failure point to estimate the RUL. The proposed methodology is tested on estimating the RUL of a turbine (the basic electrical generating component of a power plant) in three different cases. Results demonstrate the benefits of the methodology for components for which operational data is not readily available and emphasize the significance of the selection of fuzzy sets and the effect of knowledge representation on the predicted output. To verify the effectiveness of the methodology, it was benchmarked against the data-based simple linear regression model used for predictions which was shown to perform equal or worse than the presented methodology. Furthermore, methodology comparison highlighted the improvement in estimation offered by the adoption of appropriate of fuzzy sets for parameter representation.

  18. Rank-based estimation in the {ell}1-regularized partly linear model for censored outcomes with application to integrated analyses of clinical predictors and gene expression data.

    PubMed

    Johnson, Brent A

    2009-10-01

    We consider estimation and variable selection in the partial linear model for censored data. The partial linear model for censored data is a direct extension of the accelerated failure time model, the latter of which is a very important alternative model to the proportional hazards model. We extend rank-based lasso-type estimators to a model that may contain nonlinear effects. Variable selection in such partial linear model has direct application to high-dimensional survival analyses that attempt to adjust for clinical predictors. In the microarray setting, previous methods can adjust for other clinical predictors by assuming that clinical and gene expression data enter the model linearly in the same fashion. Here, we select important variables after adjusting for prognostic clinical variables but the clinical effects are assumed nonlinear. Our estimator is based on stratification and can be extended naturally to account for multiple nonlinear effects. We illustrate the utility of our method through simulation studies and application to the Wisconsin prognostic breast cancer data set.

  19. Optimal operating rules definition in complex water resource systems combining fuzzy logic, expert criteria and stochastic programming

    NASA Astrophysics Data System (ADS)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel

    2016-04-01

    This contribution presents a methodology for defining optimal seasonal operating rules in multireservoir systems coupling expert criteria and stochastic optimization. Both sources of information are combined using fuzzy logic. The structure of the operating rules is defined based on expert criteria, via a joint expert-technician framework consisting in a series of meetings, workshops and surveys carried out between reservoir managers and modelers. As a result, the decision-making process used by managers can be assessed and expressed using fuzzy logic: fuzzy rule-based systems are employed to represent the operating rules and fuzzy regression procedures are used for forecasting future inflows. Once done that, a stochastic optimization algorithm can be used to define optimal decisions and transform them into fuzzy rules. Finally, the optimal fuzzy rules and the inflow prediction scheme are combined into a Decision Support System for making seasonal forecasts and simulate the effect of different alternatives in response to the initial system state and the foreseen inflows. The approach presented has been applied to the Jucar River Basin (Spain). Reservoir managers explained how the system is operated, taking into account the reservoirs' states at the beginning of the irrigation season and the inflows previewed during that season. According to the information given by them, the Jucar River Basin operating policies were expressed via two fuzzy rule-based (FRB) systems that estimate the amount of water to be allocated to the users and how the reservoir storages should be balanced to guarantee those deliveries. A stochastic optimization model using Stochastic Dual Dynamic Programming (SDDP) was developed to define optimal decisions, which are transformed into optimal operating rules embedding them into the two FRBs previously created. As a benchmark, historical records are used to develop alternative operating rules. A fuzzy linear regression procedure was employed to foresee future inflows depending on present and past hydrological and meteorological variables actually used by the reservoir managers to define likely inflow scenarios. A Decision Support System (DSS) was created coupling the FRB systems and the inflow prediction scheme in order to give the user a set of possible optimal releases in response to the reservoir states at the beginning of the irrigation season and the fuzzy inflow projections made using hydrological and meteorological information. The results show that the optimal DSS created using the FRB operating policies are able to increase the amount of water allocated to the users in 20 to 50 Mm3 per irrigation season with respect to the current policies. Consequently, the mechanism used to define optimal operating rules and transform them into a DSS is able to increase the water deliveries in the Jucar River Basin, combining expert criteria and optimization algorithms in an efficient way. This study has been partially supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) and FEDER funds. It also has received funding from the European Union's Horizon 2020 research and innovation programme under the IMPREX project (grant agreement no: 641.811).

  20. Integration of perception and reasoning in fast neural modules

    NASA Technical Reports Server (NTRS)

    Fritz, David G.

    1989-01-01

    Artificial neural systems promise to integrate symbolic and sub-symbolic processing to achieve real time control of physical systems. Two potential alternatives exist. In one, neural nets can be used to front-end expert systems. The expert systems, in turn, are developed with varying degrees of parallelism, including their implementation in neural nets. In the other, rule-based reasoning and sensor data can be integrated within a single hybrid neural system. The hybrid system reacts as a unit to provide decisions (problem solutions) based on the simultaneous evaluation of data and rules. Discussed here is a model hybrid system based on the fuzzy cognitive map (FCM). The operation of the model is illustrated with the control of a hypothetical satellite that intelligently alters its attitude in space in response to an intersecting micrometeorite shower.

  1. A Dual Hesitant Fuzzy Multigranulation Rough Set over Two-Universe Model for Medical Diagnoses

    PubMed Central

    Zhang, Chao; Li, Deyu; Yan, Yan

    2015-01-01

    In medical science, disease diagnosis is one of the difficult tasks for medical experts who are confronted with challenges in dealing with a lot of uncertain medical information. And different medical experts might express their own thought about the medical knowledge base which slightly differs from other medical experts. Thus, to solve the problems of uncertain data analysis and group decision making in disease diagnoses, we propose a new rough set model called dual hesitant fuzzy multigranulation rough set over two universes by combining the dual hesitant fuzzy set and multigranulation rough set theories. In the framework of our study, both the definition and some basic properties of the proposed model are presented. Finally, we give a general approach which is applied to a decision making problem in disease diagnoses, and the effectiveness of the approach is demonstrated by a numerical example. PMID:26858772

  2. A prototype knowledge-based simulation support system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, T.R.; Roberts, S.D.

    1987-04-01

    As a preliminary step toward the goal of an intelligent automated system for simulation modeling support, we explore the feasibility of the overall concept by generating and testing a prototypical framework. A prototype knowledge-based computer system was developed to support a senior level course in industrial engineering so that the overall feasibility of an expert simulation support system could be studied in a controlled and observable setting. The system behavior mimics the diagnostic (intelligent) process performed by the course instructor and teaching assistants, finding logical errors in INSIGHT simulation models and recommending appropriate corrective measures. The system was programmed inmore » a non-procedural language (PROLOG) and designed to run interactively with students working on course homework and projects. The knowledge-based structure supports intelligent behavior, providing its users with access to an evolving accumulation of expert diagnostic knowledge. The non-procedural approach facilitates the maintenance of the system and helps merge the roles of expert and knowledge engineer by allowing new knowledge to be easily incorporated without regard to the existing flow of control. The background, features and design of the system are describe and preliminary results are reported. Initial success is judged to demonstrate the utility of the reported approach and support the ultimate goal of an intelligent modeling system which can support simulation modelers outside the classroom environment. Finally, future extensions are suggested.« less

  3. Quantifying the conservation gains from shared access to linear infrastructure.

    PubMed

    Runge, Claire A; Tulloch, Ayesha I T; Gordon, Ascelin; Rhodes, Jonathan R

    2017-12-01

    The proliferation of linear infrastructure such as roads and railways is a major global driver of cumulative biodiversity loss. One strategy for reducing habitat loss associated with development is to encourage linear infrastructure providers and users to share infrastructure networks. We quantified the reductions in biodiversity impact and capital costs under linear infrastructure sharing of a range of potential mine to port transportation links for 47 mine locations operated by 28 separate companies in the Upper Spencer Gulf Region of South Australia. We mapped transport links based on least-cost pathways for different levels of linear-infrastructure sharing and used expert-elicited impacts of linear infrastructure to estimate the consequences for biodiversity. Capital costs were calculated based on estimates of construction costs, compensation payments, and transaction costs. We evaluated proposed mine-port links by comparing biodiversity impacts and capital costs across 3 scenarios: an independent scenario, where no infrastructure is shared; a restricted-access scenario, where the largest mining companies share infrastructure but exclude smaller mining companies from sharing; and a shared scenario where all mining companies share linear infrastructure. Fully shared development of linear infrastructure reduced overall biodiversity impacts by 76% and reduced capital costs by 64% compared with the independent scenario. However, there was considerable variation among companies. Our restricted-access scenario showed only modest biodiversity benefits relative to the independent scenario, indicating that reductions are likely to be limited if the dominant mining companies restrict access to infrastructure, which often occurs without policies that promote sharing of infrastructure. Our research helps illuminate the circumstances under which infrastructure sharing can minimize the biodiversity impacts of development. © 2017 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  4. Model Predictive Control of Type 1 Diabetes: An in Silico Trial

    PubMed Central

    Magni, Lalo; Raimondo, Davide M.; Bossi, Luca; Man, Chiara Dalla; De Nicolao, Giuseppe; Kovatchev, Boris; Cobelli, Claudio

    2007-01-01

    Background The development of artificial pancreas has received a new impulse from recent technological advancements in subcutaneous continuous glucose monitoring and subcutaneous insulin pump delivery systems. However, the availability of innovative sensors and actuators, although essential, does not guarantee optimal glycemic regulation. Closed-loop control of blood glucose levels still poses technological challenges to the automatic control expert, most notable of which are the inevitable time delays between glucose sensing and insulin actuation. Methods A new in silico model is exploited for both design and validation of a linear model predictive control (MPC) glucose control system. The starting point is a recently developed meal glucose–insulin model in health, which is modified to describe the metabolic dynamics of a person with type 1 diabetes mellitus. The population distribution of the model parameters originally obtained in healthy 204 patients is modified to describe diabetic patients. Individual models of virtual patients are extracted from this distribution. A discrete-time MPC is designed for all the virtual patients from a unique input–output-linearized approximation of the full model based on the average population values of the parameters. The in silico trial simulates 4 consecutive days, during which the patient receives breakfast, lunch, and dinner each day. Results Provided that the regulator undergoes some individual tuning, satisfactory results are obtained even if the control design relies solely on the average patient model. Only the weight on the glucose concentration error needs to be tuned in a quite straightforward and intuitive way. The ability of the MPC to take advantage of meal announcement information is demonstrated. Imperfect knowledge of the amount of ingested glucose causes only marginal deterioration of performance. In general, MPC results in better regulation than proportional integral derivative, limiting significantly the oscillation of glucose levels. Conclusions The proposed in silico trial shows the potential of MPC for artificial pancreas design. The main features are a capability to consider meal announcement information, delay compensation, and simplicity of tuning and implementation. PMID:19885152

  5. Take-the-best in expert-novice decision strategies for residential burglary.

    PubMed

    Garcia-Retamero, Rocio; Dhami, Mandeep K

    2009-02-01

    We examined the decision strategies and cue use of experts and novices in a consequential domain: crime. Three participant groups decided which of two residential properties was more likely to be burgled, on the basis of eight cues such as location of the property. The two expert groups were experienced burglars and police officers, and the novice group was composed of graduate students. We found that experts' choices were best predicted by a lexicographic heuristic strategy called take-the-best that implies noncompensatory information processing, whereas novices' choices were best predicted by a weighted additive linear strategy that implies compensatory processing. The two expert groups, however, differed in the cues they considered important in making their choices, and the police officers were actually more similar to novices in this regard. These findings extend the literature on judgment, decision making, and expertise, and have implications for criminal justice policy.

  6. Similarity and accuracy of mental models formed during nursing handovers: A concept mapping approach.

    PubMed

    Drach-Zahavy, Anat; Broyer, Chaya; Dagan, Efrat

    2017-09-01

    Shared mental models are crucial for constructing mutual understanding of the patient's condition during a clinical handover. Yet, scant research, if any, has empirically explored mental models of the parties involved in a clinical handover. This study aimed to examine the similarities among mental models of incoming and outgoing nurses, and to test their accuracy by comparing them with mental models of expert nurses. A cross-sectional study, exploring nurses' mental models via the concept mapping technique. 40 clinical handovers. Data were collected via concept mapping of the incoming, outgoing, and expert nurses' mental models (total of 120 concept maps). Similarity and accuracy for concepts and associations indexes were calculated to compare the different maps. About one fifth of the concepts emerged in both outgoing and incoming nurses' concept maps (concept similarity=23%±10.6). Concept accuracy indexes were 35%±18.8 for incoming and 62%±19.6 for outgoing nurses' maps. Although incoming nurses absorbed fewer number of concepts and associations (23% and 12%, respectively), they partially closed the gap (35% and 22%, respectively) relative to expert nurses' maps. The correlations between concept similarities, and incoming as well as outgoing nurses' concept accuracy, were significant (r=0.43, p<0.01; r=0.68 p<0.01, respectively). Finally, in 90% of the maps, outgoing nurses added information concerning the processes enacted during the shift, beyond the expert nurses' gold standard. Two seemingly contradicting processes in the handover were identified. "Information loss", captured by the low similarity indexes among the mental models of incoming and outgoing nurses; and "information restoration", based on accuracy measures indexes among the mental models of the incoming nurses. Based on mental model theory, we propose possible explanations for these processes and derive implications for how to improve a clinical handover. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Evolving MCDM Applications Using Hybrid Expert-Based ISM and DEMATEL Models: An Example of Sustainable Ecotourism

    PubMed Central

    Chuang, Huan-Ming

    2013-01-01

    Ecological degradation is an escalating global threat. Increasingly, people are expressing awareness and priority for concerns about environmental problems surrounding them. Environmental protection issues are highlighted. An appropriate information technology tool, the growing popular social network system (virtual community, VC), facilitates public education and engagement with applications for existent problems effectively. Particularly, the exploration of related involvement behavior of VC member engagement is an interesting topic. Nevertheless, member engagement processes comprise interrelated sub-processes that reflect an interactive experience within VCs as well as the value co-creation model. To address the top-focused ecotourism VCs, this study presents an application of a hybrid expert-based ISM model and DEMATEL model based on multi-criteria decision making tools to investigate the complex multidimensional and dynamic nature of member engagement. Our research findings provide insightful managerial implications and suggest that the viral marketing of ecotourism protection is concerned with practitioners and academicians alike. PMID:24453902

  8. Evolving MCDM applications using hybrid expert-based ISM and DEMATEL models: an example of sustainable ecotourism.

    PubMed

    Chuang, Huan-Ming; Lin, Chien-Ku; Chen, Da-Ren; Chen, You-Shyang

    2013-01-01

    Ecological degradation is an escalating global threat. Increasingly, people are expressing awareness and priority for concerns about environmental problems surrounding them. Environmental protection issues are highlighted. An appropriate information technology tool, the growing popular social network system (virtual community, VC), facilitates public education and engagement with applications for existent problems effectively. Particularly, the exploration of related involvement behavior of VC member engagement is an interesting topic. Nevertheless, member engagement processes comprise interrelated sub-processes that reflect an interactive experience within VCs as well as the value co-creation model. To address the top-focused ecotourism VCs, this study presents an application of a hybrid expert-based ISM model and DEMATEL model based on multi-criteria decision making tools to investigate the complex multidimensional and dynamic nature of member engagement. Our research findings provide insightful managerial implications and suggest that the viral marketing of ecotourism protection is concerned with practitioners and academicians alike.

  9. Design of Linear Control System for Wind Turbine Blade Fatigue Testing

    NASA Astrophysics Data System (ADS)

    Toft, Anders; Roe-Poulsen, Bjarke; Christiansen, Rasmus; Knudsen, Torben

    2016-09-01

    This paper proposes a linear method for wind turbine blade fatigue testing at Siemens Wind Power. The setup consists of a blade, an actuator (motor and load mass) that acts on the blade with a sinusoidal moment, and a distribution of strain gauges to measure the blade flexure. Based on the frequency of the sinusoidal input, the blade will start oscillating with a given gain, hence the objective of the fatigue test is to make the blade oscillate with a controlled amplitude. The system currently in use is based on frequency control, which involves some non-linearities that make the system difficult to control. To make a linear controller, a different approach has been chosen, namely making a controller which is not regulating on the input frequency, but on the input amplitude. A non-linear mechanical model for the blade and the motor has been constructed. This model has been simplified based on the desired output, namely the amplitude of the blade. Furthermore, the model has been linearised to make it suitable for linear analysis and control design methods. The controller is designed based on a simplified and linearised model, and its gain parameter determined using pole placement. The model variants have been simulated in the MATLAB toolbox Simulink, which shows that the controller design based on the simple model performs adequately with the non-linear model. Moreover, the developed controller solves the robustness issue found in the existent solution and also reduces the needed energy for actuation as it always operates at the blade eigenfrequency.

  10. An Application to the Prediction of LOD Change Based on General Regression Neural Network

    NASA Astrophysics Data System (ADS)

    Zhang, X. H.; Wang, Q. J.; Zhu, J. J.; Zhang, H.

    2011-07-01

    Traditional prediction of the LOD (length of day) change was based on linear models, such as the least square model and the autoregressive technique, etc. Due to the complex non-linear features of the LOD variation, the performances of the linear model predictors are not fully satisfactory. This paper applies a non-linear neural network - general regression neural network (GRNN) model to forecast the LOD change, and the results are analyzed and compared with those obtained with the back propagation neural network and other models. The comparison shows that the performance of the GRNN model in the prediction of the LOD change is efficient and feasible.

  11. The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator.

    PubMed

    Roh, S D; Kim, S W; Cho, W S

    2001-10-01

    The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator were accomplished. In the numerical modelling, two models applied to the modelling within the kiln are the combustion chamber model including the mass and energy balance equations for two combustion chambers and 3D thermal model. The combustion chamber model predicts temperature within the kiln, flue gas composition, flux and heat of combustion. Using the combustion chamber model and 3D thermal model, the production-rules for the process simulation can be obtained through interrelation analysis between control and operation variables. The process simulation of the kiln is operated with the production-rules for automatic operation. The process simulation aims to provide fundamental solutions to the problems in incineration process by introducing an online expert control system to provide an integrity in process control and management. Knowledge-based expert control systems use symbolic logic and heuristic rules to find solutions for various types of problems. It was implemented to be a hybrid intelligent expert control system by mutually connecting with the process control systems which has the capability of process diagnosis, analysis and control.

  12. An on-line expert system for diagnosing environmentally induced spacecraft anomalies using CLIPS

    NASA Technical Reports Server (NTRS)

    Lauriente, Michael; Rolincik, Mark; Koons, Harry C; Gorney, David

    1993-01-01

    A new rule-based, expert system for diagnosing spacecraft anomalies is under development. The knowledge base consists of over two-hundred rules and provide links to historical and environmental databases. Environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. The use of heuristics frees the user from searching through large amounts of irrelevant information (varying degrees of confidence in an answer) or 'unknown' to any question. The expert system not only provides scientists with needed risk analysis and confidence estimates not available in standard numerical models or databases, but it is also an effective learning tool. In addition, the architecture of the expert system allows easy additions to the knowledge base and the database. For example, new frames concerning orbital debris and ionospheric scintillation are being considered. The system currently runs on a MicroVAX and uses the C Language Integrated Production System (CLIPS).

  13. Waveform Design for Wireless Power Transfer

    NASA Astrophysics Data System (ADS)

    Clerckx, Bruno; Bayguzina, Ekaterina

    2016-12-01

    Far-field Wireless Power Transfer (WPT) has attracted significant attention in recent years. Despite the rapid progress, the emphasis of the research community in the last decade has remained largely concentrated on improving the design of energy harvester (so-called rectenna) and has left aside the effect of transmitter design. In this paper, we study the design of transmit waveform so as to enhance the DC power at the output of the rectenna. We derive a tractable model of the non-linearity of the rectenna and compare with a linear model conventionally used in the literature. We then use those models to design novel multisine waveforms that are adaptive to the channel state information (CSI). Interestingly, while the linear model favours narrowband transmission with all the power allocated to a single frequency, the non-linear model favours a power allocation over multiple frequencies. Through realistic simulations, waveforms designed based on the non-linear model are shown to provide significant gains (in terms of harvested DC power) over those designed based on the linear model and over non-adaptive waveforms. We also compute analytically the theoretical scaling laws of the harvested energy for various waveforms as a function of the number of sinewaves and transmit antennas. Those scaling laws highlight the benefits of CSI knowledge at the transmitter in WPT and of a WPT design based on a non-linear rectenna model over a linear model. Results also motivate the study of a promising architecture relying on large-scale multisine multi-antenna waveforms for WPT. As a final note, results stress the importance of modeling and accounting for the non-linearity of the rectenna in any system design involving wireless power.

  14. Multi-Year Revenue and Expenditure Forecasting for Small Municipal Governments.

    DTIC Science & Technology

    1981-03-01

    Management Audit Econometric Revenue Forecast Gap and Impact Analysis Deterministic Expenditure Forecast Municipal Forecasting Municipal Budget Formlto...together with a multi-year revenue and expenditure forecasting model for the City of Monterey, California. The Monterey model includes an econometric ...65 5 D. FORECAST BASED ON THE ECONOMETRIC MODEL ------- 67 E. FORECAST BASED ON EXPERT JUDGMENT AND TREND ANALYSIS

  15. A scan statistic for identifying optimal risk windows in vaccine safety studies using self-controlled case series design.

    PubMed

    Xu, Stanley; Hambidge, Simon J; McClure, David L; Daley, Matthew F; Glanz, Jason M

    2013-08-30

    In the examination of the association between vaccines and rare adverse events after vaccination in postlicensure observational studies, it is challenging to define appropriate risk windows because prelicensure RCTs provide little insight on the timing of specific adverse events. Past vaccine safety studies have often used prespecified risk windows based on prior publications, biological understanding of the vaccine, and expert opinion. Recently, a data-driven approach was developed to identify appropriate risk windows for vaccine safety studies that use the self-controlled case series design. This approach employs both the maximum incidence rate ratio and the linear relation between the estimated incidence rate ratio and the inverse of average person time at risk, given a specified risk window. In this paper, we present a scan statistic that can identify appropriate risk windows in vaccine safety studies using the self-controlled case series design while taking into account the dependence of time intervals within an individual and while adjusting for time-varying covariates such as age and seasonality. This approach uses the maximum likelihood ratio test based on fixed-effects models, which has been used for analyzing data from self-controlled case series design in addition to conditional Poisson models. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Comparison of climate envelope models developed using expert-selected variables versus statistical selection

    USGS Publications Warehouse

    Brandt, Laura A.; Benscoter, Allison; Harvey, Rebecca G.; Speroterra, Carolina; Bucklin, David N.; Romañach, Stephanie; Watling, James I.; Mazzotti, Frank J.

    2017-01-01

    Climate envelope models are widely used to describe potential future distribution of species under different climate change scenarios. It is broadly recognized that there are both strengths and limitations to using climate envelope models and that outcomes are sensitive to initial assumptions, inputs, and modeling methods Selection of predictor variables, a central step in modeling, is one of the areas where different techniques can yield varying results. Selection of climate variables to use as predictors is often done using statistical approaches that develop correlations between occurrences and climate data. These approaches have received criticism in that they rely on the statistical properties of the data rather than directly incorporating biological information about species responses to temperature and precipitation. We evaluated and compared models and prediction maps for 15 threatened or endangered species in Florida based on two variable selection techniques: expert opinion and a statistical method. We compared model performance between these two approaches for contemporary predictions, and the spatial correlation, spatial overlap and area predicted for contemporary and future climate predictions. In general, experts identified more variables as being important than the statistical method and there was low overlap in the variable sets (<40%) between the two methods Despite these differences in variable sets (expert versus statistical), models had high performance metrics (>0.9 for area under the curve (AUC) and >0.7 for true skill statistic (TSS). Spatial overlap, which compares the spatial configuration between maps constructed using the different variable selection techniques, was only moderate overall (about 60%), with a great deal of variability across species. Difference in spatial overlap was even greater under future climate projections, indicating additional divergence of model outputs from different variable selection techniques. Our work is in agreement with other studies which have found that for broad-scale species distribution modeling, using statistical methods of variable selection is a useful first step, especially when there is a need to model a large number of species or expert knowledge of the species is limited. Expert input can then be used to refine models that seem unrealistic or for species that experts believe are particularly sensitive to change. It also emphasizes the importance of using multiple models to reduce uncertainty and improve map outputs for conservation planning. Where outputs overlap or show the same direction of change there is greater certainty in the predictions. Areas of disagreement can be used for learning by asking why the models do not agree, and may highlight areas where additional on-the-ground data collection could improve the models.

  17. A place for agent-based models. Comment on "Statistical physics of crime: A review" by M.R. D'Orsogna and M. Perc

    NASA Astrophysics Data System (ADS)

    Barbaro, Alethea

    2015-03-01

    Agent-based models have been widely applied in theoretical ecology to explain migrations and other collective animal movements [2,5,8]. As D'Orsogna and Perc have expertly highlighted in [6], the recent emergence of crime modeling has opened another interesting avenue for mathematical investigation. The area of crime modeling is particularly suited to agent-based models, because these models offer a great deal of flexibility within the model and also ease of communication among criminologist, law enforcement and modelers.

  18. A Model of Instructional Supervision That Meets Today's Needs.

    ERIC Educational Resources Information Center

    Beck, John J.; Seifert, Edward H.

    1983-01-01

    The proposed Instructional Technologist Model is based on a closed loop feedback system allowing for continuous monitoring of teachers by expert instructional technologists. Principals are thereby released for instructional evaluation and general educational management. (MJL)

  19. Estimation of group means when adjusting for covariates in generalized linear models.

    PubMed

    Qu, Yongming; Luo, Junxiang

    2015-01-01

    Generalized linear models are commonly used to analyze categorical data such as binary, count, and ordinal outcomes. Adjusting for important prognostic factors or baseline covariates in generalized linear models may improve the estimation efficiency. The model-based mean for a treatment group produced by most software packages estimates the response at the mean covariate, not the mean response for this treatment group for the studied population. Although this is not an issue for linear models, the model-based group mean estimates in generalized linear models could be seriously biased for the true group means. We propose a new method to estimate the group mean consistently with the corresponding variance estimation. Simulation showed the proposed method produces an unbiased estimator for the group means and provided the correct coverage probability. The proposed method was applied to analyze hypoglycemia data from clinical trials in diabetes. Copyright © 2014 John Wiley & Sons, Ltd.

  20. Simulation Of Combat With An Expert System

    NASA Technical Reports Server (NTRS)

    Provenzano, J. P.

    1989-01-01

    Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.

  1. Reduced-order model based feedback control of the modified Hasegawa-Wakatani model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goumiri, I. R.; Rowley, C. W.; Ma, Z.

    2013-04-15

    In this work, the development of model-based feedback control that stabilizes an unstable equilibrium is obtained for the Modified Hasegawa-Wakatani (MHW) equations, a classic model in plasma turbulence. First, a balanced truncation (a model reduction technique that has proven successful in flow control design problems) is applied to obtain a low dimensional model of the linearized MHW equation. Then, a model-based feedback controller is designed for the reduced order model using linear quadratic regulators. Finally, a linear quadratic Gaussian controller which is more resistant to disturbances is deduced. The controller is applied on the non-reduced, nonlinear MHW equations to stabilizemore » the equilibrium and suppress the transition to drift-wave induced turbulence.« less

  2. Ontology based decision system for breast cancer diagnosis

    NASA Astrophysics Data System (ADS)

    Trabelsi Ben Ameur, Soumaya; Cloppet, Florence; Wendling, Laurent; Sellami, Dorra

    2018-04-01

    In this paper, we focus on analysis and diagnosis of breast masses inspired by expert concepts and rules. Accordingly, a Bag of Words is built based on the ontology of breast cancer diagnosis, accurately described in the Breast Imaging Reporting and Data System. To fill the gap between low level knowledge and expert concepts, a semantic annotation is developed using a machine learning tool. Then, breast masses are classified into benign or malignant according to expert rules implicitly modeled with a set of classifiers (KNN, ANN, SVM and Decision Tree). This semantic context of analysis offers a frame where we can include external factors and other meta-knowledge such as patient risk factors as well as exploiting more than one modality. Based on MRI and DECEDM modalities, our developed system leads a recognition rate of 99.7% with Decision Tree where an improvement of 24.7 % is obtained owing to semantic analysis.

  3. A knowledge-based support system for mechanical ventilation of the lungs. The KUSIVAR concept and prototype.

    PubMed

    Rudowski, R; Frostell, C; Gill, H

    1989-09-01

    The KUSIVAR is an expert system for mechanical ventilation of adult patients suffering from respiratory insufficiency. Its main objective is to provide guidance in respirator management. The knowledge base includes both qualitative, rule-based knowledge and quantitative knowledge expressed in the form of mathematical models (expert control) which is used for prediction of arterial gas tensions and optimization purposes. The system is data driven and uses a forward chaining mechanism for rule invocation. The interaction with the user will be performed in advisory, critiquing, semi-automatic and automatic modes. The system is at present in an advanced prototype stage. Prototyping is performed using KEE (Knowledge Engineering Environment) on a Sperry Explorer workstation. For further development and clinical use the expert system will be downloaded to an advanced PC. The system is intended to support therapy with a Siemens-Elema Servoventilator 900 C.

  4. Toward efficient biomechanical-based deformable image registration of lungs for image-guided radiotherapy

    NASA Astrophysics Data System (ADS)

    Al-Mayah, Adil; Moseley, Joanne; Velec, Mike; Brock, Kristy

    2011-08-01

    Both accuracy and efficiency are critical for the implementation of biomechanical model-based deformable registration in clinical practice. The focus of this investigation is to evaluate the potential of improving the efficiency of the deformable image registration of the human lungs without loss of accuracy. Three-dimensional finite element models have been developed using image data of 14 lung cancer patients. Each model consists of two lungs, tumor and external body. Sliding of the lungs inside the chest cavity is modeled using a frictionless surface-based contact model. The effect of the type of element, finite deformation and elasticity on the accuracy and computing time is investigated. Linear and quadrilateral tetrahedral elements are used with linear and nonlinear geometric analysis. Two types of material properties are applied namely: elastic and hyperelastic. The accuracy of each of the four models is examined using a number of anatomical landmarks representing the vessels bifurcation points distributed across the lungs. The registration error is not significantly affected by the element type or linearity of analysis, with an average vector error of around 2.8 mm. The displacement differences between linear and nonlinear analysis methods are calculated for all lungs nodes and a maximum value of 3.6 mm is found in one of the nodes near the entrance of the bronchial tree into the lungs. The 95 percentile of displacement difference ranges between 0.4 and 0.8 mm. However, the time required for the analysis is reduced from 95 min in the quadratic elements nonlinear geometry model to 3.4 min in the linear element linear geometry model. Therefore using linear tetrahedral elements with linear elastic materials and linear geometry is preferable for modeling the breathing motion of lungs for image-guided radiotherapy applications.

  5. Effect of Increased Intensity of Physiotherapy on Patient Outcomes After Stroke: An Economic Literature Review and Cost-Effectiveness Analysis

    PubMed Central

    Chan, B

    2015-01-01

    Background Functional improvements have been seen in stroke patients who have received an increased intensity of physiotherapy. This requires additional costs in the form of increased physiotherapist time. Objectives The objective of this economic analysis is to determine the cost-effectiveness of increasing the intensity of physiotherapy (duration and/or frequency) during inpatient rehabilitation after stroke, from the perspective of the Ontario Ministry of Health and Long-term Care. Data Sources The inputs for our economic evaluation were extracted from articles published in peer-reviewed journals and from reports from government sources or the Canadian Stroke Network. Where published data were not available, we sought expert opinion and used inputs based on the experts' estimates. Review Methods The primary outcome we considered was cost per quality-adjusted life-year (QALY). We also evaluated functional strength training because of its similarities to physiotherapy. We used a 2-state Markov model to evaluate the cost-effectiveness of functional strength training and increased physiotherapy intensity for stroke inpatient rehabilitation. The model had a lifetime timeframe with a 5% annual discount rate. We then used sensitivity analyses to evaluate uncertainty in the model inputs. Results We found that functional strength training and higher-intensity physiotherapy resulted in lower costs and improved outcomes over a lifetime. However, our sensitivity analyses revealed high levels of uncertainty in the model inputs, and therefore in the results. Limitations There is a high level of uncertainty in this analysis due to the uncertainty in model inputs, with some of the major inputs based on expert panel consensus or expert opinion. In addition, the utility outcomes were based on a clinical study conducted in the United Kingdom (i.e., 1 study only, and not in an Ontario or Canadian setting). Conclusions Functional strength training and higher-intensity physiotherapy may result in lower costs and improved health outcomes. However, these results should be interpreted with caution. PMID:26366241

  6. Replication of clinical innovations in multiple medical practices.

    PubMed

    Henley, N S; Pearce, J; Phillips, L A; Weir, S

    1998-11-01

    Many clinical innovations had been successfully developed and piloted in individual medical practice units of Kaiser Permanente in North Carolina during 1995 and 1996. Difficulty in replicating these clinical innovations consistently throughout all 21 medical practice units led to development of the interdisciplinary Clinical Innovation Implementation Team, which was formed by using existing resources from various departments across the region. REPLICATION MODEL: Based on a model of transfer of best practices, the implementation team developed a process and tools (master schedule and activity matrix) to quickly replicate successful pilot projects throughout all medical practice units. The process involved the following steps: identifying a practice and delineating its characteristics and measures (source identification); identifying a team to receive the (new) practice; piloting the practice; and standardizing, including the incorporation of learnings. The model includes the following components for each innovation: sending and receiving teams, an innovation coordinator role, an innovation expert role, a location expert role, a master schedule, and a project activity matrix. Communication depended on a partnership among the location experts (local knowledge and credibility), the innovation coordinator (process expertise), and the innovation experts (content expertise). Results after 12 months of working with the 21 medical practice units include integration of diabetes care team services into the practices, training of more than 120 providers in the use of personal computers and an icon-based clinical information system, and integration of a planwide self-care program into the medical practices--all with measurable improved outcomes. The model for sequential replication and the implementation team structure and function should be successful in other organizational settings.

  7. Assessment on EXPERT Descent and Landing System Aerodynamics

    NASA Astrophysics Data System (ADS)

    Wong, H.; Muylaert, J.; Northey, D.; Riley, D.

    2009-01-01

    EXPERT is a re-entry vehicle designed for validation of aero-thermodynamic models, numerical schemes in Computational Fluid Dynamics codes and test facilities for measuring flight data under an Earth re-entry environment. This paper addresses the design for the descent and landing sequence for EXPERT. It includes the descent sequence, the choice of drogue and main parachutes, and the parachute deployment condition, which can be supersonic or subsonic. The analysis is based mainly on an engineering tool, PASDA, together with some hand calculations for parachute sizing and design. The tool consists of a detailed 6-DoF simulation performed with the aerodynamics database of the vehicle, an empirical wakes model and the International Standard Atmosphere database. The aerodynamics database for the vehicle is generated by DNW experimental data and CFD codes within the framework of an ESA contract to CIRA. The analysis will be presented in terms of altitude, velocity, accelerations, angle-of- attack, pitch angle and angle of rigging line. Discussion on the advantages and disadvantages of each parachute deployment condition is included in addition to some comparison with the available data based on a Monte-Carlo method from a Russian company, FSUE NIIPS. Sensitivity on wind speed to the performance of EXPERT is shown to be strong. Supersonic deployment of drogue shows a better performance in stability at the expense of a larger G-load than those from the subsonic deployment of drogue. Further optimization on the parachute design is necessary in order to fulfill all the EXPERT specifications.

  8. Developing Expert System for Tuberculosis Diagnose to Support Knowledge Sharing in the Era of National Health Insurance System

    NASA Astrophysics Data System (ADS)

    Lidya, L.

    2017-03-01

    National Health Insurance has been implemented since 1st January 2014. A number of new policies have been established including multilevel referral system. The multilevel referral system classified health care center into three levels, it determined that the flow of patient treatment should be started from first level health care center. There are 144 kind of diseases that must be treat in the first level which mainly consists of general physicians. Unfortunately, competence of the physician in the first level may not fulfil the standard competence yet. To improved the physisians knowledge, government has created many events to accelerate knowledge sharing. However, it still needs times and many resources to give significan results. Expert system is kind of software that provide consulting services to non-expert users in accordance with the area of its expertise. It can improved effectivity and efficiency of knowledge sharing and learning. This research was developed a model of TB diagnose expert system which comply with the standard procedure of TB diagnosis and regulation. The proposed expert system has characteristics as follows provide facility to manage multimedia clinical data, supporting the complexity of TB diagnosis (combine rule-based and case-based expert system), interactive interface, good usability, multi-platform, evolutionary.

  9. Near-infrared chemical imaging (NIR-CI) as a process monitoring solution for a production line of roll compaction and tableting.

    PubMed

    Khorasani, Milad; Amigo, José M; Sun, Changquan Calvin; Bertelsen, Poul; Rantanen, Jukka

    2015-06-01

    In the present study the application of near-infrared chemical imaging (NIR-CI) supported by chemometric modeling as non-destructive tool for monitoring and assessing the roller compaction and tableting processes was investigated. Based on preliminary risk-assessment, discussion with experts and current work from the literature the critical process parameter (roll pressure and roll speed) and critical quality attributes (ribbon porosity, granule size, amount of fines, tablet tensile strength) were identified and a design space was established. Five experimental runs with different process settings were carried out which revealed intermediates (ribbons, granules) and final products (tablets) with different properties. Principal component analysis (PCA) based model of NIR images was applied to map the ribbon porosity distribution. The ribbon porosity distribution gained from the PCA based NIR-CI was used to develop predictive models for granule size fractions. Predictive methods with acceptable R(2) values could be used to predict the granule particle size. Partial least squares regression (PLS-R) based model of the NIR-CI was used to map and predict the chemical distribution and content of active compound for both roller compacted ribbons and corresponding tablets. In order to select the optimal process, setting the standard deviation of tablet tensile strength and tablet weight for each tablet batch was considered. Strong linear correlation between tablet tensile strength and amount of fines and granule size was established, respectively. These approaches are considered to have a potentially large impact on quality monitoring and control of continuously operating manufacturing lines, such as roller compaction and tableting processes. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Using expert knowledge to incorporate uncertainty in cause-of-death assignments for modeling of cause-specific mortality

    USGS Publications Warehouse

    Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.

    2018-01-01

    Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.

  11. An exploration of partnership through interactions between young 'expert' patients with cystic fibrosis and healthcare professionals.

    PubMed

    MacDonald, Kath; Irvine, Lindesay; Smith, Margaret Coulter

    2015-12-01

    To explore how young 'expert patients' living with Cystic Fibrosis and the healthcare professionals with whom they interact perceive partnership and negotiate care. Modern healthcare policy encourages partnership, engagement and self-management of long-term conditions. This philosophy is congruent with the model adopted in the care of those with Cystic Fibrosis, where self-management, trust and mutual respect are perceived to be integral to the development of the ongoing patient/professional relationship. Self-management is associated with the term; 'expert patient'; an individual with a long-term condition whose knowledge and skills are valued and used in partnership with healthcare professionals. However, the term 'expert patient' is debated in the literature as are the motivation for its use and the assumptions implicit in the term. A qualitative exploratory design informed by Interpretivism and Symbolic Interactionism was conducted. Thirty-four consultations were observed and 23 semi-structured interviews conducted between 10 patients, 2 carers and 12 healthcare professionals. Data were analysed thematically using the five stages of 'Framework' a matrix-based qualitative data analysis approach and were subject to peer review and respondent validation. The study received full ethical approval. Three main themes emerged; experiences of partnership, attributes of the expert patient and constructions of illness. Sub-themes of the 'ceremonial order of the clinic', negotiation and trust in relationships and perceptions of the expert patient are presented. The model of consultation may be a barrier to person-centred care. Healthcare professionals show leniency in negotiations, but do not always trust patients' accounts. The term 'expert patient' is unpopular and remains contested. Gaining insight into structures and processes that enable or inhibit partnership can lead to a collaborative approach to service redesign and a revision of the consultation model. © 2015 John Wiley & Sons Ltd.

  12. Aqueduct: a methodology to measure and communicate global water risks

    NASA Astrophysics Data System (ADS)

    Gassert, Francis; Reig, Paul

    2013-04-01

    The Aqueduct Water Risk Atlas (Aqueduct) is a publicly available, global database and interactive tool that maps indicators of water related risks for decision makers worldwide. Aqueduct makes use of the latest geo-statistical modeling techniques to compute a composite index and translate the most recently available hydrological data into practical information on water related risks for companies, investors, and governments alike. Twelve global indicators are grouped into a Water Risk Framework designed in response to the growing concerns from private sector actors around water scarcity, water quality, climate change, and increasing demand for freshwater. The Aqueduct framework organizes indicators into three categories of risk that bring together multiple dimensions of water related risk into comprehensive aggregated scores and includes indicators of water stress, variability in supply, storage, flood, drought, groundwater, water quality and social conflict, addressing both spatial and temporal variation in water hazards. Indicators are selected based on relevance to water users, availability and robustness of global data sources, and expert consultation, and are collected from existing datasets or derived from a Global Land Data Assimilation System (GLDAS) based integrated water balance model. Indicators are normalized using a threshold approach, and composite scores are computed using a linear aggregation scheme that allows for dynamic weighting to capture users' unique exposure to water hazards. By providing consistent scores across the globe, the Aqueduct Water Risk Atlas enables rapid comparison across diverse aspects of water risk. Companies can use this information to prioritize actions, investors to leverage financial interest to improve water management, and governments to engage with the private sector to seek solutions for more equitable and sustainable water governance. The Aqueduct Water Risk Atlas enables practical applications of scientific data, helping non-expert audiences better understand and evaluate risks facing water users. This presentation will discuss the methodology used to combine the indicator values into aggregated risk scores and lessons learned from working with diverse audiences in academia, development institutions, and the public and private sectors.

  13. The role of perceived impact on relationship quality in pharmacists' willingness to influence indication-based off-label prescribing decisions.

    PubMed

    Basak, Ramsankar; Bentley, John P; McCaffrey, David J; Bouldin, Alicia S; Banahan, Benjamin F

    2015-05-01

    Little is known about factors that affect pharmacists' roles in off-label prescribing. This study examined the effect of perceived impact on relationship quality (IRQ) on hospital pharmacists' willingness to influence a physician's decision regarding an indication-based off-label medication order (WTIP) (i.e., beyond FDA-approved indications) and the moderating roles of the appropriateness of the medication order and the relative expert power of the pharmacist. Pharmacists practicing in U.S. hospitals, recruited from membership rolls of state affiliates of the American Society of Health-System Pharmacists, were sent an electronic link to a questionnaire via their respective affiliates. A cross-sectional, randomized, 2 × 2 experimental design was used; participants were assigned to one of the indication-based off-label medication order scenarios. Relative expert power (i.e., power differential between the pharmacist and the physician) and appropriateness of the prescription were manipulated. Perceived IRQ was measured with multiple items. Pharmacists' WTIP in the scenario was the outcome variable. A total of 243 responses were included in multiple linear regression analyses. After controlling for dependence power, information power, communication effectiveness, perceived responsibility, and attitude, pharmacists' WTIP was negatively affected by perceived IRQ (estimate = -0.309, P < 0.05). This effect was more pronounced in groups exposed to the scenario where the pharmacist had lower relative expert power (estimate = -0.438, P < 0.05) and where the medication was less appropriate (estimate = -0.503, P < 0.05). Although willing to ensure rationality of off-label prescribing, pharmacists' WTIP was affected by a complex array of factors - the perceived impact of influence attempts on relationship quality between the pharmacist and the prescriber, the pharmacist's relative expert power, and the appropriateness of the off-label prescription. Increasing pharmacists' expert power and collaboration with physicians and promoting pharmacists' multifaceted contribution, collaborative or independent, to patient care may facilitate pharmacist services in off-label pharmaceutical care. Published by Elsevier Ltd.

  14. EXPLICIT: a feasibility study of remote expert elicitation in health technology assessment.

    PubMed

    Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken

    2017-09-04

    Expert opinion is often sought to complement available information needed to inform model-based economic evaluations in health technology assessments. In this context, we define expert elicitation as the process of encoding expert opinion on a quantity of interest, together with associated uncertainty, as a probability distribution. When availability for face-to-face expert elicitation with a facilitator is limited, elicitation can be conducted remotely, overcoming challenges of finding an appropriate time to meet the expert and allowing access to experts situated too far away for practical face-to-face sessions. However, distance elicitation is associated with reduced response rates and limited assistance for the expert during the elicitation session. The aim of this study was to inform the development of a remote elicitation tool by exploring the influence of mode of elicitation on elicited beliefs. An Excel-based tool (EXPLICIT) was developed to assist the elicitation session, including the preparation of the expert and recording of their responses. General practitioners (GPs) were invited to provide expert opinion about population alcohol consumption behaviours. They were randomised to complete the elicitation by either a face-to-face meeting or email. EXPLICIT was used in the elicitation sessions for both arms. Fifteen GPs completed the elicitation session. Those conducted by email were longer than the face-to-face sessions (13 min 30 s vs 10 min 26 s, p = 0.1) and the email-elicited estimates contained less uncertainty. However, the resulting aggregated distributions were comparable. EXPLICIT was useful in both facilitating the elicitation task and in obtaining expert opinion from experts via email. The findings support the opinion that remote, self-administered elicitation is a viable approach within the constraints of HTA to inform policy making, although poor response rates may be observed and additional time for individual sessions may be required.

  15. An evaluation of selected (Q)SARs/expert systems for predicting skin sensitisation potential.

    PubMed

    Fitzpatrick, J M; Roberts, D W; Patlewicz, G

    2018-06-01

    Predictive testing to characterise substances for their skin sensitisation potential has historically been based on animal models such as the Local Lymph Node Assay (LLNA) and the Guinea Pig Maximisation Test (GPMT). In recent years, EU regulations, have provided a strong incentive to develop non-animal alternatives, such as expert systems software. Here we selected three different types of expert systems: VEGA (statistical), Derek Nexus (knowledge-based) and TIMES-SS (hybrid), and evaluated their performance using two large sets of animal data: one set of 1249 substances from eChemportal and a second set of 515 substances from NICEATM. A model was considered successful at predicting skin sensitisation potential if it had at least the same balanced accuracy as the LLNA and the GPMT had in predicting the other outcomes, which ranged from 79% to 86%. We found that the highest balanced accuracy of any of the expert systems evaluated was 65% when making global predictions. For substances within the domain of TIMES-SS, however, balanced accuracies for the two datasets were found to be 79% and 82%. In those cases where a chemical was within the TIMES-SS domain, the TIMES-SS skin sensitisation hazard prediction had the same confidence as the result from LLNA or GPMT.

  16. Assessing the risk of Nipah virus establishment in Australian flying-foxes.

    PubMed

    Roche, S E; Costard, S; Meers, J; Field, H E; Breed, A C

    2015-07-01

    Nipah virus (NiV) is a recently emerged zoonotic virus that causes severe disease in humans. The reservoir hosts for NiV, bats of the genus Pteropus (known as flying-foxes) are found across the Asia-Pacific including Australia. While NiV has not been detected in Australia, evidence for NiV infection has been found in flying-foxes in some of Australia's closest neighbours. A qualitative risk assessment was undertaken to assess the risk of NiV establishing in Australian flying-foxes through flying-fox movements from nearby regions. Events surrounding the emergence of new diseases are typically uncertain and in this study an expert opinion workshop was used to address gaps in knowledge. Given the difficulties in combining expert opinion, five different combination methods were analysed to assess their influence on the risk outcome. Under the baseline scenario where the median was used to combine opinions, the risk was estimated to be very low. However, this risk increased when the mean and linear opinion pooling combination methods were used. This assessment highlights the effects that different methods for combining expert opinion have on final risk estimates and the caution needed when interpreting these outcomes given the high degree of uncertainty in expert opinion. This work has provided a flexible model framework for assessing the risk of NiV establishment in Australian flying-foxes through bat movements which can be updated when new data become available.

  17. Parameterising User Uptake in Economic Evaluations: The role of discrete choice experiments.

    PubMed

    Terris-Prestholt, Fern; Quaife, Matthew; Vickerman, Peter

    2016-02-01

    Model-based economic evaluations of new interventions have shown that user behaviour (uptake) is a critical driver of overall impact achieved. However, early economic evaluations, prior to introduction, often rely on assumed levels of uptake based on expert opinion or uptake of similar interventions. In addition to the likely uncertainty surrounding these uptake assumptions, they also do not allow for uptake to be a function of product, intervention, or user characteristics. This letter proposes using uptake projections from discrete choice experiments (DCE) to better parameterize uptake and substitution in cost-effectiveness models. A simple impact model is developed and illustrated using an example from the HIV prevention field in South Africa. Comparison between the conventional approach and the DCE-based approach shows that, in our example, DCE-based impact predictions varied by up to 50% from conventional estimates and provided far more nuanced projections. In the absence of observed uptake data and to model the effect of variations in intervention characteristics, DCE-based uptake predictions are likely to greatly improve models parameterizing uptake solely based on expert opinion. This is particularly important for global and national level decision making around introducing new and probably more expensive interventions, particularly where resources are most constrained. © 2016 The Authors. Health Economics published by John Wiley & Sons Ltd.

  18. A Method for Generating Reduced-Order Linear Models of Multidimensional Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Chicatelli, Amy; Hartley, Tom T.

    1998-01-01

    Simulation of high speed propulsion systems may be divided into two categories, nonlinear and linear. The nonlinear simulations are usually based on multidimensional computational fluid dynamics (CFD) methodologies and tend to provide high resolution results that show the fine detail of the flow. Consequently, these simulations are large, numerically intensive, and run much slower than real-time. ne linear simulations are usually based on large lumping techniques that are linearized about a steady-state operating condition. These simplistic models often run at or near real-time but do not always capture the detailed dynamics of the plant. Under a grant sponsored by the NASA Lewis Research Center, Cleveland, Ohio, a new method has been developed that can be used to generate improved linear models for control design from multidimensional steady-state CFD results. This CFD-based linear modeling technique provides a small perturbation model that can be used for control applications and real-time simulations. It is important to note the utility of the modeling procedure; all that is needed to obtain a linear model of the propulsion system is the geometry and steady-state operating conditions from a multidimensional CFD simulation or experiment. This research represents a beginning step in establishing a bridge between the controls discipline and the CFD discipline so that the control engineer is able to effectively use multidimensional CFD results in control system design and analysis.

  19. A linear programming model for preserving privacy when disclosing patient spatial information for secondary purposes.

    PubMed

    Jung, Ho-Won; El Emam, Khaled

    2014-05-29

    A linear programming (LP) model was proposed to create de-identified data sets that maximally include spatial detail (e.g., geocodes such as ZIP or postal codes, census blocks, and locations on maps) while complying with the HIPAA Privacy Rule's Expert Determination method, i.e., ensuring that the risk of re-identification is very small. The LP model determines the transition probability from an original location of a patient to a new randomized location. However, it has a limitation for the cases of areas with a small population (e.g., median of 10 people in a ZIP code). We extend the previous LP model to accommodate the cases of a smaller population in some locations, while creating de-identified patient spatial data sets which ensure the risk of re-identification is very small. Our LP model was applied to a data set of 11,740 postal codes in the City of Ottawa, Canada. On this data set we demonstrated the limitations of the previous LP model, in that it produces improbable results, and showed how our extensions to deal with small areas allows the de-identification of the whole data set. The LP model described in this study can be used to de-identify geospatial information for areas with small populations with minimal distortion to postal codes. Our LP model can be extended to include other information, such as age and gender.

  20. Prompt comprehension in UNIX command production.

    PubMed

    Doane, S M; McNamara, D S; Kintsch, W; Polson, P G; Clawson, D M

    1992-07-01

    We hypothesize that a cognitive analysis based on the construction-integration theory of comprehension (Kintsch, 1988) can predict what is difficult about generating complex composite commands in the UNIX operating system. We provide empirical support for assumptions of the Doane, Kintsch, and Polson (1989, 1990) construction-integration model for generating complex commands in UNIX. We asked users whose UNIX experience varied to produce complex UNIX commands, and then provided help prompts whenever the commands that they produced were erroneous. The help prompts were designed to assist subjects with respect to both the knowledge and the memory processes that our UNIX modeling efforts have suggested are lacking in less expert users. It appears that experts respond to different prompts than do novices. Expert performance is helped by the presentation of abstract information, whereas novice and intermediate performance is modified by presentation of concrete information. Second, while presentation of specific prompts helps less expert subjects, they do not provide sufficient information to obtain correct performance. Our analyses suggest that information about the ordering of commands is required to help the less expert with both knowledge and memory load problems in a manner consistent with skill acquisition theories.

  1. Biochemical methane potential prediction of plant biomasses: Comparing chemical composition versus near infrared methods and linear versus non-linear models.

    PubMed

    Godin, Bruno; Mayer, Frédéric; Agneessens, Richard; Gerin, Patrick; Dardenne, Pierre; Delfosse, Philippe; Delcarte, Jérôme

    2015-01-01

    The reliability of different models to predict the biochemical methane potential (BMP) of various plant biomasses using a multispecies dataset was compared. The most reliable prediction models of the BMP were those based on the near infrared (NIR) spectrum compared to those based on the chemical composition. The NIR predictions of local (specific regression and non-linear) models were able to estimate quantitatively, rapidly, cheaply and easily the BMP. Such a model could be further used for biomethanation plant management and optimization. The predictions of non-linear models were more reliable compared to those of linear models. The presentation form (green-dried, silage-dried and silage-wet form) of biomasses to the NIR spectrometer did not influence the performances of the NIR prediction models. The accuracy of the BMP method should be improved to enhance further the BMP prediction models. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Comparison of Natural Language Processing Rules-based and Machine-learning Systems to Identify Lumbar Spine Imaging Findings Related to Low Back Pain.

    PubMed

    Tan, W Katherine; Hassanpour, Saeed; Heagerty, Patrick J; Rundell, Sean D; Suri, Pradeep; Huhdanpaa, Hannu T; James, Kathryn; Carrell, David S; Langlotz, Curtis P; Organ, Nancy L; Meier, Eric N; Sherman, Karen J; Kallmes, David F; Luetmer, Patrick H; Griffith, Brent; Nerenz, David R; Jarvik, Jeffrey G

    2018-03-28

    To evaluate a natural language processing (NLP) system built with open-source tools for identification of lumbar spine imaging findings related to low back pain on magnetic resonance and x-ray radiology reports from four health systems. We used a limited data set (de-identified except for dates) sampled from lumbar spine imaging reports of a prospectively assembled cohort of adults. From N = 178,333 reports, we randomly selected N = 871 to form a reference-standard dataset, consisting of N = 413 x-ray reports and N = 458 MR reports. Using standardized criteria, four spine experts annotated the presence of 26 findings, where 71 reports were annotated by all four experts and 800 were each annotated by two experts. We calculated inter-rater agreement and finding prevalence from annotated data. We randomly split the annotated data into development (80%) and testing (20%) sets. We developed an NLP system from both rule-based and machine-learned models. We validated the system using accuracy metrics such as sensitivity, specificity, and area under the receiver operating characteristic curve (AUC). The multirater annotated dataset achieved inter-rater agreement of Cohen's kappa > 0.60 (substantial agreement) for 25 of 26 findings, with finding prevalence ranging from 3% to 89%. In the testing sample, rule-based and machine-learned predictions both had comparable average specificity (0.97 and 0.95, respectively). The machine-learned approach had a higher average sensitivity (0.94, compared to 0.83 for rules-based), and a higher overall AUC (0.98, compared to 0.90 for rules-based). Our NLP system performed well in identifying the 26 lumbar spine findings, as benchmarked by reference-standard annotation by medical experts. Machine-learned models provided substantial gains in model sensitivity with slight loss of specificity, and overall higher AUC. Copyright © 2018 The Association of University Radiologists. All rights reserved.

  3. Improving the efficiency of a user-driven learning system with reconfigurable hardware. Application to DNA splicing.

    PubMed

    Lemoine, E; Merceron, D; Sallantin, J; Nguifo, E M

    1999-01-01

    This paper describes a new approach to problem solving by splitting up problem component parts between software and hardware. Our main idea arises from the combination of two previously published works. The first one proposed a conceptual environment of concept modelling in which the machine and the human expert interact. The second one reported an algorithm based on reconfigurable hardware system which outperforms any kind of previously published genetic data base scanning hardware or algorithms. Here we show how efficient the interaction between the machine and the expert is when the concept modelling is based on reconfigurable hardware system. Their cooperation is thus achieved with an real time interaction speed. The designed system has been partially applied to the recognition of primate splice junctions sites in genetic sequences.

  4. Progressive intervention strategy for the gait of sub-acute stroke patient using the International Classification of Functioning, Disability, and Health tool.

    PubMed

    Kang, Tae-Woo; Cynn, Heon-Seock

    2017-01-01

    The International Classification of Functioning, Disability, and Health (ICF) provides models for functions and disabilities. The ICF is presented as a frame that enables organizing physical therapists' clinical practice for application. The purpose of the present study was to describe processes through which stroke patients are assessed and treated based on the ICF model. The patient was a 65-year-old female diagnosed with right cerebral artery infarction with left hemiparesis. Progressive interventions were applied, such as those aiming at sitting and standing for the first two weeks, gait intervention for the third and fourth weeks, and those aiming at sitting from a standing position for the fifth and sixth weeks. The ICF model provides rehabilitation experts with a frame that enables them to accurately identify and understand their patients' problems. The ICF model helps the experts understand not only their patients' body structure, function, activity, and participation, but also their problems related to personal and environmental factors. The experts could efficiently make decisions and provide optimum treatment at clinics using the ICF model.

  5. Agent based reasoning for the non-linear stochastic models of long-range memory

    NASA Astrophysics Data System (ADS)

    Kononovicius, A.; Gontis, V.

    2012-02-01

    We extend Kirman's model by introducing variable event time scale. The proposed flexible time scale is equivalent to the variable trading activity observed in financial markets. Stochastic version of the extended Kirman's agent based model is compared to the non-linear stochastic models of long-range memory in financial markets. The agent based model providing matching macroscopic description serves as a microscopic reasoning of the earlier proposed stochastic model exhibiting power law statistics.

  6. Bad-breath: Perceptions and misconceptions of Nigerian adults.

    PubMed

    Nwhator, S O; Isiekwe, G I; Soroye, M O; Agbaje, M O

    2015-01-01

    To provide baseline data about bad-breath perception and misconceptions among Nigerian adults. Multi-center cross-sectional study of individuals aged 18-64 years using examiner-administered questionnaires. Age comparisons were based on the model of emerging adults versus full adults. Data were recoded for statistical analyses and univariate and secondary log-linear statistics applied. Participants had lopsided perceptions about bad-breath. While 730 (90.8%) identified the dentist as the expert on halitosis and 719 (89.4%) knew that bad-breath is not contagious, only 4.4% and 2.5% associated bad-breath with tooth decay and gum disease respectively. There were no significant sex differences but the older adults showed better knowledge in a few instances. Most respondents (747, 92.9%) would tell a spouse about their bad-breath and 683 (85%) would tell a friend. Participants had lop-sided knowledge and perceptions about bad-breath. Most Nigerian adults are their "brothers' keepers" who would tell a spouse or friend about their halitosis so they could seek treatment.

  7. Comparing The Effectiveness of a90/95 Calculations (Preprint)

    DTIC Science & Technology

    2006-09-01

    Nachtsheim, John Neter, William Li, Applied Linear Statistical Models , 5th ed., McGraw-Hill/Irwin, 2005 5. Mood, Graybill and Boes, Introduction...curves is based on methods that are only valid for ordinary linear regression. Requirements for a valid Ordinary Least-Squares Regression Model There... linear . For example is a linear model ; is not. 2. Uniform variance (homoscedasticity

  8. Research on complex 3D tree modeling based on L-system

    NASA Astrophysics Data System (ADS)

    Gang, Chen; Bin, Chen; Yuming, Liu; Hui, Li

    2018-03-01

    L-system as a fractal iterative system could simulate complex geometric patterns. Based on the field observation data of trees and knowledge of forestry experts, this paper extracted modeling constraint rules and obtained an L-system rules set. Using the self-developed L-system modeling software the L-system rule set was parsed to generate complex tree 3d models.The results showed that the geometrical modeling method based on l-system could be used to describe the morphological structure of complex trees and generate 3D tree models.

  9. A probabilistic maintenance model for diesel engines

    NASA Astrophysics Data System (ADS)

    Pathirana, Shan; Abeygunawardane, Saranga Kumudu

    2018-02-01

    In this paper, a probabilistic maintenance model is developed for inspection based preventive maintenance of diesel engines based on the practical model concepts discussed in the literature. Developed model is solved using real data obtained from inspection and maintenance histories of diesel engines and experts' views. Reliability indices and costs were calculated for the present maintenance policy of diesel engines. A sensitivity analysis is conducted to observe the effect of inspection based preventive maintenance on the life cycle cost of diesel engines.

  10. Nonlinear rescaling of control values simplifies fuzzy control

    NASA Technical Reports Server (NTRS)

    Vanlangingham, H.; Tsoukkas, A.; Kreinovich, V.; Quintana, C.

    1993-01-01

    Traditional control theory is well-developed mainly for linear control situations. In non-linear cases there is no general method of generating a good control, so we have to rely on the ability of the experts (operators) to control them. If we want to automate their control, we must acquire their knowledge and translate it into a precise control strategy. The experts' knowledge is usually represented in non-numeric terms, namely, in terms of uncertain statements of the type 'if the obstacle is straight ahead, the distance to it is small, and the velocity of the car is medium, press the brakes hard'. Fuzzy control is a methodology that translates such statements into precise formulas for control. The necessary first step of this strategy consists of assigning membership functions to all the terms that the expert uses in his rules (in our sample phrase these words are 'small', 'medium', and 'hard'). The appropriate choice of a membership function can drastically improve the quality of a fuzzy control. In the simplest cases, we can take the functions whose domains have equally spaced endpoints. Because of that, many software packages for fuzzy control are based on this choice of membership functions. This choice is not very efficient in more complicated cases. Therefore, methods have been developed that use neural networks or generic algorithms to 'tune' membership functions. But this tuning takes lots of time (for example, several thousands iterations are typical for neural networks). In some cases there are evident physical reasons why equally space domains do not work: e.g., if the control variable u is always positive (i.e., if we control temperature in a reactor), then negative values (that are generated by equal spacing) simply make no sense. In this case it sounds reasonable to choose another scale u' = f(u) to represent u, so that equal spacing will work fine for u'. In the present paper we formulate the problem of finding the best rescaling function, solve this problem, and show (on a real-life example) that after an optimal rescaling, the un-tuned fuzzy control can be as good as the best state-of-art traditional non-linear controls.

  11. Relationship Between Time Consumption and Quality of Responses to Drug-related Queries: A Study From Seven Drug Information Centers in Scandinavia.

    PubMed

    Amundstuen Reppe, Linda; Lydersen, Stian; Schjøtt, Jan; Damkier, Per; Rolighed Christensen, Hanne; Peter Kampmann, Jens; Böttiger, Ylva; Spigset, Olav

    2016-07-01

    The aims of this study were to assess the quality of responses produced by drug information centers (DICs) in Scandinavia, and to study the association between time consumption processing queries and the quality of the responses. We posed six identical drug-related queries to seven DICs in Scandinavia, and the time consumption required for processing them was estimated. Clinical pharmacologists (internal experts) and general practitioners (external experts) reviewed responses individually. We used mixed model linear regression analyses to study the associations between time consumption on one hand and the summarized quality scores and the overall impression of the responses on the other hand. Both expert groups generally assessed the quality of the responses as "satisfactory" to "good." A few responses were criticized for being poorly synthesized and less relevant, of which none were quality-assured using co-signatures. For external experts, an increase in time consumption was statistically significantly associated with a decrease in common quality score (change in score, -0.20 per hour of work; 95% CI, -0.33 to -0.06; P = 0.004), and overall impression (change in score, -0.05 per hour of work; 95% CI, -0.08 to -0.01; P = 0.005). No such associations were found for the internal experts' assessment. To our knowledge, this is the first study of the association between time consumption and quality of responses to drug-related queries in DICs. The quality of responses were in general good, but time consumption and quality were only weakly associated in this setting. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Utilizing Expert Knowledge in Estimating Future STS Costs

    NASA Technical Reports Server (NTRS)

    Fortner, David B.; Ruiz-Torres, Alex J.

    2004-01-01

    A method of estimating the costs of future space transportation systems (STSs) involves classical activity-based cost (ABC) modeling combined with systematic utilization of the knowledge and opinions of experts to extend the process-flow knowledge of existing systems to systems that involve new materials and/or new architectures. The expert knowledge is particularly helpful in filling gaps that arise in computational models of processes because of inconsistencies in historical cost data. Heretofore, the costs of planned STSs have been estimated following a "top-down" approach that tends to force the architectures of new systems to incorporate process flows like those of the space shuttles. In this ABC-based method, one makes assumptions about the processes, but otherwise follows a "bottoms up" approach that does not force the new system architecture to incorporate a space-shuttle-like process flow. Prototype software has been developed to implement this method. Through further development of software, it should be possible to extend the method beyond the space program to almost any setting in which there is a need to estimate the costs of a new system and to extend the applicable knowledge base in order to make the estimate.

  13. The Reliability Estimation for the Open Function of Cabin Door Affected by the Imprecise Judgment Corresponding to Distribution Hypothesis

    NASA Astrophysics Data System (ADS)

    Yu, Z. P.; Yue, Z. F.; Liu, W.

    2018-05-01

    With the development of artificial intelligence, more and more reliability experts have noticed the roles of subjective information in the reliability design of complex system. Therefore, based on the certain numbers of experiment data and expert judgments, we have divided the reliability estimation based on distribution hypothesis into cognition process and reliability calculation. Consequently, for an illustration of this modification, we have taken the information fusion based on intuitional fuzzy belief functions as the diagnosis model of cognition process, and finished the reliability estimation for the open function of cabin door affected by the imprecise judgment corresponding to distribution hypothesis.

  14. Architecture For The Optimization Of A Machining Process In Real Time Through Rule-Based Expert System

    NASA Astrophysics Data System (ADS)

    Serrano, Rafael; González, Luis Carlos; Martín, Francisco Jesús

    2009-11-01

    Under the project SENSOR-IA which has had financial funding from the Order of Incentives to the Regional Technology Centers of the Counsil of Innovation, Science and Enterprise of Andalusia, an architecture for the optimization of a machining process in real time through rule-based expert system has been developed. The architecture consists of an acquisition system and sensor data processing engine (SATD) from an expert system (SE) rule-based which communicates with the SATD. The SE has been designed as an inference engine with an algorithm for effective action, using a modus ponens rule model of goal-oriented rules.The pilot test demonstrated that it is possible to govern in real time the machining process based on rules contained in a SE. The tests have been done with approximated rules. Future work includes an exhaustive collection of data with different tool materials and geometries in a database to extract more precise rules.

  15. A model based on temporal dynamics of fixations for distinguishing expert radiologists' scanpaths

    NASA Astrophysics Data System (ADS)

    Gandomkar, Ziba; Tay, Kevin; Brennan, Patrick C.; Mello-Thoms, Claudia

    2017-03-01

    This study investigated a model which distinguishes expert radiologists from less experienced radiologists based on features describing spatio-temporal dynamics of their eye movement during interpretation of digital mammograms. Eye movements of four expert and four less experienced radiologists were recorded during interpretation of 120 two-view digital mammograms of which 59 had biopsy proven cancers. For each scanpath, a two-dimensional recurrence plot, which represents the radiologist's refixation pattern, was generated. From each plot, six features indicating the spatio-temporal dynamics of fixations were extracted. The first feature measured the percentage of recurrent fixations; the second indicated the percentage of recurrent fixations which was fixated later in several consecutive fixations; the third measured the percentage of recurrent fixations that form a repeated sequence of fixations and the fourth assessed whether the recurrent fixations were occurring sequentially close together. The number of switches between the two mammographic views was also measured, as was the average number of consecutive fixations in each view before switching. These six features along with total time on case and average fixation duration were fed into a support vector machine whose performance was evaluated using 10-fold cross validation. The model achieved a sensitivity of 86.3% and a specificity of 85.2% for distinguishing experts' scanpaths. The obtained result suggests that spatio-temporal dynamics of eye movements can characterize expertise level and has potential applications for monitoring the development of expertise among radiologists as a result of different training regimes and continuing education schemes.

  16. Local hyperspectral data multisharpening based on linear/linear-quadratic nonnegative matrix factorization by integrating lidar data

    NASA Astrophysics Data System (ADS)

    Benhalouche, Fatima Zohra; Karoui, Moussa Sofiane; Deville, Yannick; Ouamri, Abdelaziz

    2015-10-01

    In this paper, a new Spectral-Unmixing-based approach, using Nonnegative Matrix Factorization (NMF), is proposed to locally multi-sharpen hyperspectral data by integrating a Digital Surface Model (DSM) obtained from LIDAR data. In this new approach, the nature of the local mixing model is detected by using the local variance of the object elevations. The hyper/multispectral images are explored using small zones. In each zone, the variance of the object elevations is calculated from the DSM data in this zone. This variance is compared to a threshold value and the adequate linear/linearquadratic spectral unmixing technique is used in the considered zone to independently unmix hyperspectral and multispectral data, using an adequate linear/linear-quadratic NMF-based approach. The obtained spectral and spatial information thus respectively extracted from the hyper/multispectral images are then recombined in the considered zone, according to the selected mixing model. Experiments based on synthetic hyper/multispectral data are carried out to evaluate the performance of the proposed multi-sharpening approach and literature linear/linear-quadratic approaches used on the whole hyper/multispectral data. In these experiments, real DSM data are used to generate synthetic data containing linear and linear-quadratic mixed pixel zones. The DSM data are also used for locally detecting the nature of the mixing model in the proposed approach. Globally, the proposed approach yields good spatial and spectral fidelities for the multi-sharpened data and significantly outperforms the used literature methods.

  17. A Bayesian Belief Network approach to assess the potential of non wood forest products for small scale forest owners

    NASA Astrophysics Data System (ADS)

    Vacik, Harald; Huber, Patrick; Hujala, Teppo; Kurtilla, Mikko; Wolfslehner, Bernhard

    2015-04-01

    It is an integral element of the European understanding of sustainable forest management to foster the design and marketing of forest products, non-wood forest products (NWFPs) and services that go beyond the production of timber. Despite the relevance of NWFPs in Europe, forest management and planning methods have been traditionally tailored towards wood and wood products, because most forest management models and silviculture techniques were developed to ensure a sustained production of timber. Although several approaches exist which explicitly consider NWFPs as management objectives in forest planning, specific models are needed for the assessment of their production potential in different environmental contexts and for different management regimes. Empirical data supporting a comprehensive assessment of the potential of NWFPs are rare, thus making development of statistical models particularly problematic. However, the complex causal relationships between the sustained production of NWFPs, the available ecological resources, as well as the organizational and the market potential of forest management regimes are well suited for knowledge-based expert models. Bayesian belief networks (BBNs) are a kind of probabilistic graphical model that have become very popular to practitioners and scientists mainly due to the powerful probability theory involved, which makes BBNs suitable to deal with a wide range of environmental problems. In this contribution we present the development of a Bayesian belief network to assess the potential of NWFPs for small scale forest owners. A three stage iterative process with stakeholder and expert participation was used to develop the Bayesian Network within the frame of the StarTree Project. The group of participants varied in the stages of the modelling process. A core team, consisting of one technical expert and two domain experts was responsible for the entire modelling process as well as for the first prototype of the network structure, including nodes and relationships. A top-level causal network, was further decomposed to sub level networks. Stakeholder participation including a group of experts from different related subject areas was used in model verification and validation. We demonstrate that BBNs can be used to transfer expert knowledge from science to practice and thus have the ability to contribute to improved problem understanding of non-expert decision makers for a sustainable production of NWFPs.

  18. Nonlinearity measure and internal model control based linearization in anti-windup design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perev, Kamen

    2013-12-18

    This paper considers the problem of internal model control based linearization in anti-windup design. The nonlinearity measure concept is used for quantifying the control system degree of nonlinearity. The linearizing effect of a modified internal model control structure is presented by comparing the nonlinearity measures of the open-loop and closed-loop systems. It is shown that the linearization properties are improved by increasing the control system local feedback gain. However, it is emphasized that at the same time the stability of the system deteriorates. The conflicting goals of stability and linearization are resolved by solving the design problem in different frequencymore » ranges.« less

  19. 29 CFR 18.703 - Bases of opinion testimony by experts.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Bases of opinion testimony by experts. The facts or data in the particular case upon which an expert bases an opinion or inference may be those perceived by or made known to the expert at or before the... 29 Labor 1 2010-07-01 2010-07-01 true Bases of opinion testimony by experts. 18.703 Section 18.703...

  20. 29 CFR 18.703 - Bases of opinion testimony by experts.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 1 2011-07-01 2011-07-01 false Bases of opinion testimony by experts. 18.703 Section 18... Bases of opinion testimony by experts. The facts or data in the particular case upon which an expert bases an opinion or inference may be those perceived by or made known to the expert at or before the...

  1. 29 CFR 18.703 - Bases of opinion testimony by experts.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 1 2013-07-01 2013-07-01 false Bases of opinion testimony by experts. 18.703 Section 18... Bases of opinion testimony by experts. The facts or data in the particular case upon which an expert bases an opinion or inference may be those perceived by or made known to the expert at or before the...

  2. 29 CFR 18.703 - Bases of opinion testimony by experts.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 1 2012-07-01 2012-07-01 false Bases of opinion testimony by experts. 18.703 Section 18... Bases of opinion testimony by experts. The facts or data in the particular case upon which an expert bases an opinion or inference may be those perceived by or made known to the expert at or before the...

  3. Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation.

    NASA Astrophysics Data System (ADS)

    Lawley, Russell; Barron, Mark; Lee, Katy

    2014-05-01

    Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation. R. Lawley, M. Barron and K. Lee. NERC - British Geological Survey, Environmental Science Centre, Keyworth, Nottingham, UK, NG12 5GG The boundaries mapped in traditional field geological survey are subject to a wide range of inherent uncertainties. A map at a survey-scale of 1:10,000 is created by a combination of terrain interpretation, direct observations from boreholes and exposures (often sparsely distributed), and indirect interpretation of proxy variables such as soil properties, vegetation and remotely sensed images. A critical factor influencing the quality of the final map is the skill and experience of the surveyor to bring this information together in a coherent conceptual model. The users of geological data comprising or based on mapped boundaries are increasingly aware of these uncertainties, and want to know how to manage them. The growth of 3D modelling, which takes 2D surveys as a starting point, adds urgency to the need for a better understanding of survey uncertainties; particularly where 2D mapping of variable vintage has been compiled into a national coverage. Previous attempts to apply confidence on the basis of metrics such as data density, survey age or survey techniques have proved useful for isolating single, critical, factors but do not generally succeed in evaluating geological mapping 'in the round', because they cannot account for the 'conceptual' skill set of the surveyor. The British Geological Survey (BGS) is using expert elicitation methods to gain a better understanding of uncertainties within the national geological map of Great Britain. The expert elicitation approach starts with the assumption that experienced surveyors have an intuitive sense of the uncertainty of the boundaries that they map, based on a tacit model of geology and its complexity and the nature of the surveying process. The objective of elicitation is to extract this model in a useable, quantitative, form by a robust and transparent procedure. At BGS expert elicitation is being used to evaluate the uncertainty of mapped boundaries in different common mapping scenarios, with a view to building a 'collective' understanding of the challenges each scenario presents. For example, a 'sharp contact (at surface) between highly contrasting sedimentary rocks' represents one level of survey challenge that should be accurately met by all surveyors, even novices. In contrast, a 'transitional boundary defined by localised facies-variation' may require much more experience to resolve (without recourse to significantly more sampling). We will describe the initial phase of this exercise in which uncertainty models were elicited for mapped boundaries in six contrasting scenarios. Each scenario was presented to a panel of experts with varied expertise and career history. In five cases it was possible to arrive at a consensus model, in a sixth case experts with different experience took different views of the nature of the mapping problem. We will discuss our experience of the use of elicitation methodology and the implications of our results for further work at the BGS to quantify uncertainty in map products. In particular we will consider the value of elicitation as a means to capture the expertise of individuals as they retire, and as the composition of the organization's staff changes in response to the management and policy decisions.

  4. USING LINEAR AND POLYNOMIAL MODELS TO EXAMINE THE ENVIRONMENTAL STABILITY OF VIRUSES

    EPA Science Inventory

    The article presents the development of model equations for describing the fate of viral infectivity in environmental samples. Most of the models were based upon the use of a two-step linear regression approach. The first step employs regression of log base 10 transformed viral t...

  5. Stereotactic radiosurgery for intracranial metastases: linac-based and gamma-dedicated unit approach.

    PubMed

    Alongi, Filippo; Fiorentino, Alba; Mancosu, Pietro; Navarria, Pierina; Giaj Levra, Niccolò; Mazzola, Rosario; Scorsetti, Marta

    2016-07-01

    For intracranial metastases, the role of stereotactic radiosurgery (SRS) or fractionated stereotactic radiotherapy is well recognized. Historically, the first technology, for stereotactic device able to irradiate a brain tumor volume, was Gamma Knife® (GK). Due to the technological advancement of linear accelerator (Linac), there was a continuous increasing interest in SRS Linac-based applications. In those decades, it was assumed a superiority of GK compared to SRS Linac-based for brain tumor in terms of dose conformity and rapid fall-off dose close to the target. Expert commentary: Recently, due to the Linac technologic advancement, the choice of SRS GK-based is not necessarily so exclusive. The current review discussed in details the technical and clinical aspects comparing the two approaches for brain metastases.

  6. Modeling with Young Students--Quantitative and Qualitative.

    ERIC Educational Resources Information Center

    Bliss, Joan; Ogborn, Jon; Boohan, Richard; Brosnan, Tim; Mellar, Harvey; Sakonidis, Babis

    1999-01-01

    A project created tasks and tools to investigate quality and nature of 11- to 14-year-old pupils' reasoning with quantitative and qualitative computer-based modeling tools. Tasks and tools were used in two innovative modes of learning: expressive, where pupils created their own models, and exploratory, where pupils investigated an expert's model.…

  7. A Model for Effective Teaching and Learning in Research Methods.

    ERIC Educational Resources Information Center

    Poindexter, Paula M.

    1998-01-01

    Proposes a teaching model for making research relevant. Presents a case study of the model as used in advertising and public relations research classes. Notes that the model consists of a knowledge base, team process, a realistic goal-oriented experience, self-management, expert consultation, and evaluation and synthesis. Discusses resulting…

  8. Value of biologic therapy: a forecasting model in three disease areas.

    PubMed

    Paramore, L Clark; Hunter, Craig A; Luce, Bryan R; Nordyke, Robert J; Halbert, R J

    2010-01-01

    Forecast the return on investment (ROI) for advances in biologic therapies in years 2015 and 2030, based upon impact on disease prevalence, morbidity, and mortality for asthma, diabetes, and colorectal cancer. A deterministic, spreadsheet-based, forecasting model was developed based on trends in demographics and disease epidemiology. 'Return' was defined as reductions in disease burden (prevalence, morbidity, mortality) translated into monetary terms; 'investment' was defined as the incremental costs of biologic therapy advances. Data on disease prevalence, morbidity, mortality, and associated costs were obtained from government survey statistics or published literature. Expected impact of advances in biologic therapies was based on expert opinion. Gains in quality-adjusted life years (QALYs) were valued at $100,000 per QALY. The base case analysis, in which reductions in disease prevalence and mortality predicted by the expert panel are not considered, shows the resulting ROIs remain positive for asthma and diabetes but fall below $1 for colorectal cancer. Analysis involving expert panel predictions indicated positive ROI results for all three diseases at both time points, ranging from $207 for each incremental dollar spent on biologic therapies to treat asthma in 2030, to $4 for each incremental dollar spent on biologic therapies to treat colorectal cancer in 2015. If QALYs are not considered, the resulting ROIs remain positive for all three diseases at both time points. Society may expect substantial returns from investments in innovative biologic therapies. These benefits are most likely to be realized in an environment of appropriate use of new molecules. The potential variance between forecasted (from expert opinion) and actual future health outcomes could be significant. Similarly, the forecasted growth in use of biologic therapies relied upon unvalidated market forecasts.

  9. Markovian negentropies in bioinformatics. 1. A picture of footprints after the interaction of the HIV-1 Psi-RNA packaging region with drugs.

    PubMed

    Díaz, Humberto González; de Armas, Ronal Ramos; Molina, Reinaldo

    2003-11-01

    Many experts worldwide have highlighted the potential of RNA molecules as drug targets for the chemotherapeutic treatment of a range of diseases. In particular, the molecular pockets of RNA in the HIV-1 packaging region have been postulated as promising sites for antiviral action. The discovery of simpler methods to accurately represent drug-RNA interactions could therefore become an interesting and rapid way to generate models that are complementary to docking-based systems. The entropies of a vibrational Markov chain have been introduced here as physically meaningful descriptors for the local drug-nucleic acid complexes. A study of the interaction of the antibiotic Paromomycin with the packaging region of the RNA present in type-1 HIV has been carried out as an illustrative example of this approach. A linear discriminant function gave rise to excellent discrimination among 80.13% of interacting/non-interacting sites. More specifically, the model classified 36/45 nucleotides (80.0%) that interacted with paromomycin and, in addition, 85/106 (80.2%) footprinted (non-interacting) sites from the RNA viral sequence were recognized. The model showed a high Matthews' regression coefficient (C = 0.64). The Jackknife method was also used to assess the stability and predictability of the model by leaving out adenines, C, G, or U. Matthews' coefficients and overall accuracies for these approaches were between 0.55 and 0.68 and 75.8 and 82.7, respectively. On the other hand, a linear regression model predicted the local binding affinity constants between a specific nucleotide and the aforementioned antibiotic (R2 = 0.83,Q2 = 0.825). These kinds of models may play an important role either in the discovery of new anti-HIV compounds or in the elucidation of their mode of action. On request from the corresponding author (humbertogd@cbq.uclv.edu.cu or humbertogd@navegalia.com).

  10. Workshop on Biological Integrity of Coral Reefs August 21-22 ...

    EPA Pesticide Factsheets

    This report summarizes an EPA-sponsored workshop on coral reef biological integrity held at the Caribbean Coral Reef Institute in La Parguera, Puerto Rico on August 21-22, 2012. The goals of this workshop were to:• Identify key qualitative and quantitative ecological characteristics (reef attributes) that determine the condition of linear coral reefs inhabiting shallow waters (<12 m) in southwestern Puerto Rico.• Use those reef attributes to recommend categorical condition rankings for establishing a biological condition gradient.• Ascertain through expert consensus those reef attributes that characterize biological integrity (a natural, fully-functioning system of organisms and communities) for coral reefs. • Develop a conceptual, narrative model that describes how biological attributes of coral reefs change along a gradient of increasing anthropogenic stress.The workshop brought together scientists with expertise in coral reef taxonomic groups (e.g., stony corals, fishes, sponges, gorgonians, algae, seagrasses and macroinvertebrates), as well as community structure, organism condition, ecosystem function and ecosystem connectivity. The experts evaluated photos and videos from 12 stations collected during EPA Coral Reef surveys (2010 & 2011) from Puerto Rico on coral reefs exhibiting a wide range of conditions. The experts individually rated each station as to observed condition (“good”, “fair” or “poor”) and documented their rationale for

  11. Knowledge acquisition and representation using fuzzy evidential reasoning and dynamic adaptive fuzzy Petri nets.

    PubMed

    Liu, Hu-Chen; Liu, Long; Lin, Qing-Lian; Liu, Nan

    2013-06-01

    The two most important issues of expert systems are the acquisition of domain experts' professional knowledge and the representation and reasoning of the knowledge rules that have been identified. First, during expert knowledge acquisition processes, the domain expert panel often demonstrates different experience and knowledge from one another and produces different types of knowledge information such as complete and incomplete, precise and imprecise, and known and unknown because of its cross-functional and multidisciplinary nature. Second, as a promising tool for knowledge representation and reasoning, fuzzy Petri nets (FPNs) still suffer a couple of deficiencies. The parameters in current FPN models could not accurately represent the increasingly complex knowledge-based systems, and the rules in most existing knowledge inference frameworks could not be dynamically adjustable according to propositions' variation as human cognition and thinking. In this paper, we present a knowledge acquisition and representation approach using the fuzzy evidential reasoning approach and dynamic adaptive FPNs to solve the problems mentioned above. As is illustrated by the numerical example, the proposed approach can well capture experts' diversity experience, enhance the knowledge representation power, and reason the rule-based knowledge more intelligently.

  12. Cost drivers and resource allocation in military health care systems.

    PubMed

    Fulton, Larry; Lasdon, Leon S; McDaniel, Reuben R

    2007-03-01

    This study illustrates the feasibility of incorporating technical efficiency considerations in the funding of military hospitals and identifies the primary drivers for hospital costs. Secondary data collected for 24 U.S.-based Army hospitals and medical centers for the years 2001 to 2003 are the basis for this analysis. Technical efficiency was measured by using data envelopment analysis; subsequently, efficiency estimates were included in logarithmic-linear cost models that specified cost as a function of volume, complexity, efficiency, time, and facility type. These logarithmic-linear models were compared against stochastic frontier analysis models. A parsimonious, three-variable, logarithmic-linear model composed of volume, complexity, and efficiency variables exhibited a strong linear relationship with observed costs (R(2) = 0.98). This model also proved reliable in forecasting (R(2) = 0.96). Based on our analysis, as much as $120 million might be reallocated to improve the United States-based Army hospital performance evaluated in this study.

  13. Combination and selection of traffic safety expert judgments for the prevention of driving risks.

    PubMed

    Cabello, Enrique; Conde, Cristina; de Diego, Isaac Martín; Moguerza, Javier M; Redchuk, Andrés

    2012-11-02

    In this paper, we describe a new framework to combine experts’ judgments for the prevention of driving risks in a cabin truck. In addition, the methodology shows how to choose among the experts the one whose predictions fit best the environmental conditions. The methodology is applied over data sets obtained from a high immersive cabin truck simulator in natural driving conditions. A nonparametric model, based in Nearest Neighbors combined with Restricted Least Squared methods is developed. Three experts were asked to evaluate the driving risk using a Visual Analog Scale (VAS), in order to measure the driving risk in a truck simulator where the vehicle dynamics factors were stored. Numerical results show that the methodology is suitable for embedding in real time systems.

  14. A Comparison of Computational Cognitive Models: Agent-Based Systems Versus Rule-Based Architectures

    DTIC Science & Technology

    2003-03-01

    Java™ How To Program , Prentice Hall, 1999. Friedman-Hill, E., Jess, The Expert System Shell for the Java Platform, Sandia National Laboratories, 2001...transition from the descriptive NDM theory to a computational model raises several questions: Who is an experienced decision maker? How do you model the...progression from being a novice to an experienced decision maker? How does the model account for previous experiences? Are there situations where

  15. Difference-based ridge-type estimator of parameters in restricted partial linear model with correlated errors.

    PubMed

    Wu, Jibo

    2016-01-01

    In this article, a generalized difference-based ridge estimator is proposed for the vector parameter in a partial linear model when the errors are dependent. It is supposed that some additional linear constraints may hold to the whole parameter space. Its mean-squared error matrix is compared with the generalized restricted difference-based estimator. Finally, the performance of the new estimator is explained by a simulation study and a numerical example.

  16. Toward a better integration of roughness in rockfall simulations - a sensitivity study with the RockyFor3D model

    NASA Astrophysics Data System (ADS)

    Monnet, Jean-Matthieu; Bourrier, Franck; Milenkovic, Milutin

    2017-04-01

    Advances in numerical simulation and analysis of real-size field experiments have supported the development of process-based rockfall simulation models. Availability of high resolution remote sensing data and high-performance computing now make it possible to implement them for operational applications, e.g. risk zoning and protection structure design. One key parameter regarding rock propagation is the surface roughness, sometimes defined as the variation in height perpendicular to the slope (Pfeiffer and Bowen, 1989). Roughness-related input parameters for rockfall models are usually determined by experts on the field. In the RockyFor3D model (Dorren, 2015), three values related to the distribution of obstacles (deposited rocks, stumps, fallen trees,... as seen from the incoming rock) relatively to the average slope are estimated. The use of high resolution digital terrain models (DTMs) questions both the scale usually adopted by experts for roughness assessment and the relevance of modeling hypotheses regarding the rock / ground interaction. Indeed, experts interpret the surrounding terrain as obstacles or ground depending on the overall visibility and on the nature of objects. Digital models represent the terrain with a certain amount of smoothing, depending on the sensor capacities. Besides, the rock rebound on the ground is modeled by changes in the velocities of the gravity center of the block due to impact. Thus, the use of a DTM with resolution smaller than the block size might have little relevance while increasing computational burden. The objective of this work is to investigate the issue of scale relevance with simulations based on RockyFor3D in order to derive guidelines for roughness estimation by field experts. First a sensitivity analysis is performed to identify the combinations of parameters (slope, soil roughness parameter, rock size) where the roughness values have a critical effect on rock propagation on a regular hillside. Second, a more complex hillside is simulated by combining three components: a) a global trend (planar surface), b) local systematic components (sine waves), c) random roughness (Gaussian, zero-mean noise). The parameters for simulating these components are estimated for three typical scenarios of rockfall terrains: soft soil, fine scree and coarse scree, based on expert knowledge and available airborne and terrestrial laser scanning data. For each scenario, the reference terrain is created and used to compute input data for RockyFor3D simulations at different scales, i.e. DTMs with resolutions from 0.5 m to 20 m and associated roughness parameters. Subsequent analysis mainly focuses on the sensitivity of simulations both in terms of run-out envelope and kinetic energy distribution. Guidelines drawn from the results are expected to help experts handle the scale issue while integrating remote sensing data and field measurements of roughness in rockfall simulations.

  17. Statistical models for fever forecasting based on advanced body temperature monitoring.

    PubMed

    Jordan, Jorge; Miro-Martinez, Pau; Vargas, Borja; Varela-Entrecanales, Manuel; Cuesta-Frau, David

    2017-02-01

    Body temperature monitoring provides health carers with key clinical information about the physiological status of patients. Temperature readings are taken periodically to detect febrile episodes and consequently implement the appropriate medical countermeasures. However, fever is often difficult to assess at early stages, or remains undetected until the next reading, probably a few hours later. The objective of this article is to develop a statistical model to forecast fever before a temperature threshold is exceeded to improve the therapeutic approach to the subjects involved. To this end, temperature series of 9 patients admitted to a general internal medicine ward were obtained with a continuous monitoring Holter device, collecting measurements of peripheral and core temperature once per minute. These series were used to develop different statistical models that could quantify the probability of having a fever spike in the following 60 minutes. A validation series was collected to assess the accuracy of the models. Finally, the results were compared with the analysis of some series by experienced clinicians. Two different models were developed: a logistic regression model and a linear discrimination analysis model. Both of them exhibited a fever peak forecasting accuracy greater than 84%. When compared with experts' assessment, both models identified 35 (97.2%) of 36 fever spikes. The models proposed are highly accurate in forecasting the appearance of fever spikes within a short period in patients with suspected or confirmed febrile-related illnesses. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Modeling and Inverse Controller Design for an Unmanned Aerial Vehicle Based on the Self-Organizing Map

    NASA Technical Reports Server (NTRS)

    Cho, Jeongho; Principe, Jose C.; Erdogmus, Deniz; Motter, Mark A.

    2005-01-01

    The next generation of aircraft will have dynamics that vary considerably over the operating regime. A single controller will have difficulty to meet the design specifications. In this paper, a SOM-based local linear modeling scheme of an unmanned aerial vehicle (UAV) is developed to design a set of inverse controllers. The SOM selects the operating regime depending only on the embedded output space information and avoids normalization of the input data. Each local linear model is associated with a linear controller, which is easy to design. Switching of the controllers is done synchronously with the active local linear model that tracks the different operating conditions. The proposed multiple modeling and control strategy has been successfully tested in a simulator that models the LoFLYTE UAV.

  19. Creating of structure of facts for the knowledge base of an expert system for wind power plant's equipment diagnosis

    NASA Astrophysics Data System (ADS)

    Duer, Stanisław; Wrzesień, Paweł; Duer, Radosław

    2017-10-01

    This article describes rules and conditions for making a structure (a set) of facts for an expert knowledge base of the intelligent system to diagnose Wind Power Plants' equipment. Considering particular operational conditions of a technical object, that is a set of Wind Power Plant's equipment, this is a significant issue. A structural model of Wind Power Plant's equipment is developed. Based on that, a functional - diagnostic model of Wind Power Plant's equipment is elaborated. That model is a basis for determining primary elements of the object structure, as well as for interpreting a set of diagnostic signals and their reference signals. The key content of this paper is a description of rules for building of facts on the basis of developed analytical dependence. According to facts, their dependence is described by rules for transferring of a set of pieces of diagnostic information into a specific set of facts. The article consists of four chapters that concern particular issues on the subject.

  20. A performance model for GPUs with caches

    DOE PAGES

    Dao, Thanh Tuan; Kim, Jungwon; Seo, Sangmin; ...

    2014-06-24

    To exploit the abundant computational power of the world's fastest supercomputers, an even workload distribution to the typically heterogeneous compute devices is necessary. While relatively accurate performance models exist for conventional CPUs, accurate performance estimation models for modern GPUs do not exist. This paper presents two accurate models for modern GPUs: a sampling-based linear model, and a model based on machine-learning (ML) techniques which improves the accuracy of the linear model and is applicable to modern GPUs with and without caches. We first construct the sampling-based linear model to predict the runtime of an arbitrary OpenCL kernel. Based on anmore » analysis of NVIDIA GPUs' scheduling policies we determine the earliest sampling points that allow an accurate estimation. The linear model cannot capture well the significant effects that memory coalescing or caching as implemented in modern GPUs have on performance. We therefore propose a model based on ML techniques that takes several compiler-generated statistics about the kernel as well as the GPU's hardware performance counters as additional inputs to obtain a more accurate runtime performance estimation for modern GPUs. We demonstrate the effectiveness and broad applicability of the model by applying it to three different NVIDIA GPU architectures and one AMD GPU architecture. On an extensive set of OpenCL benchmarks, on average, the proposed model estimates the runtime performance with less than 7 percent error for a second-generation GTX 280 with no on-chip caches and less than 5 percent for the Fermi-based GTX 580 with hardware caches. On the Kepler-based GTX 680, the linear model has an error of less than 10 percent. On an AMD GPU architecture, Radeon HD 6970, the model estimates with 8 percent of error rates. As a result, the proposed technique outperforms existing models by a factor of 5 to 6 in terms of accuracy.« less

  1. Section-constrained local geological interface dynamic updating method based on the HRBF surface

    NASA Astrophysics Data System (ADS)

    Guo, Jiateng; Wu, Lixin; Zhou, Wenhui; Li, Chaoling; Li, Fengdan

    2018-02-01

    Boundaries, attitudes and sections are the most common data acquired from regional field geological surveys, and they are used for three-dimensional (3D) geological modelling. However, constructing topologically consistent 3D geological models from rapid and automatic regional modelling with convenient local modifications remains unresolved. In previous works, the Hermite radial basis function (HRBF) surface was introduced for the simulation of geological interfaces from geological boundaries and attitudes, which allows 3D geological models to be automatically extracted from the modelling area by the interfaces. However, the reasonability and accuracy of non-supervised subsurface modelling is limited without further modifications generated through explanations and analyses performed by geology experts. In this paper, we provide flexible and convenient manual interactive manipulation tools for geologists to sketch constraint lines, and these tools may help geologists transform and apply their expert knowledge to the models. In the modified modelling workflow, the geological sections were treated as auxiliary constraints to construct more reasonable 3D geological models. The geometric characteristics of section lines were abstracted to coordinates and normal vectors, and along with the transformed coordinates and vectors from boundaries and attitudes, these characteristics were adopted to co-calculate the implicit geological surface function parameters of the HRBF equations and form constrained geological interfaces from topographic (boundaries and attitudes) and subsurface data (sketched sections). Based on this new modelling method, a prototype system was developed, in which the section lines could be imported from databases or interactively sketched, and the models could be immediately updated after the new constraints were added. Experimental comparisons showed that all boundary, attitude and section data are well represented in the constrained models, which are consistent with expert explanations and help improve the quality of the models.

  2. AGU Climate Scientists Offer Question-and-Answer Service for Media

    NASA Astrophysics Data System (ADS)

    Jackson, Stacy

    2010-03-01

    In fall 2009, AGU launched a member-driven pilot project to improve the accuracy of climate science coverage in the media and to improve public understanding of climate science. The project's goal was to increase the accessibility of climate science experts to journalists across the full spectrum of media outlets. As a supplement to the traditional one-to-one journalist-expert relationship model, the project tested the novel approach of providing a question-and-answer (Q&A) service with a pool of expert scientists and a Web-based interface with journalists. Questions were explicitly limited to climate science to maintain a nonadvocacy, nonpartisan perspective.

  3. The influence of expertise on brain activation of the action observation network during anticipation of tennis and volleyball serves

    PubMed Central

    Balser, Nils; Lorey, Britta; Pilgramm, Sebastian; Naumann, Tim; Kindermann, Stefan; Stark, Rudolf; Zentgraf, Karen; Williams, A. Mark; Munzert, Jörn

    2014-01-01

    In many daily activities, and especially in sport, it is necessary to predict the effects of others' actions in order to initiate appropriate responses. Recently, researchers have suggested that the action–observation network (AON) including the cerebellum plays an essential role during such anticipation, particularly in sport expert performers. In the present study, we examined the influence of task-specific expertise on the AON by investigating differences between two expert groups trained in different sports while anticipating action effects. Altogether, 15 tennis and 16 volleyball experts anticipated the direction of observed tennis and volleyball serves while undergoing functional magnetic resonance imaging (fMRI). The expert group in each sport acted as novice controls in the other sport with which they had only little experience. When contrasting anticipation in both expertise conditions with the corresponding untrained sport, a stronger activation of AON areas (SPL, SMA), and particularly of cerebellar structures, was observed. Furthermore, the neural activation within the cerebellum and the SPL was linearly correlated with participant's anticipation performance, irrespective of the specific expertise. For the SPL, this relationship also holds when an expert performs a domain-specific anticipation task. Notably, the stronger activation of the cerebellum as well as of the SMA and the SPL in the expertise conditions suggests that experts rely on their more fine-tuned perceptual-motor representations that have improved during years of training when anticipating the effects of others' actions in their preferred sport. The association of activation within the SPL and the cerebellum with the task achievement suggests that these areas are the predominant brain sites involved in fast motor predictions. The SPL reflects the processing of domain-specific contextual information and the cerebellum the usage of a predictive internal model to solve the anticipation task. PMID:25136305

  4. Water Sensation During Passive Propulsion for Expert and Nonexpert Swimmers.

    PubMed

    Kusanagi, Kenta; Sato, Daisuke; Hashimoto, Yasuhiro; Yamada, Norimasa

    2017-06-01

    This study determined whether expert swimmers, compared with nonexperts, have superior movement perception and physical sensations of propulsion in water. Expert (national level competitors, n = 10) and nonexpert (able to swim 50 m in > 3 styles, n = 10) swimmers estimated distance traveled in water with their eyes closed. Both groups indicated their subjective physical sensations in the water. For each of two trials, two-dimensional coordinates were obtained from video recordings using the two-dimensional direct linear transformation method for calculating changes in speed. The mean absolute error of the difference between the actual and estimated distance traveled in the water was significantly lower for expert swimmers (0.90 ± 0.71 meters) compared with nonexpert swimmers (3.85 ± 0.84 m). Expert swimmers described the sensation of propulsion in water in cutaneous terms as the "sense of flow" and sensation of "skin resistance." Therefore, expert swimmers appear to have a superior sense of distance during their movement in the water compared with that of nonexpert swimmers. In addition, expert swimmers may have a better perception of movement in water. We propose that expert swimmers integrate sensations and proprioceptive senses, enabling them to better perceive and estimate distance moved through water.

  5. A preference-based approach to deriving breeding objectives: applied to sheep breeding.

    PubMed

    Byrne, T J; Amer, P R; Fennessy, P F; Hansen, P; Wickham, B W

    2012-05-01

    Using internet-based software known as 1000Minds, choice-experiment surveys were administered to experts and farmers from the Irish sheep industry to capture their preferences with respect to the relative importance - represented by part-worth utilities - of target traits in the definition of a breeding objective for sheep in Ireland. Sheep production in Ireland can be broadly separated into lowland and hill farming systems; therefore, each expert was asked to answer the survey first as if he or she were a lowland farmer and second as a hill farmer. In addition to the experts, a group of lowland and a group of hill farmers were surveyed to assess whether, and to what extent, the groups' preferences differ from the experts' preferences. The part-worth utilities obtained from the surveys were converted into relative economic value terms per unit change in each trait. These measures - referred to as 'preference economic values' (pEVs) - were compared with economic values for the traits obtained from bio-economic models. The traits 'value per lamb at the meat processor' and 'lamb survival to slaughter' were revealed as being the two most important traits for the surveyed experts responding as lowland and hill farmers, respectively. In contrast, 'number of foot baths per year for ewes' and 'number of anthelmintic treatments per year for ewes' were the two least important traits. With the exception of 'carcase fat class' (P < 0.05), there were no statistically significant differences in the mean pEVs obtained from the surveyed experts under both the lowland and hill farming scenarios. Compared with the economic values obtained from bio-economic models, the pEVs for 'lambing difficulty' when the experts responded as lowland farmers were higher (P < 0.001); and they were lower (P < 0.001) for 'carcase conformation class', 'carcase fat class' (less negative) and 'ewe mature weight' (less negative) under both scenarios. Compared with surveyed experts, pEVs from lowland farmers differed significantly for 'lambing difficulty', 'lamb survival to slaughter', 'average days to slaughter of lambs', 'number of foot baths per year for ewes', 'number of anthelmintic treatments per year for ewes' and 'ewe mature weight'. Compared with surveyed experts, pEVs from hill farmers differed significantly for 'lambing difficulty', 'average days to slaughter of lambs' and 'number of foot baths per year for ewes'. This study indicates that preference-based tools have the potential to contribute to the definition of breeding objectives where production and price data are not available.

  6. Evaluating the Sustainability of School-Based Health Centers.

    PubMed

    Navarro, Stephanie; Zirkle, Dorothy L; Barr, Donald A

    2017-01-01

    The United States is facing a surge in the number of school-based health centers (SBHCs) owing to their success in delivering positive health outcomes and increasing access to care. To preserve this success, experts have developed frameworks for creating sustainable SBHCs; however, little research has affirmed or added to these models. This research seeks to analyze elements of sustainability in a case study of three SBHCs in San Diego, California, with the purpose of creating a research-based framework of SBHC sustainability to supplement expertly derived models. Using a mixed methods study design, data were collected from interviews with SBHC stakeholders, observations in SBHCs, and SBHC budgets. A grounded theory qualitative analysis and a quantitative budget analysis were completed to develop a theoretical framework for the sustainability of SBHCs. Forty-one interviews were conducted, 6 hours of observations were completed, and 3 years of SBHC budgets were analyzed to identify care coordination, community buy-in, community awareness, and SBHC partner cooperation as key themes of sustainability promoting patient retention for sustainable billing and reimbursement levels. These findings highlight the unique ways in which SBHCs gain community buy-in and awareness by becoming trusted sources of comprehensive and coordinated care within communities and among vulnerable populations. Findings also support ideas from expert models of SBHC sustainability calling for well-defined and executed community partnerships and quality coordinated care in the procurement of sustainable SBHC funding.

  7. Longitudinal data analyses using linear mixed models in SPSS: concepts, procedures and illustrations.

    PubMed

    Shek, Daniel T L; Ma, Cecilia M S

    2011-01-05

    Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented.

  8. Longitudinal Data Analyses Using Linear Mixed Models in SPSS: Concepts, Procedures and Illustrations

    PubMed Central

    Shek, Daniel T. L.; Ma, Cecilia M. S.

    2011-01-01

    Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented. PMID:21218263

  9. Robust nonlinear system identification: Bayesian mixture of experts using the t-distribution

    NASA Astrophysics Data System (ADS)

    Baldacchino, Tara; Worden, Keith; Rowson, Jennifer

    2017-02-01

    A novel variational Bayesian mixture of experts model for robust regression of bifurcating and piece-wise continuous processes is introduced. The mixture of experts model is a powerful model which probabilistically splits the input space allowing different models to operate in the separate regions. However, current methods have no fail-safe against outliers. In this paper, a robust mixture of experts model is proposed which consists of Student-t mixture models at the gates and Student-t distributed experts, trained via Bayesian inference. The Student-t distribution has heavier tails than the Gaussian distribution, and so it is more robust to outliers, noise and non-normality in the data. Using both simulated data and real data obtained from the Z24 bridge this robust mixture of experts performs better than its Gaussian counterpart when outliers are present. In particular, it provides robustness to outliers in two forms: unbiased parameter regression models, and robustness to overfitting/complex models.

  10. Modeling of salt and pH gradient elution in ion-exchange chromatography.

    PubMed

    Schmidt, Michael; Hafner, Mathias; Frech, Christian

    2014-01-01

    The separation of proteins by internally and externally generated pH gradients in chromatofocusing on ion-exchange columns is a well-established analytical method with a large number of applications. In this work, a stoichiometric displacement model was used to describe the retention behavior of lysozyme on SP Sepharose FF and a monoclonal antibody on Fractogel SO3 (S) in linear salt and pH gradient elution. The pH dependence of the binding charge B in the linear gradient elution model is introduced using a protein net charge model, while the pH dependence of the equilibrium constant is based on a thermodynamic approach. The model parameter and pH dependences are calculated from linear salt gradient elutions at different pH values as well as from linear pH gradient elutions at different fixed salt concentrations. The application of the model for the well-characterized protein lysozyme resulted in almost identical model parameters based on either linear salt or pH gradient elution data. For the antibody, only the approach based on linear pH gradients is feasible because of the limited pH range useful for salt gradient elution. The application of the model for the separation of an acid variant of the antibody from the major monomeric form is discussed. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Probabilistic inversion of expert assessments to inform projections about Antarctic ice sheet responses.

    PubMed

    Fuller, Robert William; Wong, Tony E; Keller, Klaus

    2017-01-01

    The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections.

  12. Expert AIV: Study and Prototyping of an Expert System, To Support the Conceptual AIV Phases Of Space Programs

    NASA Astrophysics Data System (ADS)

    Andrina, G.; Basso, V.; Saitta, L.

    2004-08-01

    The effort in optimising the AIV process has been mainly focused in the recent years on the standardisation of approaches and on the application of new methodologies. But the earlier the intervention, the greater the benefits in terms of cost and schedule. Early phases of AIV process relied up to now on standards that need to be tailored through company and personal expertise. A study has then been conducted in order to exploit the possibility to develop an expert system helping in making choices in the early, conceptual phase of Assembly, Integration and Verification, namely the Model Philosophy and the test definition. The work focused on a hybrid approach, allowing interaction between historical data and human expertise. The expert system that has been prototyped exploits both information elicited from domain experts and results of a Data Mining activity on the existent data bases of completed projects verification data. The Data Mining algorithms allow the extraction of past experience resident on ESA/ MATD data base, which contains information in the form of statistical summaries, costs, frequencies of on-ground and in flight failures. Finding non-trivial associations could then be utilised by the experts to manage new decisions in a controlled way (Standards driven) at the beginning or during the AIV Process Moreover, the Expert AIV could allow compilation of a set of feasible AIV schedules to support further programmatic-driven choices.

  13. A Deep Learning Architecture for Temporal Sleep Stage Classification Using Multivariate and Multimodal Time Series.

    PubMed

    Chambon, Stanislas; Galtier, Mathieu N; Arnal, Pierrick J; Wainrib, Gilles; Gramfort, Alexandre

    2018-04-01

    Sleep stage classification constitutes an important preliminary exam in the diagnosis of sleep disorders. It is traditionally performed by a sleep expert who assigns to each 30 s of the signal of a sleep stage, based on the visual inspection of signals such as electroencephalograms (EEGs), electrooculograms (EOGs), electrocardiograms, and electromyograms (EMGs). We introduce here the first deep learning approach for sleep stage classification that learns end-to-end without computing spectrograms or extracting handcrafted features, that exploits all multivariate and multimodal polysomnography (PSG) signals (EEG, EMG, and EOG), and that can exploit the temporal context of each 30-s window of data. For each modality, the first layer learns linear spatial filters that exploit the array of sensors to increase the signal-to-noise ratio, and the last layer feeds the learnt representation to a softmax classifier. Our model is compared to alternative automatic approaches based on convolutional networks or decisions trees. Results obtained on 61 publicly available PSG records with up to 20 EEG channels demonstrate that our network architecture yields the state-of-the-art performance. Our study reveals a number of insights on the spatiotemporal distribution of the signal of interest: a good tradeoff for optimal classification performance measured with balanced accuracy is to use 6 EEG with 2 EOG (left and right) and 3 EMG chin channels. Also exploiting 1 min of data before and after each data segment offers the strongest improvement when a limited number of channels are available. As sleep experts, our system exploits the multivariate and multimodal nature of PSG signals in order to deliver the state-of-the-art classification performance with a small computational cost.

  14. A randomized trial using telehealth technology to link caregivers with dementia care experts for in-home caregiving support: FamTechCare protocol.

    PubMed

    Williams, Kristine; Blyler, Diane; Vidoni, Eric D; Shaw, Clarissa; Wurth, JoEllen; Seabold, Denise; Perkhounkova, Yelena; Van Sciver, Angela

    2018-06-01

    The number of persons with dementia (PWD) in the United States is expected to reach 16 million by 2050. Due to the behavioral and psychological symptoms of dementia, caregivers face challenging in-home care situations that lead to a range of negative health outcomes such as anxiety and depression for the caregivers and nursing home placement for PWD. Supporting Family Caregivers with Technology for Dementia Home Care (FamTechCare) is a multisite randomized controlled trial evaluating the effects of a telehealth intervention on caregiver well-being and PWD behavioral symptoms. The FamTechCare intervention provides individualized dementia-care strategies to in-home caregivers based on video recordings that the caregiver creates of challenging care situations. A team of dementia care experts review videos submitted by caregivers and provide interventions to improve care weekly for the experimental group. Caregivers in the control group receive feedback for improving care based on a weekly phone call with the interventionist and receive feedback on their videos at the end of the 3-month study. Using linear mixed modeling, we will compare experimental and control group outcomes (PWD behavioral symptoms and caregiver burden) after 1 and 3 months. An exploratory descriptive design will identify a typology of interventions for telehealth support for in-home dementia caregivers. Finally, the cost for FamTechCare will be determined and examined in relation to hypothesized effects on PWD behavioral symptoms, placement rates, and caregiver burden. This research will provide the foundation for future research for telehealth interventions with this population, especially for families in rural or remote locations. © 2018 Wiley Periodicals, Inc.

  15. Nonlinear aeroservoelastic analysis of a controlled multiple-actuated-wing model with free-play

    NASA Astrophysics Data System (ADS)

    Huang, Rui; Hu, Haiyan; Zhao, Yonghui

    2013-10-01

    In this paper, the effects of structural nonlinearity due to free-play in both leading-edge and trailing-edge outboard control surfaces on the linear flutter control system are analyzed for an aeroelastic model of three-dimensional multiple-actuated-wing. The free-play nonlinearities in the control surfaces are modeled theoretically by using the fictitious mass approach. The nonlinear aeroelastic equations of the presented model can be divided into nine sub-linear modal-based aeroelastic equations according to the different combinations of deflections of the leading-edge and trailing-edge outboard control surfaces. The nonlinear aeroelastic responses can be computed based on these sub-linear aeroelastic systems. To demonstrate the effects of nonlinearity on the linear flutter control system, a single-input and single-output controller and a multi-input and multi-output controller are designed based on the unconstrained optimization techniques. The numerical results indicate that the free-play nonlinearity can lead to either limit cycle oscillations or divergent motions when the linear control system is implemented.

  16. Modelling efforts needed to advance herpes simplex virus (HSV) vaccine development: Key findings from the World Health Organization Consultation on HSV Vaccine Impact Modelling.

    PubMed

    Gottlieb, Sami L; Giersing, Birgitte; Boily, Marie-Claude; Chesson, Harrell; Looker, Katharine J; Schiffer, Joshua; Spicknall, Ian; Hutubessy, Raymond; Broutet, Nathalie

    2017-06-21

    Development of a vaccine against herpes simplex virus (HSV) is an important goal for global sexual and reproductive health. In order to more precisely define the health and economic burden of HSV infection and the theoretical impact and cost-effectiveness of an HSV vaccine, in 2015 the World Health Organization convened an expert consultation meeting on HSV vaccine impact modelling. The experts reviewed existing model-based estimates and dynamic models of HSV infection to outline critical future modelling needs to inform development of a comprehensive business case and preferred product characteristics for an HSV vaccine. This article summarizes key findings and discussions from the meeting on modelling needs related to HSV burden, costs, and vaccine impact, essential data needs to carry out those models, and important model components and parameters. Copyright © 2017. Published by Elsevier Ltd.

  17. A consensus reaching model for 2-tuple linguistic multiple attribute group decision making with incomplete weight information

    NASA Astrophysics Data System (ADS)

    Zhang, Wancheng; Xu, Yejun; Wang, Huimin

    2016-01-01

    The aim of this paper is to put forward a consensus reaching method for multi-attribute group decision-making (MAGDM) problems with linguistic information, in which the weight information of experts and attributes is unknown. First, some basic concepts and operational laws of 2-tuple linguistic label are introduced. Then, a grey relational analysis method and a maximising deviation method are proposed to calculate the incomplete weight information of experts and attributes respectively. To eliminate the conflict in the group, a weight-updating model is employed to derive the weights of experts based on their contribution to the consensus reaching process. After conflict elimination, the final group preference can be obtained which will give the ranking of the alternatives. The model can effectively avoid information distortion which is occurred regularly in the linguistic information processing. Finally, an illustrative example is given to illustrate the application of the proposed method and comparative analysis with the existing methods are offered to show the advantages of the proposed method.

  18. Resting-State Functional Magnetic Resonance Imaging for Language Preoperative Planning

    PubMed Central

    Branco, Paulo; Seixas, Daniela; Deprez, Sabine; Kovacs, Silvia; Peeters, Ronald; Castro, São L.; Sunaert, Stefan

    2016-01-01

    Functional magnetic resonance imaging (fMRI) is a well-known non-invasive technique for the study of brain function. One of its most common clinical applications is preoperative language mapping, essential for the preservation of function in neurosurgical patients. Typically, fMRI is used to track task-related activity, but poor task performance and movement artifacts can be critical limitations in clinical settings. Recent advances in resting-state protocols open new possibilities for pre-surgical mapping of language potentially overcoming these limitations. To test the feasibility of using resting-state fMRI instead of conventional active task-based protocols, we compared results from fifteen patients with brain lesions while performing a verb-to-noun generation task and while at rest. Task-activity was measured using a general linear model analysis and independent component analysis (ICA). Resting-state networks were extracted using ICA and further classified in two ways: manually by an expert and by using an automated template matching procedure. The results revealed that the automated classification procedure correctly identified language networks as compared to the expert manual classification. We found a good overlay between task-related activity and resting-state language maps, particularly within the language regions of interest. Furthermore, resting-state language maps were as sensitive as task-related maps, and had higher specificity. Our findings suggest that resting-state protocols may be suitable to map language networks in a quick and clinically efficient way. PMID:26869899

  19. An operation support expert system based on on-line dynamics simulation and fuzzy reasoning for startup schedule optimization in fossil power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsumoto, H.; Eki, Y.; Kaji, A.

    1993-12-01

    An expert system which can support operators of fossil power plants in creating the optimum startup schedule and executing it accurately is described. The optimum turbine speed-up and load-up pattern is obtained through an iterative manner which is based on fuzzy resonating using quantitative calculations as plant dynamics models and qualitative knowledge as schedule optimization rules with fuzziness. The rules represent relationships between stress margins and modification rates of the schedule parameters. Simulations analysis proves that the system provides quick and accurate plant startups.

  20. Planning bioinformatics workflows using an expert system.

    PubMed

    Chen, Xiaoling; Chang, Jeffrey T

    2017-04-15

    Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. https://github.com/jefftc/changlab. jeffrey.t.chang@uth.tmc.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. Planning bioinformatics workflows using an expert system

    PubMed Central

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  2. Hospital-based expert model for health technology procurement planning in hospitals.

    PubMed

    Miniati, R; Cecconi, G; Frosini, F; Dori, F; Regolini, J; Iadanza, E; Biffi Gentili, G

    2014-01-01

    Although in the last years technology innovation in healthcare brought big improvements in care level and patient quality of life, hospital complexity and management cost became higher. For this reason, necessity of planning for medical equipment procurement within hospitals is getting more and more important in order to sustainable provide appropriate technology for both routine activity and innovative procedures. In order to support hospital decision makers for technology procurement planning, an expert model was designed as reported in the following paper. It combines the most widely used approaches for technology evaluation by taking into consideration Health Technology Assessment (HTA) and Medical Equipment Replacement Model (MERM). The designing phases include a first definition of prioritization algorithms, then the weighting process through experts' interviews and a final step for the model validation that included both statistical testing and comparison with real decisions. In conclusion, the designed model was able to provide a semi-automated tool that through the use of multidisciplinary information is able to prioritize different requests of technology acquisition in hospitals. Validation outcomes improved the model accuracy and created different "user profiles" according to the specific needs of decision makers.

  3. Boilermodel: A Qualitative Model-Based Reasoning System Implemented in Ada

    DTIC Science & Technology

    1991-09-01

    comple- ment to shipboard engineering training. Accesion For NTIS CRA&I DTIO I A3 f_- Unairmoui1ccd [i Justification By ................... Distribut;or, I...investment (in terms of man-hours lost, equipment maintenance, materials, etc.) for initial training. On- going training is also required to sustain a...REASONING FROM MODELS Model-based expert systems have been written in many languages and for many different architectures . Knowledge representation also

  4. Task analysis method for procedural training curriculum development.

    PubMed

    Riggle, Jakeb D; Wadman, Michael C; McCrory, Bernadette; Lowndes, Bethany R; Heald, Elizabeth A; Carstens, Patricia K; Hallbeck, M Susan

    2014-06-01

    A central venous catheter (CVC) is an important medical tool used in critical care and emergent situations. Integral to proper care in many circumstances, insertion of a CVC introduces the risk of central line-associated blood stream infections and mechanical adverse events; proper training is important for safe CVC insertion. Cognitive task analysis (CTA) methods have been successfully implemented in the medical field to improve the training of postgraduate medical trainees, but can be very time-consuming to complete and require a significant time commitment from many subject matter experts (SMEs). Many medical procedures such as CVC insertion are linear processes with well-documented procedural steps. These linear procedures may not require a traditional CTA to gather the information necessary to create a training curriculum. Accordingly, a novel, streamlined CTA method designed primarily to collect cognitive cues for linear procedures was developed to be used by medical professionals with minimal CTA training. This new CTA methodology required fewer trained personnel, fewer interview sessions, and less time commitment from SMEs than a traditional CTA. Based on this study, a streamlined CTA methodology can be used to efficiently gather cognitive information on linear medical procedures for the creation of resident training curricula and procedural skills assessments.

  5. Development and validation of a mass casualty conceptual model.

    PubMed

    Culley, Joan M; Effken, Judith A

    2010-03-01

    To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions.

  6. Extending Theory-Based Quantitative Predictions to New Health Behaviors.

    PubMed

    Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O

    2016-04-01

    Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.

  7. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  8. Which Dimensions of Patient-Centeredness Matter? - Results of a Web-Based Expert Delphi Survey

    PubMed Central

    Zill, Jördis M.; Scholl, Isabelle; Härter, Martin; Dirmaier, Jörg

    2015-01-01

    Background Present models and definitions of patient-centeredness revealed a lack of conceptual clarity. Based on a prior systematic literature review, we developed an integrative model with 15 dimensions of patient-centeredness. The aims of this study were to 1) validate, and 2) prioritize these dimensions. Method A two-round web-based Delphi study was conducted. 297 international experts were invited to participate. In round one they were asked to 1) give an individual rating on a nine-point-scale on relevance and clarity of the dimensions, 2) add missing dimensions, and 3) prioritize the dimensions. In round two, experts received feedback about the results of round one and were asked to reflect and re-rate their own results. The cut-off for the validation of a dimension was a median < 7 on one of the criteria. Results 105 experts participated in round one and 71 in round two. In round one, one new dimension was suggested and included for discussion in round two. In round two, this dimension did not reach sufficient ratings to be included in the model. Eleven dimensions reached a median ≥ 7 on both criteria (relevance and clarity). Four dimensions had a median < 7 on one or both criteria. The five dimensions rated as most important were: patient as a unique person, patient involvement in care, patient information, clinician-patient communication and patient empowerment. Discussion 11 out of the 15 dimensions have been validated through experts’ ratings. Further research on the four dimensions that received insufficient ratings is recommended. The priority order of the dimensions can help researchers and clinicians to focus on the most important dimensions of patient-centeredness. Overall, the model provides a useful framework that can be used in the development of measures, interventions, and medical education curricula, as well as the adoption of a new perspective in health policy. PMID:26539990

  9. A Bibliography of Externally Published Works by the SEI Engineering Techniques Program

    DTIC Science & Technology

    1992-08-01

    media, and virtual reality * model- based engineering * programming languages * reuse * software architectures * software engineering as a discipline...Knowledge- Based Engineering Environments." IEEE Expert 3, 2 (May 1988): 18-23, 26-32. Audience: Practitioner [Klein89b] Klein, D.V. "Comparison of...Terms with Software Reuse Terminology: A Model- Based Approach." ACM SIGSOFT Software Engineering Notes 16, 2 (April 1991): 45-51. Audience: Practitioner

  10. Recommending Education Materials for Diabetic Questions Using Information Retrieval Approaches

    PubMed Central

    Wang, Yanshan; Shen, Feichen; Liu, Sijia; Rastegar-Mojarad, Majid; Wang, Liwei

    2017-01-01

    Background Self-management is crucial to diabetes care and providing expert-vetted content for answering patients’ questions is crucial in facilitating patient self-management. Objective The aim is to investigate the use of information retrieval techniques in recommending patient education materials for diabetic questions of patients. Methods We compared two retrieval algorithms, one based on Latent Dirichlet Allocation topic modeling (topic modeling-based model) and one based on semantic group (semantic group-based model), with the baseline retrieval models, vector space model (VSM), in recommending diabetic patient education materials to diabetic questions posted on the TuDiabetes forum. The evaluation was based on a gold standard dataset consisting of 50 randomly selected diabetic questions where the relevancy of diabetic education materials to the questions was manually assigned by two experts. The performance was assessed using precision of top-ranked documents. Results We retrieved 7510 diabetic questions on the forum and 144 diabetic patient educational materials from the patient education database at Mayo Clinic. The mapping rate of words in each corpus mapped to the Unified Medical Language System (UMLS) was significantly different (P<.001). The topic modeling-based model outperformed the other retrieval algorithms. For example, for the top-retrieved document, the precision of the topic modeling-based, semantic group-based, and VSM models was 67.0%, 62.8%, and 54.3%, respectively. Conclusions This study demonstrated that topic modeling can mitigate the vocabulary difference and it achieved the best performance in recommending education materials for answering patients’ questions. One direction for future work is to assess the generalizability of our findings and to extend our study to other disease areas, other patient education material resources, and online forums. PMID:29038097

  11. Verification of a quality management theory: using a delphi study.

    PubMed

    Mosadeghrad, Ali Mohammad

    2013-11-01

    A model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. The proposed model was further developed using feedback from thirty quality management experts using a Delphi method. Further, a guidebook for its implementation was prepared including a road map and performance measurement. The research led to the development of a context-specific model of quality management for healthcare organisations and a series of guidelines for its implementation. A proper model of quality management should be developed and implemented properly in healthcare organisations to achieve business excellence.

  12. Verification of a Quality Management Theory: Using a Delphi Study

    PubMed Central

    Mosadeghrad, Ali Mohammad

    2013-01-01

    Background: A model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. Methods: The proposed model was further developed using feedback from thirty quality management experts using a Delphi method. Further, a guidebook for its implementation was prepared including a road map and performance measurement. Results: The research led to the development of a context-specific model of quality management for healthcare organisations and a series of guidelines for its implementation. Conclusion: A proper model of quality management should be developed and implemented properly in healthcare organisations to achieve business excellence. PMID:24596883

  13. Multivariable control of the Space Shuttle Remote Manipulator System using linearization by state feedback. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Gettman, Chang-Ching LO

    1993-01-01

    This thesis develops and demonstrates an approach to nonlinear control system design using linearization by state feedback. The design provides improved transient response behavior allowing faster maneuvering of payloads by the SRMS. Modeling uncertainty is accounted for by using a second feedback loop designed around the feedback linearized dynamics. A classical feedback loop is developed to provide the easy implementation required for the relatively small on board computers. Feedback linearization also allows the use of higher bandwidth model based compensation in the outer loop, since it helps maintain stability in the presence of the nonlinearities typically neglected in model based designs.

  14. Optimization Research of Generation Investment Based on Linear Programming Model

    NASA Astrophysics Data System (ADS)

    Wu, Juan; Ge, Xueqian

    Linear programming is an important branch of operational research and it is a mathematical method to assist the people to carry out scientific management. GAMS is an advanced simulation and optimization modeling language and it will combine a large number of complex mathematical programming, such as linear programming LP, nonlinear programming NLP, MIP and other mixed-integer programming with the system simulation. In this paper, based on the linear programming model, the optimized investment decision-making of generation is simulated and analyzed. At last, the optimal installed capacity of power plants and the final total cost are got, which provides the rational decision-making basis for optimized investments.

  15. An intelligent training system for space shuttle flight controllers

    NASA Technical Reports Server (NTRS)

    Loftin, R. Bowen; Wang, Lui; Baffes, Paul; Hua, Grace

    1988-01-01

    An autonomous intelligent training system which integrates expert system technology with training/teaching methodologies is described. The system was designed to train Mission Control Center (MCC) Flight Dynamics Officers (FDOs) to deploy a certain type of satellite from the Space Shuttle. The Payload-assist module Deploys/Intelligent Computer-Aided Training (PD/ICAT) system consists of five components: a user interface, a domain expert, a training session manager, a trainee model, and a training scenario generator. The interface provides the trainee with information of the characteristics of the current training session and with on-line help. The domain expert (DeplEx for Deploy Expert) contains the rules and procedural knowledge needed by the FDO to carry out the satellite deploy. The DeplEx also contains mal-rules which permit the identification and diagnosis of common errors made by the trainee. The training session manager (TSM) examines the actions of the trainee and compares them with the actions of DeplEx in order to determine appropriate responses. A trainee model is developed for each individual using the system. The model includes a history of the trainee's interactions with the training system and provides evaluative data on the trainee's current skill level. A training scenario generator (TSG) designs appropriate training exercises for each trainee based on the trainee model and the training goals. All of the expert system components of PD/ICAT communicate via a common blackboard. The PD/ICAT is currently being tested. Ultimately, this project will serve as a vehicle for developing a general architecture for intelligent training systems together with a software environment for creating such systems.

  16. An intelligent training system for space shuttle flight controllers

    NASA Technical Reports Server (NTRS)

    Loftin, R. Bowen; Wang, Lui; Baffles, Paul; Hua, Grace

    1988-01-01

    An autonomous intelligent training system which integrates expert system technology with training/teaching methodologies is described. The system was designed to train Mission Control Center (MCC) Flight Dynamics Officers (FDOs) to deploy a certain type of satellite from the Space Shuttle. The Payload-assist module Deploys/Intelligent Computer-Aided Training (PD/ICAT) system consists of five components: a user interface, a domain expert, a training session manager, a trainee model, and a training scenario generator. The interface provides the trainee with information of the characteristics of the current training session and with on-line help. The domain expert (Dep1Ex for Deploy Expert) contains the rules and procedural knowledge needed by the FDO to carry out the satellite deploy. The Dep1Ex also contains mal-rules which permit the identification and diagnosis of common errors made by the trainee. The training session manager (TSM) examines the actions of the trainee and compares them with the actions of Dep1Ex in order to determine appropriate responses. A trainee model is developed for each individual using the system. The model includes a history of the trainee's interactions with the training system and provides evaluative data on the trainee's current skill level. A training scenario generator (TSG) designs appropriate training exercises for each trainee based on the trainee model and the training goals. All of the expert system components of PD/ICAT communicate via a common blackboard. The PD/ICAT is currently being tested. Ultimately, this project will serve as a vehicle for developing a general architecture for intelligent training systems together with a software environment for creating such systems.

  17. A population-based tissue probability map-driven level set method for fully automated mammographic density estimations.

    PubMed

    Kim, Youngwoo; Hong, Byung Woo; Kim, Seung Ja; Kim, Jong Hyo

    2014-07-01

    A major challenge when distinguishing glandular tissues on mammograms, especially for area-based estimations, lies in determining a boundary on a hazy transition zone from adipose to glandular tissues. This stems from the nature of mammography, which is a projection of superimposed tissues consisting of different structures. In this paper, the authors present a novel segmentation scheme which incorporates the learned prior knowledge of experts into a level set framework for fully automated mammographic density estimations. The authors modeled the learned knowledge as a population-based tissue probability map (PTPM) that was designed to capture the classification of experts' visual systems. The PTPM was constructed using an image database of a selected population consisting of 297 cases. Three mammogram experts extracted regions for dense and fatty tissues on digital mammograms, which was an independent subset used to create a tissue probability map for each ROI based on its local statistics. This tissue class probability was taken as a prior in the Bayesian formulation and was incorporated into a level set framework as an additional term to control the evolution and followed the energy surface designed to reflect experts' knowledge as well as the regional statistics inside and outside of the evolving contour. A subset of 100 digital mammograms, which was not used in constructing the PTPM, was used to validate the performance. The energy was minimized when the initial contour reached the boundary of the dense and fatty tissues, as defined by experts. The correlation coefficient between mammographic density measurements made by experts and measurements by the proposed method was 0.93, while that with the conventional level set was 0.47. The proposed method showed a marked improvement over the conventional level set method in terms of accuracy and reliability. This result suggests that the proposed method successfully incorporated the learned knowledge of the experts' visual systems and has potential to be used as an automated and quantitative tool for estimations of mammographic breast density levels.

  18. Adolescents’ Use of Indoor Tanning: A Large-Scale Evaluation of Psychosocial, Environmental, and Policy-Level Correlates

    PubMed Central

    Woodruff, Susan I.; Slymen, Donald J.; Sallis, James F.; Forster, Jean L.; Clapp, Elizabeth J.; Hoerster, Katherine D.; Pichon, Latrice C.; Weeks, John R.; Belch, George E.; Weinstock, Martin A.; Gilmer, Todd

    2011-01-01

    Objectives. We evaluated psychosocial, built-environmental, and policy-related correlates of adolescents’ indoor tanning use. Methods. We developed 5 discrete data sets in the 100 most populous US cities, based on interviews of 6125 adolescents (aged 14–17 years) and their parents, analysis of state indoor tanning laws, interviews with enforcement experts, computed density of tanning facilities, and evaluations of these 3399 facilities’ practices regarding access by youths. After univariate analyses, we constructed multilevel models with generalized linear mixed models (GLMMs). Results. In the past year, 17.1% of girls and 3.2% of boys had used indoor tanning. The GLMMs indicated that several psychosocial or demographic variables significantly predicted use, including being female, older, and White; having a larger allowance and a parent who used indoor tanning and allowed their adolescent to use it; and holding certain beliefs about indoor tanning's consequences. Living within 2 miles of a tanning facility also was a significant predictor. Residing in a state with youth-access legislation was not significantly associated with use. Conclusions. Current laws appear ineffective in reducing indoor tanning; bans likely are needed. Parents have an important role in prevention efforts. PMID:21421947

  19. Evaluation of a Performance-Based Expert Elicitation: WHO Global Attribution of Foodborne Diseases.

    PubMed

    Aspinall, W P; Cooke, R M; Havelaar, A H; Hoffmann, S; Hald, T

    2016-01-01

    For many societally important science-based decisions, data are inadequate, unreliable or non-existent, and expert advice is sought. In such cases, procedures for eliciting structured expert judgments (SEJ) are increasingly used. This raises questions regarding validity and reproducibility. This paper presents new findings from a large-scale international SEJ study intended to estimate the global burden of foodborne disease on behalf of WHO. The study involved 72 experts distributed over 134 expert panels, with panels comprising thirteen experts on average. Elicitations were conducted in five languages. Performance-based weighted solutions for target questions of interest were formed for each panel. These weights were based on individual expert's statistical accuracy and informativeness, determined using between ten and fifteen calibration variables from the experts' field with known values. Equal weights combinations were also calculated. The main conclusions on expert performance are: (1) SEJ does provide a science-based method for attribution of the global burden of foodborne diseases; (2) equal weighting of experts per panel increased statistical accuracy to acceptable levels, but at the cost of informativeness; (3) performance-based weighting increased informativeness, while retaining accuracy; (4) due to study constraints individual experts' accuracies were generally lower than in other SEJ studies, and (5) there was a negative correlation between experts' informativeness and statistical accuracy which attenuated as accuracy improved, revealing that the least accurate experts drive the negative correlation. It is shown, however, that performance-based weighting has the ability to yield statistically accurate and informative combinations of experts' judgments, thereby offsetting this contrary influence. The present findings suggest that application of SEJ on a large scale is feasible, and motivate the development of enhanced training and tools for remote elicitation of multiple, internationally-dispersed panels.

  20. A new adaptive multiple modelling approach for non-linear and non-stationary systems

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Gong, Yu; Hong, Xia

    2016-07-01

    This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.

  1. Reusable Component Model Development Approach for Parallel and Distributed Simulation

    PubMed Central

    Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng

    2014-01-01

    Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751

  2. Evaluation of expert system application based on usability aspects

    NASA Astrophysics Data System (ADS)

    Munaiseche, C. P. C.; Liando, O. E. S.

    2016-04-01

    Usability usually defined as a point of human acceptance to a product or a system based on understands and right reaction to an interface. The performance of web application has been influence by the quality of the interface of that web to supporting information transfer process. Preferably, before the applications of expert systems were installed in the operational environment, these applications must be evaluated first by usability testing. This research aimed to measure the usability of the expert system application using tasks as interaction media. This study uses an expert system application to diagnose skin disease in human using questionnaire method which utilize the tasks as interaction media in measuring the usability. Certain tasks were executed by the participants in observing usability value of the application. The usability aspects observed were learnability, efficiency, memorability, errors, and satisfaction. Each questionnaire question represent aspects of usability. The results present the usability value for each aspect and the total average merit for all the five-usability aspect was 4.28, this indicated that the tested expert system application is in the range excellent for the usability level, so the application can be implemented as the operated system by user. The main contribution of the study is the research became the first step in using task model in the usability evaluation for the expert system application software.

  3. An Integrated Textbook, Video, and Software Environment for Novice and Expert Prolog Programmers. Technical Report No. 23.

    ERIC Educational Resources Information Center

    Eisenstadt, Marc; Brayshaw, Mike

    This paper describes a Prolog execution model which serves as the uniform basis of textbook material, video-based teaching material, and an advanced graphical user interface for Prolog programmers. The model, based upon an augmented AND/OR tree representation of Prolog programs, uses an enriched "status box" in place of the traditional…

  4. Development of an Adolescent Alcohol Misuse Intervention Based on the Prototype Willingness Model: A Delphi Study

    ERIC Educational Resources Information Center

    Davies, Emma; Martin, Jilly; Foxcroft, David

    2016-01-01

    Purpose: The purpose of this paper is to report on the use of the Delphi method to gain expert feedback on the identification of behaviour change techniques (BCTs) and development of a novel intervention to reduce adolescent alcohol misuse, based on the Prototype Willingness Model (PWM) of health risk behaviour. Design/methodology/approach: Four…

  5. Knowledge-Based Information Retrieval.

    ERIC Educational Resources Information Center

    Ford, Nigel

    1991-01-01

    Discussion of information retrieval focuses on theoretical and empirical advances in knowledge-based information retrieval. Topics discussed include the use of natural language for queries; the use of expert systems; intelligent tutoring systems; user modeling; the need for evaluation of system effectiveness; and examples of systems, including…

  6. Identifying psychophysiological indices of expert vs. novice performance in deadly force judgment and decision making

    PubMed Central

    Johnson, Robin R.; Stone, Bradly T.; Miranda, Carrie M.; Vila, Bryan; James, Lois; James, Stephen M.; Rubio, Roberto F.; Berka, Chris

    2014-01-01

    Objective: To demonstrate that psychophysiology may have applications for objective assessment of expertise development in deadly force judgment and decision making (DFJDM). Background: Modern training techniques focus on improving decision-making skills with participative assessment between trainees and subject matter experts primarily through subjective observation. Objective metrics need to be developed. The current proof of concept study explored the potential for psychophysiological metrics in deadly force judgment contexts. Method: Twenty-four participants (novice, expert) were recruited. All wore a wireless Electroencephalography (EEG) device to collect psychophysiological data during high-fidelity simulated deadly force judgment and decision-making simulations using a modified Glock firearm. Participants were exposed to 27 video scenarios, one-third of which would have justified use of deadly force. Pass/fail was determined by whether the participant used deadly force appropriately. Results: Experts had a significantly higher pass rate compared to novices (p < 0.05). Multiple metrics were shown to distinguish novices from experts. Hierarchical regression analyses indicate that psychophysiological variables are able to explain 72% of the variability in expert performance, but only 37% in novices. Discriminant function analysis (DFA) using psychophysiological metrics was able to discern between experts and novices with 72.6% accuracy. Conclusion: While limited due to small sample size, the results suggest that psychophysiology may be developed for use as an objective measure of expertise in DFDJM. Specifically, discriminant function measures may have the potential to objectively identify expert skill acquisition. Application: Psychophysiological metrics may create a performance model with the potential to optimize simulator-based DFJDM training. These performance models could be used for trainee feedback, and/or by the instructor to assess performance objectively. PMID:25100966

  7. Breakfast barriers and opportunities for children living in a Dutch disadvantaged neighbourhood.

    PubMed

    van Kleef, Ellen; Vingerhoeds, Monique H; Vrijhof, Milou; van Trijp, Hans C M

    2016-12-01

    The objective of this study was to explore parents', children's, and experts' beliefs and experiences about breakfast motivation, opportunity, and ability and elicit their thoughts on effective interventions to encourage healthy breakfast consumption. The setting was a disadvantaged neighbourhood in Rotterdam, the Netherlands. Focus groups with mothers and children and semi-structured individual interviews with experts were conducted. Interview guides were developed based on the motivation, opportunity, and ability consumer psychology model. Thirty-two mothers of primary school children participated in five group discussions, eight focus groups were conducted with 44 children, and nine experts participated in interviews. Data from expert interviews and group discussions were coded and thematically analysed. The following themes emerged from the focus groups: (1) generally high motivation to have breakfast, (2) improved performance at school is key motivator, (3) limited time hinders breakfast, and (4) lack of nutritional knowledge about high quality breakfast. Experts mentioned lack of effort, knowledge, and time; financial constraints; and environmental issues (food availability) as barriers to breakfasting healthily. Several ways to encourage healthy breakfasting habits were identified: (1) involvement of both children and parents, (2) role models inspiring change, and (3) interactive educational approaches. Experts perceived more problems and challenges in achieving healthy breakfast habits than did mothers and children. Lack of opportunity (according to the children and experts) and ability (according to the experts) were identified, although the motivation to eat a healthy breakfast was present. Predominant barriers are lack of time and nutritional knowledge. Overall, findings suggest educational and social marketing approaches as interventions to encourage healthy breakfast consumption. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. A Multidimensional Model for Child Maltreatment Prevention Readiness in Low- and Middle-Income Countries

    ERIC Educational Resources Information Center

    Mikton, Christopher; Mehra, Radhika; Butchart, Alexander; Addiss, David; Almuneef, Maha; Cardia, Nancy; Cheah, Irene; Chen, JingQi; Makoae, Mokhantso; Raleva, Marija

    2011-01-01

    The study's aim was to develop a multidimensional model for the assessment of child maltreatment prevention readiness in low- and middle-income countries. The model was developed based on a conceptual review of relevant existing models and approaches, an international expert consultation, and focus groups in six countries. The final model…

  9. Probability Modeling and Thinking: What Can We Learn from Practice?

    ERIC Educational Resources Information Center

    Pfannkuch, Maxine; Budgett, Stephanie; Fewster, Rachel; Fitch, Marie; Pattenwise, Simeon; Wild, Chris; Ziedins, Ilze

    2016-01-01

    Because new learning technologies are enabling students to build and explore probability models, we believe that there is a need to determine the big enduring ideas that underpin probabilistic thinking and modeling. By uncovering the elements of the thinking modes of expert users of probability models we aim to provide a base for the setting of…

  10. [Discriminant analysis to predict the clinical diagnosis of primary immunodeficiencies: a preliminary report].

    PubMed

    Murata, Chiharu; Ramírez, Ana Belén; Ramírez, Guadalupe; Cruz, Alonso; Morales, José Luis; Lugo-Reyes, Saul Oswaldo

    2015-01-01

    The features in a clinical history from a patient with suspected primary immunodeficiency (PID) direct the differential diagnosis through pattern recognition. PIDs are a heterogeneous group of more than 250 congenital diseases with increased susceptibility to infection, inflammation, autoimmunity, allergy and malignancy. Linear discriminant analysis (LDA) is a multivariate supervised classification method to sort objects of study into groups by finding linear combinations of a number of variables. To identify the features that best explain membership of pediatric PID patients to a group of defect or disease. An analytic cross-sectional study was done with a pre-existing database with clinical and laboratory records from 168 patients with PID, followed at the National Institute of Pediatrics during 1991-2012, it was used to build linear discriminant models that would explain membership of each patient to the different group defects and to the most prevalent PIDs in our registry. After a preliminary run only 30 features were included (4 demographic, 10 clinical, 10 laboratory, 6 germs), with which the training models were developed through a stepwise regression algorithm. We compared the automatic feature selection with a selection made by a human expert, and then assessed the diagnostic usefulness of the resulting models (sensitivity, specificity, prediction accuracy and kappa coefficient), with 95% confidence intervals. The models incorporated 6 to 14 features to explain membership of PID patients to the five most abundant defect groups (combined, antibody, well-defined, dysregulation and phagocytosis), and to the four most prevalent PID diseases (X-linked agammaglobulinemia, chronic granulomatous disease, common variable immunodeficiency and ataxiatelangiectasia). In practically all cases of feature selection the machine outperformed the human expert. Diagnosis prediction using the equations created had a global accuracy of 83 to 94%, with sensitivity of 60 to 100%, specificity of 83 to 95% and kappa coefficient of 0.37 to 0.76. In general, the selection of features has clinical plausibility, and the practical advantage of utilizing only clinical attributes, infecting germs and routine lab results (blood cell counts and serum immunoglobulins). The performance of the model as a diagnostic tool was acceptable. The study's main limitations are a limited sample size and a lack of cross validation. This is only the first step in the construction of a machine learning system, with a wider approach that includes a larger database and different methodologies, to assist the clinical diagnosis of primary immunodeficiencies.

  11. Classifying Facial Actions

    PubMed Central

    Donato, Gianluca; Bartlett, Marian Stewart; Hager, Joseph C.; Ekman, Paul; Sejnowski, Terrence J.

    2010-01-01

    The Facial Action Coding System (FACS) [23] is an objective method for quantifying facial movement in terms of component actions. This system is widely used in behavioral investigations of emotion, cognitive processes, and social interaction. The coding is presently performed by highly trained human experts. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions. PMID:21188284

  12. An automatic multi-atlas prostate segmentation in MRI using a multiscale representation and a label fusion strategy

    NASA Astrophysics Data System (ADS)

    Álvarez, Charlens; Martínez, Fabio; Romero, Eduardo

    2015-01-01

    The pelvic magnetic Resonance images (MRI) are used in Prostate cancer radiotherapy (RT), a process which is part of the radiation planning. Modern protocols require a manual delineation, a tedious and variable activity that may take about 20 minutes per patient, even for trained experts. That considerable time is an important work ow burden in most radiological services. Automatic or semi-automatic methods might improve the efficiency by decreasing the measure times while conserving the required accuracy. This work presents a fully automatic atlas- based segmentation strategy that selects the more similar templates for a new MRI using a robust multi-scale SURF analysis. Then a new segmentation is achieved by a linear combination of the selected templates, which are previously non-rigidly registered towards the new image. The proposed method shows reliable segmentations, obtaining an average DICE Coefficient of 79%, when comparing with the expert manual segmentation, under a leave-one-out scheme with the training database.

  13. Compartmental and Data-Based Modeling of Cerebral Hemodynamics: Linear Analysis.

    PubMed

    Henley, B C; Shin, D C; Zhang, R; Marmarelis, V Z

    Compartmental and data-based modeling of cerebral hemodynamics are alternative approaches that utilize distinct model forms and have been employed in the quantitative study of cerebral hemodynamics. This paper examines the relation between a compartmental equivalent-circuit and a data-based input-output model of dynamic cerebral autoregulation (DCA) and CO2-vasomotor reactivity (DVR). The compartmental model is constructed as an equivalent-circuit utilizing putative first principles and previously proposed hypothesis-based models. The linear input-output dynamics of this compartmental model are compared with data-based estimates of the DCA-DVR process. This comparative study indicates that there are some qualitative similarities between the two-input compartmental model and experimental results.

  14. Knowledge-based diagnosis for aerospace systems

    NASA Technical Reports Server (NTRS)

    Atkinson, David J.

    1988-01-01

    The need for automated diagnosis in aerospace systems and the approach of using knowledge-based systems are examined. Research issues in knowledge-based diagnosis which are important for aerospace applications are treated along with a review of recent relevant research developments in Artificial Intelligence. The design and operation of some existing knowledge-based diagnosis systems are described. The systems described and compared include the LES expert system for liquid oxygen loading at NASA Kennedy Space Center, the FAITH diagnosis system developed at the Jet Propulsion Laboratory, the PES procedural expert system developed at SRI International, the CSRL approach developed at Ohio State University, the StarPlan system developed by Ford Aerospace, the IDM integrated diagnostic model, and the DRAPhys diagnostic system developed at NASA Langley Research Center.

  15. Creating a test blueprint for a progress testing program: A paired-comparisons approach.

    PubMed

    von Bergmann, HsingChi; Childs, Ruth A

    2018-03-01

    Creating a new testing program requires the development of a test blueprint that will determine how the items on each test form are distributed across possible content areas and practice domains. To achieve validity, categories of a blueprint are typically based on the judgments of content experts. How experts judgments are elicited and combined is important to the quality of resulting test blueprints. Content experts in dentistry participated in a day-long faculty-wide workshop to discuss, refine, and confirm the categories and their relative weights. After reaching agreement on categories and their definitions, experts judged the relative importance between category pairs, registering their judgments anonymously using iClicker, an audience response system. Judgments were combined in two ways: a simple calculation that could be performed during the workshop and a multidimensional scaling of the judgments performed later. Content experts were able to produce a set of relative weights using this approach. The multidimensional scaling yielded a three-dimensional model with the potential to provide deeper insights into the basis of the experts' judgments. The approach developed and demonstrated in this study can be applied across academic disciplines to elicit and combine content experts judgments for the development of test blueprints.

  16. Alternative Instructional Design Paradigms: What's Worth Discussing and What Isn't.

    ERIC Educational Resources Information Center

    Willis, Jerry

    1998-01-01

    Examines the paradigm debate over established (behavioral and cognitive) and alternative (constructivist) models of instructional design (ID). Discusses instructional strategies and principles, new terms versus new meaning, "straw man" and personalized arguments, expert-determined goals, research-based versus "brand-X" models,…

  17. Completing and Adapting Models of Biological Processes

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana; Hinchey, Michael G.; Raffelt, Harald; Rash, James L.; Rouff, Christopher A.; Steffen, Bernhard

    2006-01-01

    We present a learning-based method for model completion and adaptation, which is based on the combination of two approaches: 1) R2D2C, a technique for mechanically transforming system requirements via provably equivalent models to running code, and 2) automata learning-based model extrapolation. The intended impact of this new combination is to make model completion and adaptation accessible to experts of the field, like biologists or engineers. The principle is briefly illustrated by generating models of biological procedures concerning gene activities in the production of proteins, although the main application is going to concern autonomic systems for space exploration.

  18. Improved knowledge diffusion model based on the collaboration hypernetwork

    NASA Astrophysics Data System (ADS)

    Wang, Jiang-Pan; Guo, Qiang; Yang, Guang-Yong; Liu, Jian-Guo

    2015-06-01

    The process for absorbing knowledge becomes an essential element for innovation in firms and in adapting to changes in the competitive environment. In this paper, we present an improved knowledge diffusion hypernetwork (IKDH) model based on the idea that knowledge will spread from the target node to all its neighbors in terms of the hyperedge and knowledge stock. We apply the average knowledge stock V(t) , the variable σ2(t) , and the variance coefficient c(t) to evaluate the performance of knowledge diffusion. By analyzing different knowledge diffusion ways, selection ways of the highly knowledgeable nodes, hypernetwork sizes and hypernetwork structures for the performance of knowledge diffusion, results show that the diffusion speed of IKDH model is 3.64 times faster than that of traditional knowledge diffusion (TKDH) model. Besides, it is three times faster to diffuse knowledge by randomly selecting "expert" nodes than that by selecting large-hyperdegree nodes as "expert" nodes. Furthermore, either the closer network structure or smaller network size results in the faster knowledge diffusion.

  19. From complex questionnaire and interviewing data to intelligent Bayesian Network models for medical decision support

    PubMed Central

    Constantinou, Anthony Costa; Fenton, Norman; Marsh, William; Radlinski, Lukasz

    2016-01-01

    Objectives 1) To develop a rigorous and repeatable method for building effective Bayesian network (BN) models for medical decision support from complex, unstructured and incomplete patient questionnaires and interviews that inevitably contain examples of repetitive, redundant and contradictory responses; 2) To exploit expert knowledge in the BN development since further data acquisition is usually not possible; 3) To ensure the BN model can be used for interventional analysis; 4) To demonstrate why using data alone to learn the model structure and parameters is often unsatisfactory even when extensive data is available. Method The method is based on applying a range of recent BN developments targeted at helping experts build BNs given limited data. While most of the components of the method are based on established work, its novelty is that it provides a rigorous consolidated and generalised framework that addresses the whole life-cycle of BN model development. The method is based on two original and recent validated BN models in forensic psychiatry, known as DSVM-MSS and DSVM-P. Results When employed with the same datasets, the DSVM-MSS demonstrated competitive to superior predictive performance (AUC scores 0.708 and 0.797) against the state-of-the-art (AUC scores ranging from 0.527 to 0.705), and the DSVM-P demonstrated superior predictive performance (cross-validated AUC score of 0.78) against the state-of-the-art (AUC scores ranging from 0.665 to 0.717). More importantly, the resulting models go beyond improving predictive accuracy and into usefulness for risk management purposes through intervention, and enhanced decision support in terms of answering complex clinical questions that are based on unobserved evidence. Conclusions This development process is applicable to any application domain which involves large-scale decision analysis based on such complex information, rather than based on data with hard facts, and in conjunction with the incorporation of expert knowledge for decision support via intervention. The novelty extends to challenging the decision scientists to reason about building models based on what information is really required for inference, rather than based on what data is available and hence, forces decision scientists to use available data in a much smarter way. PMID:26830286

  20. From complex questionnaire and interviewing data to intelligent Bayesian network models for medical decision support.

    PubMed

    Constantinou, Anthony Costa; Fenton, Norman; Marsh, William; Radlinski, Lukasz

    2016-02-01

    (1) To develop a rigorous and repeatable method for building effective Bayesian network (BN) models for medical decision support from complex, unstructured and incomplete patient questionnaires and interviews that inevitably contain examples of repetitive, redundant and contradictory responses; (2) To exploit expert knowledge in the BN development since further data acquisition is usually not possible; (3) To ensure the BN model can be used for interventional analysis; (4) To demonstrate why using data alone to learn the model structure and parameters is often unsatisfactory even when extensive data is available. The method is based on applying a range of recent BN developments targeted at helping experts build BNs given limited data. While most of the components of the method are based on established work, its novelty is that it provides a rigorous consolidated and generalised framework that addresses the whole life-cycle of BN model development. The method is based on two original and recent validated BN models in forensic psychiatry, known as DSVM-MSS and DSVM-P. When employed with the same datasets, the DSVM-MSS demonstrated competitive to superior predictive performance (AUC scores 0.708 and 0.797) against the state-of-the-art (AUC scores ranging from 0.527 to 0.705), and the DSVM-P demonstrated superior predictive performance (cross-validated AUC score of 0.78) against the state-of-the-art (AUC scores ranging from 0.665 to 0.717). More importantly, the resulting models go beyond improving predictive accuracy and into usefulness for risk management purposes through intervention, and enhanced decision support in terms of answering complex clinical questions that are based on unobserved evidence. This development process is applicable to any application domain which involves large-scale decision analysis based on such complex information, rather than based on data with hard facts, and in conjunction with the incorporation of expert knowledge for decision support via intervention. The novelty extends to challenging the decision scientists to reason about building models based on what information is really required for inference, rather than based on what data is available and hence, forces decision scientists to use available data in a much smarter way. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. New insights into the classification and nomenclature of cortical GABAergic interneurons.

    PubMed

    DeFelipe, Javier; López-Cruz, Pedro L; Benavides-Piccione, Ruth; Bielza, Concha; Larrañaga, Pedro; Anderson, Stewart; Burkhalter, Andreas; Cauli, Bruno; Fairén, Alfonso; Feldmeyer, Dirk; Fishell, Gord; Fitzpatrick, David; Freund, Tamás F; González-Burgos, Guillermo; Hestrin, Shaul; Hill, Sean; Hof, Patrick R; Huang, Josh; Jones, Edward G; Kawaguchi, Yasuo; Kisvárday, Zoltán; Kubota, Yoshiyuki; Lewis, David A; Marín, Oscar; Markram, Henry; McBain, Chris J; Meyer, Hanno S; Monyer, Hannah; Nelson, Sacha B; Rockland, Kathleen; Rossier, Jean; Rubenstein, John L R; Rudy, Bernardo; Scanziani, Massimo; Shepherd, Gordon M; Sherwood, Chet C; Staiger, Jochen F; Tamás, Gábor; Thomson, Alex; Wang, Yun; Yuste, Rafael; Ascoli, Giorgio A

    2013-03-01

    A systematic classification and accepted nomenclature of neuron types is much needed but is currently lacking. This article describes a possible taxonomical solution for classifying GABAergic interneurons of the cerebral cortex based on a novel, web-based interactive system that allows experts to classify neurons with pre-determined criteria. Using Bayesian analysis and clustering algorithms on the resulting data, we investigated the suitability of several anatomical terms and neuron names for cortical GABAergic interneurons. Moreover, we show that supervised classification models could automatically categorize interneurons in agreement with experts' assignments. These results demonstrate a practical and objective approach to the naming, characterization and classification of neurons based on community consensus.

  2. Distributed semantic networks and CLIPS

    NASA Technical Reports Server (NTRS)

    Snyder, James; Rodriguez, Tony

    1991-01-01

    Semantic networks of frames are commonly used as a method of reasoning in many problems. In most of these applications the semantic network exists as a single entity in a single process environment. Advances in workstation hardware provide support for more sophisticated applications involving multiple processes, interacting in a distributed environment. In these applications the semantic network may well be distributed over several concurrently executing tasks. This paper describes the design and implementation of a frame based, distributed semantic network in which frames are accessed both through C Language Integrated Production System (CLIPS) expert systems and procedural C++ language programs. The application area is a knowledge based, cooperative decision making model utilizing both rule based and procedural experts.

  3. Predicting disease progression from short biomarker series using expert advice algorithm

    NASA Astrophysics Data System (ADS)

    Morino, Kai; Hirata, Yoshito; Tomioka, Ryota; Kashima, Hisashi; Yamanishi, Kenji; Hayashi, Norihiro; Egawa, Shin; Aihara, Kazuyuki

    2015-05-01

    Well-trained clinicians may be able to provide diagnosis and prognosis from very short biomarker series using information and experience gained from previous patients. Although mathematical methods can potentially help clinicians to predict the progression of diseases, there is no method so far that estimates the patient state from very short time-series of a biomarker for making diagnosis and/or prognosis by employing the information of previous patients. Here, we propose a mathematical framework for integrating other patients' datasets to infer and predict the state of the disease in the current patient based on their short history. We extend a machine-learning framework of ``prediction with expert advice'' to deal with unstable dynamics. We construct this mathematical framework by combining expert advice with a mathematical model of prostate cancer. Our model predicted well the individual biomarker series of patients with prostate cancer that are used as clinical samples.

  4. Predicting disease progression from short biomarker series using expert advice algorithm.

    PubMed

    Morino, Kai; Hirata, Yoshito; Tomioka, Ryota; Kashima, Hisashi; Yamanishi, Kenji; Hayashi, Norihiro; Egawa, Shin; Aihara, Kazuyuki

    2015-05-20

    Well-trained clinicians may be able to provide diagnosis and prognosis from very short biomarker series using information and experience gained from previous patients. Although mathematical methods can potentially help clinicians to predict the progression of diseases, there is no method so far that estimates the patient state from very short time-series of a biomarker for making diagnosis and/or prognosis by employing the information of previous patients. Here, we propose a mathematical framework for integrating other patients' datasets to infer and predict the state of the disease in the current patient based on their short history. We extend a machine-learning framework of "prediction with expert advice" to deal with unstable dynamics. We construct this mathematical framework by combining expert advice with a mathematical model of prostate cancer. Our model predicted well the individual biomarker series of patients with prostate cancer that are used as clinical samples.

  5. A novel AIDS/HIV intelligent medical consulting system based on expert systems.

    PubMed

    Ebrahimi, Alireza Pour; Toloui Ashlaghi, Abbas; Mahdavy Rad, Maryam

    2013-01-01

    The purpose of this paper is to propose a novel intelligent model for AIDS/HIV data based on expert system and using it for developing an intelligent medical consulting system for AIDS/HIV. In this descriptive research, 752 frequently asked questions (FAQs) about AIDS/HIV are gathered from numerous websites about this disease. To perform the data mining and extracting the intelligent model, the 6 stages of Crisp method has been completed for FAQs. The 6 stages include: Business understanding, data understanding, data preparation, modelling, evaluation and deployment. C5.0 Tree classification algorithm is used for modelling. Also, rational unified process (RUP) is used to develop the web-based medical consulting software. Stages of RUP are as follows: Inception, elaboration, construction and transition. The intelligent developed model has been used in the infrastructure of the software and based on client's inquiry and keywords related FAQs are displayed to the client, according to the rank. FAQs' ranks are gradually determined considering clients reading it. Based on displayed FAQs, test and entertainment links are also displayed. The accuracy of the AIDS/HIV intelligent web-based medical consulting system is estimated to be 78.76%. AIDS/HIV medical consulting systems have been developed using intelligent infrastructure. Being equipped with an intelligent model, providing consulting services on systematic textual data and providing side services based on client's activities causes the implemented system to be unique. The research has been approved by Iranian Ministry of Health and Medical Education for being practical.

  6. Using collective expert judgements to evaluate quality measures of mass spectrometry images

    PubMed Central

    Palmer, Andrew; Ovchinnikova, Ekaterina; Thuné, Mikael; Lavigne, Régis; Guével, Blandine; Dyatlov, Andrey; Vitek, Olga; Pineau, Charles; Borén, Mats; Alexandrov, Theodore

    2015-01-01

    Motivation: Imaging mass spectrometry (IMS) is a maturating technique of molecular imaging. Confidence in the reproducible quality of IMS data is essential for its integration into routine use. However, the predominant method for assessing quality is visual examination, a time consuming, unstandardized and non-scalable approach. So far, the problem of assessing the quality has only been marginally addressed and existing measures do not account for the spatial information of IMS data. Importantly, no approach exists for unbiased evaluation of potential quality measures. Results: We propose a novel approach for evaluating potential measures by creating a gold-standard set using collective expert judgements upon which we evaluated image-based measures. To produce a gold standard, we engaged 80 IMS experts, each to rate the relative quality between 52 pairs of ion images from MALDI-TOF IMS datasets of rat brain coronal sections. Experts’ optional feedback on their expertise, the task and the survey showed that (i) they had diverse backgrounds and sufficient expertise, (ii) the task was properly understood, and (iii) the survey was comprehensible. A moderate inter-rater agreement was achieved with Krippendorff’s alpha of 0.5. A gold-standard set of 634 pairs of images with accompanying ratings was constructed and showed a high agreement of 0.85. Eight families of potential measures with a range of parameters and statistical descriptors, giving 143 in total, were evaluated. Both signal-to-noise and spatial chaos-based measures performed highly with a correlation of 0.7 to 0.9 with the gold standard ratings. Moreover, we showed that a composite measure with the linear coefficients (trained on the gold standard with regularized least squares optimization and lasso) showed a strong linear correlation of 0.94 and an accuracy of 0.98 in predicting which image in a pair was of higher quality. Availability and implementation: The anonymized data collected from the survey and the Matlab source code for data processing can be found at: https://github.com/alexandrovteam/IMS_quality. Contact: theodore.alexandrov@embl.de PMID:26072506

  7. ESKAPE/CF: A Knowledge Acquisition Tool for Expert Systems Using Cognitive Feedback

    DTIC Science & Technology

    1991-03-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California AD-A241 815i!1! lit 1i iill 1111 !! I 1111 ST E * ODTIC OCT22 z 99I; THESIS ESKAPE /CF: A KNOWLEDGE...11. TITLE (include Security Classification) ESKAPE /CF: A KNOWLEDGE ACQUISITION TOOL FOR EXPERT SYSTEMS USING COGNITIVE FEEDBACK (U) e PERSOIAL AUTVR(Yl...tool using Cognitive Feedback ( ESKAPE /CF), based on Lens model techniques which have demonstrated effectiveness in cap- turing policy knowledge. The

  8. Sorting Through the Safety Data Haystack: Using Machine Learning to Identify Individual Case Safety Reports in Social-Digital Media.

    PubMed

    Comfort, Shaun; Perera, Sujan; Hudson, Zoe; Dorrell, Darren; Meireis, Shawman; Nagarajan, Meenakshi; Ramakrishnan, Cartic; Fine, Jennifer

    2018-06-01

    There is increasing interest in social digital media (SDM) as a data source for pharmacovigilance activities; however, SDM is considered a low information content data source for safety data. Given that pharmacovigilance itself operates in a high-noise, lower-validity environment without objective 'gold standards' beyond process definitions, the introduction of large volumes of SDM into the pharmacovigilance workflow has the potential to exacerbate issues with limited manual resources to perform adverse event identification and processing. Recent advances in medical informatics have resulted in methods for developing programs which can assist human experts in the detection of valid individual case safety reports (ICSRs) within SDM. In this study, we developed rule-based and machine learning (ML) models for classifying ICSRs from SDM and compared their performance with that of human pharmacovigilance experts. We used a random sampling from a collection of 311,189 SDM posts that mentioned Roche products and brands in combination with common medical and scientific terms sourced from Twitter, Tumblr, Facebook, and a spectrum of news media blogs to develop and evaluate three iterations of an automated ICSR classifier. The ICSR classifier models consisted of sub-components to annotate the relevant ICSR elements and a component to make the final decision on the validity of the ICSR. Agreement with human pharmacovigilance experts was chosen as the preferred performance metric and was evaluated by calculating the Gwet AC1 statistic (gKappa). The best performing model was tested against the Roche global pharmacovigilance expert using a blind dataset and put through a time test of the full 311,189-post dataset. During this effort, the initial strict rule-based approach to ICSR classification resulted in a model with an accuracy of 65% and a gKappa of 46%. Adding an ML-based adverse event annotator improved the accuracy to 74% and gKappa to 60%. This was further improved by the addition of an additional ML ICSR detector. On a blind test set of 2500 posts, the final model demonstrated a gKappa of 78% and an accuracy of 83%. In the time test, it took the final model 48 h to complete a task that would have taken an estimated 44,000 h for human experts to perform. The results of this study indicate that an effective and scalable solution to the challenge of ICSR detection in SDM includes a workflow using an automated ML classifier to identify likely ICSRs for further human SME review.

  9. Impact of Diagnosticity on the Adequacy of Models for Cognitive Diagnosis under a Linear Attribute Structure: A Simulation Study

    ERIC Educational Resources Information Center

    de La Torre, Jimmy; Karelitz, Tzur M.

    2009-01-01

    Compared to unidimensional item response models (IRMs), cognitive diagnostic models (CDMs) based on latent classes represent examinees' knowledge and item requirements using discrete structures. This study systematically examines the viability of retrofitting CDMs to IRM-based data with a linear attribute structure. The study utilizes a procedure…

  10. Women's Endorsement of Models of Sexual Response: Correlates and Predictors.

    PubMed

    Nowosielski, Krzysztof; Wróbel, Beata; Kowalczyk, Robert

    2016-02-01

    Few studies have investigated endorsement of female sexual response models, and no single model has been accepted as a normative description of women's sexual response. The aim of the study was to establish how women from a population-based sample endorse current theoretical models of the female sexual response--the linear models and circular model (partial and composite Basson models)--as well as predictors of endorsement. Accordingly, 174 heterosexual women aged 18-55 years were included in a cross-sectional study: 74 women diagnosed with female sexual dysfunction (FSD) based on DSM-5 criteria and 100 non-dysfunctional women. The description of sexual response models was used to divide subjects into four subgroups: linear (Masters-Johnson and Kaplan models), circular (partial Basson model), mixed (linear and circular models in similar proportions, reflective of the composite Basson model), and a different model. Women were asked to choose which of the models best described their pattern of sexual response and how frequently they engaged in each model. Results showed that 28.7% of women endorsed the linear models, 19.5% the partial Basson model, 40.8% the composite Basson model, and 10.9% a different model. Women with FSD endorsed the partial Basson model and a different model more frequently than did non-dysfunctional controls. Individuals who were dissatisfied with a partner as a lover were more likely to endorse a different model. Based on the results, we concluded that the majority of women endorsed a mixed model combining the circular response with the possibility of an innate desire triggering a linear response. Further, relationship difficulties, not FSD, predicted model endorsement.

  11. Analysis on 3RWB model (Reduce, reuse, recycle, and waste bank) in comprehensive waste management toward community-based zero waste

    NASA Astrophysics Data System (ADS)

    Affandy, Nur Azizah; Isnaini, Enik; Laksono, Arif Budi

    2017-06-01

    Waste management becomes a serious issue in Indonesia. Significantly, waste production in Lamongan Regency is increasing in linear with the growth of population and current people activities, creating a gap between waste production and waste management. It is a critical problem that should be solved immediately. As a reaction to the issue, the Government of Lamongan Regency has enacted a new policy regarding waste management through a program named Lamongan Green and Clean (LGC). From the collected data, it showed that the "wet waste" or "organic waste" was approximately 63% of total domestic waste. With such condition, it can be predicted that the trashes will decompose quite quickly. From the observation, it was discovered that the generated waste was approximately 0.25 kg/person/day. Meanwhile, the number of population in Tumenggungan Village, Lamongan (data obtained from Monograph in Lamongan district, 2012) was 4651 people. Thus, it can be estimated the total waste in Lamongan was approximately 0.25 kg/person/day x 4651 characters = 930 kg/day. Within 3RWB Model, several stages have to be conducted. In the planning stage, the promotion of self-awareness among the communities in selecting and managing waste due to their interest in a potential benefit, is done. It indicated that community's awareness of waste management waste grew significantly. Meanwhile in socialization stage, each village staff, environmental expert, and policymaker should bear significant role in disseminating the awareness among the people. In the implementation phase, waste management with 3RWB model is promoted by applying it among of the community, starting from selection, waste management, until recycled products sale through the waste bank. In evaluation stage, the village managers, environmental expert, and waste managers are expected to regularly supervise and evaluate the whole activity of the waste management.

  12. Expert elicitation of population-level effects of disturbance

    USGS Publications Warehouse

    Fleishman, Erica; Burgman, Mark; Runge, Michael C.; Schick, Robert S; Krauss, Scott; Popper, Arthur N.; Hawkins, Anthony

    2016-01-01

    Expert elicitation is a rigorous method for synthesizing expert knowledge to inform decision making and is reliable and practical when field data are limited. We evaluated the feasibility of applying expert elicitation to estimate population-level effects of disturbance on marine mammals. Diverse experts estimated parameters related to mortality and sublethal injury of North Atlantic right whales (Eubalaena glacialis). We are now eliciting expert knowledge on the movement of right whales among geographic regions to parameterize a spatial model of health. Expert elicitation complements methods such as simulation models or extrapolations from other species, sometimes with greater accuracy and less uncertainty.

  13. Rotary Wing Propulsion Specialists' Meeting, Williamsburg, VA, Nov. 13-15, 1990, Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1990-01-01

    Topics presented include sound diffraction at a sharp trailing edge in a supersonic flow, the MTR390 turboshaft development program, progress report of the electrostatic engine monitoring system, some corrosion resistant magnesium alloys, handling severe inlet conditions in aircraft fuel pumps, and an over view of inlet protection systems for Army aircraft. Also presented are the advanced control system architecture for the T800 engine, an expert system to perform on-line controller restructuring for abrupt model changes, an enhanced APU for the H-60 series and Sh-2G helicopters, and a linear theory of the North Atlantic blocking during January 1979.

  14. Assessing experience in the deliberate practice of running using a fuzzy decision-support system

    PubMed Central

    Roveri, Maria Isabel; Manoel, Edison de Jesus; Onodera, Andrea Naomi; Ortega, Neli R. S.; Tessutti, Vitor Daniel; Vilela, Emerson; Evêncio, Nelson

    2017-01-01

    The judgement of skill experience and its levels is ambiguous though it is crucial for decision-making in sport sciences studies. We developed a fuzzy decision support system to classify experience of non-elite distance runners. Two Mamdani subsystems were developed based on expert running coaches’ knowledge. In the first subsystem, the linguistic variables of training frequency and volume were combined and the output defined the quality of running practice. The second subsystem yielded the level of running experience from the combination of the first subsystem output with the number of competitions and practice time. The model results were highly consistent with the judgment of three expert running coaches (r>0.88, p<0.001) and also with five other expert running coaches (r>0.86, p<0.001). From the expert’s knowledge and the fuzzy model, running experience is beyond the so-called "10-year rule" and depends not only on practice time, but on the quality of practice (training volume and frequency) and participation in competitions. The fuzzy rule-based model was very reliable, valid, deals with the marked ambiguities inherent in the judgment of experience and has potential applications in research, sports training, and clinical settings. PMID:28817655

  15. Probabilistic inversion of expert assessments to inform projections about Antarctic ice sheet responses

    PubMed Central

    Wong, Tony E.; Keller, Klaus

    2017-01-01

    The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections. PMID:29287095

  16. Dengue Disease Risk Mental Models in the City of Dhaka, Bangladesh: Juxtapositions and Gaps Between the Public and Experts.

    PubMed

    Dhar-Chowdhury, Parnali; Haque, C Emdad; Driedger, S Michelle

    2016-05-01

    Worldwide, more than 50 million cases of dengue fever are reported every year in at least 124 countries, and it is estimated that approximately 2.5 billion people are at risk for dengue infection. In Bangladesh, the recurrence of dengue has become a growing public health threat. Notably, knowledge and perceptions of dengue disease risk, particularly among the public, are not well understood. Recognizing the importance of assessing risk perception, we adopted a comparative approach to examine a generic methodology to assess diverse sets of beliefs related to dengue disease risk. Our study mapped existing knowledge structures regarding the risk associated with dengue virus, its vector (Aedes mosquitoes), water container use, and human activities in the city of Dhaka, Bangladesh. "Public mental models" were developed from interviews and focus group discussions with diverse community groups; "expert mental models" were formulated based on open-ended discussions with experts in the pertinent fields. A comparative assessment of the public's and experts' knowledge and perception of dengue disease risk has revealed significant gaps in the perception of: (a) disease risk indicators and measurements; (b) disease severity; (c) control of disease spread; and (d) the institutions responsible for intervention. This assessment further identifies misconceptions in public perception regarding: (a) causes of dengue disease; (b) dengue disease symptoms; (c) dengue disease severity; (d) dengue vector ecology; and (e) dengue disease transmission. Based on these results, recommendations are put forward for improving communication of dengue risk and practicing local community engagement and knowledge enhancement in Bangladesh. © 2015 Society for Risk Analysis.

  17. Evaluation of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) and the Collected Data.

    PubMed

    Huff, Andrew G; Hodges, James S; Kennedy, Shaun P; Kircher, Amy

    2015-08-01

    To protect and secure food resources for the United States, it is crucial to have a method to compare food systems' criticality. In 2007, the U.S. government funded development of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) to determine which food and agriculture systems were most critical to the nation. FASCAT was developed in a collaborative process involving government officials and food industry subject matter experts (SMEs). After development, data were collected using FASCAT to quantify threats, vulnerabilities, consequences, and the impacts on the United States from failure of evaluated food and agriculture systems. To examine FASCAT's utility, linear regression models were used to determine: (1) which groups of questions posed in FASCAT were better predictors of cumulative criticality scores; (2) whether the items included in FASCAT's criticality method or the smaller subset of FASCAT items included in DHS's risk analysis method predicted similar criticality scores. Akaike's information criterion was used to determine which regression models best described criticality, and a mixed linear model was used to shrink estimates of criticality for individual food and agriculture systems. The results indicated that: (1) some of the questions used in FASCAT strongly predicted food or agriculture system criticality; (2) the FASCAT criticality formula was a stronger predictor of criticality compared to the DHS risk formula; (3) the cumulative criticality formula predicted criticality more strongly than weighted criticality formula; and (4) the mixed linear regression model did not change the rank-order of food and agriculture system criticality to a large degree. © 2015 Society for Risk Analysis.

  18. Users manual for an expert system (HSPEXP) for calibration of the hydrological simulation program; Fortran

    USGS Publications Warehouse

    Lumb, A.M.; McCammon, R.B.; Kittle, J.L.

    1994-01-01

    Expert system software was developed to assist less experienced modelers with calibration of a watershed model and to facilitate the interaction between the modeler and the modeling process not provided by mathematical optimization. A prototype was developed with artificial intelligence software tools, a knowledge engineer, and two domain experts. The manual procedures used by the domain experts were identified and the prototype was then coded by the knowledge engineer. The expert system consists of a set of hierarchical rules designed to guide the calibration of the model through a systematic evaluation of model parameters. When the prototype was completed and tested, it was rewritten for portability and operational use and was named HSPEXP. The watershed model Hydrological Simulation Program--Fortran (HSPF) is used in the expert system. This report is the users manual for HSPEXP and contains a discussion of the concepts and detailed steps and examples for using the software. The system has been tested on watersheds in the States of Washington and Maryland, and the system correctly identified the model parameters to be adjusted and the adjustments led to improved calibration.

  19. Controls/CFD Interdisciplinary Research Software Generates Low-Order Linear Models for Control Design From Steady-State CFD Results

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    1997-01-01

    The NASA Lewis Research Center is developing analytical methods and software tools to create a bridge between the controls and computational fluid dynamics (CFD) disciplines. Traditionally, control design engineers have used coarse nonlinear simulations to generate information for the design of new propulsion system controls. However, such traditional methods are not adequate for modeling the propulsion systems of complex, high-speed vehicles like the High Speed Civil Transport. To properly model the relevant flow physics of high-speed propulsion systems, one must use simulations based on CFD methods. Such CFD simulations have become useful tools for engineers that are designing propulsion system components. The analysis techniques and software being developed as part of this effort are an attempt to evolve CFD into a useful tool for control design as well. One major aspect of this research is the generation of linear models from steady-state CFD results. CFD simulations, often used during the design of high-speed inlets, yield high resolution operating point data. Under a NASA grant, the University of Akron has developed analytical techniques and software tools that use these data to generate linear models for control design. The resulting linear models have the same number of states as the original CFD simulation, so they are still very large and computationally cumbersome. Model reduction techniques have been successfully applied to reduce these large linear models by several orders of magnitude without significantly changing the dynamic response. The result is an accurate, easy to use, low-order linear model that takes less time to generate than those generated by traditional means. The development of methods for generating low-order linear models from steady-state CFD is most complete at the one-dimensional level, where software is available to generate models with different kinds of input and output variables. One-dimensional methods have been extended somewhat so that linear models can also be generated from two- and three-dimensional steady-state results. Standard techniques are adequate for reducing the order of one-dimensional CFD-based linear models. However, reduction of linear models based on two- and three-dimensional CFD results is complicated by very sparse, ill-conditioned matrices. Some novel approaches are being investigated to solve this problem.

  20. A General Accelerated Degradation Model Based on the Wiener Process.

    PubMed

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  1. A General Accelerated Degradation Model Based on the Wiener Process

    PubMed Central

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-01-01

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses. PMID:28774107

  2. Issues and recent advances in optimal experimental design for site investigation (Invited)

    NASA Astrophysics Data System (ADS)

    Nowak, W.

    2013-12-01

    This presentation provides an overview over issues and recent advances in model-based experimental design for site exploration. The addressed issues and advances are (1) how to provide an adequate envelope to prior uncertainty, (2) how to define the information needs in a task-oriented manner, (3) how to measure the expected impact of a data set that it not yet available but only planned to be collected, and (4) how to perform best the optimization of the data collection plan. Among other shortcomings of the state-of-the-art, it is identified that there is a lack of demonstrator studies where exploration schemes based on expert judgment are compared to exploration schemes obtained by optimal experimental design. Such studies will be necessary do address the often voiced concern that experimental design is an academic exercise with little improvement potential over the well- trained gut feeling of field experts. When addressing this concern, a specific focus has to be given to uncertainty in model structure, parameterizations and parameter values, and to related surprises that data often bring about in field studies, but never in synthetic-data based studies. The background of this concern is that, initially, conceptual uncertainty may be so large that surprises are the rule rather than the exception. In such situations, field experts have a large body of experience in handling the surprises, and expert judgment may be good enough compared to meticulous optimization based on a model that is about to be falsified by the incoming data. In order to meet surprises accordingly and adapt to them, there needs to be a sufficient representation of conceptual uncertainty within the models used. Also, it is useless to optimize an entire design under this initial range of uncertainty. Thus, the goal setting of the optimization should include the objective to reduce conceptual uncertainty. A possible way out is to upgrade experimental design theory towards real-time interaction with the ongoing site investigation, such that surprises in the data are immediately accounted for to restrict the conceptual uncertainty and update the optimization of the plan.

  3. Expert anticipatory skill in striking sports: a review and a model.

    PubMed

    Müller, Sean; Abernethy, Bruce

    2012-06-01

    Expert performers in striking sports can hit objects moving at high speed with incredible precision. Exceptionally well developed anticipation skills are necessary to cope with the severe constraints on interception. In this papr we provide a review of the empirical evidence regarding expert interception in striking sports and propose a preliminary model of expert anticipation. Central to the review and the model is the notion that the visual information used to guide the sequential phases of the striking action is systematically different between experts and nonexperts. Knowing the factors that contribute to expert anticipation, and how anticipation may guide skilled performance in striking sports, has practical implications for assessment and training across skill levels.

  4. Guidance, navigation, and control subsystem equipment selection algorithm using expert system methods

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1991-01-01

    Enhanced engineering tools can be obtained through the integration of expert system methodologies and existing design software. The application of these methodologies to the spacecraft design and cost model (SDCM) software provides an improved technique for the selection of hardware for unmanned spacecraft subsystem design. The knowledge engineering system (KES) expert system development tool was used to implement a smarter equipment section algorithm than that which is currently achievable through the use of a standard data base system. The guidance, navigation, and control subsystems of the SDCM software was chosen as the initial subsystem for implementation. The portions of the SDCM code which compute the selection criteria and constraints remain intact, and the expert system equipment selection algorithm is embedded within this existing code. The architecture of this new methodology is described and its implementation is reported. The project background and a brief overview of the expert system is described, and once the details of the design are characterized, an example of its implementation is demonstrated.

  5. Available pressure amplitude of linear compressor based on phasor triangle model

    NASA Astrophysics Data System (ADS)

    Duan, C. X.; Jiang, X.; Zhi, X. Q.; You, X. K.; Qiu, L. M.

    2017-12-01

    The linear compressor for cryocoolers possess the advantages of long-life operation, high efficiency, low vibration and compact structure. It is significant to study the match mechanisms between the compressor and the cold finger, which determines the working efficiency of the cryocooler. However, the output characteristics of linear compressor are complicated since it is affected by many interacting parameters. The existing matching methods are simplified and mainly focus on the compressor efficiency and output acoustic power, while neglecting the important output parameter of pressure amplitude. In this study, a phasor triangle model basing on analyzing the forces of the piston is proposed. It can be used to predict not only the output acoustic power, the efficiency, but also the pressure amplitude of the linear compressor. Calculated results agree well with the measurement results of the experiment. By this phasor triangle model, the theoretical maximum output pressure amplitude of the linear compressor can be calculated simply based on a known charging pressure and operating frequency. Compared with the mechanical and electrical model of the linear compressor, the new model can provide an intuitionistic understanding on the match mechanism with faster computational process. The model can also explain the experimental phenomenon of the proportional relationship between the output pressure amplitude and the piston displacement in experiments. By further model analysis, such phenomenon is confirmed as an expression of the unmatched design of the compressor. The phasor triangle model may provide an alternative method for the compressor design and matching with the cold finger.

  6. Development and validation of surgical training tool: cystectomy assessment and surgical evaluation (CASE) for robot-assisted radical cystectomy for men.

    PubMed

    Hussein, Ahmed A; Sexton, Kevin J; May, Paul R; Meng, Maxwell V; Hosseini, Abolfazl; Eun, Daniel D; Daneshmand, Siamak; Bochner, Bernard H; Peabody, James O; Abaza, Ronney; Skinner, Eila C; Hautmann, Richard E; Guru, Khurshid A

    2018-04-13

    We aimed to develop a structured scoring tool: cystectomy assessment and surgical evaluation (CASE) that objectively measures and quantifies performance during robot-assisted radical cystectomy (RARC) for men. A multinational 10-surgeon expert panel collaborated towards development and validation of CASE. The critical steps of RARC in men were deconstructed into nine key domains, each assessed by five anchors. Content validation was done utilizing the Delphi methodology. Each anchor was assessed in terms of context, score concordance, and clarity. The content validity index (CVI) was calculated for each aspect. A CVI ≥ 0.75 represented consensus, and this statement was removed from the next round. This process was repeated until consensus was achieved for all statements. CASE was used to assess de-identified videos of RARC to determine reliability and construct validity. Linearly weighted percent agreement was used to assess inter-rater reliability (IRR). A logit model for odds ratio (OR) was used to assess construct validation. The expert panel reached consensus on CASE after four rounds. The final eight domains of the CASE included: pelvic lymph node dissection, development of the peri-ureteral space, lateral pelvic space, anterior rectal space, control of the vascular pedicle, anterior vesical space, control of the dorsal venous complex, and apical dissection. IRR > 0.6 was achieved for all eight domains. Experts outperformed trainees across all domains. We developed and validated a reliable structured, procedure-specific tool for objective evaluation of surgical performance during RARC. CASE may help differentiate novice from expert performances.

  7. An implementation framework for wastewater treatment models requiring a minimum programming expertise.

    PubMed

    Rodríguez, J; Premier, G C; Dinsdale, R; Guwy, A J

    2009-01-01

    Mathematical modelling in environmental biotechnology has been a traditionally difficult resource to access for researchers and students without programming expertise. The great degree of flexibility required from model implementation platforms to be suitable for research applications restricts their use to programming expert users. More user friendly software packages however do not normally incorporate the necessary flexibility for most research applications. This work presents a methodology based on Excel and Matlab-Simulink for both flexible and accessible implementation of mathematical models by researchers with and without programming expertise. The models are almost fully defined in an Excel file in which the names and values of the state variables and parameters are easily created. This information is automatically processed in Matlab to create the model structure and almost immediate model simulation, after only a minimum Matlab code definition, is possible. The framework proposed also provides programming expert researchers with a highly flexible and modifiable platform on which to base more complex model implementations. The method takes advantage of structural generalities in most mathematical models of environmental bioprocesses while enabling the integration of advanced elements (e.g. heuristic functions, correlations). The methodology has already been successfully used in a number of research studies.

  8. Entropy based quantification of Ki-67 positive cell images and its evaluation by a reader study

    NASA Astrophysics Data System (ADS)

    Niazi, M. Khalid Khan; Pennell, Michael; Elkins, Camille; Hemminger, Jessica; Jin, Ming; Kirby, Sean; Kurt, Habibe; Miller, Barrie; Plocharczyk, Elizabeth; Roth, Rachel; Ziegler, Rebecca; Shana'ah, Arwa; Racke, Fred; Lozanski, Gerard; Gurcan, Metin N.

    2013-03-01

    Presence of Ki-67, a nuclear protein, is typically used to measure cell proliferation. The quantification of the Ki-67 proliferation index is performed visually by the pathologist; however, this is subject to inter- and intra-reader variability. Automated techniques utilizing digital image analysis by computers have emerged. The large variations in specimen preparation, staining, and imaging as well as true biological heterogeneity of tumor tissue often results in variable intensities in Ki-67 stained images. These variations affect the performance of currently developed methods. To optimize the segmentation of Ki-67 stained cells, one should define a data dependent transformation that will account for these color variations instead of defining a fixed linear transformation to separate different hues. To address these issues in images of tissue stained with Ki-67, we propose a methodology that exploits the intrinsic properties of CIE L∗a∗b∗ color space to translate this complex problem into an automatic entropy based thresholding problem. The developed method was evaluated through two reader studies with pathology residents and expert hematopathologists. Agreement between the proposed method and the expert pathologists was good (CCC = 0.80).

  9. Perception of low dose radiation risks among radiation researchers in Korea

    PubMed Central

    Seo, Songwon; Lee, Dalnim; Park, Sunhoo; Jin, Young Woo; Lee, Seung-Sook

    2017-01-01

    Expert’s risk evaluation of radiation exposure strongly influences the public’s risk perception. Experts can inform laypersons of significant radiation information including health knowledge based on experimental data. However, some experts’ radiation risk perception is often based on non-conclusive scientific evidence (i.e., radiation levels below 100 millisievert), which is currently under debate. Examining perception levels among experts is important for communication with the public since these individual’s opinions have often exacerbated the public’s confusion. We conducted a survey of Korean radiation researchers to investigate their perceptions of the risks associated with radiation exposure below 100 millisievert. A linear regression analysis revealed that having ≥ 11 years’ research experience was a critical factor associated with radiation risk perception, which was inversely correlated with each other. Increased opportunities to understand radiation effects at < 100 millisievert could alter the public’s risk perception of radiation exposure. In addition, radiation researchers conceived that more scientific evidence reducing the uncertainty for radiation effects < 100 millisievert is necessary for successful public communication. We concluded that sustained education addressing scientific findings is a critical attribute that will affect the risk perception of radiation exposure. PMID:28166286

  10. Efficacy of diabetes nurse expert team program to improve nursing confidence and expertise in caring for hospitalized patients with diabetes mellitus.

    PubMed

    Corl, Dawn E; McCliment, Sean; Thompson, Rachel E; Suhr, Louise D; Wisse, Brent E

    2014-01-01

    Nursing care for hospitalized patients with diabetes has become more complex as evidence accumulates that inpatient glycemic control improves outcomes. Previous studies have highlighted challenges for educators in providing inpatient diabetes education to nurses. In this article, the authors show that a unit-based diabetes nurse expert team model, developed and led by a diabetes clinical nurse specialist, effectively increased nurses' confidence and expertise in inpatient diabetes care. Adapting this model in other institutions may be a cost-effective way to improve inpatient diabetes care and safety as well as promote professional growth of staff nurses.

  11. Development of a model of the tobacco industry's interference with tobacco control programmes

    PubMed Central

    Trochim, W; Stillman, F; Clark, P; Schmitt, C

    2003-01-01

    Objective: To construct a conceptual model of tobacco industry tactics to undermine tobacco control programmes for the purposes of: (1) developing measures to evaluate industry tactics, (2) improving tobacco control planning, and (3) supplementing current or future frameworks used to classify and analyse tobacco industry documents. Design: Web based concept mapping was conducted, including expert brainstorming, sorting, and rating of statements describing industry tactics. Statistical analyses used multidimensional scaling and cluster analysis. Interpretation of the resulting maps was accomplished by an expert panel during a face-to-face meeting. Subjects: 34 experts, selected because of their previous encounters with industry resistance or because of their research into industry tactics, took part in some or all phases of the project. Results: Maps with eight non-overlapping clusters in two dimensional space were developed, with importance ratings of the statements and clusters. Cluster and quadrant labels were agreed upon by the experts. Conclusions: The conceptual maps summarise the tactics used by the industry and their relationships to each other, and suggest a possible hierarchy for measures that can be used in statistical modelling of industry tactics and for review of industry documents. Finally, the maps enable hypothesis of a likely progression of industry reactions as public health programmes become more successful, and therefore more threatening to industry profits. PMID:12773723

  12. Model-Based Engine Control Architecture with an Extended Kalman Filter

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2016-01-01

    This paper discusses the design and implementation of an extended Kalman filter (EKF) for model-based engine control (MBEC). Previously proposed MBEC architectures feature an optimal tuner Kalman Filter (OTKF) to produce estimates of both unmeasured engine parameters and estimates for the health of the engine. The success of this approach relies on the accuracy of the linear model and the ability of the optimal tuner to update its tuner estimates based on only a few sensors. Advances in computer processing are making it possible to replace the piece-wise linear model, developed off-line, with an on-board nonlinear model running in real-time. This will reduce the estimation errors associated with the linearization process, and is typically referred to as an extended Kalman filter. The non-linear extended Kalman filter approach is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k) and compared to the previously proposed MBEC architecture. The results show that the EKF reduces the estimation error, especially during transient operation.

  13. Extending Correlation Filter-Based Visual Tracking by Tree-Structured Ensemble and Spatial Windowing.

    PubMed

    Gundogdu, Erhan; Ozkan, Huseyin; Alatan, A Aydin

    2017-11-01

    Correlation filters have been successfully used in visual tracking due to their modeling power and computational efficiency. However, the state-of-the-art correlation filter-based (CFB) tracking algorithms tend to quickly discard the previous poses of the target, since they consider only a single filter in their models. On the contrary, our approach is to register multiple CFB trackers for previous poses and exploit the registered knowledge when an appearance change occurs. To this end, we propose a novel tracking algorithm [of complexity O(D) ] based on a large ensemble of CFB trackers. The ensemble [of size O(2 D ) ] is organized over a binary tree (depth D ), and learns the target appearance subspaces such that each constituent tracker becomes an expert of a certain appearance. During tracking, the proposed algorithm combines only the appearance-aware relevant experts to produce boosted tracking decisions. Additionally, we propose a versatile spatial windowing technique to enhance the individual expert trackers. For this purpose, spatial windows are learned for target objects as well as the correlation filters and then the windowed regions are processed for more robust correlations. In our extensive experiments on benchmark datasets, we achieve a substantial performance increase by using the proposed tracking algorithm together with the spatial windowing.

  14. Development of an instructional model for higher order thinking in science among secondary school students: a fuzzy Delphi approach

    NASA Astrophysics Data System (ADS)

    Saido, G. A. M.; Siraj, S.; DeWitt, D.; Al-Amedy, O. S.

    2018-05-01

    It is important for science students to develop higher order thinking (HOT) so that they can reason like scientists in the field. In this study, a HOT instructional model for secondary school science was developed with experts. The model would focus on reflective thinking (RT) and science process skills (SPS) among Grade 7 students. The Fuzzy Delphi Method (FDM) was employed to determine consensus among a panel of 20 experts. First, semi-structured interviews were conducted among the experts to generate the elements required for the model. Then, a questionnaire was developed using a seven-point linguistic scale based on these elements. The defuzzification value was calculated for each item, and a threshold value (d) of 0.75 was used to determine consensus for the items in the questionnaire. The alpha-cut value of >0.5 was used to select the phases and sub-phases in the model. The elements in the model were ranked to identify the sub-phases which had to be emphasised for implementation in instruction. Consensus was achieved on the phases of the HOT instructional model: engagement, investigation, explanation, conclusion and reflection. An additional 24 learning activities to encourage RT skills and SPS among students were also identified to develop HOT skills in science.

  15. Numerical solution of non-linear dual-phase-lag bioheat transfer equation within skin tissues.

    PubMed

    Kumar, Dinesh; Kumar, P; Rai, K N

    2017-11-01

    This paper deals with numerical modeling and simulation of heat transfer in skin tissues using non-linear dual-phase-lag (DPL) bioheat transfer model under periodic heat flux boundary condition. The blood perfusion is assumed temperature-dependent which results in non-linear DPL bioheat transfer model in order to predict more accurate results. A numerical method of line which is based on finite difference and Runge-Kutta (4,5) schemes, is used to solve the present non-linear problem. Under specific case, the exact solution has been obtained and compared with the present numerical scheme, and we found that those are in good agreement. A comparison based on model selection criterion (AIC) has been made among non-linear DPL models when the variation of blood perfusion rate with temperature is of constant, linear and exponential type with the experimental data and it has been found that non-linear DPL model with exponential variation of blood perfusion rate is closest to the experimental data. In addition, it is found that due to absence of phase-lag phenomena in Pennes bioheat transfer model, it achieves steady state more quickly and always predict higher temperature than thermal and DPL non-linear models. The effect of coefficient of blood perfusion rate, dimensionless heating frequency and Kirchoff number on dimensionless temperature distribution has also been analyzed. The whole analysis is presented in dimensionless form. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. The effect of observing novice and expert performance on acquisition of surgical skills on a robotic platform

    PubMed Central

    Harris, David J.; Vine, Samuel J.; Wilson, Mark R.; McGrath, John S.; LeBel, Marie-Eve

    2017-01-01

    Background Observational learning plays an important role in surgical skills training, following the traditional model of learning from expertise. Recent findings have, however, highlighted the benefit of observing not only expert performance but also error-strewn performance. The aim of this study was to determine which model (novice vs. expert) would lead to the greatest benefits when learning robotically assisted surgical skills. Methods 120 medical students with no prior experience of robotically-assisted surgery completed a ring-carrying training task on three occasions; baseline, post-intervention and at one-week follow-up. The observation intervention consisted of a video model performing the ring-carrying task, with participants randomly assigned to view an expert model, a novice model, a mixed expert/novice model or no observation (control group). Participants were assessed for task performance and surgical instrument control. Results There were significant group differences post-intervention, with expert and novice observation groups outperforming the control group, but there were no clear group differences at a retention test one week later. There was no difference in performance between the expert-observing and error-observing groups. Conclusions Similar benefits were found when observing the traditional expert model or the error-strewn model, suggesting that viewing poor performance may be as beneficial as viewing expertise in the early acquisition of robotic surgical skills. Further work is required to understand, then inform, the optimal curriculum design when utilising observational learning in surgical training. PMID:29141046

  17. Expertise and category-based induction.

    PubMed

    Proffitt, J B; Coley, J D; Medin, D L

    2000-07-01

    The authors examined inductive reasoning among experts in a domain. Three types of tree experts (landscapers, taxonomists, and parks maintenance personnel) completed 3 reasoning tasks. In Experiment 1, participants inferred which of 2 novel diseases would affect "more other kinds of trees" and provided justifications for their choices. In Experiment 2, the authors used modified instructions and asked which disease would be more likely to affect "all trees." In Experiment 3, the conclusion category was eliminated altogether, and participants were asked to generate a list of other affected trees. Among these populations, typicality and diversity effects were weak to nonexistent. Instead, experts' reasoning was influenced by "local" coverage (extension of the property to members of the same folk family) and causal-ecological factors. The authors concluded that domain knowledge leads to the use of a variety of reasoning strategies not captured by current models of category-based induction.

  18. Projection of climatic suitability for Aedes albopictus Skuse (Culicidae) in Europe under climate change conditions

    NASA Astrophysics Data System (ADS)

    Fischer, Dominik; Thomas, Stephanie Margarete; Niemitz, Franziska; Reineking, Björn; Beierkuhnlein, Carl

    2011-07-01

    During the last decades the disease vector Aedes albopictus ( Ae. albopictus) has rapidly spread around the globe. The spread of this species raises serious public health concerns. Here, we model the present distribution and the future climatic suitability of Europe for this vector in the face of climate change. In order to achieve the most realistic current prediction and future projection, we compare the performance of four different modelling approaches, differentiated by the selection of climate variables (based on expert knowledge vs. statistical criteria) and by the geographical range of presence records (native range vs. global range). First, models of the native and global range were built with MaxEnt and were either based on (1) statistically selected climatic input variables or (2) input variables selected with expert knowledge from the literature. Native models show high model performance (AUC: 0.91-0.94) for the native range, but do not predict the European distribution well (AUC: 0.70-0.72). Models based on the global distribution of the species, however, were able to identify all regions where Ae. albopictus is currently established, including Europe (AUC: 0.89-0.91). In a second step, the modelled bioclimatic envelope of the global range was projected to future climatic conditions in Europe using two emission scenarios implemented in the regional climate model COSMO-CLM for three time periods 2011-2040, 2041-2070, and 2071-2100. For both global-driven models, the results indicate that climatically suitable areas for the establishment of Ae. albopictus will increase in western and central Europe already in 2011-2040 and with a temporal delay in eastern Europe. On the other hand, a decline in climatically suitable areas in southern Europe is pronounced in the Expert knowledge based model. Our projections appear unaffected by non-analogue climate, as this is not detected by Multivariate Environmental Similarity Surface analysis. The generated risk maps can aid in identifying suitable habitats for Ae. albopictus and hence support monitoring and control activities to avoid disease vector establishment.

  19. Deformed Palmprint Matching Based on Stable Regions.

    PubMed

    Wu, Xiangqian; Zhao, Qiushi

    2015-12-01

    Palmprint recognition (PR) is an effective technology for personal recognition. A main problem, which deteriorates the performance of PR, is the deformations of palmprint images. This problem becomes more severe on contactless occasions, in which images are acquired without any guiding mechanisms, and hence critically limits the applications of PR. To solve the deformation problems, in this paper, a model for non-linearly deformed palmprint matching is derived by approximating non-linear deformed palmprint images with piecewise-linear deformed stable regions. Based on this model, a novel approach for deformed palmprint matching, named key point-based block growing (KPBG), is proposed. In KPBG, an iterative M-estimator sample consensus algorithm based on scale invariant feature transform features is devised to compute piecewise-linear transformations to approximate the non-linear deformations of palmprints, and then, the stable regions complying with the linear transformations are decided using a block growing algorithm. Palmprint feature extraction and matching are performed over these stable regions to compute matching scores for decision. Experiments on several public palmprint databases show that the proposed models and the KPBG approach can effectively solve the deformation problem in palmprint verification and outperform the state-of-the-art methods.

  20. Lightning related fatalities in livestock: veterinary expertise and the added value of lightning location data.

    PubMed

    Vanneste, E; Weyens, P; Poelman, D R; Chiers, K; Deprez, P; Pardon, B

    2015-01-01

    Although lightning strike is an important cause of sudden death in livestock on pasture and among the main reasons why insurance companies consult an expert veterinarian, scientific information on this subject is limited. The aim of the present study was to provide objective information on the circumstantial evidence and pathological findings in lightning related fatalities (LRF), based on a retrospective analysis of 410 declarations, examined by a single expert veterinarian in Flanders, Belgium, from 1998 to 2012. Predictive logistic models for compatibility with LRF were constructed based on anamnestic, environmental and pathological factors. In addition, the added value of lightning location data (LLD) was evaluated. Pathognomonic singe lesions were present in 84/194 (43%) confirmed reports. Factors which remained significantly associated with LRF in the multivariable model were age, presence of a tree or open water in the near surroundings, tympany and presence of feed in the oral cavity at the time of investigation. This basic model had a sensitivity (Se) of 53.8% and a specificity (Sp) of 88.2%. Relying only on LLD to confirm LRF in livestock resulted in a high Se (91.3%), but a low Sp (41.2%), leading to a high probability that a negative case would be wrongly accepted as an LRF. The best results were obtained when combining the model based on the veterinary expert investigation (circumstantial evidence and pathological findings), together with the detection of cloud-to-ground (CG) lightning at the time and location of death (Se 89.1%; Sp 66.7%). Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Medicines for cancers in children: The WHO model for selection of essential medicines

    PubMed Central

    Robertson, Jane; Barr, Ronald; Forte, Gilles; Ondari, Clive

    2015-01-01

    Pressures to include more cancer medicines in the WHO Model List of Essential Medicines (EML) pose challenges for the Expert Committee responsible for recommending changes to the list. How do medicines for cancer fit within a definition of essential medicines as those meeting the priority health needs of the population? Will identifying a medicine as “essential” offer some leverage to improve access to effective cancer medicines in low and middle‐income countries (LMICs)? The addition of a number of medicines for the treatment of cancers in children to the Model List of Essential Medicines for Children (EMLc) in 2011 provides important insights into previous Expert Committee decision‐making and offers a platform for future deliberations. As combination chemotherapy is required for effective treatment of many malignancies, a disease‐based approach makes more sense than an agent‐based approach. Inadequate financing to purchase essential medicines is a reality in many LMICs, thus a consideration of health impact is central to decisions on the selection and procurement of medicines. Inclusion in national EMLs should identify medicines that have priority for procurement in the public sector. This article will discuss some of the factors taken into account by the Expert Committee in developing the WHO EMLc. We argue that the disease‐based approach coupled with the assessment of the magnitude of the clinical benefit provides an appropriate approach for considering further additions of medicines for pediatric cancers and for the review of the adult cancer section of the Model List. Pediatr Blood Cancer 2015;62:1689–1693. © 2015 Wiley Periodicals, Inc. PMID:25929524

  2. On structure-exploiting trust-region regularized nonlinear least squares algorithms for neural-network learning.

    PubMed

    Mizutani, Eiji; Demmel, James W

    2003-01-01

    This paper briefly introduces our numerical linear algebra approaches for solving structured nonlinear least squares problems arising from 'multiple-output' neural-network (NN) models. Our algorithms feature trust-region regularization, and exploit sparsity of either the 'block-angular' residual Jacobian matrix or the 'block-arrow' Gauss-Newton Hessian (or Fisher information matrix in statistical sense) depending on problem scale so as to render a large class of NN-learning algorithms 'efficient' in both memory and operation costs. Using a relatively large real-world nonlinear regression application, we shall explain algorithmic strengths and weaknesses, analyzing simulation results obtained by both direct and iterative trust-region algorithms with two distinct NN models: 'multilayer perceptrons' (MLP) and 'complementary mixtures of MLP-experts' (or neuro-fuzzy modular networks).

  3. Kalman filter with a linear state model for PDR+WLAN positioning and its application to assisting a particle filter

    NASA Astrophysics Data System (ADS)

    Raitoharju, Matti; Nurminen, Henri; Piché, Robert

    2015-12-01

    Indoor positioning based on wireless local area network (WLAN) signals is often enhanced using pedestrian dead reckoning (PDR) based on an inertial measurement unit. The state evolution model in PDR is usually nonlinear. We present a new linear state evolution model for PDR. In simulated-data and real-data tests of tightly coupled WLAN-PDR positioning, the positioning accuracy with this linear model is better than with the traditional models when the initial heading is not known, which is a common situation. The proposed method is computationally light and is also suitable for smoothing. Furthermore, we present modifications to WLAN positioning based on Gaussian coverage areas and show how a Kalman filter using the proposed model can be used for integrity monitoring and (re)initialization of a particle filter.

  4. Autonomous power expert fault diagnostic system for Space Station Freedom electrical power system testbed

    NASA Technical Reports Server (NTRS)

    Truong, Long V.; Walters, Jerry L.; Roth, Mary Ellen; Quinn, Todd M.; Krawczonek, Walter M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control to the Space Station Freedom Electrical Power System (SSF/EPS) testbed being developed and demonstrated at NASA Lewis Research Center. The objectives of the program are to establish artificial intelligence technology paths, to craft knowledge-based tools with advanced human-operator interfaces for power systems, and to interface and integrate knowledge-based systems with conventional controllers. The Autonomous Power EXpert (APEX) portion of the APS program will integrate a knowledge-based fault diagnostic system and a power resource planner-scheduler. Then APEX will interface on-line with the SSF/EPS testbed and its Power Management Controller (PMC). The key tasks include establishing knowledge bases for system diagnostics, fault detection and isolation analysis, on-line information accessing through PMC, enhanced data management, and multiple-level, object-oriented operator displays. The first prototype of the diagnostic expert system for fault detection and isolation has been developed. The knowledge bases and the rule-based model that were developed for the Power Distribution Control Unit subsystem of the SSF/EPS testbed are described. A corresponding troubleshooting technique is also described.

  5. Multiple neural network approaches to clinical expert systems

    NASA Astrophysics Data System (ADS)

    Stubbs, Derek F.

    1990-08-01

    We briefly review the concept of computer aided medical diagnosis and more extensively review the the existing literature on neural network applications in the field. Neural networks can function as simple expert systems for diagnosis or prognosis. Using a public database we develop a neural network for the diagnosis of a major presenting symptom while discussing the development process and possible approaches. MEDICAL EXPERTS SYSTEMS COMPUTER AIDED DIAGNOSIS Biomedicine is an incredibly diverse and multidisciplinary field and it is not surprising that neural networks with their many applications are finding more and more applications in the highly non-linear field of biomedicine. I want to concentrate on neural networks as medical expert systems for clinical diagnosis or prognosis. Expert Systems started out as a set of computerized " ifthen" rules. Everything was reduced to boolean logic and the promised land of computer experts was said to be in sight. It never came. Why? First the computer code explodes as the number of " ifs" increases. All the " ifs" have to interact. Second experts are not very good at reducing expertise to language. It turns out that experts recognize patterns and have non-verbal left-brain intuition decision processes. Third learning by example rather than learning by rule is the way natural brains works and making computers work by rule-learning is hideously labor intensive. Neural networks can learn from example. They learn the results

  6. Development of a five-year mortality model in systemic sclerosis patients by different analytical approaches.

    PubMed

    Beretta, Lorenzo; Santaniello, Alessandro; Cappiello, Francesca; Chawla, Nitesh V; Vonk, Madelon C; Carreira, Patricia E; Allanore, Yannick; Popa-Diaconu, D A; Cossu, Marta; Bertolotti, Francesca; Ferraccioli, Gianfranco; Mazzone, Antonino; Scorza, Raffaella

    2010-01-01

    Systemic sclerosis (SSc) is a multiorgan disease with high mortality rates. Several clinical features have been associated with poor survival in different populations of SSc patients, but no clear and reproducible prognostic model to assess individual survival prediction in scleroderma patients has ever been developed. We used Cox regression and three data mining-based classifiers (Naïve Bayes Classifier [NBC], Random Forests [RND-F] and logistic regression [Log-Reg]) to develop a robust and reproducible 5-year prognostic model. All the models were built and internally validated by means of 5-fold cross-validation on a population of 558 Italian SSc patients. Their predictive ability and capability of generalisation was then tested on an independent population of 356 patients recruited from 5 external centres and finally compared to the predictions made by two SSc domain experts on the same population. The NBC outperformed the Cox-based classifier and the other data mining algorithms after internal cross-validation (area under receiving operator characteristic curve, AUROC: NBC=0.759; RND-F=0.736; Log-Reg=0.754 and Cox= 0.724). The NBC had also a remarkable and better trade-off between sensitivity and specificity (e.g. Balanced accuracy, BA) than the Cox-based classifier, when tested on an independent population of SSc patients (BA: NBC=0.769, Cox=0.622). The NBC was also superior to domain experts in predicting 5-year survival in this population (AUROC=0.829 vs. AUROC=0.788 and BA=0.769 vs. BA=0.67). We provide a model to make consistent 5-year prognostic predictions in SSc patients. Its internal validity, as well as capability of generalisation and reduced uncertainty compared to human experts support its use at bedside. Available at: http://www.nd.edu/~nchawla/survival.xls.

  7. Model-Based Reasoning in the Detection of Satellite Anomalies

    DTIC Science & Technology

    1990-12-01

    Conference on Artificial Intellegence . 1363-1368. Detroit, Michigan, August 89. Chu, Wei-Hai. "Generic Expert System Shell for Diagnostic Reasoning... Intellegence . 1324-1330. Detroit, Michigan, August 89. de Kleer, Johan and Brian C. Williams. "Diagnosing Multiple Faults," Artificial Intellegence , 32(1): 97...Benjamin Kuipers. "Model-Based Monitoring of Dynamic Systems," Proceedings of the Eleventh Intematianal Joint Conference on Artificial Intellegence . 1238

  8. Motivating Teachers to Enact Free-Choice Project-Based Learning in Science and Technology (PBLSAT): Effects of a Professional Development Model

    ERIC Educational Resources Information Center

    Fallik, Orna; Eylon, Bat-Sheva; Rosenfeld, Sherman

    2008-01-01

    We investigated the effects of a long-term, continuous professional development (CPD) model, designed to support teachers to enact Project-Based Learning (PBLSAT). How do novice PBLSAT teachers view their acquisition of PBLSAT skills and how do expert PBLSAT teachers, who enacted the program 5-7 years, perceive the program? Novice teachers…

  9. Reduced-Order Model Based Feedback Control For Modified Hasegawa-Wakatani Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goumiri, I. R.; Rowley, C. W.; Ma, Z.

    2013-01-28

    In this work, the development of model-based feedback control that stabilizes an unstable equilibrium is obtained for the Modi ed Hasegawa-Wakatani (MHW) equations, a classic model in plasma turbulence. First, a balanced truncation (a model reduction technique that has proven successful in ow control design problems) is applied to obtain a low dimensional model of the linearized MHW equation. Then a modelbased feedback controller is designed for the reduced order model using linear quadratic regulators (LQR). Finally, a linear quadratic gaussian (LQG) controller, which is more resistant to disturbances is deduced. The controller is applied on the non-reduced, nonlinear MHWmore » equations to stabilize the equilibrium and suppress the transition to drift-wave induced turbulence.« less

  10. An Analysis of Turkey's PISA 2015 Results Using Two-Level Hierarchical Linear Modelling

    ERIC Educational Resources Information Center

    Atas, Dogu; Karadag, Özge

    2017-01-01

    In the field of education, most of the data collected are multi-level structured. Cities, city based schools, school based classes and finally students in the classrooms constitute a hierarchical structure. Hierarchical linear models give more accurate results compared to standard models when the data set has a structure going far as individuals,…

  11. Linear dependence between the wavefront gradient and the masked intensity for the point source with a CCD sensor

    NASA Astrophysics Data System (ADS)

    Yang, Huizhen; Ma, Liang; Wang, Bin

    2018-01-01

    In contrast to the conventional adaptive optics (AO) system, the wavefront sensorless (WFSless) AO system doesn't need a WFS to measure the wavefront aberrations. It is simpler than the conventional AO in system architecture and can be applied to the complex conditions. The model-based WFSless system has a great potential in real-time correction applications because of its fast convergence. The control algorithm of the model-based WFSless system is based on an important theory result that is the linear relation between the Mean-Square Gradient (MSG) magnitude of the wavefront aberration and the second moment of the masked intensity distribution in the focal plane (also called as Masked Detector Signal-MDS). The linear dependence between MSG and MDS for the point source imaging with a CCD sensor will be discussed from theory and simulation in this paper. The theory relationship between MSG and MDS is given based on our previous work. To verify the linear relation for the point source, we set up an imaging model under atmospheric turbulence. Additionally, the value of MDS will be deviate from that of theory because of the noise of detector and further the deviation will affect the correction effect. The theory results under noise will be obtained through theoretical derivation and then the linear relation between MDS and MDS under noise will be discussed through the imaging model. Results show the linear relation between MDS and MDS under noise is also maintained well, which provides a theoretical support to applications of the model-based WFSless system.

  12. Expert Coaching in Weight Loss: Retrospective Analysis

    PubMed Central

    Kushner, Robert F; Hill, James O; Lindquist, Richard; Brunning, Scott; Margulies, Amy

    2018-01-01

    Background Providing coaches as part of a weight management program is a common practice to increase participant engagement and weight loss success. Understanding coach and participant interactions and how these interactions impact weight loss success needs to be further explored for coaching best practices. Objective The purpose of this study was to analyze the coach and participant interaction in a 6-month weight loss intervention administered by Retrofit, a personalized weight management and Web-based disease prevention solution. The study specifically examined the association between different methods of coach-participant interaction and weight loss and tried to understand the level of coaching impact on weight loss outcome. Methods A retrospective analysis was performed using 1432 participants enrolled from 2011 to 2016 in the Retrofit weight loss program. Participants were males and females aged 18 years or older with a baseline body mass index of ≥25 kg/m², who also provided at least one weight measurement beyond baseline. First, a detailed analysis of different coach-participant interaction was performed using both intent-to-treat and completer populations. Next, a multiple regression analysis was performed using all measures associated with coach-participant interactions involving expert coaching sessions, live weekly expert-led Web-based classes, and electronic messaging and feedback. Finally, 3 significant predictors (P<.001) were analyzed in depth to reveal the impact on weight loss outcome. Results Participants in the Retrofit weight loss program lost a mean 5.14% (SE 0.14) of their baseline weight, with 44% (SE 0.01) of participants losing at least 5% of their baseline weight. Multiple regression model (R2=.158, P<.001) identified the following top 3 measures as significant predictors of weight loss at 6 months: expert coaching session attendance (P<.001), live weekly Web-based class attendance (P<.001), and food log feedback days per week (P<.001). Attending 80% of expert coaching sessions, attending 60% of live weekly Web-based classes, and receiving a minimum of 1 food log feedback day per week were associated with clinically significant weight loss. Conclusions Participant’s one-on-one expert coaching session attendance, live weekly expert-led interactive Web-based class attendance, and the number of food log feedback days per week from expert coach were significant predictors of weight loss in a 6-month intervention. PMID:29535082

  13. Mathematical Modelling in Engineering: An Alternative Way to Teach Linear Algebra

    ERIC Educational Resources Information Center

    Domínguez-García, S.; García-Planas, M. I.; Taberna, J.

    2016-01-01

    Technological advances require that basic science courses for engineering, including Linear Algebra, emphasize the development of mathematical strengths associated with modelling and interpretation of results, which are not limited only to calculus abilities. Based on this consideration, we have proposed a project-based learning, giving a dynamic…

  14. Experimental research on mathematical modelling and unconventional control of clinker kiln in cement plants

    NASA Astrophysics Data System (ADS)

    Rusu-Anghel, S.

    2017-01-01

    Analytical modeling of the flow of manufacturing process of the cement is difficult because of their complexity and has not resulted in sufficiently precise mathematical models. In this paper, based on a statistical model of the process and using the knowledge of human experts, was designed a fuzzy system for automatic control of clinkering process.

  15. K-Means Subject Matter Expert Refined Topic Model Methodology

    DTIC Science & Technology

    2017-01-01

    Refined Topic Model Methodology Topic Model Estimation via K-Means U.S. Army TRADOC Analysis Center-Monterey 700 Dyer Road...January 2017 K-means Subject Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-Means Theodore T. Allen, Ph.D. Zhenhuan...Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-means 5a. CONTRACT NUMBER W9124N-15-P-0022 5b. GRANT NUMBER 5c

  16. Automatic ICD-10 multi-class classification of cause of death from plaintext autopsy reports through expert-driven feature selection.

    PubMed

    Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali

    2017-01-01

    Widespread implementation of electronic databases has improved the accessibility of plaintext clinical information for supplementary use. Numerous machine learning techniques, such as supervised machine learning approaches or ontology-based approaches, have been employed to obtain useful information from plaintext clinical data. This study proposes an automatic multi-class classification system to predict accident-related causes of death from plaintext autopsy reports through expert-driven feature selection with supervised automatic text classification decision models. Accident-related autopsy reports were obtained from one of the largest hospital in Kuala Lumpur. These reports belong to nine different accident-related causes of death. Master feature vector was prepared by extracting features from the collected autopsy reports by using unigram with lexical categorization. This master feature vector was used to detect cause of death [according to internal classification of disease version 10 (ICD-10) classification system] through five automated feature selection schemes, proposed expert-driven approach, five subset sizes of features, and five machine learning classifiers. Model performance was evaluated using precisionM, recallM, F-measureM, accuracy, and area under ROC curve. Four baselines were used to compare the results with the proposed system. Random forest and J48 decision models parameterized using expert-driven feature selection yielded the highest evaluation measure approaching (85% to 90%) for most metrics by using a feature subset size of 30. The proposed system also showed approximately 14% to 16% improvement in the overall accuracy compared with the existing techniques and four baselines. The proposed system is feasible and practical to use for automatic classification of ICD-10-related cause of death from autopsy reports. The proposed system assists pathologists to accurately and rapidly determine underlying cause of death based on autopsy findings. Furthermore, the proposed expert-driven feature selection approach and the findings are generally applicable to other kinds of plaintext clinical reports.

  17. Automatic ICD-10 multi-class classification of cause of death from plaintext autopsy reports through expert-driven feature selection

    PubMed Central

    Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali

    2017-01-01

    Objectives Widespread implementation of electronic databases has improved the accessibility of plaintext clinical information for supplementary use. Numerous machine learning techniques, such as supervised machine learning approaches or ontology-based approaches, have been employed to obtain useful information from plaintext clinical data. This study proposes an automatic multi-class classification system to predict accident-related causes of death from plaintext autopsy reports through expert-driven feature selection with supervised automatic text classification decision models. Methods Accident-related autopsy reports were obtained from one of the largest hospital in Kuala Lumpur. These reports belong to nine different accident-related causes of death. Master feature vector was prepared by extracting features from the collected autopsy reports by using unigram with lexical categorization. This master feature vector was used to detect cause of death [according to internal classification of disease version 10 (ICD-10) classification system] through five automated feature selection schemes, proposed expert-driven approach, five subset sizes of features, and five machine learning classifiers. Model performance was evaluated using precisionM, recallM, F-measureM, accuracy, and area under ROC curve. Four baselines were used to compare the results with the proposed system. Results Random forest and J48 decision models parameterized using expert-driven feature selection yielded the highest evaluation measure approaching (85% to 90%) for most metrics by using a feature subset size of 30. The proposed system also showed approximately 14% to 16% improvement in the overall accuracy compared with the existing techniques and four baselines. Conclusion The proposed system is feasible and practical to use for automatic classification of ICD-10-related cause of death from autopsy reports. The proposed system assists pathologists to accurately and rapidly determine underlying cause of death based on autopsy findings. Furthermore, the proposed expert-driven feature selection approach and the findings are generally applicable to other kinds of plaintext clinical reports. PMID:28166263

  18. A MODELING AND SIMULATION LANGUAGE FOR BIOLOGICAL CELLS WITH COUPLED MECHANICAL AND CHEMICAL PROCESSES

    PubMed Central

    Somogyi, Endre; Glazier, James A.

    2017-01-01

    Biological cells are the prototypical example of active matter. Cells sense and respond to mechanical, chemical and electrical environmental stimuli with a range of behaviors, including dynamic changes in morphology and mechanical properties, chemical uptake and secretion, cell differentiation, proliferation, death, and migration. Modeling and simulation of such dynamic phenomena poses a number of computational challenges. A modeling language describing cellular dynamics must naturally represent complex intra and extra-cellular spatial structures and coupled mechanical, chemical and electrical processes. Domain experts will find a modeling language most useful when it is based on concepts, terms and principles native to the problem domain. A compiler must then be able to generate an executable model from this physically motivated description. Finally, an executable model must efficiently calculate the time evolution of such dynamic and inhomogeneous phenomena. We present a spatial hybrid systems modeling language, compiler and mesh-free Lagrangian based simulation engine which will enable domain experts to define models using natural, biologically motivated constructs and to simulate time evolution of coupled cellular, mechanical and chemical processes acting on a time varying number of cells and their environment. PMID:29303160

  19. A MODELING AND SIMULATION LANGUAGE FOR BIOLOGICAL CELLS WITH COUPLED MECHANICAL AND CHEMICAL PROCESSES.

    PubMed

    Somogyi, Endre; Glazier, James A

    2017-04-01

    Biological cells are the prototypical example of active matter. Cells sense and respond to mechanical, chemical and electrical environmental stimuli with a range of behaviors, including dynamic changes in morphology and mechanical properties, chemical uptake and secretion, cell differentiation, proliferation, death, and migration. Modeling and simulation of such dynamic phenomena poses a number of computational challenges. A modeling language describing cellular dynamics must naturally represent complex intra and extra-cellular spatial structures and coupled mechanical, chemical and electrical processes. Domain experts will find a modeling language most useful when it is based on concepts, terms and principles native to the problem domain. A compiler must then be able to generate an executable model from this physically motivated description. Finally, an executable model must efficiently calculate the time evolution of such dynamic and inhomogeneous phenomena. We present a spatial hybrid systems modeling language, compiler and mesh-free Lagrangian based simulation engine which will enable domain experts to define models using natural, biologically motivated constructs and to simulate time evolution of coupled cellular, mechanical and chemical processes acting on a time varying number of cells and their environment.

  20. Descriptive Linear modeling of steady-state visual evoked response

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Junker, A. M.; Kenner, K.

    1986-01-01

    A study is being conducted to explore use of the steady state visual-evoke electrocortical response as an indicator of cognitive task loading. Application of linear descriptive modeling to steady state Visual Evoked Response (VER) data is summarized. Two aspects of linear modeling are reviewed: (1) unwrapping the phase-shift portion of the frequency response, and (2) parsimonious characterization of task-loading effects in terms of changes in model parameters. Model-based phase unwrapping appears to be most reliable in applications, such as manual control, where theoretical models are available. Linear descriptive modeling of the VER has not yet been shown to provide consistent and readily interpretable results.

Top