A Boundary Condition for Simulation of Flow Over Porous Surfaces
NASA Technical Reports Server (NTRS)
Frink, Neal T.; Bonhaus, Daryl L.; Vatsa, Veer N.; Bauer, Steven X. S.; Tinetti, Ana F.
2001-01-01
A new boundary condition is presented.for simulating the flow over passively porous surfaces. The model builds on the prior work of R.H. Bush to eliminate the need for constructing grid within an underlying plenum, thereby simplifying the numerical modeling of passively porous flow control systems and reducing computation cost. Code experts.for two structured-grid.flow solvers, TLNS3D and CFL3D. and one unstructured solver, USM3Dns, collaborated with an experimental porosity expert to develop the model and implement it into their respective codes. Results presented,for the three codes on a slender forebody with circumferential porosity and a wing with leading-edge porosity demonstrate a good agreement with experimental data and a remarkable ability to predict the aggregate aerodynamic effects of surface porosity with a simple boundary condition.
Expert systems applied to spacecraft fire safety
NASA Technical Reports Server (NTRS)
Smith, Richard L.; Kashiwagi, Takashi
1989-01-01
Expert systems are problem-solving programs that combine a knowledge base and a reasoning mechanism to simulate a human expert. The development of an expert system to manage fire safety in spacecraft, in particular the NASA Space Station Freedom, is difficult but clearly advantageous in the long-term. Some needs in low-gravity flammability characteristics, ventilating-flow effects, fire detection, fire extinguishment, and decision models, all necessary to establish the knowledge base for an expert system, are discussed.
Quality control of 3D Geological Models using an Attention Model based on Gaze
NASA Astrophysics Data System (ADS)
Busschers, Freek S.; van Maanen, Peter-Paul; Brouwer, Anne-Marie
2014-05-01
The Geological Survey of the Netherlands (GSN) produces 3D stochastic geological models of the upper 50 meters of the Dutch subsurface. The voxel models are regarded essential in answering subsurface questions on, for example, aggregate resources, groundwater flow, land subsidence studies and the planning of large-scale infrastructural works such as tunnels. GeoTOP is the most recent and detailed generation of 3D voxel models. This model describes 3D lithological variability up to a depth of 50 m using voxels of 100*100*0.5m. Due to the expected increase in data-flow, model output and user demands, the development of (semi-)automated quality control systems is getting more important in the near future. Besides numerical control systems, capturing model errors as seen from the expert geologist viewpoint is of increasing interest. We envision the use of eye gaze to support and speed up detection of errors in the geological voxel models. As a first step in this direction we explore gaze behavior of 12 geological experts from the GSN during quality control of part of the GeoTOP 3D geological model using an eye-tracker. Gaze is used as input of an attention model that results in 'attended areas' for each individual examined image of the GeoTOP model and each individual expert. We compared these attended areas to errors as marked by the experts using a mouse. Results show that: 1) attended areas as determined from experts' gaze data largely match with GeoTOP errors as indicated by the experts using a mouse, and 2) a substantial part of the match can be reached using only gaze data from the first few seconds of the time geologists spend to search for errors. These results open up the possibility of faster GeoTOP model control using gaze if geologists accept a small decrease of error detection accuracy. Attention data may also be used to make independent comparisons between different geologists varying in focus and expertise. This would facilitate a more effective use of experts in specific different projects or areas. Part of such a procedure could be to confront geological experts with their own results, allowing possible training steps in order to improve their geological expertise and eventually improve the GeoTop model. Besides the directions as indicated above, future research should focus on concrete implementation of facilitating and optimizing error detection in present and future 3D voxel models that are commonly characterized by very large amounts of data.
An expert system for water quality modelling.
Booty, W G; Lam, D C; Bobba, A G; Wong, I; Kay, D; Kerby, J P; Bowen, G S
1992-12-01
The RAISON-micro (Regional Analysis by Intelligent System ON a micro-computer) expert system is being used to predict the effects of mine effluents on receiving waters in Ontario. The potential of this system to assist regulatory agencies and mining industries to define more acceptable effluent limits was shown in an initial study. This system has been further developed so that the expert system helps the model user choose the most appropriate model for a particular application from a hierarchy of models. The system currently contains seven models which range from steady state to time dependent models, for both conservative and nonconservative substances in rivers and lakes. The menu driven expert system prompts the model user for information such as the nature of the receiving water system, the type of effluent being considered, and the range of background data available for use as input to the models. The system can also be used to determine the nature of the environmental conditions at the site which are not available in the textual information database, such as the components of river flow. Applications of the water quality expert system are presented for representative mine sites in the Timmins area of Ontario.
Utilizing Expert Knowledge in Estimating Future STS Costs
NASA Technical Reports Server (NTRS)
Fortner, David B.; Ruiz-Torres, Alex J.
2004-01-01
A method of estimating the costs of future space transportation systems (STSs) involves classical activity-based cost (ABC) modeling combined with systematic utilization of the knowledge and opinions of experts to extend the process-flow knowledge of existing systems to systems that involve new materials and/or new architectures. The expert knowledge is particularly helpful in filling gaps that arise in computational models of processes because of inconsistencies in historical cost data. Heretofore, the costs of planned STSs have been estimated following a "top-down" approach that tends to force the architectures of new systems to incorporate process flows like those of the space shuttles. In this ABC-based method, one makes assumptions about the processes, but otherwise follows a "bottoms up" approach that does not force the new system architecture to incorporate a space-shuttle-like process flow. Prototype software has been developed to implement this method. Through further development of software, it should be possible to extend the method beyond the space program to almost any setting in which there is a need to estimate the costs of a new system and to extend the applicable knowledge base in order to make the estimate.
In Situ Distribution Guided Analysis and Visualization of Transonic Jet Engine Simulations.
Dutta, Soumya; Chen, Chun-Ming; Heinlein, Gregory; Shen, Han-Wei; Chen, Jen-Ping
2017-01-01
Study of flow instability in turbine engine compressors is crucial to understand the inception and evolution of engine stall. Aerodynamics experts have been working on detecting the early signs of stall in order to devise novel stall suppression technologies. A state-of-the-art Navier-Stokes based, time-accurate computational fluid dynamics simulator, TURBO, has been developed in NASA to enhance the understanding of flow phenomena undergoing rotating stall. Despite the proven high modeling accuracy of TURBO, the excessive simulation data prohibits post-hoc analysis in both storage and I/O time. To address these issues and allow the expert to perform scalable stall analysis, we have designed an in situ distribution guided stall analysis technique. Our method summarizes statistics of important properties of the simulation data in situ using a probabilistic data modeling scheme. This data summarization enables statistical anomaly detection for flow instability in post analysis, which reveals the spatiotemporal trends of rotating stall for the expert to conceive new hypotheses. Furthermore, the verification of the hypotheses and exploratory visualization using the summarized data are realized using probabilistic visualization techniques such as uncertain isocontouring. Positive feedback from the domain scientist has indicated the efficacy of our system in exploratory stall analysis.
Expert systems for automated maintenance of a Mars oxygen production system
NASA Astrophysics Data System (ADS)
Huang, Jen-Kuang; Ho, Ming-Tsang; Ash, Robert L.
1992-08-01
Application of expert system concepts to a breadboard Mars oxygen processor unit have been studied and tested. The research was directed toward developing the methodology required to enable autonomous operation and control of these simple chemical processors at Mars. Failure detection and isolation was the key area of concern, and schemes using forward chaining, backward chaining, knowledge-based expert systems, and rule-based expert systems were examined. Tests and simulations were conducted that investigated self-health checkout, emergency shutdown, and fault detection, in addition to normal control activities. A dynamic system model was developed using the Bond-Graph technique. The dynamic model agreed well with tests involving sudden reductions in throughput. However, nonlinear effects were observed during tests that incorporated step function increases in flow variables. Computer simulations and experiments have demonstrated the feasibility of expert systems utilizing rule-based diagnosis and decision-making algorithms.
Expert Game experiment predicts emergence of trust in professional communication networks.
Bendtsen, Kristian Moss; Uekermann, Florian; Haerter, Jan O
2016-10-25
Strong social capital is increasingly recognized as an organizational advantage. Better knowledge sharing and reduced transaction costs increase work efficiency. To mimic the formation of the associated communication network, we propose the Expert Game, where each individual must find a specific expert and receive her help. Participants act in an impersonal environment and under time constraints that provide short-term incentives for noncooperative behavior. Despite these constraints, we observe cooperation between individuals and the self-organization of a sustained trust network, which facilitates efficient communication channels with increased information flow. We build a behavioral model that explains the experimental dynamics. Analysis of the model reveals an exploitation protection mechanism and measurable social capital, which quantitatively describe the economic utility of trust.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanham, R.; Vogt, W.G.; Mickle, M.H.
1986-01-01
This book presents the papers given at a conference on computerized simulation. Topics considered at the conference included expert systems, modeling in electric power systems, power systems operating strategies, energy analysis, a linear programming approach to optimum load shedding in transmission systems, econometrics, simulation in natural gas engineering, solar energy studies, artificial intelligence, vision systems, hydrology, multiprocessors, and flow models.
Wong, Linda; Hill, Beth L; Hunsberger, Benjamin C; Bagwell, C Bruce; Curtis, Adam D; Davis, Bruce H
2015-01-01
Leuko64™ (Trillium Diagnostics) is a flow cytometric assay that measures neutrophil CD64 expression and serves as an in vitro indicator of infection/sepsis or the presence of a systemic acute inflammatory response. Leuko64 assay currently utilizes QuantiCALC, a semiautomated software that employs cluster algorithms to define cell populations. The software reduces subjective gating decisions, resulting in interanalyst variability of <5%. We evaluated a completely automated approach to measuring neutrophil CD64 expression using GemStone™ (Verity Software House) and probability state modeling (PSM). Four hundred and fifty-seven human blood samples were processed using the Leuko64 assay. Samples were analyzed on four different flow cytometer models: BD FACSCanto II, BD FACScan, BC Gallios/Navios, and BC FC500. A probability state model was designed to identify calibration beads and three leukocyte subpopulations based on differences in intensity levels of several parameters. PSM automatically calculates CD64 index values for each cell population using equations programmed into the model. GemStone software uses PSM that requires no operator intervention, thus totally automating data analysis and internal quality control flagging. Expert analysis with the predicate method (QuantiCALC) was performed. Interanalyst precision was evaluated for both methods of data analysis. PSM with GemStone correlates well with the expert manual analysis, r(2) = 0.99675 for the neutrophil CD64 index values with no intermethod bias detected. The average interanalyst imprecision for the QuantiCALC method was 1.06% (range 0.00-7.94%), which was reduced to 0.00% with the GemStone PSM. The operator-to-operator agreement in GemStone was a perfect correlation, r(2) = 1.000. Automated quantification of CD64 index values produced results that strongly correlate with expert analysis using a standard gate-based data analysis method. PSM successfully evaluated flow cytometric data generated by multiple instruments across multiple lots of the Leuko64 kit in all 457 cases. The probability-based method provides greater objectivity, higher data analysis speed, and allows for greater precision for in vitro diagnostic flow cytometric assays. © 2015 International Clinical Cytometry Society.
Montangero, Agnes; Belevi, Hasan
2007-03-01
Simple models based on the physical and biochemical processes occurring in septic tanks, pit and urine diversion latrines were developed to determine the nutrient flows in these systems. Nitrogen and phosphorus separation in different output materials from these on-site sanitation installations were thus determined. Moreover, nutrient separation in septic tanks was also assessed through literature values and by eliciting expert judgement. Use of formal expert elicitation technique proved to be effective, particularly in the context of developing countries where data is often scarce but expert judgement readily available. In Vietnam, only 5-14% and 11-27% of the nitrogen and phosphorus input, respectively, are removed from septic tanks with the faecal sludge. The remaining fraction leaves the tank via the liquid effluent. Unlike septic tanks, urine diversion latrines allow to immobilize most of the nutrients either in form of stored urine or dehydrated faecal matter. These latrines thus contribute to reducing the nutrient load in the environment and lowering consumption of energy and non-renewable resources for fertiliser production.
NASA Astrophysics Data System (ADS)
Rusu-Anghel, S.
2017-01-01
Analytical modeling of the flow of manufacturing process of the cement is difficult because of their complexity and has not resulted in sufficiently precise mathematical models. In this paper, based on a statistical model of the process and using the knowledge of human experts, was designed a fuzzy system for automatic control of clinkering process.
Aerothermal Assment Of The Expert Flap In The SCIROCCO Wind Tunnel
NASA Astrophysics Data System (ADS)
Walpot, L.; Di Clemente, M.; Vos, J.; Etchells, J.; Trifoni, E.; Thoemel, J.; Gavira, J.
2011-05-01
In the frame of the “In-Flight Test Measurement Techniques for Aerothermodynamics” activity of the EXPERT Program, the EXPERT Instrumented Open Flap Assembly experiment has the objective to verify the design/sensor integration and validate the CFD tools. Ground based measurements were made in Europe’s largest high enthalpy plasma facility, Scirocco in Italy. Two EXPERT flaps of the flight article, instrumented with 14 thermocouples, 5 pressure ports, a pyrometer and an IR camera mounted in the cavity instrumented flap will collect in-flight data. During the Scirocco experiment, an EXPERT flap model identical to the flight article was mounted at 45 deg on a holder including cavity and was subjected to a hot plasma flow at an enthalpy up to 11MJ/kg at a stagnation pressure of 7 bar. The test model sports the same pressure sensors as the flight article. Hypersonic state-of-the-art codes were then be used to perform code-to-code and wind tunnel-to-code comparisons, including thermal response of the flap as collected during the tests by the sensors and camera.
NASA Astrophysics Data System (ADS)
Minier, Jean-Pierre; Chibbaro, Sergio; Pope, Stephen B.
2014-11-01
In this paper, we establish a set of criteria which are applied to discuss various formulations under which Lagrangian stochastic models can be found. These models are used for the simulation of fluid particles in single-phase turbulence as well as for the fluid seen by discrete particles in dispersed turbulent two-phase flows. The purpose of the present work is to provide guidelines, useful for experts and non-experts alike, which are shown to be helpful to clarify issues related to the form of Lagrangian stochastic models. A central issue is to put forward reliable requirements which must be met by Lagrangian stochastic models and a new element brought by the present analysis is to address the single- and two-phase flow situations from a unified point of view. For that purpose, we consider first the single-phase flow case and check whether models are fully consistent with the structure of the Reynolds-stress models. In the two-phase flow situation, coming up with clear-cut criteria is more difficult and the present choice is to require that the single-phase situation be well-retrieved in the fluid-limit case, elementary predictive abilities be respected and that some simple statistical features of homogeneous fluid turbulence be correctly reproduced. This analysis does not address the question of the relative predictive capacities of different models but concentrates on their formulation since advantages and disadvantages of different formulations are not always clear. Indeed, hidden in the changes from one structure to another are some possible pitfalls which can lead to flaws in the construction of practical models and to physically unsound numerical calculations. A first interest of the present approach is illustrated by considering some models proposed in the literature and by showing that these criteria help to assess whether these Lagrangian stochastic models can be regarded as acceptable descriptions. A second interest is to indicate how future developments can be safely built, which is also relevant for stochastic subgrid models for particle-laden flows in the context of Large Eddy Simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minier, Jean-Pierre, E-mail: Jean-Pierre.Minier@edf.fr; Chibbaro, Sergio; Pope, Stephen B.
In this paper, we establish a set of criteria which are applied to discuss various formulations under which Lagrangian stochastic models can be found. These models are used for the simulation of fluid particles in single-phase turbulence as well as for the fluid seen by discrete particles in dispersed turbulent two-phase flows. The purpose of the present work is to provide guidelines, useful for experts and non-experts alike, which are shown to be helpful to clarify issues related to the form of Lagrangian stochastic models. A central issue is to put forward reliable requirements which must be met by Lagrangianmore » stochastic models and a new element brought by the present analysis is to address the single- and two-phase flow situations from a unified point of view. For that purpose, we consider first the single-phase flow case and check whether models are fully consistent with the structure of the Reynolds-stress models. In the two-phase flow situation, coming up with clear-cut criteria is more difficult and the present choice is to require that the single-phase situation be well-retrieved in the fluid-limit case, elementary predictive abilities be respected and that some simple statistical features of homogeneous fluid turbulence be correctly reproduced. This analysis does not address the question of the relative predictive capacities of different models but concentrates on their formulation since advantages and disadvantages of different formulations are not always clear. Indeed, hidden in the changes from one structure to another are some possible pitfalls which can lead to flaws in the construction of practical models and to physically unsound numerical calculations. A first interest of the present approach is illustrated by considering some models proposed in the literature and by showing that these criteria help to assess whether these Lagrangian stochastic models can be regarded as acceptable descriptions. A second interest is to indicate how future developments can be safely built, which is also relevant for stochastic subgrid models for particle-laden flows in the context of Large Eddy Simulations.« less
Andrew J. Shirk; Michael A. Schroeder; Leslie A. Robb; Samuel A. Cushman
2015-01-01
The ability of landscapes to impede speciesâ movement or gene flow may be quantified by resistance models. Few studies have assessed the performance of resistance models parameterized by expert opinion. In addition, resistance models differ in terms of spatial and thematic resolution as well as their focus on the ecology of a particular species or more generally on the...
Maria C. Mateo-Sanchez; Niko Balkenhol; Samuel Cushman; Trinidad Perez; Ana Dominguez; Santiago Saura
2015-01-01
Most current methods to assess connectivity begin with landscape resistance maps. The prevailing resistance models are commonly based on expert opinion and, more recently, on a direct transformation of habitat suitability. However, habitat associations are not necessarily accurate indicators of dispersal, and thus may fail as a surrogate of resistance to...
Collaborative testing of turbulence models
NASA Technical Reports Server (NTRS)
Bradshaw, Peter; Launder, Brian E.; Lumley, John L.
1991-01-01
A review is given of an ongoing international project, in which data from experiments on, and simulations of, turbulent flows are distributed to developers of (time-averaged) engineering turbulence models. The predictions of each model are sent to the organizers and redistributed to all the modelers, plus some experimentalists and other experts (total approx. 120), for comment. The 'reaction time' of modelers has proved to be much longer than anticipated, partly because the comparisons with data have prompted many modelers to improve their models or numerics.
Corzo, Gerald; Solomatine, Dimitri
2007-05-01
Natural phenomena are multistationary and are composed of a number of interacting processes, so one single model handling all processes often suffers from inaccuracies. A solution is to partition data in relation to such processes using the available domain knowledge or expert judgment, to train separate models for each of the processes, and to merge them in a modular model (committee). In this paper a problem of water flow forecast in watershed hydrology is considered where the flow process can be presented as consisting of two subprocesses -- base flow and excess flow, so that these two processes can be separated. Several approaches to data separation techniques are studied. Two case studies with different forecast horizons are considered. Parameters of the algorithms responsible for data partitioning are optimized using genetic algorithms and global pattern search. It was found that modularization of ANN models using domain knowledge makes models more accurate, if compared with a global model trained on the whole data set, especially when forecast horizon (and hence the complexity of the modelled processes) is increased.
Developing expertise in surgery.
Alderson, David
2010-01-01
The concept of expertise is widely embraced but poorly defined in surgery. Dictionary definitions differentiate between authority and experience, while a third view sees expertise as a mind-set rather than a status. Both absolute and relative models of expertise have been developed, and each allows a richer understanding of the application of these concepts to emerge. Trainees must develop both independent and interdependent expertise, and an appreciation of the essentially constructivist and uncertain nature of medical knowledge. Approach may be more important than innate talent; the concepts of 'flow', sustained 'deliberate practice' and 'adaptive expertise' are examples of expert approaches to learning. Non-analytical reasoning plays a key role in decision making at expert levels of practice. A technically gifted surgeon may be seen as a safety hazard rather than an expert if inter-dependent expertise has not been developed. Key roles of a surgical educator are to facilitate the development of an expert approach to education and to enable entry into and movement towards the centre of an expert community of practice.
VisFlow - Web-based Visualization Framework for Tabular Data with a Subset Flow Model.
Yu, Bowen; Silva, Claudio T
2017-01-01
Data flow systems allow the user to design a flow diagram that specifies the relations between system components which process, filter or visually present the data. Visualization systems may benefit from user-defined data flows as an analysis typically consists of rendering multiple plots on demand and performing different types of interactive queries across coordinated views. In this paper, we propose VisFlow, a web-based visualization framework for tabular data that employs a specific type of data flow model called the subset flow model. VisFlow focuses on interactive queries within the data flow, overcoming the limitation of interactivity from past computational data flow systems. In particular, VisFlow applies embedded visualizations and supports interactive selections, brushing and linking within a visualization-oriented data flow. The model requires all data transmitted by the flow to be a data item subset (i.e. groups of table rows) of some original input table, so that rendering properties can be assigned to the subset unambiguously for tracking and comparison. VisFlow features the analysis flexibility of a flow diagram, and at the same time reduces the diagram complexity and improves usability. We demonstrate the capability of VisFlow on two case studies with domain experts on real-world datasets showing that VisFlow is capable of accomplishing a considerable set of visualization and analysis tasks. The VisFlow system is available as open source on GitHub.
Deep nets vs expert designed features in medical physics: An IMRT QA case study.
Interian, Yannet; Rideout, Vincent; Kearney, Vasant P; Gennatas, Efstathios; Morin, Olivier; Cheung, Joey; Solberg, Timothy; Valdes, Gilmer
2018-03-30
The purpose of this study was to compare the performance of Deep Neural Networks against a technique designed by domain experts in the prediction of gamma passing rates for Intensity Modulated Radiation Therapy Quality Assurance (IMRT QA). A total of 498 IMRT plans across all treatment sites were planned in Eclipse version 11 and delivered using a dynamic sliding window technique on Clinac iX or TrueBeam Linacs. Measurements were performed using a commercial 2D diode array, and passing rates for 3%/3 mm local dose/distance-to-agreement (DTA) were recorded. Separately, fluence maps calculated for each plan were used as inputs to a convolution neural network (CNN). The CNNs were trained to predict IMRT QA gamma passing rates using TensorFlow and Keras. A set of model architectures, inspired by the convolutional blocks of the VGG-16 ImageNet model, were constructed and implemented. Synthetic data, created by rotating and translating the fluence maps during training, was created to boost the performance of the CNNs. Dropout, batch normalization, and data augmentation were utilized to help train the model. The performance of the CNNs was compared to a generalized Poisson regression model, previously developed for this application, which used 78 expert designed features. Deep Neural Networks without domain knowledge achieved comparable performance to a baseline system designed by domain experts in the prediction of 3%/3 mm Local gamma passing rates. An ensemble of neural nets resulted in a mean absolute error (MAE) of 0.70 ± 0.05 and the domain expert model resulted in a 0.74 ± 0.06. Convolutional neural networks (CNNs) with transfer learning can predict IMRT QA passing rates by automatically designing features from the fluence maps without human expert supervision. Predictions from CNNs are comparable to a system carefully designed by physicist experts. © 2018 American Association of Physicists in Medicine.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-28
... bring together experts from diverse backgrounds and experiences including electric system operators... transmission switching; AC optimal power flow modeling; and use of active and dynamic transmission ratings. In... variability of the system, including forecast error? [cir] How can outage probability be captured in...
[Model for unplanned self extubation of ICU patients using system dynamics approach].
Song, Yu Gil; Yun, Eun Kyoung
2015-04-01
In this study a system dynamics methodology was used to identify correlation and nonlinear feedback structure among factors affecting unplanned extubation (UE) of ICU patients and to construct and verify a simulation model. Factors affecting UE were identified through a theoretical background established by reviewing literature and preceding studies and referencing various statistical data. Related variables were decided through verification of content validity by an expert group. A causal loop diagram (CLD) was made based on the variables. Stock & Flow modeling using Vensim PLE Plus Version 6.0 b was performed to establish a model for UE. Based on the literature review and expert verification, 18 variables associated with UE were identified and CLD was prepared. From the prepared CLD, a model was developed by converting to the Stock & Flow Diagram. Results of the simulation showed that patient stress, patient in an agitated state, restraint application, patient movability, and individual intensive nursing were variables giving the greatest effect to UE probability. To verify agreement of the UE model with real situations, simulation with 5 cases was performed. Equation check and sensitivity analysis on TIME STEP were executed to validate model integrity. Results show that identification of a proper model enables prediction of UE probability. This prediction allows for adjustment of related factors, and provides basic data do develop nursing interventions to decrease UE.
A knowledge-based system for controlling automobile traffic
NASA Technical Reports Server (NTRS)
Maravas, Alexander; Stengel, Robert F.
1994-01-01
Transportation network capacity variations arising from accidents, roadway maintenance activity, and special events as well as fluctuations in commuters' travel demands complicate traffic management. Artificial intelligence concepts and expert systems can be useful in framing policies for incident detection, congestion anticipation, and optimal traffic management. This paper examines the applicability of intelligent route guidance and control as decision aids for traffic management. Basic requirements for managing traffic are reviewed, concepts for studying traffic flow are introduced, and mathematical models for modeling traffic flow are examined. Measures for quantifying transportation network performance levels are chosen, and surveillance and control strategies are evaluated. It can be concluded that automated decision support holds great promise for aiding the efficient flow of automobile traffic over limited-access roadways, bridges, and tunnels.
The Higher Education Policy of Global Experts Recruitment Program: Focused on China
ERIC Educational Resources Information Center
Kim, Hanna
2017-01-01
There is an increasing interest in how to train and use national experts around the world. Major advanced countries are putting their national efforts into attracting global experts overseas and preventing domestic experts from flowing out of their countries. China has also endeavored much to attract global experts for its economic development and…
Neural network river forecasting through baseflow separation and binary-coded swarm optimization
NASA Astrophysics Data System (ADS)
Taormina, Riccardo; Chau, Kwok-Wing; Sivakumar, Bellie
2015-10-01
The inclusion of expert knowledge in data-driven streamflow modeling is expected to yield more accurate estimates of river quantities. Modular models (MMs) designed to work on different parts of the hydrograph are preferred ways to implement such approach. Previous studies have suggested that better predictions of total streamflow could be obtained via modular Artificial Neural Networks (ANNs) trained to perform an implicit baseflow separation. These MMs fit separately the baseflow and excess flow components as produced by a digital filter, and reconstruct the total flow by adding these two signals at the output. The optimization of the filter parameters and ANN architectures is carried out through global search techniques. Despite the favorable premises, the real effectiveness of such MMs has been tested only on a few case studies, and the quality of the baseflow separation they perform has never been thoroughly assessed. In this work, we compare the performance of MM against global models (GMs) for nine different gaging stations in the northern United States. Binary-coded swarm optimization is employed for the identification of filter parameters and model structure, while Extreme Learning Machines, instead of ANN, are used to drastically reduce the large computational times required to perform the experiments. The results show that there is no evidence that MM outperform global GM for predicting the total flow. In addition, the baseflow produced by the MM largely underestimates the actual baseflow component expected for most of the considered gages. This occurs because the values of the filter parameters maximizing overall accuracy do not reflect the geological characteristics of the river basins. The results indeed show that setting the filter parameters according to expert knowledge results in accurate baseflow separation but lower accuracy of total flow predictions, suggesting that these two objectives are intrinsically conflicting rather than compatible.
Research Electronic Data Capture (REDCap®) used as an audit tool with a built-in database.
Kragelund, Signe H; Kjærsgaard, Mona; Jensen-Fangel, Søren; Leth, Rita A; Ank, Nina
2018-05-01
The aim of this study was to develop an audit tool with a built-in database using Research Electronic Data Capture (REDCap®) as part of an antimicrobial stewardship program at a regional hospital in the Central Denmark Region, and to analyse the need, if any, to involve more than one expert in the evaluation of cases of antimicrobial treatment, and the level of agreement among the experts. Patients treated with systemic antimicrobials in the period from 1 September 2015 to 31 August 2016 were included, in total 722 cases. Data were collected retrospectively and entered manually. The audit was based on seven flow charts regarding: (1) initiation of antimicrobial treatment (2) infection (3) prescription and administration of antimicrobials (4) discontinuation of antimicrobials (5) reassessment within 48 h after the first prescription of antimicrobials (6) microbiological sampling in the period between suspicion of infection and the first administration of antimicrobials (7) microbiological results. The audit was based on automatic calculations drawing on the entered data and on expert assessments. Initially, two experts completed the audit, and in the cases in which they disagreed, a third expert was consulted. In 31.9% of the cases, the two experts agreed on all elements of the audit. In 66.2%, the two experts reached agreement by discussing the cases. Finally, 1.9% of the cases were completed in cooperation with a third expert. The experts assessed 3406 flow charts of which they agreed on 75.8%. We succeeded in creating an audit tool with a built-in database that facilitates independent expert evaluation using REDCap. We found a large inter-observer difference that needs to be considered when constructing a project based on expert judgements. Our two experts agreed on most of the flow charts after discussion, whereas the third expert's intervention did not have any influence on the overall assessment. Copyright © 2018 Elsevier Inc. All rights reserved.
Analysis of the Structure of Surgical Activity for a Suturing and Knot-Tying Task
Vedula, S. Swaroop; Malpani, Anand O.; Tao, Lingling; Chen, George; Gao, Yixin; Poddar, Piyush; Ahmidi, Narges; Paxton, Christopher; Vidal, Rene; Khudanpur, Sanjeev; Hager, Gregory D.; Chen, Chi Chiung Grace
2016-01-01
Background Surgical tasks are performed in a sequence of steps, and technical skill evaluation includes assessing task flow efficiency. Our objective was to describe differences in task flow for expert and novice surgeons for a basic surgical task. Methods We used a hierarchical semantic vocabulary to decompose and annotate maneuvers and gestures for 135 instances of a surgeon’s knot performed by 18 surgeons. We compared counts of maneuvers and gestures, and analyzed task flow by skill level. Results Experts used fewer gestures to perform the task (26.29; 95% CI = 25.21 to 27.38 for experts vs. 31.30; 95% CI = 29.05 to 33.55 for novices) and made fewer errors in gestures than novices (1.00; 95% CI = 0.61 to 1.39 vs. 2.84; 95% CI = 2.3 to 3.37). Transitions among maneuvers, and among gestures within each maneuver for expert trials were more predictable than novice trials. Conclusions Activity segments and state flow transitions within a basic surgical task differ by surgical skill level, and can be used to provide targeted feedback to surgical trainees. PMID:26950551
Hoang, Linh; van Griensven, Ann; van der Keur, Peter; Refsgaard, Jens Christian; Troldborg, Lars; Nilsson, Bertel; Mynett, Arthur
2014-01-01
The European Union Water Framework Directive requires an integrated pollution prevention plan at the river basin level. Hydrological river basin modeling tools are therefore promising tools to support the quantification of pollution originating from different sources. A limited number of studies have reported on the use of these models to predict pollution fluxes in tile-drained basins. This study focused on evaluating different modeling tools and modeling concepts to quantify the flow and nitrate fluxes in the Odense River basin using DAISY-MIKE SHE (DMS) and the Soil and Water Assessment Tool (SWAT). The results show that SWAT accurately predicted flow for daily and monthly time steps, whereas simulation of nitrate fluxes were more accurate at a monthly time step. In comparison to the DMS model, which takes into account the uncertainty of soil hydraulic and slurry parameters, SWAT results for flow and nitrate fit well within the range of DMS simulated values in high-flow periods but were slightly lower in low-flow periods. Despite the similarities of simulated flow and nitrate fluxes at the basin outlet, the two models predicted very different separations into flow components (overland flow, tile drainage, and groundwater flow) as well as nitrate fluxes from flow components. It was concluded that the assessment on which the model provides a better representation of the reality in terms of flow paths should not only be based on standard statistical metrics for the entire river basin but also needs to consider additional data, field experiments, and opinions of field experts. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Sensitivity analysis of the Gupta and Park chemical models on the heat flux by DSMC and CFD codes
NASA Astrophysics Data System (ADS)
Morsa, Luigi; Festa, Giandomenico; Zuppardi, Gennaro
2012-11-01
The present study is the logical continuation of a former paper by the first author in which the influence of the chemical models by Gupta and by Park on the computation of heat flux on the Orion and EXPERT capsules was evaluated. Tests were carried out by the direct simulation Monte Carlo code DS2V and by the computational fluiddynamic (CFD) code H3NS. DS2V implements the Gupta model, while H3NS implements the Park model. In order to compare the effects of the chemical models, the Park model was implemented also in DS2V. The results showed that DS2V and H3NS compute a different composition both in the flow field and on the surface, even using the same chemical model (Park). Furthermore DS2V computes, by the two chemical models, different compositions in the flow field but the same composition on the surface, therefore the same heat flux. In the present study, in order to evaluate the influence of these chemical models also in a CFD code, the Gupta and the Park models have been implemented in FLUENT. Tests by DS2V and by FLUENT, have been carried out for the EXPERT capsule at the altitude of 70 km and with velocity of 5000 m/s. The capsule experiences a hypersonic, continuum low density regime. Due to the energy level of the flow, the vibration equation, lacking in the original version of FLUENT, has been implemented. The results of the heat flux computation verify that FLUENT is quite sensitive to the Gupta and to the Park chemical models. In fact, at the stagnation point, the percentage difference between the models is about 13%. On the opposite the DS2V results by the two models are practically equivalent.
Intuitive Visualization of Transient Flow: Towards a Full 3D Tool
NASA Astrophysics Data System (ADS)
Michel, Isabel; Schröder, Simon; Seidel, Torsten; König, Christoph
2015-04-01
Visualization of geoscientific data is a challenging task especially when targeting a non-professional audience. In particular, the graphical presentation of transient vector data can be a significant problem. With STRING Fraunhofer ITWM (Kaiserslautern, Germany) in collaboration with delta h Ingenieurgesellschaft mbH (Witten, Germany) developed a commercial software for intuitive 2D visualization of 3D flow problems. Through the intuitive character of the visualization experts can more easily transport their findings to non-professional audiences. In STRING pathlets moving with the flow provide an intuition of velocity and direction of both steady-state and transient flow fields. The visualization concept is based on the Lagrangian view of the flow which means that the pathlets' movement is along the direction given by pathlines. In order to capture every detail of the flow an advanced method for intelligent, time-dependent seeding of the pathlets is implemented based on ideas of the Finite Pointset Method (FPM) originally conceived at and continuously developed by Fraunhofer ITWM. Furthermore, by the same method pathlets are removed during the visualization to avoid visual cluttering. Additional scalar flow attributes, for example concentration or potential, can either be mapped directly to the pathlets or displayed in the background of the pathlets on the 2D visualization plane. The extensive capabilities of STRING are demonstrated with the help of different applications in groundwater modeling. We will discuss the strengths and current restrictions of STRING which have surfaced during daily use of the software, for example by delta h. Although the software focusses on the graphical presentation of flow data for non-professional audiences its intuitive visualization has also proven useful to experts when investigating details of flow fields. Due to the popular reception of STRING and its limitation to 2D, the need arises for the extension to a full 3D tool. Currently STRING can generate animations of single 2D cuts, either planar or curved surfaces, through 3D simulation domains. To provide a general tool for experts enabling also direct exploration and analysis of large 3D flow fields the software needs to be extended to intuitive as well as interactive visualizations of entire 3D flow domains. The current research concerning this project, which is funded by the Federal Ministry for Economic Affairs and Energy (Germany), is presented.
Castillo, Edward; Castillo, Richard; White, Benjamin; Rojo, Javier; Guerrero, Thomas
2012-01-01
Compressible flow based image registration operates under the assumption that the mass of the imaged material is conserved from one image to the next. Depending on how the mass conservation assumption is modeled, the performance of existing compressible flow methods is limited by factors such as image quality, noise, large magnitude voxel displacements, and computational requirements. The Least Median of Squares Filtered Compressible Flow (LFC) method introduced here is based on a localized, nonlinear least squares, compressible flow model that describes the displacement of a single voxel that lends itself to a simple grid search (block matching) optimization strategy. Spatially inaccurate grid search point matches, corresponding to erroneous local minimizers of the nonlinear compressible flow model, are removed by a novel filtering approach based on least median of squares fitting and the forward search outlier detection method. The spatial accuracy of the method is measured using ten thoracic CT image sets and large samples of expert determined landmarks (available at www.dir-lab.com). The LFC method produces an average error within the intra-observer error on eight of the ten cases, indicating that the method is capable of achieving a high spatial accuracy for thoracic CT registration. PMID:22797602
Seeing Fluid Physics via Visual Expertise Training
NASA Astrophysics Data System (ADS)
Hertzberg, Jean; Goodman, Katherine; Curran, Tim
2016-11-01
In a course on Flow Visualization, students often expressed that their perception of fluid flows had increased, implying the acquisition of a type of visual expertise, akin to that of radiologists or dog show judges. In the first steps towards measuring this expertise, we emulated an experimental design from psychology. The study had two groups of participants: "novices" with no formal fluids education, and "experts" who had passed as least one fluid mechanics course. All participants were trained to place static images of fluid flows into two categories (laminar and turbulent). Half the participants were trained on flow images with a specific format (Von Kármán vortex streets), and the other half on a broader group. Novices' results were in line with past perceptual expertise studies, showing that it is easier to transfer learning from a broad category to a new specific format than vice versa. In contrast, experts did not have a significant difference between training conditions, suggesting the experts did not undergo the same learning process as the novices. We theorize that expert subjects were able to access their conceptual knowledge about fluids to perform this new, visual task. This finding supports new ways of understanding conceptual learning.
Simplified Thermo-Chemical Modelling For Hypersonic Flow
NASA Astrophysics Data System (ADS)
Sancho, Jorge; Alvarez, Paula; Gonzalez, Ezequiel; Rodriguez, Manuel
2011-05-01
Hypersonic flows are connected with high temperatures, generally associated with strong shock waves that appear in such flows. At high temperatures vibrational degrees of freedom of the molecules may become excited, the molecules may dissociate into atoms, the molecules or free atoms may ionize, and molecular or ionic species, unimportant at lower temperatures, may be formed. In order to take into account these effects, a chemical model is needed, but this model should be simplified in order to be handled by a CFD code, but with a sufficient precision to take into account the physics more important. This work is related to a chemical non-equilibrium model validation, implemented into a commercial CFD code, in order to obtain the flow field around bodies in hypersonic flow. The selected non-equilibrium model is composed of seven species and six direct reactions together with their inverse. The commercial CFD code where the non- equilibrium model has been implemented is FLUENT. For the validation, the X38/Sphynx Mach 20 case is rebuilt on a reduced geometry, including the 1/3 Lref forebody. This case has been run in laminar regime, non catalytic wall and with radiative equilibrium wall temperature. The validated non-equilibrium model is applied to the EXPERT (European Experimental Re-entry Test-bed) vehicle at a specified trajectory point (Mach number 14). This case has been run also in laminar regime, non catalytic wall and with radiative equilibrium wall temperature.
NASA Astrophysics Data System (ADS)
Lidya, L.
2017-03-01
National Health Insurance has been implemented since 1st January 2014. A number of new policies have been established including multilevel referral system. The multilevel referral system classified health care center into three levels, it determined that the flow of patient treatment should be started from first level health care center. There are 144 kind of diseases that must be treat in the first level which mainly consists of general physicians. Unfortunately, competence of the physician in the first level may not fulfil the standard competence yet. To improved the physisians knowledge, government has created many events to accelerate knowledge sharing. However, it still needs times and many resources to give significan results. Expert system is kind of software that provide consulting services to non-expert users in accordance with the area of its expertise. It can improved effectivity and efficiency of knowledge sharing and learning. This research was developed a model of TB diagnose expert system which comply with the standard procedure of TB diagnosis and regulation. The proposed expert system has characteristics as follows provide facility to manage multimedia clinical data, supporting the complexity of TB diagnosis (combine rule-based and case-based expert system), interactive interface, good usability, multi-platform, evolutionary.
Sheikhtaheri, Abbas; Sadoughi, Farahnaz; Hashemi Dehaghi, Zahra
2014-09-01
Complicacy of clinical decisions justifies utilization of information systems such as artificial intelligence (e.g. expert systems and neural networks) to achieve better decisions, however, application of these systems in the medical domain faces some challenges. We aimed at to review the applications of these systems in the medical domain and discuss about such challenges. Following a brief introduction of expert systems and neural networks by representing few examples, the challenges of these systems in the medical domain are discussed. We found that the applications of expert systems and artificial neural networks have been increased in the medical domain. These systems have shown many advantages such as utilization of experts' knowledge, gaining rare knowledge, more time for assessment of the decision, more consistent decisions, and shorter decision-making process. In spite of all these advantages, there are challenges ahead of developing and using such systems including maintenance, required experts, inputting patients' data into the system, problems for knowledge acquisition, problems in modeling medical knowledge, evaluation and validation of system performance, wrong recommendations and responsibility, limited domains of such systems and necessity of integrating such systems into the routine work flows. We concluded that expert systems and neural networks can be successfully used in medicine; however, there are many concerns and questions to be answered through future studies and discussions.
Modeling methods for merging computational and experimental aerodynamic pressure data
NASA Astrophysics Data System (ADS)
Haderlie, Jacob C.
This research describes a process to model surface pressure data sets as a function of wing geometry from computational and wind tunnel sources and then merge them into a single predicted value. The described merging process will enable engineers to integrate these data sets with the goal of utilizing the advantages of each data source while overcoming the limitations of both; this provides a single, combined data set to support analysis and design. The main challenge with this process is accurately representing each data source everywhere on the wing. Additionally, this effort demonstrates methods to model wind tunnel pressure data as a function of angle of attack as an initial step towards a merging process that uses both location on the wing and flow conditions (e.g., angle of attack, flow velocity or Reynold's number) as independent variables. This surrogate model of pressure as a function of angle of attack can be useful for engineers that need to predict the location of zero-order discontinuities, e.g., flow separation or normal shocks. Because, to the author's best knowledge, there is no published, well-established merging method for aerodynamic pressure data (here, the coefficient of pressure Cp), this work identifies promising modeling and merging methods, and then makes a critical comparison of these methods. Surrogate models represent the pressure data for both data sets. Cubic B-spline surrogate models represent the computational simulation results. Machine learning and multi-fidelity surrogate models represent the experimental data. This research compares three surrogates for the experimental data (sequential--a.k.a. online--Gaussian processes, batch Gaussian processes, and multi-fidelity additive corrector) on the merits of accuracy and computational cost. The Gaussian process (GP) methods employ cubic B-spline CFD surrogates as a model basis function to build a surrogate model of the WT data, and this usage of the CFD surrogate in building the WT data could serve as a "merging" because the resulting WT pressure prediction uses information from both sources. In the GP approach, this model basis function concept seems to place more "weight" on the Cp values from the wind tunnel (WT) because the GP surrogate uses the CFD to approximate the WT data values. Conversely, the computationally inexpensive additive corrector method uses the CFD B-spline surrogate to define the shape of the spanwise distribution of the Cp while minimizing prediction error at all spanwise locations for a given arc length position; this, too, combines information from both sources to make a prediction of the 2-D WT-based Cp distribution, but the additive corrector approach gives more weight to the CFD prediction than to the WT data. Three surrogate models of the experimental data as a function of angle of attack are also compared for accuracy and computational cost. These surrogates are a single Gaussian process model (a single "expert"), product of experts, and generalized product of experts. The merging approach provides a single pressure distribution that combines experimental and computational data. The batch Gaussian process method provides a relatively accurate surrogate that is computationally acceptable, and can receive wind tunnel data from port locations that are not necessarily parallel to a variable direction. On the other hand, the sequential Gaussian process and additive corrector methods must receive a sufficient number of data points aligned with one direction, e.g., from pressure port bands (tap rows) aligned with the freestream. The generalized product of experts best represents wind tunnel pressure as a function of angle of attack, but at higher computational cost than the single expert approach. The format of the application data from computational and experimental sources in this work precluded the merging process from including flow condition variables (e.g., angle of attack) in the independent variables, so the merging process is only conducted in the wing geometry variables of arc length and span. The merging process of Cp data allows a more "hands-off" approach to aircraft design and analysis, (i.e., not as many engineers needed to debate the Cp distribution shape) and generates Cp predictions at any location on the wing. However, the cost with these benefits are engineer time (learning how to build surrogates), computational time in constructing the surrogates, and surrogate accuracy (surrogates introduce error into data predictions). This dissertation effort used the Trap Wing / First AIAA CFD High-Lift Prediction Workshop as a relevant transonic wing with a multi-element high-lift system, and this work identified that the batch GP model for the WT data and the B-spline surrogate for the CFD might best be combined using expert belief weights to describe Cp as a function of location on the wing element surface. (Abstract shortened by ProQuest.).
Center for Hypersonic Combined Cycle Flow Physics
2015-03-24
team of expert experimentalists and numerical and chemical kinetic modelers. Flowfields were examined in the turbine /ramjet dual inlet mode transition...using data from the NASA Glenn IMX facility and RANS calculations. In the ramjet/scramjet mode regime a dual-mode combustion wind tunnel was developed...SUBJECT TERMS Hypersonic combined cycle propulsion, turbine /ram dual-inlet transition, ram/scram dual-mode transition, hypervelocity regime, RANS, Hybrid
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1989-01-01
The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.
Numerical Analysis of Incipient Separation on 53 Deg Swept Diamond Wing
NASA Technical Reports Server (NTRS)
Frink, Neal T.
2015-01-01
A systematic analysis of incipient separation and subsequent vortex formation from moderately swept blunt leading edges is presented for a 53 deg swept diamond wing. This work contributes to a collective body of knowledge generated within the NATO/STO AVT-183 Task Group titled 'Reliable Prediction of Separated Flow Onset and Progression for Air and Sea Vehicles'. The objective is to extract insights from the experimentally measured and numerically computed flow fields that might enable turbulence experts to further improve their models for predicting swept blunt leading-edge flow separation. Details of vortex formation are inferred from numerical solutions after establishing a good correlation of the global flow field and surface pressure distributions between wind tunnel measurements and computed flow solutions. From this, significant and sometimes surprising insights into the nature of incipient separation and part-span vortex formation are derived from the wealth of information available in the computational solutions.
A prototype knowledge-based simulation support system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, T.R.; Roberts, S.D.
1987-04-01
As a preliminary step toward the goal of an intelligent automated system for simulation modeling support, we explore the feasibility of the overall concept by generating and testing a prototypical framework. A prototype knowledge-based computer system was developed to support a senior level course in industrial engineering so that the overall feasibility of an expert simulation support system could be studied in a controlled and observable setting. The system behavior mimics the diagnostic (intelligent) process performed by the course instructor and teaching assistants, finding logical errors in INSIGHT simulation models and recommending appropriate corrective measures. The system was programmed inmore » a non-procedural language (PROLOG) and designed to run interactively with students working on course homework and projects. The knowledge-based structure supports intelligent behavior, providing its users with access to an evolving accumulation of expert diagnostic knowledge. The non-procedural approach facilitates the maintenance of the system and helps merge the roles of expert and knowledge engineer by allowing new knowledge to be easily incorporated without regard to the existing flow of control. The background, features and design of the system are describe and preliminary results are reported. Initial success is judged to demonstrate the utility of the reported approach and support the ultimate goal of an intelligent modeling system which can support simulation modelers outside the classroom environment. Finally, future extensions are suggested.« less
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1988-01-01
This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.
A CLIPS expert system for clinical flow cytometry data analysis
NASA Technical Reports Server (NTRS)
Salzman, G. C.; Duque, R. E.; Braylan, R. C.; Stewart, C. C.
1990-01-01
An expert system is being developed using CLIPS to assist clinicians in the analysis of multivariate flow cytometry data from cancer patients. Cluster analysis is used to find subpopulations representing various cell types in multiple datasets each consisting of four to five measurements on each of 5000 cells. CLIPS facts are derived from results of the clustering. CLIPS rules are based on the expertise of Drs. Stewart, Duque, and Braylan. The rules incorporate certainty factors based on case histories.
A Fuzzy Expert System for Fault Management of Water Supply Recovery in the ALSS Project
NASA Technical Reports Server (NTRS)
Tohala, Vapsi J.
1998-01-01
Modeling with a new software is a challenge. CONFIG is a challenge and is design to work with many types of systems in which discrete and continuous processes occur. The CONFIG software was used to model the two subsystem of the Water Recovery system: ICB and TFB. The model worked manually only for water flows with further implementation to be done in the future. Activities in the models are stiff need to be implemented based on testing of the hardware for phase III. More improvements to CONFIG are in progress to make it a more user friendly software.
NASA Astrophysics Data System (ADS)
Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve
2016-04-01
Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.
Prietula, M J; Feltovich, P J; Marchak, F
2000-01-01
We propose that considering four categories of task factors can facilitate knowledge elicitation efforts in the analysis of complex cognitive tasks: materials, strategies, knowledge characteristics, and goals. A study was conducted to examine the effects of altering aspects of two of these task categories on problem-solving behavior across skill levels: materials and goals. Two versions of an applied engineering problem were presented to expert, intermediate, and novice participants. Participants were to minimize the cost of running a steam generation facility by adjusting steam generation levels and flows. One version was cast in the form of a dynamic, computer-based simulation that provided immediate feedback on flows, costs, and constraint violations, thus incorporating key variable dynamics of the problem context. The other version was cast as a static computer-based model, with no dynamic components, cost feedback, or constraint checking. Experts performed better than the other groups across material conditions, and, when required, the presentation of the goal assisted the experts more than the other groups. The static group generated richer protocols than the dynamic group, but the dynamic group solved the problem in significantly less time. Little effect of feedback was found for intermediates, and none for novices. We conclude that demonstrating differences in performance in this task requires different materials than explicating underlying knowledge that leads to performance. We also conclude that substantial knowledge is required to exploit the information yielded by the dynamic form of the task or the explicit solution goal. This simple model can help to identify the contextual factors that influence elicitation and specification of knowledge, which is essential in the engineering of joint cognitive systems.
Expert System for ASIC Imaging
NASA Astrophysics Data System (ADS)
Gupta, Shri N.; Arshak, Khalil I.; McDonnell, Pearse; Boyce, Conor; Duggan, Andrew
1989-07-01
With the developments in the techniques of artificial intelligence over the last few years, development of advisory, scheduling and similar class of problems has become very convenient using tools such as PROLOG. In this paper an expert system has been described which helps lithographers and process engineers in several ways. The methodology used is to model each work station according to its input, output and control parameters, combine these work stations in a logical sequence based on past experience and work out process schedule for a job. In addition, all the requirements vis-a-vis a particular job parameters are converted into decision rules. One example is the exposure time, develop time for a wafer with different feature sizes would be different. This expert system has been written in Turbo Prolog. By building up a large number of rules, one can tune the program to any facility and use it for as diverse applications as advisory help, trouble shooting etc. Leitner (1) has described an advisory expert system that is being used at National Semiconductor. This system is quite different from the one being reported in the present paper. The approach is quite different for one. There is stress on job flow and process for another.
NASA Technical Reports Server (NTRS)
Bochsler, Daniel C.
1988-01-01
A complete listing is given of the expert system rules for the Entry phase of the Onboard Navigation (ONAV) Ground Based Expert Trainer System for aircraft/space shuttle navigation. These source listings appear in the same format as utilized and required by the C Language Integrated Production System (CLIPS) expert system shell which is the basis for the ONAV entry system. A schematic overview is given of how the rules are organized. These groups result from a partitioning of the rules according to the overall function which a given set of rules performs. This partitioning was established and maintained according to that established in the knowledge specification document. In addition, four other groups of rules are specified. The four groups (control flow, operator inputs, output management, and data tables) perform functions that affect all the other functional rule groups. As the name implies, control flow ensures that the rule groups are executed in the order required for proper operation; operator input rules control the introduction into the CLIPS fact base of various kinds of data required by the expert system; output management rules control the updating of the ONAV expert system user display screen during execution of the system; and data tables are static information utilized by many different rule sets gathered in one convenient place.
Symbolic Knowledge Processing for the Acquisition of Expert Behavior: A Study in Medicine.
1984-05-01
information . It provides a model for this type of study, suggesting a different approach to the problem of learning and efficiency of knowledge -based...flow of information 2.2. Scope and description of the subsystems Three subsystems perform distinct operations using the preceding knowledge sources...which actually yields a new knowledge rCpresentation Ahere new external information is encoded in the combination and ordering of elements of the
An expert judgment model applied to estimating the safety effect of a bicycle facility.
Leden, L; Gårder, P; Pulkkinen, U
2000-07-01
This paper presents a risk index model that can be used for assessing the safety effect of countermeasures. The model estimates risk in a multiplicative way, which makes it possible to analyze the impact of different factors separately. Expert judgments are incorporated through a Bayesian error model. The variance of the risk estimate is determined by Monte-Carlo simulation. The model was applied to assess the safety effect of a new design of a bicycle crossing. The intent was to gain safety by raising the crossings to reduce vehicle speeds and by making the crossings more visible by painting them in a bright color. Before the implementations, bicyclists were riding on bicycle crossings of conventional Swedish type, i.e. similar to crosswalks but delineated by white squares rather than solid lines or zebra markings. Automobile speeds were reduced as anticipated. However, it seems as if the positive effect of this was more or less canceled out by increased bicycle speeds. The safety per bicyclist was still improved by approximately 20%. This improvement was primarily caused by an increase in bicycle flow, since the data show that more bicyclists at a given location seem to benefit their safety. The increase in bicycle flow was probably caused by the new layout of the crossings since bicyclists perceived them as safer and causing less delay. Some future development work is suggested. Pros and cons with the used methodology are discussed. The most crucial parameter to be added is probably a model describing the interaction between motorists and bicyclists, for example, how risk is influenced by the lateral position of the bicyclist in relation to the motorist. It is concluded that the interaction seems to be optimal when both groups share the roadway.
Ko, Henry
2016-05-01
The healthcare field contains a multitude of opportunities for science communication. Given the many stakeholders dancing together in a multidirectional tango of communication, we need to ask how much does the deficit model apply to the health field? History dictates that healthcare professionals are the holders of all knowledge, and the patients and other stakeholders are the ones that need the scientific information communicated to them. This essay argues otherwise, in part due to the rise of shared decision-making and patients and other stakeholders acting as partners in healthcare. The traditional deficit model in health held that: (1) doctors were experts and patients were consumers, (2) it is impossible for the public to grasp the many disciplines of knowledge in medicine, (3) if experts have trouble keeping up with medical research then the public surely can't keep up, and (4) it is safer for healthcare professionals to communicate to the public using a deficit model. However, with the rise of partnerships with patients in healthcare decision-making, the deficit model might be weakening. Examples of public participation in healthcare decision-making include: (1) crowd-sourcing public participation in systematic reviews, (2) public participation in health policy, (3) public collaboration in health research, and (4) health consumer groups acting as producers of health information. With the challenges to the deficit model in science communication in health, caution is needed with the increasing role of technology and social media, and how these may affect the legitimacy of healthcare information flows away from the healthcare professional. © The Author(s) 2016.
Wolf, Sebastian; Brölz, Ellen; Keune, Philipp M; Wesa, Benjamin; Hautzinger, Martin; Birbaumer, Niels; Strehl, Ute
2015-02-01
Functional hemispheric asymmetry is assumed to constitute one underlying neurophysiological mechanism of flow-experience and skilled psycho-motor performance in table tennis athletes. We hypothesized that when initiating motor execution during motor imagery, elite table tennis players show higher right- than left-hemispheric temporal activity and stronger right temporal-premotor than left temporal-premotor theta coherence compared to amateurs. We additionally investigated, whether less pronounced left temporal cortical activity is associated with more world rank points and more flow-experience. To this aim, electroencephalographic data were recorded in 14 experts and 15 amateur table tennis players. Subjects watched videos of an opponent serving a ball and were instructed to imagine themselves responding with a specific table tennis stroke. Alpha asymmetry scores were calculated by subtracting left from right hemispheric 8-13 Hz alpha power. 4-7 Hz theta coherence was calculated between temporal (T3/T4) and premotor (Fz) cortex. Experts showed a significantly stronger shift towards lower relative left-temporal brain activity compared to amateurs and a significantly stronger right temporal-premotor coherence than amateurs. The shift towards lower relative left-temporal brain activity in experts was associated with more flow-experience and lower relative left temporal activity was correlated with more world rank points. The present findings suggest that skilled psycho-motor performance in elite table tennis players reflect less desynchronized brain activity at the left hemisphere and more coherent brain activity between fronto-temporal and premotor oscillations at the right hemisphere. This pattern probably reflect less interference of irrelevant communication of verbal-analytical with motor-control mechanisms which implies flow-experience and predict world rank in experts. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Jaramillo, L. V.; Stone, M. C.; Morrison, R. R.
2017-12-01
Decision-making for natural resource management is complex especially for fire impacted watersheds in the Southwestern US because of the vital importance of water resources, exorbitant cost of fire management and restoration, and the risks of the wildland-urban interface (WUI). While riparian and terrestrial vegetation are extremely important to ecosystem health and provide ecosystem services, loss of vegetation due to wildfire, post-fire flooding, and debris flows can lead to further degradation of the watershed and increased vulnerability to erosion and debris flow. Land managers are charged with taking measures to mitigate degradation of the watershed effectively and efficiently with limited time, money, and data. For our study, a Bayesian network (BN) approach is implemented to understand vegetation potential for Kashe-Katuwe Tent Rocks National Monument in the fire-impacted Peralta Canyon Watershed, New Mexico, USA. We implement both two-dimensional hydrodynamic and Bayesian network modeling to incorporate spatial variability in the system. Our coupled modeling framework presents vegetation recruitment and succession potential for three representative plant types (native riparian, native terrestrial, and non-native) under several hydrologic scenarios and management actions. In our BN model, we use variables that address timing, hydrologic, and groundwater conditions as well as recruitment and succession constraints for the plant types based on expert knowledge and literature. Our approach allows us to utilize small and incomplete data, incorporate expert knowledge, and explicitly account for uncertainty in the system. Our findings can be used to help land managers and local decision-makers determine their plan of action to increase watershed health and resilience.
Development and exemplification of a model for Teacher Assessment in Primary Science
NASA Astrophysics Data System (ADS)
Davies, D. J.; Earle, S.; McMahon, K.; Howe, A.; Collier, C.
2017-09-01
The Teacher Assessment in Primary Science project is funded by the Primary Science Teaching Trust and based at Bath Spa University. The study aims to develop a whole-school model of valid, reliable and manageable teacher assessment to inform practice and make a positive impact on primary-aged children's learning in science. The model is based on a data-flow 'pyramid' (analogous to the flow of energy through an ecosystem), whereby the rich formative assessment evidence gathered in the classroom is summarised for monitoring, reporting and evaluation purposes [Nuffield Foundation. (2012). Developing policy, principles and practice in primary school science assessment. London: Nuffield Foundation]. Using a design-based research (DBR) methodology, the authors worked in collaboration with teachers from project schools and other expert groups to refine, elaborate, validate and operationalise the data-flow 'pyramid' model, resulting in the development of a whole-school self-evaluation tool. In this paper, we argue that a DBR approach to theory-building and school improvement drawing upon teacher expertise has led to the identification, adaptation and successful scaling up of a promising approach to school self-evaluation in relation to assessment in science.
Design And Ground Testing For The Expert PL4/PL5 'Natural And Roughness Induced Transition'
NASA Astrophysics Data System (ADS)
Masutti, Davie; Chazot, Olivier; Donelli, Raffaele; de Rosa, Donato
2011-05-01
Unpredicted boundary layer transition can impact dramatically the stability of the vehicle, its aerodynamic coefficients and reduce the efficiency of the thermal protection system. In this frame, ESA started the EXPERT (European eXPErimental Reentry Testbed) program to pro- vide and perform in-flight experiments in order to obtain aerothermodynamic data for the validation of numerical models and of ground-to-flight extrapolation methodologies. Considering the boundary layer transition investigation, the EXPERT vehicle is equipped with two specific payloads, PL4 and PL5, concerning respectively the study of the natural and roughness induced transition. The paper is a survey on the design process of these two in-flight experiments and it covers the major analyses and findings encountered during the development of the payloads. A large amount of transition criteria have been investigated and used to estimate either the dangerousness of the height of the distributed roughness, arising due to nose erosion, or the effectiveness of height of the isolated roughness element forcing the boundary layer transition. Supporting the PL4 design, linear stability computations and CFD analyses have been performed by CIRA on the EXPERT flight vehicle to determine the amplification factor of the boundary layer instabilities at different point of the re-entry trajectory. Ground test experiments regarding the PL5 are carried on in the Mach 6 VKI H3 Hypersonic Wind Tunnel with a Reynolds numbers ranging from 18E6/m to 26E6/m. Infrared measurements (Stanton number) and flow visualization are used on a 1/16 scaled model of the EXPERT vehicle and a flat plate to validate the Potter and Whitfield criterion as a suitable methodology for ground-to-flight extrapolation and the payload design.
Independent Review of Simulation of Net Infiltration for Present-Day and Potential Future Climates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Review Panel: Soroosh Sorooshian, Ph.D., Panel Chairperson, University of California, Irvine; Jan M. H. Hendrickx, Ph.D., New Mexico Institute of Mining and Technology; Binayak P. Mohanty, Ph.D., Texas A&M University
The DOE Office of Civilian Radioactive Waste Management (OCRWM) tasked Oak Ridge Institute for Science and Education (ORISE) with providing an independent expert review of the documented model and prediction results for net infiltration of water into the unsaturated zone at Yucca Mountain. The specific purpose of the model, as documented in the report MDL-NBS-HS-000023, Rev. 01, is “to provide a spatial representation, including epistemic and aleatory uncertainty, of the predicted mean annual net infiltration at the Yucca Mountain site ...” (p. 1-1) The expert review panel assembled by ORISE concluded that the model report does not provide a technicallymore » credible spatial representation of net infiltration at Yucca Mountain. Specifically, the ORISE Review Panel found that: • A critical lack of site-specific meteorological, surface, and subsurface information prevents verification of (i) the net infiltration estimates, (ii) the uncertainty estimates of parameters caused by their spatial variability, and (iii) the assumptions used by the modelers (ranges and distributions) for the characterization of parameters. The paucity of site-specific data used by the modeling team for model implementation and validation is a major deficiency in this effort. • The model does not incorporate at least one potentially important hydrologic process. Subsurface lateral flow is not accounted for by the model, and the assumption that the effect of subsurface lateral flow is negligible is not adequately justified. This issue is especially critical for the wetter climate periods. This omission may be one reason the model results appear to underestimate net infiltration beneath wash environments and therefore imprecisely represent the spatial variability of net infiltration. • While the model uses assumptions consistently, such as uniform soil depths and a constant vegetation rooting depth, such assumptions may not be appropriate for this net infiltration simulation because they oversimplify a complex landscape and associated hydrologic processes, especially since the model assumptions have not been adequately corroborated by field and laboratory observations at Yucca Mountain.« less
Common sense reasoning about petroleum flow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, S.
1981-02-01
This paper describes an expert system for understanding and Reasoning in a petroleum resources domain. A basic model is implemented in FRL (Frame Representation Language). Expertise is encoded as rule frames. The model consists of a set of episodic contexts which are sequentially generated over time. Reasoning occurs in separate reasoning contexts consisting of a buffer frame and packets of rules. These function similar to small production systems. reasoning is linked to the model through an interface of Sentinels (instance driven demons) which notice anomalous conditions. Heuristics and metaknowledge are used through the creation of further reasoning contexts which overlaymore » the simpler ones.« less
Knowledge Engineering (Or, Catching Black Cats in Dark Rooms).
ERIC Educational Resources Information Center
Ruyle, Kim E.
1993-01-01
Discusses knowledge engineering, its relationship to artificial intelligence, and possible applications to developing expert systems, job aids, and technical training. The educational background of knowledge engineers is considered; the role of subject matter experts is described; and examples of flow charts, lists, and pictorial representations…
Early warning, warning or alarm systems for natural hazards? A generic classification.
NASA Astrophysics Data System (ADS)
Sättele, Martina; Bründl, Michael; Straub, Daniel
2013-04-01
Early warning, warning and alarm systems have gained popularity in recent years as cost-efficient measures for dangerous natural hazard processes such as floods, storms, rock and snow avalanches, debris flows, rock and ice falls, landslides, flash floods, glacier lake outburst floods, forest fires and even earthquakes. These systems can generate information before an event causes loss of property and life. In this way, they mainly mitigate the overall risk by reducing the presence probability of endangered objects. These systems are typically prototypes tailored to specific project needs. Despite their importance there is no recognised system classification. This contribution classifies warning and alarm systems into three classes: i) threshold systems, ii) expert systems and iii) model-based expert systems. The result is a generic classification, which takes the characteristics of the natural hazard process itself and the related monitoring possibilities into account. The choice of the monitoring parameters directly determines the system's lead time. The classification of 52 active systems moreover revealed typical system characteristics for each system class. i) Threshold systems monitor dynamic process parameters of ongoing events (e.g. water level of a debris flow) and incorporate minor lead times. They have a local geographical coverage and a predefined threshold determines if an alarm is automatically activated to warn endangered objects, authorities and system operators. ii) Expert systems monitor direct changes in the variable disposition (e.g crack opening before a rock avalanche) or trigger events (e.g. heavy rain) at a local scale before the main event starts and thus offer extended lead times. The final alarm decision incorporates human, model and organisational related factors. iii) Model-based expert systems monitor indirect changes in the variable disposition (e.g. snow temperature, height or solar radiation that influence the occurrence probability of snow avalanches) or trigger events (e.g. heavy snow fall) to predict spontaneous hazard events in advance. They encompass regional or national measuring networks and satisfy additional demands such as the standardisation of the measuring stations. The developed classification and the characteristics, which were revealed for each class, yield a valuable input to quantifying the reliability of warning and alarm systems. Importantly, this will facilitate to compare them with well-established standard mitigation measures such as dams, nets and galleries within an integrated risk management approach.
TARGET: Rapid Capture of Process Knowledge
NASA Technical Reports Server (NTRS)
Ortiz, C. J.; Ly, H. V.; Saito, T.; Loftin, R. B.
1993-01-01
TARGET (Task Analysis/Rule Generation Tool) represents a new breed of tool that blends graphical process flow modeling capabilities with the function of a top-down reporting facility. Since NASA personnel frequently perform tasks that are primarily procedural in nature, TARGET models mission or task procedures and generates hierarchical reports as part of the process capture and analysis effort. Historically, capturing knowledge has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent the expert's knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some types of knowledge, procedural knowledge has received relatively little attention. In essence, TARGET is one of the first tools of its kind, commercial or institutional, that is designed to support this type of knowledge capture undertaking. This paper will describe the design and development of TARGET for the acquisition and representation of procedural knowledge. The strategies employed by TARGET to support use by knowledge engineers, subject matter experts, programmers and managers will be discussed. This discussion includes the method by which the tool employs its graphical user interface to generate a task hierarchy report. Next, the approach to generate production rules for incorporation in and development of a CLIPS based expert system will be elaborated. TARGET also permits experts to visually describe procedural tasks as a common medium for knowledge refinement by the expert community and knowledge engineer making knowledge consensus possible. The paper briefly touches on the verification and validation issues facing the CLIPS rule generation aspects of TARGET. A description of efforts to support TARGET's interoperability issues on PCs, Macintoshes and UNIX workstations concludes the paper.
NASA Astrophysics Data System (ADS)
Bouaziz, Laurène; de Boer-Euser, Tanja; Brauer, Claudia; Drogue, Gilles; Fenicia, Fabrizio; Grelier, Benjamin; de Niel, Jan; Nossent, Jiri; Pereira, Fernando; Savenije, Hubert; Thirel, Guillaume; Willems, Patrick
2016-04-01
International collaboration between institutes and universities is a promising way to reach consensus on hydrological model development. Education, experience and expert knowledge of the hydrological community have resulted in the development of a great variety of model concepts, calibration methods and analysis techniques. Although comparison studies are very valuable for international cooperation, they do often not lead to very clear new insights regarding the relevance of the modelled processes. We hypothesise that this is partly caused by model complexity and the used comparison methods, which focus on a good overall performance instead of focusing on specific events. We propose an approach that focuses on the evaluation of specific events. Eight international research groups calibrated their model for the Ourthe catchment in Belgium (1607 km2) and carried out a validation in time for the Ourthe (i.e. on two different periods, one of them on a blind mode for the modellers) and a validation in space for nested and neighbouring catchments of the Meuse in a completely blind mode. For each model, the same protocol was followed and an ensemble of best performing parameter sets was selected. Signatures were first used to assess model performances in the different catchments during validation. Comparison of the models was then followed by evaluation of selected events, which include: low flows, high flows and the transition from low to high flows. While the models show rather similar performances based on general metrics (i.e. Nash-Sutcliffe Efficiency), clear differences can be observed for specific events. While most models are able to simulate high flows well, large differences are observed during low flows and in the ability to capture the first peaks after drier months. The transferability of model parameters to neighbouring and nested catchments is assessed as an additional measure in the model evaluation. This suggested approach helps to select, among competing model alternatives, the most suitable model for a specific purpose.
Knowledge-based zonal grid generation for computational fluid dynamics
NASA Technical Reports Server (NTRS)
Andrews, Alison E.
1988-01-01
Automation of flow field zoning in two dimensions is an important step towards reducing the difficulty of three-dimensional grid generation in computational fluid dynamics. Using a knowledge-based approach makes sense, but problems arise which are caused by aspects of zoning involving perception, lack of expert consensus, and design processes. These obstacles are overcome by means of a simple shape and configuration language, a tunable zoning archetype, and a method of assembling plans from selected, predefined subplans. A demonstration system for knowledge-based two-dimensional flow field zoning has been successfully implemented and tested on representative aerodynamic configurations. The results show that this approach can produce flow field zonings that are acceptable to experts with differing evaluation criteria.
Stream-flow forecasting using extreme learning machines: A case study in a semi-arid region in Iraq
NASA Astrophysics Data System (ADS)
Yaseen, Zaher Mundher; Jaafar, Othman; Deo, Ravinesh C.; Kisi, Ozgur; Adamowski, Jan; Quilty, John; El-Shafie, Ahmed
2016-11-01
Monthly stream-flow forecasting can yield important information for hydrological applications including sustainable design of rural and urban water management systems, optimization of water resource allocations, water use, pricing and water quality assessment, and agriculture and irrigation operations. The motivation for exploring and developing expert predictive models is an ongoing endeavor for hydrological applications. In this study, the potential of a relatively new data-driven method, namely the extreme learning machine (ELM) method, was explored for forecasting monthly stream-flow discharge rates in the Tigris River, Iraq. The ELM algorithm is a single-layer feedforward neural network (SLFNs) which randomly selects the input weights, hidden layer biases and analytically determines the output weights of the SLFNs. Based on the partial autocorrelation functions of historical stream-flow data, a set of five input combinations with lagged stream-flow values are employed to establish the best forecasting model. A comparative investigation is conducted to evaluate the performance of the ELM compared to other data-driven models: support vector regression (SVR) and generalized regression neural network (GRNN). The forecasting metrics defined as the correlation coefficient (r), Nash-Sutcliffe efficiency (ENS), Willmott's Index (WI), root-mean-square error (RMSE) and mean absolute error (MAE) computed between the observed and forecasted stream-flow data are employed to assess the ELM model's effectiveness. The results revealed that the ELM model outperformed the SVR and the GRNN models across a number of statistical measures. In quantitative terms, superiority of ELM over SVR and GRNN models was exhibited by ENS = 0.578, 0.378 and 0.144, r = 0.799, 0.761 and 0.468 and WI = 0.853, 0.802 and 0.689, respectively and the ELM model attained lower RMSE value by approximately 21.3% (relative to SVR) and by approximately 44.7% (relative to GRNN). Based on the findings of this study, several recommendations were suggested for further exploration of the ELM model in hydrological forecasting problems.
A model for assessing water quality risk in catchments prone to wildfire
NASA Astrophysics Data System (ADS)
Langhans, Christoph; Smith, Hugh; Chong, Derek; Nyman, Petter; Lane, Patrick; Sheridan, Gary
2017-04-01
Post-fire debris flows can have erosion rates up to three orders of magnitude higher than background rates. They are major sources of fine suspended sediment, which is critical to the safety of water supply from forested catchments. Fire can cover parts or all of these large catchments and burn severity is often heterogeneous. The probability of spatial and temporal overlap of fire disturbance and rainfall events, and the susceptibility of hillslopes to severe erosion determine the risk to water quality. Here we present a model to calculate recurrence intervals of high magnitude sediment delivery from runoff-generated debris flows to a reservoir in a large catchment (>100 km2) accounting for heterogeneous burn conditions. Debris flow initiation was modelled with indicators of surface runoff and soil surface erodibility. Debris flow volume was calculated with an empirical model, and fine sediment delivery was calculated using simple, expert-based assumptions. In a Monte-Carlo simulation, wildfire was modelled with a fire spread model using historic data on weather and ignition probabilities for a forested catchment in central Victoria, Australia. Multiple high intensity storms covering the study catchment were simulated using Intensity-Frequency-Duration relationships, and the runoff indicator calculated with a runoff model for hillslopes. A sensitivity analysis showed that fine sediment is most sensitive to variables related to the texture of the source material, debris flow volume estimation, and the proportion of fine sediment transported to the reservoir. As a measure of indirect validation, denudation rates of 4.6 - 28.5 mm ka-1 were estimated and compared well to other studies in the region. From the results it was extrapolated that in the absence of fire management intervention the critical sediment concentrations in the studied reservoir could be exceeded in intervals of 18 - 124 years.
Human Factors Operability Timeline Analysis to Improve the Processing Flow of the Orion Spacecraft
NASA Technical Reports Server (NTRS)
Schlierf, Roland; Stambolian, Damon B.; Miller, Darcy; Posanda, Juan; Haddock, Mike; Haddad, Mike; Tran, Donald; Henderson, Gena; Barth, Tim
2010-01-01
The Constellation Program (CxP) Orion vehicle goes through several areas and stages of processing before its launched at the Kennedy Space Center. In order to have efficient and effective processing, all of the activities need to be analyzed. This was accomplished by first developing a timeline of events that included each activity, and then each activity was analyzed by operability experts and human factors experts with spacecraft processing experience. This papers focus is to explain the results and the process for developing this human factors operability timeline analysis to improve the processing flow of Orion.
Richard S. Holthausen; Michael J. Wisdom; John Pierce; Daniel K. Edwards; Mary M. Rowland
1994-01-01
We used expert opinion to evaluate the predictive reliability of a habitat effectiveness model for elk in western Oregon and Washington. Twenty-five experts in elk ecology were asked to rate habitat quality for 16 example landscapes. Rankings and ratings of 21 experts were significantly correlated with model output. Expert opinion and model predictions differed for 4...
Utilizing DMAIC six sigma and evidence-based medicine to streamline diagnosis in chest pain.
Kumar, Sameer; Thomas, Kory M
2010-01-01
The purpose of this study was to quantify the difference between the current process flow model for a typical patient workup for chest pain and development of a new process flow model that incorporates DMAIC (define, measure, analyze, improve, control) Six Sigma and evidence-based medicine in a best practices model for diagnosis and treatment. The first stage, DMAIC Six Sigma, is used to highlight areas of variability and unnecessary tests in the current process flow for a patient presenting to the emergency department or physician's clinic with chest pain (also known as angina). The next stage, patient process flow, utilizes DMAIC results in the development of a simulated model that represents real-world variability in the diagnosis and treatment of a patient presenting with angina. The third and final stage is used to analyze the evidence-based output and quantify the factors that drive physician diagnosis accuracy and treatment, as well as review the potential for a broad national evidence-based database. Because of the collective expertise captured within the computer-oriented evidence-based model, the study has introduced an innovative approach to health care delivery by bringing expert-level care to any physician triaging a patient for chest pain anywhere in the world. Similar models can be created for other ailments as well, such as headache, gastrointestinal upset, and back pain. This updated way of looking at diagnosing patients stemming from an evidence-based best practice decision support model may improve workflow processes and cost savings across the health care continuum.
Regional Management of an Aquifer for Mining Under Fuzzy Environmental Objectives
NASA Astrophysics Data System (ADS)
BogáRdi, IstváN.; BáRdossy, AndráS.; Duckstein, Lucien
1983-12-01
A methodology is developed for the dynamic multiobjective management of a multipurpose regional aquifer. In a case study of bauxite mining in Western Hungary, ore deposits are often under the piezometric level of a karstic aquifer, while this same aquifer also provides recharge flows for thermal springs. N + 1 objectives are to be minimized, the first one being total discounted cost of control by dewatering or grouting; the other N objectives consist of the flow of thermal springs at N control points. However, there is no agreement among experts as to a set of numerical values that would constitute a "sound environment"; for this reason a fuzzy set analysis is used, and the N environmental objectives are combined into a single fuzzy membership function. The constraints include ore availability, various capacities, and the state transition function that describes the behavior of both piezometric head and underground flow. The model is linearized and solved as a biobjective dynamic program by using multiobjective compromise programming. A numerical example with N = 2 appears to lead to realistic control policies. Extension of the model to the nonlinear case is discussed.
An expert system for diagnostics and estimation of steam turbine components condition
NASA Astrophysics Data System (ADS)
Murmansky, B. E.; Aronson, K. E.; Brodov, Yu. M.
2017-11-01
The report describes an expert system of probability type for diagnostics and state estimation of steam turbine technological subsystems components. The expert system is based on Bayes’ theorem and permits to troubleshoot the equipment components, using expert experience, when there is a lack of baseline information on the indicators of turbine operation. Within a unified approach the expert system solves the problems of diagnosing the flow steam path of the turbine, bearings, thermal expansion system, regulatory system, condensing unit, the systems of regenerative feed-water and hot water heating. The knowledge base of the expert system for turbine unit rotors and bearings contains a description of 34 defects and of 104 related diagnostic features that cause a change in its vibration state. The knowledge base for the condensing unit contains 12 hypotheses and 15 evidence (indications); the procedures are also designated for 20 state parameters estimation. Similar knowledge base containing the diagnostic features and faults hypotheses are formulated for other technological subsystems of turbine unit. With the necessary initial information available a number of problems can be solved within the expert system for various technological subsystems of steam turbine unit: for steam flow path it is the correlation and regression analysis of multifactor relationship between the vibration parameters variations and the regime parameters; for system of thermal expansions it is the evaluation of force acting on the longitudinal keys depending on the temperature state of the turbine cylinder; for condensing unit it is the evaluation of separate effect of the heat exchange surface contamination and of the presence of air in condenser steam space on condenser thermal efficiency performance, as well as the evaluation of term for condenser cleaning and for tube system replacement and so forth. With a lack of initial information the expert system enables to formulate a diagnosis, calculating the probability of faults hypotheses, given the degree of the expert confidence in estimation of turbine components operation parameters.
NASA Astrophysics Data System (ADS)
Szajnfarber, Zoe; Weigel, Annalisa L.
2013-03-01
This paper investigates the process through which new technical concepts are matured in the NASA innovation ecosystem. We propose an "epoch-shock" conceptualization as an alternative mental model to the traditional stage-gate view. The epoch-shock model is developed inductively, based on detailed empirical observations of the process, and validated, to the extent possible, through expert review. The paper concludes by illustrating how the new epoch-shock conceptualization could provide a useful basis for rethinking feasible interventions to improve innovation management in the space agency context. Where the more traditional stage-gate model leads to an emphasis on centralized flow control, the epoch-shock model acknowledges the decentralized, probabilistic nature of key interactions and highlights which aspects may be influenced.
Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985
NASA Technical Reports Server (NTRS)
1986-01-01
The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.
Models Used to Select Strategic Planning Experts for High Technology Productions
NASA Astrophysics Data System (ADS)
Zakharova, Alexandra A.; Grigorjeva, Antonina A.; Tseplit, Anna P.; Ozgogov, Evgenij V.
2016-04-01
The article deals with the problems and specific aspects in organizing works of experts involved in assessment of companies that manufacture complex high-technology products. A model is presented that is intended for evaluating competences of experts in individual functional areas of expertise. Experts are selected to build a group on the basis of tables used to determine a competence level. An expert selection model based on fuzzy logic is proposed and additional requirements for the expert group composition can be taken into account, with regard to the needed quality and competence related preferences of decision-makers. A Web-based information system model is developed for the interaction between experts and decision-makers when carrying out online examinations.
USDA-ARS?s Scientific Manuscript database
The Chesapeake Stormwater Network hosted a workshop on July, 2012 to discuss the potential nutrient reductions from emerging stormwater technologies including algal flow-way technologies (AFTs). Workshop participants recommended the Chesapeake Bay Program’s Water Quality Goal Implementation Team(WQ...
Translating expert system rules into Ada code with validation and verification
NASA Technical Reports Server (NTRS)
Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam
1991-01-01
The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.
NASA Astrophysics Data System (ADS)
Grossmann, I.; Steyn, D. G.
2014-12-01
Global general circulation models typically cannot provide the detailed and accurate regional climate information required by stakeholders for climate adaptation efforts, given their limited capacity to resolve the regional topography and changes in local sea surface temperature, wind and circulation patterns. The study region in Northwest Costa Rica has a tropical wet-dry climate with a double-peak wet season. During the dry season the central Costa Rican mountains prevent tropical Atlantic moisture from reaching the region. Most of the annual precipitation is received following the northward migration of the ITCZ in May that allows the region to benefit from moist southwesterly flow from the tropical Pacific. The wet season begins with a short period of "early rains" and is interrupted by the mid-summer drought associated with the intensification and westward expansion of the North Atlantic subtropical high in late June. Model projections for the 21st century indicate a lengthening and intensification of the mid-summer drought and a weakening of the early rains on which current crop cultivation practices rely. We developed an expert elicitation to systematically address uncertainties in the available model projections of changes in the seasonal precipitation pattern. Our approach extends an elicitation approach developed previously at Carnegie Mellon University. Experts in the climate of the study region or Central American climate were asked to assess the mechanisms driving precipitation during each part of the season, uncertainties regarding these mechanisms, expected changes in each mechanism in a warming climate, and the capacity of current models to reproduce these processes. To avoid overconfidence bias, a step-by-step procedure was followed to estimate changes in the timing and intensity of precipitation during each part of the season. The questions drew upon interviews conducted with the regions stakeholders to assess their climate information needs. This study is part of the FuturAgua project funded by the Belmont Freshwater Security call. The expert opinions on expected changes in the seasonal precipitation pattern are being used to inform regional efforts to build drought resilience and to create and compare alternative water management strategies with the region's stakeholders.
#FluxFlow: Visual Analysis of Anomalous Information Spreading on Social Media.
Zhao, Jian; Cao, Nan; Wen, Zhen; Song, Yale; Lin, Yu-Ru; Collins, Christopher
2014-12-01
We present FluxFlow, an interactive visual analysis system for revealing and analyzing anomalous information spreading in social media. Everyday, millions of messages are created, commented, and shared by people on social media websites, such as Twitter and Facebook. This provides valuable data for researchers and practitioners in many application domains, such as marketing, to inform decision-making. Distilling valuable social signals from the huge crowd's messages, however, is challenging, due to the heterogeneous and dynamic crowd behaviors. The challenge is rooted in data analysts' capability of discerning the anomalous information behaviors, such as the spreading of rumors or misinformation, from the rest that are more conventional patterns, such as popular topics and newsworthy events, in a timely fashion. FluxFlow incorporates advanced machine learning algorithms to detect anomalies, and offers a set of novel visualization designs for presenting the detected threads for deeper analysis. We evaluated FluxFlow with real datasets containing the Twitter feeds captured during significant events such as Hurricane Sandy. Through quantitative measurements of the algorithmic performance and qualitative interviews with domain experts, the results show that the back-end anomaly detection model is effective in identifying anomalous retweeting threads, and its front-end interactive visualizations are intuitive and useful for analysts to discover insights in data and comprehend the underlying analytical model.
Influence of ionization on the Gupta and on the Park chemical models
NASA Astrophysics Data System (ADS)
Morsa, Luigi; Zuppardi, Gennaro
2014-12-01
This study is an extension of former works by the present authors, in which the influence of the chemical models by Gupta and by Park was evaluated on thermo-fluid-dynamic parameters in the flow field, including transport coefficients, related characteristic numbers and heat flux on two current capsules (EXPERT and Orion) during the high altitude re-entry path. The results verified that the models, even computing different air compositions in the flow field, compute only slight different compositions on the capsule surface, therefore the difference in the heat flux is not very relevant. In the above mentioned studies, ionization was neglected because the velocities of the capsules (about 5000 m/s for EXPERT and about 7600 m/s for Orion) were not high enough to activate meaningful ionization. The aim of the present work is to evaluate the incidence of ionization, linked to the chemical models by Gupta and by Park, on both heat flux and thermo fluid-dynamic parameters. The present computer tests were carried out by a direct simulation Monte Carlo code (DS2V) in the velocity interval 7600-12000 m/s, considering only the Orion capsule at an altitude of 85 km. The results verified what already found namely when ionization is not considered, the chemical models compute only a slight different gas composition in the core of the shock wave and practically the same composition on the surface therefore the same heat flux. On the opposite, the results verified that when ionization is considered, the chemical models compute different compositions in the whole shock layer and on the surface therefore different heat flux. The analysis of the results relies on a qualitative and a quantitative evaluation of the effects of ionization on both chemical models. The main result of the study is that when ionization is taken into account, the Park model is more reactive than the Gupta model; consequently, the heat flux computed by Park is lower than the one computed by Gupta; using the Gupta model, in the design of a thermal protection system, is recommended.
Mentoring staff members as patient safety leaders: the Clarian Safe Passage Program.
Rapala, Kathryn
2005-06-01
This article describes a second element of the Synergy Model of Patient Care implemented by Clarian Health Partners of Indiana. The Clarian Safe Passage Program is a unique approach to the promotion of patient safety. In this program, frontline staff nurses are trained to serve as Safe Passage nurses, who are unit-based safety experts. These nurses mentor each other and their peers in acquiring patient safety expertise and promoting a free flow of information to avert actual and potential errors in health care delivery.
Flow Characteristics Near to Stent Strut Configurations on Femoropopliteal Artery
NASA Astrophysics Data System (ADS)
Paisal, Muhammad Sufyan Amir; Fadhil Syed Adnan, Syed; Taib, Ishkrizat; Ismail, Al Emran; Kamil Abdullah, Mohammad; Nordin, Normayati; Seri, Suzairin Md; Darlis, Nofrizalidris
2017-08-01
Femoropopiteal artery stenting is a common procedure suggested by medical expert especially for patient who is diagnosed with severe stenosis. Many researchers reported that the growth of stenosis is significantly related to the geometry of stent strut configuration. The different shapes of stent geometry are presenting the different flow pattern and re-circulation in stented femoropopliteal artery. The blood flow characteristics near to the stent geometry are predicted for the possibility of thrombosis and atherosclerosis to be formed as well as increase the growth of stenosis. Thus, this study aims to determine the flow characteristic near to stent strut configuration based on different hemodynamic parameters. Three dimensional models of stent and simplified femoropopliteal artery are modelled using computer aided design (CAD) software. Three different models of stent shapes; hexagon, circle and rectangle are simulated using computational fluid dynamic (CFD) method. Then, parametric study is implemented to predict the performance of stent due to hemodynamic differences. The hemodynamic parameters considered are pressure, velocity, low wall shear stress (WSSlow) and wall shear stress (WSS). From the observation, flow re-circulation has been formed for all simulated stent models which the proximal region shown the severe vortices. However, rectangular shape of stent strut (Type P3) shows the lowest WSSlow and the highest WSS between the range of 4 dyne/cm2 and 70 dyne/cm2. Stent Type P3 also shows the best hemodynamic stent performance as compare to others. In conclusion, Type P3 has a favourable result in hemodynamic stent performance that predicted less probability of thrombosis and atherosclerosis to be formed as well as reduces the growth of restenosis.
Research Needs for Human Factors
1983-01-01
their relative merits. Until such comparisons are made, practitioners will continue to advocate their own products without a basis for choice among ...judgments among a group of experts; (2) formulating questions for "experts in a way that is compatible with their mental structures or "cognitive...system* Typically the operators work in teams and control compute3, which in turn mediate information flow among various automatic components. Other
Potential application of artificial concepts to aerodynamic simulation
NASA Technical Reports Server (NTRS)
Kutler, P.; Mehta, U. B.; Andrews, A.
1984-01-01
The concept of artificial intelligence as it applies to computational fluid dynamics simulation is investigated. How expert systems can be adapted to speed the numerical aerodynamic simulation process is also examined. A proposed expert grid generation system is briefly described which, given flow parameters, configuration geometry, and simulation constraints, uses knowledge about the discretization process to determine grid point coordinates, computational surface information, and zonal interface parameters.
NASA Astrophysics Data System (ADS)
Gomani, M. C.; Dietrich, O.; Lischeid, G.; Mahoo, H.; Mahay, F.; Mbilinyi, B.; Sarmett, J.
Sound decision making for water resources management has to be based on good knowledge of the dominant hydrological processes of a catchment. This information can only be obtained through establishing suitable hydrological monitoring networks. Research catchments are typically established without involving the key stakeholders, which results in instruments being installed at inappropriate places as well as at high risk of theft and vandalism. This paper presents an integrated participatory approach for establishing a hydrological monitoring network. We propose a framework with six steps beginning with (i) inception of idea; (ii) stakeholder identification; (iii) defining the scope of the network; (iv) installation; (v) monitoring; and (vi) feedback mechanism integrated within the participatory framework. The approach is illustrated using an example of the Ngerengere catchment in Tanzania. In applying the approach, the concept of establishing the Ngerengere catchment monitoring network was initiated in 2008 within the Resilient Agro-landscapes to Climate Change in Tanzania (ReACCT) research program. The main stakeholders included: local communities; Sokoine University of Agriculture; Wami Ruvu Basin Water Office and the ReACCT Research team. The scope of the network was based on expert experience in similar projects and lessons learnt from literature review of similar projects from elsewhere integrated with local expert knowledge. The installations involved reconnaissance surveys, detailed surveys, and expert consultations to identify best sites. First, a Digital Elevation Model, land use, and soil maps were used to identify potential monitoring sites. Local and expert knowledge was collected on flow regimes, indicators of shallow groundwater plant species, precipitation pattern, vegetation, and soil types. This information was integrated and used to select sites for installation of an automatic weather station, automatic rain gauges, river flow gauging stations, flow measurement sites and shallow groundwater wells. The network is now used to monitor hydro-meteorological parameters in collaboration with key stakeholders in the catchment. Preliminary results indicate that the network is working well. The benefits of this approach compared to conventional narrow scientific/technical approaches have been shown by gaining rapid insight into the hydrology of the catchment, identifying best sites for the instruments; and voluntary participation of stakeholders in installation, monitoring and safeguarding the installations. This approach has proved simple yet effective and yielded good results. Based on this experience gained in applying the approach in establishing the Ngerengere catchment monitoring network, we conclude that the integrated participatory approach helps to assimilate local and expert knowledge in catchments monitoring which consequently results in: (i) identifying best sites for the hydrologic monitoring; (ii) instilling the sense of ownership; (iii) providing security of the installed network; and (iv) minimizing costs for installation and monitoring.
Robust nonlinear system identification: Bayesian mixture of experts using the t-distribution
NASA Astrophysics Data System (ADS)
Baldacchino, Tara; Worden, Keith; Rowson, Jennifer
2017-02-01
A novel variational Bayesian mixture of experts model for robust regression of bifurcating and piece-wise continuous processes is introduced. The mixture of experts model is a powerful model which probabilistically splits the input space allowing different models to operate in the separate regions. However, current methods have no fail-safe against outliers. In this paper, a robust mixture of experts model is proposed which consists of Student-t mixture models at the gates and Student-t distributed experts, trained via Bayesian inference. The Student-t distribution has heavier tails than the Gaussian distribution, and so it is more robust to outliers, noise and non-normality in the data. Using both simulated data and real data obtained from the Z24 bridge this robust mixture of experts performs better than its Gaussian counterpart when outliers are present. In particular, it provides robustness to outliers in two forms: unbiased parameter regression models, and robustness to overfitting/complex models.
Fuller, Robert William; Wong, Tony E; Keller, Klaus
2017-01-01
The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections.
NASA Astrophysics Data System (ADS)
Haeberli, Wilfried; Huggel, Christian; Kääb, Andreas; Zgraggen-Oswald, Sonja; Polkvoj, Alexander; Galushkin, Igor; Zotikov, Igor; Osokin, Nikolay
On 20 September 2002, an enormous rock/ice slide and subsequent mud-flow occurred on the northern slope of the Kazbek massif, Northern Ossetia, Russian Caucasus. It started on the north-northeast wall of Dzhimarai-Khokh (4780 m a.s.l.) and seriously affected the valley of Genaldon/Karmadon. Immediate governmental actions, available scientific information, first reconstructions, hazard assessments and monitoring activities as well as initial expert judgments/recommendations are documented in order to enable more detailed analyses and modelling of the event by the wider scientific community. Among the most remarkable aspects related to this event are (1) the relation between the recent event and somewhat smaller but quite similar events that occurred earlier in historical times (1835, 1902), (2) the interactions between unstable local geological structures and complex geothermal and hydraulic conditions in the starting zone with permafrost, cold to polythermal hanging glaciers and volcanic effects (hot springs) in close contact with each other, (3) the erosion and incorporation of a debris-covered valley glacier largely enhancing the sliding volume of rocks, ice, firn, snow, water and probably air to a total of about 100 × 106 m3, and (4) the astonishingly high flow velocities (up to 300 km h-1) and enormous length of travel path (18 km plus 15 km of debris/mud-flow). This extraordinary case illustrates that large catastrophic events in high mountain regions typically involve a multitude of factors and require integrated consideration of complex chains of processes, a task which must be undertaken by qualified groups of experts.
Development of the ARISTOTLE webware for cloud-based rarefied gas flow modeling
NASA Astrophysics Data System (ADS)
Deschenes, Timothy R.; Grot, Jonathan; Cline, Jason A.
2016-11-01
Rarefied gas dynamics are important for a wide variety of applications. An improvement in the ability of general users to predict these gas flows will enable optimization of current, and discovery of future processes. Despite this potential, most rarefied simulation software is designed by and for experts in the community. This has resulted in low adoption of the methods outside of the immediate RGD community. This paper outlines an ongoing effort to create a rarefied gas dynamics simulation tool that can be used by a general audience. The tool leverages a direct simulation Monte Carlo (DSMC) library that is available to the entire community and a web-based simulation process that will enable all users to take advantage of high performance computing capabilities. First, the DSMC library and simulation architecture are described. Then the DSMC library is used to predict a number of representative transient gas flows that are applicable to the rarefied gas dynamics community. The paper closes with a summary and future direction.
A Psychological Model for Aggregating Judgments of Magnitude
NASA Astrophysics Data System (ADS)
Merkle, Edgar C.; Steyvers, Mark
In this paper, we develop and illustrate a psychologically-motivated model for aggregating judgments of magnitude across experts. The model assumes that experts' judgments are perturbed from the truth by both systematic biases and random error, and it provides aggregated estimates that are implicitly based on the application of nonlinear weights to individual judgments. The model is also easily extended to situations where experts report multiple quantile judgments. We apply the model to expert judgments concerning flange leaks in a chemical plant, illustrating its use and comparing it to baseline measures.
This report documents initial efforts to identify innovative strategies for managing the effects of wet-weather flow in an urban setting. It served as a communication tool and a starting point for discussion with experts. As such, the document is a compilation of literature rev...
Mechanical Power Flow Changes during Multijoint Movement Acquisition
ERIC Educational Resources Information Center
Kadota, Koji; Matsuo, Tomoyuki; Hashizume, Ken; Tezuka, Kazushi
2006-01-01
We investigated the differences in mechanical power flow in early and late practice stages during a cyclic movement consisting of upper arm circumduction to clarify the change in mechanical energy use with skill acquisition. Seven participants practiced the task every other day until their joint angular movements conformed to those of an expert.…
Expert elicitation of population-level effects of disturbance
Fleishman, Erica; Burgman, Mark; Runge, Michael C.; Schick, Robert S; Krauss, Scott; Popper, Arthur N.; Hawkins, Anthony
2016-01-01
Expert elicitation is a rigorous method for synthesizing expert knowledge to inform decision making and is reliable and practical when field data are limited. We evaluated the feasibility of applying expert elicitation to estimate population-level effects of disturbance on marine mammals. Diverse experts estimated parameters related to mortality and sublethal injury of North Atlantic right whales (Eubalaena glacialis). We are now eliciting expert knowledge on the movement of right whales among geographic regions to parameterize a spatial model of health. Expert elicitation complements methods such as simulation models or extrapolations from other species, sometimes with greater accuracy and less uncertainty.
Wong, Tony E.; Keller, Klaus
2017-01-01
The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections. PMID:29287095
Expert models and modeling processes associated with a computer-modeling tool
NASA Astrophysics Data System (ADS)
Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.
2006-07-01
Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.
Lumb, A.M.; McCammon, R.B.; Kittle, J.L.
1994-01-01
Expert system software was developed to assist less experienced modelers with calibration of a watershed model and to facilitate the interaction between the modeler and the modeling process not provided by mathematical optimization. A prototype was developed with artificial intelligence software tools, a knowledge engineer, and two domain experts. The manual procedures used by the domain experts were identified and the prototype was then coded by the knowledge engineer. The expert system consists of a set of hierarchical rules designed to guide the calibration of the model through a systematic evaluation of model parameters. When the prototype was completed and tested, it was rewritten for portability and operational use and was named HSPEXP. The watershed model Hydrological Simulation Program--Fortran (HSPF) is used in the expert system. This report is the users manual for HSPEXP and contains a discussion of the concepts and detailed steps and examples for using the software. The system has been tested on watersheds in the States of Washington and Maryland, and the system correctly identified the model parameters to be adjusted and the adjustments led to improved calibration.
Expert anticipatory skill in striking sports: a review and a model.
Müller, Sean; Abernethy, Bruce
2012-06-01
Expert performers in striking sports can hit objects moving at high speed with incredible precision. Exceptionally well developed anticipation skills are necessary to cope with the severe constraints on interception. In this papr we provide a review of the empirical evidence regarding expert interception in striking sports and propose a preliminary model of expert anticipation. Central to the review and the model is the notion that the visual information used to guide the sequential phases of the striking action is systematically different between experts and nonexperts. Knowing the factors that contribute to expert anticipation, and how anticipation may guide skilled performance in striking sports, has practical implications for assessment and training across skill levels.
Visualizing landscape hydrology as a means of education - The water cycle in a box
NASA Astrophysics Data System (ADS)
Lehr, Christian; Rauneker, Philipp; Fahle, Marcus; Hohenbrink, Tobias; Böttcher, Steven; Natkhin, Marco; Thomas, Björn; Dannowski, Ralf; Schwien, Bernd; Lischeid, Gunnar
2016-04-01
We used an aquarium to construct a physical model of the water cycle. The model can be used to visualize the movement of the water through the landscape from precipitation and infiltration via surface and subsurface flow to discharge into the sea. The model consists of two aquifers that are divided by a loamy aquitard. The 'geological' setting enables us to establish confining groundwater conditions and to demonstrate the functioning of artesian wells. Furthermore, small experiments with colored water as tracer can be performed to identify flow paths below the ground, simulate water supply problems like pollution of drinking water wells from inflowing contaminated groundwater or changes in subsurface flow direction due to changes in the predominant pressure gradients. Hydrological basics such as the connectivity of streams, lakes and the surrounding groundwater or the dependency of groundwater flow velocity from different substrates can directly be visualized. We used the model as an instructive tool in education and for public relations. We presented the model to different audiences from primary school pupils to laymen, students of hydrology up to university professors. The model was presented to the scientific community as part of the "Face of the Earth" exhibition at the EGU general assembly 2014. Independent of the antecedent knowledge of the audience, the predominant reactions were very positive. The model often acted as icebreaker to get a conversation on hydrological topics started. Because of the great interest, we prepared video material and a photo documentation on 1) the construction of the model and 2) the visualization of steady and dynamic hydrological situations. The videos will be published soon under creative common license and the collected material will be made accessible online. Accompanying documents will address professionals in hydrology as well as non-experts. In the PICO session, we will present details about the construction of the model and its main features. Further, short videos of specific processes and experiments will be shown.
A network-oriented business modeling environment
NASA Astrophysics Data System (ADS)
Bisconti, Cristian; Storelli, Davide; Totaro, Salvatore; Arigliano, Francesco; Savarino, Vincenzo; Vicari, Claudia
The development of formal models related to the organizational aspects of an enterprise is fundamental when these aspects must be re-engineered and digitalized, especially when the enterprise is involved in the dynamics and value flows of a business network. Business modeling provides an opportunity to synthesize and make business processes, business rules and the structural aspects of an organization explicit, allowing business managers to control their complexity and guide an enterprise through effective decisional and strategic activities. This chapter discusses the main results of the TEKNE project in terms of software components that enable enterprises to configure, store, search and share models of any aspects of their business while leveraging standard and business-oriented technologies and languages to bridge the gap between the world of business people and IT experts and to foster effective business-to-business collaborations.
CONFIG: Qualitative simulation tool for analyzing behavior of engineering devices
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Basham, Bryan D.; Harris, Richard A.
1987-01-01
To design failure management expert systems, engineers mentally analyze the effects of failures and procedures as they propagate through device configurations. CONFIG is a generic device modeling tool for use in discrete event simulation, to support such analyses. CONFIG permits graphical modeling of device configurations and qualitative specification of local operating modes of device components. Computation requirements are reduced by focussing the level of component description on operating modes and failure modes, and specifying qualitative ranges of variables relative to mode transition boundaries. Simulation processing occurs only when modes change or variables cross qualitative boundaries. Device models are built graphically, using components from libraries. Components are connected at ports by graphical relations that define data flow. The core of a component model is its state transition diagram, which specifies modes of operation and transitions among them.
Best Practices for Reduction of Uncertainty in CFD Results
NASA Technical Reports Server (NTRS)
Mendenhall, Michael R.; Childs, Robert E.; Morrison, Joseph H.
2003-01-01
This paper describes a proposed best-practices system that will present expert knowledge in the use of CFD. The best-practices system will include specific guidelines to assist the user in problem definition, input preparation, grid generation, code selection, parameter specification, and results interpretation. The goal of the system is to assist all CFD users in obtaining high quality CFD solutions with reduced uncertainty and at lower cost for a wide range of flow problems. The best-practices system will be implemented as a software product which includes an expert system made up of knowledge databases of expert information with specific guidelines for individual codes and algorithms. The process of acquiring expert knowledge is discussed, and help from the CFD community is solicited. Benefits and challenges associated with this project are examined.
What defines an Expert? - Uncertainty in the interpretation of seismic data
NASA Astrophysics Data System (ADS)
Bond, C. E.
2008-12-01
Studies focusing on the elicitation of information from experts are concentrated primarily in economics and world markets, medical practice and expert witness testimonies. Expert elicitation theory has been applied in the natural sciences, most notably in the prediction of fluid flow in hydrological studies. In the geological sciences expert elicitation has been limited to theoretical analysis with studies focusing on the elicitation element, gaining expert opinion rather than necessarily understanding the basis behind the expert view. In these cases experts are defined in a traditional sense, based for example on: standing in the field, no. of years of experience, no. of peer reviewed publications, the experts position in a company hierarchy or academia. Here traditional indicators of expertise have been compared for significance on affective seismic interpretation. Polytomous regression analysis has been used to assess the relative significance of length and type of experience on the outcome of a seismic interpretation exercise. Following the initial analysis the techniques used by participants to interpret the seismic image were added as additional variables to the analysis. Specific technical skills and techniques were found to be more important for the affective geological interpretation of seismic data than the traditional indicators of expertise. The results of a seismic interpretation exercise, the techniques used to interpret the seismic and the participant's prior experience have been combined and analysed to answer the question - who is and what defines an expert?
Harris, David J.; Vine, Samuel J.; Wilson, Mark R.; McGrath, John S.; LeBel, Marie-Eve
2017-01-01
Background Observational learning plays an important role in surgical skills training, following the traditional model of learning from expertise. Recent findings have, however, highlighted the benefit of observing not only expert performance but also error-strewn performance. The aim of this study was to determine which model (novice vs. expert) would lead to the greatest benefits when learning robotically assisted surgical skills. Methods 120 medical students with no prior experience of robotically-assisted surgery completed a ring-carrying training task on three occasions; baseline, post-intervention and at one-week follow-up. The observation intervention consisted of a video model performing the ring-carrying task, with participants randomly assigned to view an expert model, a novice model, a mixed expert/novice model or no observation (control group). Participants were assessed for task performance and surgical instrument control. Results There were significant group differences post-intervention, with expert and novice observation groups outperforming the control group, but there were no clear group differences at a retention test one week later. There was no difference in performance between the expert-observing and error-observing groups. Conclusions Similar benefits were found when observing the traditional expert model or the error-strewn model, suggesting that viewing poor performance may be as beneficial as viewing expertise in the early acquisition of robotic surgical skills. Further work is required to understand, then inform, the optimal curriculum design when utilising observational learning in surgical training. PMID:29141046
Mapping and Eradication Prioritization Modeling of Red Sesbania ( Sesbania punicea) Populations
NASA Astrophysics Data System (ADS)
Robison, Ramona; Barve, Nita; Owens, Christina; Skurka Darin, Gina; DiTomaso, Joseph M.
2013-07-01
Red sesbania is an invasive South American shrub that has rapidly expanded its range along California waterways, emphasizing the need to prioritize eradication sites at a regional scale. To accomplish this, we updated baseline location data in summer 2010 using field surveys throughout the state. We collected relevant GPS attribute data for GIS analysis and eradication prioritization modeling. The regional survey identified upstream and downstream extents for each watershed, as well as outliers in urban areas. We employed the Weed Heuristics: Invasive Population Prioritization for Eradication Tool (WHIPPET) to prioritize red sesbania sites for eradication, and revised the WHIPPET model to consider directional propagule flow of a riparian species. WHIPPET prioritized small populations isolated from larger infestations, as well as outliers in residential areas. When we compared five experts' assessments of a stratified sample of the red sesbania populations to WHIPPET's prioritization results, there was a positive, but nonsignificant, correlation. The combination of WHIPPET and independent expert opinion suggests that small, isolated populations and upstream source populations should be the primary targets for eradication. Particular attention should be paid to these small populations in watersheds where red sesbania is a new introduction. The use of this model in conjunction with evaluation by the land manager may help prevent the establishment of new seed sources and protect uninfested riparian corridors and their adjacent watersheds.
Mapping and eradication prioritization modeling of red sesbania (Sesbania punicea) populations.
Robison, Ramona; Barve, Nita; Owens, Christina; Skurka Darin, Gina; DiTomaso, Joseph M
2013-07-01
Red sesbania is an invasive South American shrub that has rapidly expanded its range along California waterways, emphasizing the need to prioritize eradication sites at a regional scale. To accomplish this, we updated baseline location data in summer 2010 using field surveys throughout the state. We collected relevant GPS attribute data for GIS analysis and eradication prioritization modeling. The regional survey identified upstream and downstream extents for each watershed, as well as outliers in urban areas. We employed the Weed Heuristics: Invasive Population Prioritization for Eradication Tool (WHIPPET) to prioritize red sesbania sites for eradication, and revised the WHIPPET model to consider directional propagule flow of a riparian species. WHIPPET prioritized small populations isolated from larger infestations, as well as outliers in residential areas. When we compared five experts' assessments of a stratified sample of the red sesbania populations to WHIPPET's prioritization results, there was a positive, but nonsignificant, correlation. The combination of WHIPPET and independent expert opinion suggests that small, isolated populations and upstream source populations should be the primary targets for eradication. Particular attention should be paid to these small populations in watersheds where red sesbania is a new introduction. The use of this model in conjunction with evaluation by the land manager may help prevent the establishment of new seed sources and protect uninfested riparian corridors and their adjacent watersheds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rizzo, Davinia B.; Blackburn, Mark R.
As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less
Rizzo, Davinia B.; Blackburn, Mark R.
2018-03-30
As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less
Declarative Business Process Modelling and the Generation of ERP Systems
NASA Astrophysics Data System (ADS)
Schultz-Møller, Nicholas Poul; Hølmer, Christian; Hansen, Michael R.
We present an approach to the construction of Enterprise Resource Planning (ERP) Systems, which is based on the Resources, Events and Agents (REA) ontology. This framework deals with processes involving exchange and flow of resources in a declarative, graphically-based manner describing what the major entities are rather than how they engage in computations. We show how to develop a domain-specific language on the basis of REA, and a tool which automatically can generate running web-applications. A main contribution is a proof-of-concept showing that business-domain experts can generate their own applications without worrying about implementation details.
K-Means Subject Matter Expert Refined Topic Model Methodology
2017-01-01
Refined Topic Model Methodology Topic Model Estimation via K-Means U.S. Army TRADOC Analysis Center-Monterey 700 Dyer Road...January 2017 K-means Subject Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-Means Theodore T. Allen, Ph.D. Zhenhuan...Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-means 5a. CONTRACT NUMBER W9124N-15-P-0022 5b. GRANT NUMBER 5c
Center of Excellence for Hypersonics Research
2012-01-25
detailed simulations of actual combustor configurations, and ultimately for the optimization of hypersonic air - breathing propulsion system flow paths... vehicle development programs. The Center engaged leading experts in experimental and computational analysis of hypersonic flows to provide research...advanced hypersonic vehicles and space access systems will require significant advances in the design methods and ground testing techniques to ensure
Postural control and perceptive configuration: influence of expertise in gymnastics.
Gautier, Geoffroy; Thouvarecq, Régis; Vuillerme, Nicolas
2008-07-01
The purpose of the present experiment was to investigate how postural adaptations to the perceptive configuration are modified by specific gymnastics experience. Two groups, one expert in gymnastics and the other non-expert, had to maintain the erected posture while optical flow was imposed as follows: 20s motionless, 30s approaching motion, and 20s motionless. The centre of pressure and head displacements were analysed. The postural adaptations were characterised by the variability of movements for the flow conditions and by the postural latencies for the flow transitions. The results showed that the gymnasts tended to minimise their body movements and were more stationary (head) but not more stable (COP) than the non-gymnasts. These results suggest that gymnastics experience develops a specific postural adaptability relative to the perceptive configuration. We conclude that a specific postural experience could be considered as an intrinsic constraint, which leads to modification in the patterns of functional adaptation in the perceptive motor space.
Water Sensation During Passive Propulsion for Expert and Nonexpert Swimmers.
Kusanagi, Kenta; Sato, Daisuke; Hashimoto, Yasuhiro; Yamada, Norimasa
2017-06-01
This study determined whether expert swimmers, compared with nonexperts, have superior movement perception and physical sensations of propulsion in water. Expert (national level competitors, n = 10) and nonexpert (able to swim 50 m in > 3 styles, n = 10) swimmers estimated distance traveled in water with their eyes closed. Both groups indicated their subjective physical sensations in the water. For each of two trials, two-dimensional coordinates were obtained from video recordings using the two-dimensional direct linear transformation method for calculating changes in speed. The mean absolute error of the difference between the actual and estimated distance traveled in the water was significantly lower for expert swimmers (0.90 ± 0.71 meters) compared with nonexpert swimmers (3.85 ± 0.84 m). Expert swimmers described the sensation of propulsion in water in cutaneous terms as the "sense of flow" and sensation of "skin resistance." Therefore, expert swimmers appear to have a superior sense of distance during their movement in the water compared with that of nonexpert swimmers. In addition, expert swimmers may have a better perception of movement in water. We propose that expert swimmers integrate sensations and proprioceptive senses, enabling them to better perceive and estimate distance moved through water.
Model of critical diagnostic reasoning: achieving expert clinician performance.
Harjai, Prashant Kumar; Tiwari, Ruby
2009-01-01
Diagnostic reasoning refers to the analytical processes used to determine patient health problems. While the education curriculum and health care system focus on training nurse clinicians to accurately recognize and rescue clinical situations, assessments of non-expert nurses have yielded less than satisfactory data on diagnostic competency. The contrast between the expert and non-expert nurse clinician raises the important question of how differences in thinking may contribute to a large divergence in accurate diagnostic reasoning. This article recognizes superior organization of one's knowledge base, using prototypes, and quick retrieval of pertinent information, using similarity recognition as two reasons for the expert's superior diagnostic performance. A model of critical diagnostic reasoning, using prototypes and similarity recognition, is proposed and elucidated using case studies. This model serves as a starting point toward bridging the gap between clinical data and accurate problem identification, verification, and management while providing a structure for a knowledge exchange between expert and non-expert clinicians.
A Model-Based Expert System for Space Power Distribution Diagnostics
NASA Technical Reports Server (NTRS)
Quinn, Todd M.; Schlegelmilch, Richard F.
1994-01-01
When engineers diagnose system failures, they often use models to confirm system operation. This concept has produced a class of advanced expert systems that perform model-based diagnosis. A model-based diagnostic expert system for the Space Station Freedom electrical power distribution test bed is currently being developed at the NASA Lewis Research Center. The objective of this expert system is to autonomously detect and isolate electrical fault conditions. Marple, a software package developed at TRW, provides a model-based environment utilizing constraint suspension. Originally, constraint suspension techniques were developed for digital systems. However, Marple provides the mechanisms for applying this approach to analog systems such as the test bed, as well. The expert system was developed using Marple and Lucid Common Lisp running on a Sun Sparc-2 workstation. The Marple modeling environment has proved to be a useful tool for investigating the various aspects of model-based diagnostics. This report describes work completed to date and lessons learned while employing model-based diagnostics using constraint suspension within an analog system.
A comprehensive information technology system to support physician learning at the point of care.
Cook, David A; Sorensen, Kristi J; Nishimura, Rick A; Ommen, Steve R; Lloyd, Farrell J
2015-01-01
MayoExpert is a multifaceted information system integrated with the electronic medical record (EMR) across Mayo Clinic's multisite health system. It was developed as a technology-based solution to manage information, standardize clinical practice, and promote and document learning in clinical contexts. Features include urgent test result notifications; models illustrating expert-approved care processes; concise, expert-approved answers to frequently asked questions (FAQs); a directory of topic-specific experts; and a portfolio for provider licensure and credentialing. The authors evaluate MayoExpert's reach, effectiveness, adoption, implementation, and maintenance. Evaluation data sources included usage statistics, user surveys, and pilot studies.As of October 2013, MayoExpert was available at 94 clinical sites in 12 states and contained 1,368 clinical topics, answers to 7,640 FAQs, and 92 care process models. In 2012, MayoExpert was accessed at least once by 2,578/3,643 (71%) staff physicians, 900/1,374 (66%) midlevel providers, and 1,728/2,291 (75%) residents and fellows. In a 2013 survey of MayoExpert users with 536 respondents, all features were highly rated (≥67% favorable). More providers reported using MayoExpert to answer questions before/after than during patient visits (68% versus 36%). During November 2012 to April 2013, MayoExpert sent 1,660 notifications of new-onset atrial fibrillation and 1,590 notifications of prolonged QT. MayoExpert has become part of routine clinical and educational operations, and its care process models now define Mayo Clinic best practices. MayoExpert's infrastructure and content will continue to expand with improved templates and content organization, new care process models, additional notifications, better EMR integration, and improved support for credentialing activities.
Wei, Zhenglun Alan; Sonntag, Simon Johannes; Toma, Milan; Singh-Gryzbon, Shelly; Sun, Wei
2018-04-19
The governing international standard for the development of prosthetic heart valves is International Organization for Standardization (ISO) 5840. This standard requires the assessment of the thrombus potential of transcatheter heart valve substitutes using an integrated thrombus evaluation. Besides experimental flow field assessment and ex vivo flow testing, computational fluid dynamics is a critical component of this integrated approach. This position paper is intended to provide and discuss best practices for the setup of a computational model, numerical solving, post-processing, data evaluation and reporting, as it relates to transcatheter heart valve substitutes. This paper is not intended to be a review of current computational technology; instead, it represents the position of the ISO working group consisting of experts from academia and industry with regards to considerations for computational fluid dynamic assessment of transcatheter heart valve substitutes.
A Model of How Different Biology Experts Explain Molecular and Cellular Mechanisms
ERIC Educational Resources Information Center
Trujillo, Caleb M.; Anderson, Trevor R.; Pelaez, Nancy J.
2015-01-01
Constructing explanations is an essential skill for all science learners. The goal of this project was to model the key components of expert explanation of molecular and cellular mechanisms. As such, we asked: What is an appropriate model of the components of explanation used by biology experts to explain molecular and cellular mechanisms? Do…
Samson, M; Monnet, T; Bernard, A; Lacouture, P; David, L
2018-01-23
The propulsive forces generated by the hands and arms of swimmers have so far been determined essentially by quasi-steady approaches. This study aims to quantify the temporal dependence of the hydrodynamic forces for a simple translation movement: an impulsive start from rest. The study, carried out in unsteady numerical simulation, couples the calculation of the lift and the drag on an expert swimmer hand-forearm model with visualizations of the flow and flow vortex structure analysis. The results of these simulations show that the hand and forearm hydrodynamic forces should be studied from an unsteady approach because the quasi-steady model is inadequate. It also appears that the delayed stall effect generates higher circulatory forces during a short translation at high angle of attack than forces calculated under steady state conditions. During this phase the hand force coefficients are approximately twice as large as those of the forearm. The total force coefficients are highest for angles of attack between 40° and 60°. For the same angle of attack, the forces produced when the leading edge is the thumb side are slightly greater than those produced when the leading edge is the little finger side. Copyright © 2017 Elsevier Ltd. All rights reserved.
Development of S-ARIMA Model for Forecasting Demand in a Beverage Supply Chain
NASA Astrophysics Data System (ADS)
Mircetic, Dejan; Nikolicic, Svetlana; Maslaric, Marinko; Ralevic, Nebojsa; Debelic, Borna
2016-11-01
Demand forecasting is one of the key activities in planning the freight flows in supply chains, and accordingly it is essential for planning and scheduling of logistic activities within observed supply chain. Accurate demand forecasting models directly influence the decrease of logistics costs, since they provide an assessment of customer demand. Customer demand is a key component for planning all logistic processes in supply chain, and therefore determining levels of customer demand is of great interest for supply chain managers. In this paper we deal with exactly this kind of problem, and we develop the seasonal Autoregressive IntegratedMoving Average (SARIMA) model for forecasting demand patterns of a major product of an observed beverage company. The model is easy to understand, flexible to use and appropriate for assisting the expert in decision making process about consumer demand in particular periods.
Synthesising empirical results to improve predictions of post-wildfire runoff and erosion response
Shakesby, Richard A.; Moody, John A.; Martin, Deborah A.; Robichaud, Peter R.
2016-01-01
Advances in research into wildfire impacts on runoff and erosion have demonstrated increasing complexity of controlling factors and responses, which, combined with changing fire frequency, present challenges for modellers. We convened a conference attended by experts and practitioners in post-wildfire impacts, meteorology and related research, including modelling, to focus on priority research issues. The aim was to improve our understanding of controls and responses and the predictive capabilities of models. This conference led to the eight selected papers in this special issue. They address aspects of the distinctiveness in the controls and responses among wildfire regions, spatiotemporal rainfall variability, infiltration, runoff connectivity, debris flow formation and modelling applications. Here we summarise key findings from these papers and evaluate their contribution to improving understanding and prediction of post-wildfire runoff and erosion under changes in climate, human intervention and population pressure on wildfire-prone areas.
Modeling effects of climate change on Yakima River salmonid habitats
Hatten, James R.; Batt, Thomas R.; Connolly, Patrick J.; Maule, Alec G.
2014-01-01
We evaluated the potential effects of two climate change scenarios on salmonid habitats in the Yakima River by linking the outputs from a watershed model, a river operations model, a two-dimensional (2D) hydrodynamic model, and a geographic information system (GIS). The watershed model produced a discharge time series (hydrograph) in two study reaches under three climate scenarios: a baseline (1981–2005), a 1-°C increase in mean air temperature (plus one scenario), and a 2-°C increase (plus two scenario). A river operations model modified the discharge time series with Yakima River operational rules, a 2D model provided spatially explicit depth and velocity grids for two floodplain reaches, while an expert panel provided habitat criteria for four life stages of coho and fall Chinook salmon. We generated discharge-habitat functions for each salmonid life stage (e.g., spawning, rearing) in main stem and side channels, and habitat time series for baseline, plus one (P1) and plus two (P2) scenarios. The spatial and temporal patterns in salmonid habitats differed by reach, life stage, and climate scenario. Seventy-five percent of the 28 discharge-habitat responses exhibited a decrease in habitat quantity, with the P2 scenario producing the largest changes, followed by P1. Fry and spring/summer rearing habitats were the most sensitive to warming and flow modification for both species. Side channels generally produced more habitat than main stem and were more responsive to flow changes, demonstrating the importance of lateral connectivity in the floodplain. A discharge-habitat sensitivity analysis revealed that proactive management of regulated surface waters (i.e., increasing or decreasing flows) might lessen the impacts of climate change on salmonid habitats.
Thogmartin, Wayne E.; Sanders-Reed, Carol A.; Szymanski, Jennifer; Pruitt, Lori; Runge, Michael C.
2017-01-01
Demographic characteristics of bats are often insufficiently described for modeling populations. In data poor situations, experts are often relied upon for characterizing ecological systems. In concert with the development of a matrix model describing Indiana bat (Myotis sodalis) demography, we elicited estimates for parameterizing this model from 12 experts. We conducted this elicitation in two stages, requesting expert values for 12 demographic rates. These rates were adult and juvenile seasonal (winter, summer, fall) survival rates, pup survival in fall, and propensity and success at breeding. Experts were most in agreement about adult fall survival (3% Coefficient of Variation) and least in agreement about propensity of juveniles to breed (37% CV). The experts showed greater concordance for adult ( mean CV, adult = 6.2%) than for juvenile parameters ( mean CV, juvenile = 16.4%), and slightly more agreement for survival (mean CV, survival = 9.8%) compared to reproductive rates ( mean CV, reproduction = 15.1%). However, survival and reproduction were negatively and positively biased, respectively, relative to a stationary dynamic. Despite the species exhibiting near stationary dynamics for two decades prior to the onset of a potential extinction-causing agent, white-nose syndrome, expert estimates indicated a population decline of -11% per year (95% CI = -2%, -20%); quasi-extinction was predicted within a century ( mean = 61 years to QE, range = 32, 97) by 10 of the 12 experts. Were we to use these expert estimates in our modeling efforts, we would have errantly trained our models to a rapidly declining demography asymptomatic of recent demographic behavior. While experts are sometimes the only source of information, a clear understanding of the temporal and spatial context of the information being elicited is necessary to guard against wayward predictions.
NASA Astrophysics Data System (ADS)
Wesemann, Johannes; Burgholzer, Reinhard; Herrnegger, Mathew; Schulz, Karsten
2017-04-01
In recent years, a lot of research in hydrological modelling has been invested to improve the automatic calibration of rainfall-runoff models. This includes for example (1) the implementation of new optimisation methods, (2) the incorporation of new and different objective criteria and signatures in the optimisation and (3) the usage of auxiliary data sets apart from runoff. Nevertheless, in many applications manual calibration is still justifiable and frequently applied. The hydrologist performing the manual calibration, with his expert knowledge, is able to judge the hydrographs simultaneously concerning details but also in a holistic view. This integrated eye-ball verification procedure available to man can be difficult to formulate in objective criteria, even when using a multi-criteria approach. Comparing the results of automatic and manual calibration is not straightforward. Automatic calibration often solely involves objective criteria such as Nash-Sutcliffe Efficiency Coefficient or the Kling-Gupta-Efficiency as a benchmark during the calibration. Consequently, a comparison based on such measures is intrinsically biased towards automatic calibration. Additionally, objective criteria do not cover all aspects of a hydrograph leaving questions concerning the quality of a simulation open. This contribution therefore seeks to examine the quality of manually and automatically calibrated hydrographs by interactively involving expert knowledge in the evaluation. Simulations have been performed for the Mur catchment in Austria with the rainfall-runoff model COSERO using two parameter sets evolved from a manual and an automatic calibration. A subset of resulting hydrographs for observation and simulation, representing the typical flow conditions and events, will be evaluated in this study. In an interactive crowdsourcing approach experts attending the session can vote for their preferred simulated hydrograph without having information on the calibration method that produced the respective hydrograph. Therefore, the result of the poll can be seen as an additional quality criterion for the comparison of the two different approaches and help in the evaluation of the automatic calibration method.
NASA Astrophysics Data System (ADS)
Sandeep, N.; Animasaun, I. L.
2017-06-01
Within the last few decades, experts and scientists dealing with the flow of non-Newtonian fluids (most especially Casson fluid) have confirmed the existence of such flow on a stretchable surface with low heat energy (i.e. absolute zero of temperature). This article presents the motion of a three-dimensional of such fluid. Influence of uniform space dependent internal heat source on the intermolecular forces holding the molecules of Casson fluid is investigated. It is assumed that the stagnation flow was induced by an external force (pressure gradient) together with impulsive. Based on these assumptions, variable thermophysical properties are most suitable; hence modified kinematic viscosity model is presented. The system of governing equations of 3-dimensional unsteady Casson fluid was non-dimensionalized using suitable similarity transformation which unravels the behavior of the flow at full fledge short period. The numerical solution of the corresponding boundary value problem (ODE) was obtained using Runge-Kutta fourth order along with shooting technique. The intermolecular forces holding the molecules of Casson fluid flow in both horizontal directions when magnitude of velocity ratio parameters are greater than unity breaks continuously with an increase in Casson parameter and this leads to an increase in velocity profiles in both directions.
Kiefer, Stephan; Schäfer, Michael; Bransch, Marco; Brimmers, Peter; Bartolomé, Diego; Baños, Janie; Orr, James; Jones, Dave; Jara, Maximilian; Stockmann, Martin
2014-01-01
A personal health system platform for the management of patients with chronic liver disease that incorporates a novel approach to integrate decision support and guidance through care pathways for patients and their doctors is presented in this paper. The personal health system incorporates an integrated decision support engine that guides patients and doctors through the management of the disease by issuing tasks and providing recommendations to both the care team and the patient and by controlling the execution of a Care Flow Plan based on the results of tasks and the monitored health status of the patient. This Care Flow Plan represents a formal, business process based model of disease management designed off-line by domain experts on the basis of clinical guidelines, knowledge of care pathways and an organisational model for integrated, patient-centred care. In this way, remote monitoring and treatment are dynamically adapted to the patient's actual condition and clinical symptoms and allow flexible delivery of care with close integration of specialists, therapists and care-givers.
NASA Technical Reports Server (NTRS)
Stephan, Amy; Erikson, Carol A.
1991-01-01
As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.
How much expert knowledge is it worth to put in conceptual hydrological models?
NASA Astrophysics Data System (ADS)
Antonetti, Manuel; Zappa, Massimiliano
2017-04-01
Both modellers and experimentalists agree on using expert knowledge to improve our conceptual hydrological simulations on ungauged basins. However, they use expert knowledge differently for both hydrologically mapping the landscape and parameterising a given hydrological model. Modellers use generally very simplified (e.g. topography-based) mapping approaches and put most of the knowledge for constraining the model by defining parameter and process relational rules. In contrast, experimentalists tend to invest all their detailed and qualitative knowledge about processes to obtain a spatial distribution of areas with different dominant runoff generation processes (DRPs) as realistic as possible, and for defining plausible narrow value ranges for each model parameter. Since, most of the times, the modelling goal is exclusively to simulate runoff at a specific site, even strongly simplified hydrological classifications can lead to satisfying results due to equifinality of hydrological models, overfitting problems and the numerous uncertainty sources affecting runoff simulations. Therefore, to test to which extent expert knowledge can improve simulation results under uncertainty, we applied a typical modellers' modelling framework relying on parameter and process constraints defined based on expert knowledge to several catchments on the Swiss Plateau. To map the spatial distribution of the DRPs, mapping approaches with increasing involvement of expert knowledge were used. Simulation results highlighted the potential added value of using all the expert knowledge available on a catchment. Also, combinations of event types and landscapes, where even a simplified mapping approach can lead to satisfying results, were identified. Finally, the uncertainty originated by the different mapping approaches was compared with the one linked to meteorological input data and catchment initial conditions.
NASA Astrophysics Data System (ADS)
Nijzink, R. C.; Samaniego, L.; Mai, J.; Kumar, R.; Thober, S.; Zink, M.; Schäfer, D.; Savenije, H. H. G.; Hrachowitz, M.
2015-12-01
Heterogeneity of landscape features like terrain, soil, and vegetation properties affect the partitioning of water and energy. However, it remains unclear to which extent an explicit representation of this heterogeneity at the sub-grid scale of distributed hydrological models can improve the hydrological consistency and the robustness of such models. In this study, hydrological process complexity arising from sub-grid topography heterogeneity was incorporated in the distributed mesoscale Hydrologic Model (mHM). Seven study catchments across Europe were used to test whether (1) the incorporation of additional sub-grid variability on the basis of landscape-derived response units improves model internal dynamics, (2) the application of semi-quantitative, expert-knowledge based model constraints reduces model uncertainty; and (3) the combined use of sub-grid response units and model constraints improves the spatial transferability of the model. Unconstrained and constrained versions of both, the original mHM and mHMtopo, which allows for topography-based sub-grid heterogeneity, were calibrated for each catchment individually following a multi-objective calibration strategy. In addition, four of the study catchments were simultaneously calibrated and their feasible parameter sets were transferred to the remaining three receiver catchments. In a post-calibration evaluation procedure the probabilities of model and transferability improvement, when accounting for sub-grid variability and/or applying expert-knowledge based model constraints, were assessed on the basis of a set of hydrological signatures. In terms of the Euclidian distance to the optimal model, used as overall measure for model performance with respect to the individual signatures, the model improvement achieved by introducing sub-grid heterogeneity to mHM in mHMtopo was on average 13 %. The addition of semi-quantitative constraints to mHM and mHMtopo resulted in improvements of 13 and 19 % respectively, compared to the base case of the unconstrained mHM. Most significant improvements in signature representations were, in particular, achieved for low flow statistics. The application of prior semi-quantitative constraints further improved the partitioning between runoff and evaporative fluxes. Besides, it was shown that suitable semi-quantitative prior constraints in combination with the transfer function based regularization approach of mHM, can be beneficial for spatial model transferability as the Euclidian distances for the signatures improved on average by 2 %. The effect of semi-quantitative prior constraints combined with topography-guided sub-grid heterogeneity on transferability showed a more variable picture of improvements and deteriorations, but most improvements were observed for low flow statistics.
NASA Astrophysics Data System (ADS)
Nijzink, Remko C.; Samaniego, Luis; Mai, Juliane; Kumar, Rohini; Thober, Stephan; Zink, Matthias; Schäfer, David; Savenije, Hubert H. G.; Hrachowitz, Markus
2016-03-01
Heterogeneity of landscape features like terrain, soil, and vegetation properties affects the partitioning of water and energy. However, it remains unclear to what extent an explicit representation of this heterogeneity at the sub-grid scale of distributed hydrological models can improve the hydrological consistency and the robustness of such models. In this study, hydrological process complexity arising from sub-grid topography heterogeneity was incorporated into the distributed mesoscale Hydrologic Model (mHM). Seven study catchments across Europe were used to test whether (1) the incorporation of additional sub-grid variability on the basis of landscape-derived response units improves model internal dynamics, (2) the application of semi-quantitative, expert-knowledge-based model constraints reduces model uncertainty, and whether (3) the combined use of sub-grid response units and model constraints improves the spatial transferability of the model. Unconstrained and constrained versions of both the original mHM and mHMtopo, which allows for topography-based sub-grid heterogeneity, were calibrated for each catchment individually following a multi-objective calibration strategy. In addition, four of the study catchments were simultaneously calibrated and their feasible parameter sets were transferred to the remaining three receiver catchments. In a post-calibration evaluation procedure the probabilities of model and transferability improvement, when accounting for sub-grid variability and/or applying expert-knowledge-based model constraints, were assessed on the basis of a set of hydrological signatures. In terms of the Euclidian distance to the optimal model, used as an overall measure of model performance with respect to the individual signatures, the model improvement achieved by introducing sub-grid heterogeneity to mHM in mHMtopo was on average 13 %. The addition of semi-quantitative constraints to mHM and mHMtopo resulted in improvements of 13 and 19 %, respectively, compared to the base case of the unconstrained mHM. Most significant improvements in signature representations were, in particular, achieved for low flow statistics. The application of prior semi-quantitative constraints further improved the partitioning between runoff and evaporative fluxes. In addition, it was shown that suitable semi-quantitative prior constraints in combination with the transfer-function-based regularization approach of mHM can be beneficial for spatial model transferability as the Euclidian distances for the signatures improved on average by 2 %. The effect of semi-quantitative prior constraints combined with topography-guided sub-grid heterogeneity on transferability showed a more variable picture of improvements and deteriorations, but most improvements were observed for low flow statistics.
Acute asthma severity identification of expert system flow in emergency department
NASA Astrophysics Data System (ADS)
Sharif, Nurul Atikah Mohd; Ahmad, Norazura; Ahmad, Nazihah; Desa, Wan Laailatul Hanim Mat
2017-11-01
Integration of computerized system in healthcare management help in smoothening the documentation of patient records, highly accesses of knowledge and clinical practices guideline, and advice on decision making. Exploit the advancement of artificial intelligent such as fuzzy logic and rule-based reasoning may improve the management of emergency department in terms of uncertainty condition and medical practices adherence towards clinical guideline. This paper presenting details of the emergency department flow for acute asthma severity identification with the embedding of acute asthma severity identification expert system (AASIES). Currently, AASIES is still in preliminary stage of system validation. However, the implementation of AASIES in asthma bay management is hope can reduce the usage of paper for manual documentation and be a pioneer for the development of a more complex decision support system to smoothen the ED management and more systematic.
Ask the experts: the challenges and benefits of flow chemistry to optimize drug development.
Anderson, Neal; Gernaey, Krist V; Jamison, Timothy F; Kircher, Manfred; Wiles, Charlotte; Leadbeater, Nicholas E; Sandford, Graham; Richardson, Paul
2012-09-01
Against a backdrop of a struggling economic and regulatory climate, pharmaceutical companies have recently been forced to develop new ways to provide more efficient technology to meet the demands of a competitive drug industry. This issue, coupled with an increase in patent legislation and a rising generics market, makes these themes common issues in the growth of drug development. As a consequence, the importance of process chemistry and scale-up has never been more under the spotlight. Future Medicinal Chemistry wishes to share the thoughts and opinions of a variety of experts from this field, discussing issues concerning the use of flow chemistry to optimize drug development, the potential regulatory and environmental challenges faced with this, and whether the academic and industrial sectors could benefit from a more harmonized system relevant to process chemistry.
Rotary Wing Propulsion Specialists' Meeting, Williamsburg, VA, Nov. 13-15, 1990, Proceedings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1990-01-01
Topics presented include sound diffraction at a sharp trailing edge in a supersonic flow, the MTR390 turboshaft development program, progress report of the electrostatic engine monitoring system, some corrosion resistant magnesium alloys, handling severe inlet conditions in aircraft fuel pumps, and an over view of inlet protection systems for Army aircraft. Also presented are the advanced control system architecture for the T800 engine, an expert system to perform on-line controller restructuring for abrupt model changes, an enhanced APU for the H-60 series and Sh-2G helicopters, and a linear theory of the North Atlantic blocking during January 1979.
Flow-Visualization Techniques Used at High Speed by Configuration Aerodynamics Wind-Tunnel-Test Team
NASA Technical Reports Server (NTRS)
Lamar, John E. (Editor)
2001-01-01
This paper summarizes a variety of optically based flow-visualization techniques used for high-speed research by the Configuration Aerodynamics Wind-Tunnel Test Team of the High-Speed Research Program during its tenure. The work of other national experts is included for completeness. Details of each technique with applications and status in various national wind tunnels are given.
Automatic building information model query generation
Jiang, Yufei; Yu, Nan; Ming, Jiang; ...
2015-12-01
Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less
Automatic building information model query generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Yufei; Yu, Nan; Ming, Jiang
Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less
Web-Based Environment for Maintaining Legacy Software
NASA Technical Reports Server (NTRS)
Tigges, Michael; Thompson, Nelson; Orr, Mark; Fox, Richard
2007-01-01
Advanced Tool Integration Environment (ATIE) is the name of both a software system and a Web-based environment created by the system for maintaining an archive of legacy software and expertise involved in developing the legacy software. ATIE can also be used in modifying legacy software and developing new software. The information that can be encapsulated in ATIE includes experts documentation, input and output data of tests cases, source code, and compilation scripts. All of this information is available within a common environment and retained in a database for ease of access and recovery by use of powerful search engines. ATIE also accommodates the embedment of supporting software that users require for their work, and even enables access to supporting commercial-off-the-shelf (COTS) software within the flow of the experts work. The flow of work can be captured by saving the sequence of computer programs that the expert uses. A user gains access to ATIE via a Web browser. A modern Web-based graphical user interface promotes efficiency in the retrieval, execution, and modification of legacy code. Thus, ATIE saves time and money in the support of new and pre-existing programs.
A CLIPS-based expert system for the evaluation and selection of robots
NASA Technical Reports Server (NTRS)
Nour, Mohamed A.; Offodile, Felix O.; Madey, Gregory R.
1994-01-01
This paper describes the development of a prototype expert system for intelligent selection of robots for manufacturing operations. The paper first develops a comprehensive, three-stage process to model the robot selection problem. The decisions involved in this model easily lend themselves to an expert system application. A rule-based system, based on the selection model, is developed using the CLIPS expert system shell. Data about actual robots is used to test the performance of the prototype system. Further extensions to the rule-based system for data handling and interfacing capabilities are suggested.
Turon, Clàudia; Comas, Joaquim; Torrens, Antonina; Molle, Pascal; Poch, Manel
2008-01-01
With the aim of improving effluent quality of waste stabilization ponds, different designs of vertical flow constructed wetlands and intermittent sand filters were tested on an experimental full-scale plant within the framework of a European project. The information extracted from this study was completed and updated with heuristic and bibliographic knowledge. The data and knowledge acquired were difficult to integrate into mathematical models because they involve qualitative information and expert reasoning. Therefore, it was decided to develop an environmental decision support system (EDSS-Filter-Design) as a tool to integrate mathematical models and knowledge-based techniques. This paper describes the development of this support tool, emphasizing the collection of data and knowledge and representation of this information by means of mathematical equations and a rule-based system. The developed support tool provides the main design characteristics of filters: (i) required surface, (ii) media type, and (iii) media depth. These design recommendations are based on wastewater characteristics, applied load, and required treatment level data provided by the user. The results of the EDSS-Filter-Design provide appropriate and useful information and guidelines on how to design filters, according to the expert criteria. The encapsulation of the information into a decision support system reduces the design period and provides a feasible, reasoned, and positively evaluated proposal.
Wahl, Jochen; Barleon, Lorenz; Morfeld, Peter; Lichtmeß, Andrea; Haas-Brähler, Sibylle; Pfeiffer, Norbert
2016-01-01
Purpose To develop an expert system for glaucoma screening in a working population based on a human expert procedure using images of optic nerve head (ONH), visual field (frequency doubling technology, FDT) and intraocular pressure (IOP). Methods 4167 of 13037 (32%) employees between 40 and 65 years of Evonik Industries were screened. An experienced glaucoma expert (JW) assessed papilla parameters and evaluated all individual screening results. His classification into “no glaucoma”, “possible glaucoma” and “probable glaucoma” was defined as “gold standard”. A screening model was developed which was tested versus the gold-standard. This model took into account the assessment of the ONH. Values and relationships of CDR and IOP and the FDT were considered additionally and a glaucoma score was generated. The structure of the screening model was specified a priori whereas values of the parameters were chosen post-hoc to optimize sensitivity and specificity of the algorithm. Simple screening models based on IOP and / or FDT were investigated for comparison. Results 111 persons (2.66%) were classified as glaucoma suspects, thereof 13 (0.31%) as probable and 98 (2.35%) as possible glaucoma suspects by the expert. Re-evaluation by the screening model revealed a sensitivity of 83.8% and a specificity of 99.6% for all glaucoma suspects. The positive predictive value of the model was 80.2%, the negative predictive value 99.6%. Simple screening models showed insufficient diagnostic accuracy. Conclusion Adjustment of ONH and symmetry parameters with respect to excavation and IOP in an expert system produced sufficiently satisfying diagnostic accuracy. This screening model seems to be applicable in such a working population with relatively low age and low glaucoma prevalence. Different experts should validate the model in different populations. PMID:27479301
Wahl, Jochen; Barleon, Lorenz; Morfeld, Peter; Lichtmeß, Andrea; Haas-Brähler, Sibylle; Pfeiffer, Norbert
2016-01-01
To develop an expert system for glaucoma screening in a working population based on a human expert procedure using images of optic nerve head (ONH), visual field (frequency doubling technology, FDT) and intraocular pressure (IOP). 4167 of 13037 (32%) employees between 40 and 65 years of Evonik Industries were screened. An experienced glaucoma expert (JW) assessed papilla parameters and evaluated all individual screening results. His classification into "no glaucoma", "possible glaucoma" and "probable glaucoma" was defined as "gold standard". A screening model was developed which was tested versus the gold-standard. This model took into account the assessment of the ONH. Values and relationships of CDR and IOP and the FDT were considered additionally and a glaucoma score was generated. The structure of the screening model was specified a priori whereas values of the parameters were chosen post-hoc to optimize sensitivity and specificity of the algorithm. Simple screening models based on IOP and / or FDT were investigated for comparison. 111 persons (2.66%) were classified as glaucoma suspects, thereof 13 (0.31%) as probable and 98 (2.35%) as possible glaucoma suspects by the expert. Re-evaluation by the screening model revealed a sensitivity of 83.8% and a specificity of 99.6% for all glaucoma suspects. The positive predictive value of the model was 80.2%, the negative predictive value 99.6%. Simple screening models showed insufficient diagnostic accuracy. Adjustment of ONH and symmetry parameters with respect to excavation and IOP in an expert system produced sufficiently satisfying diagnostic accuracy. This screening model seems to be applicable in such a working population with relatively low age and low glaucoma prevalence. Different experts should validate the model in different populations.
Crowd-assisted polyp annotation of virtual colonoscopy videos
NASA Astrophysics Data System (ADS)
Park, Ji Hwan; Nadeem, Saad; Marino, Joseph; Baker, Kevin; Barish, Matthew; Kaufman, Arie
2018-03-01
Virtual colonoscopy (VC) allows a radiologist to navigate through a 3D colon model reconstructed from a computed tomography scan of the abdomen, looking for polyps, the precursors of colon cancer. Polyps are seen as protrusions on the colon wall and haustral folds, visible in the VC y-through videos. A complete review of the colon surface requires full navigation from the rectum to the cecum in antegrade and retrograde directions, which is a tedious task that takes an average of 30 minutes. Crowdsourcing is a technique for non-expert users to perform certain tasks, such as image or video annotation. In this work, we use crowdsourcing for the examination of complete VC y-through videos for polyp annotation by non-experts. The motivation for this is to potentially help the radiologist reach a diagnosis in a shorter period of time, and provide a stronger confirmation of the eventual diagnosis. The crowdsourcing interface includes an interactive tool for the crowd to annotate suspected polyps in the video with an enclosing box. Using our work flow, we achieve an overall polyps-per-patient sensitivity of 87.88% (95.65% for polyps >=5mm and 70% for polyps <5mm). We also demonstrate the efficacy and effectiveness of a non-expert user in detecting and annotating polyps and discuss their possibility in aiding radiologists in VC examinations.
Jarrett, G. Lynn; Downs, Aimee C.; Grace-Jarrett, Patricia A.
1998-01-01
The Hydrological Simulation Pro-gram-FORTRAN (HSPF) was applied to an urban drainage basin in Jefferson County, Ky to integrate the large amounts of information being collected on water quantity and quality into an analytical framework that could be used as a management and planning tool. Hydrologic response units were developed using geographic data and a K-means analysis to characterize important hydrologic and physical factors in the basin. The Hydrological Simulation Program FORTRAN Expert System (HSPEXP) was used to calibrate the model parameters for the Middle Fork Beargrass Creek Basin for 3 years (June 1, 1991, to May 31, 1994) of 5-minute streamflow and precipitation time series, and 3 years of hourly pan-evaporation time series. The calibrated model parameters were applied to the South Fork Beargrass Creek Basin for confirmation. The model confirmation results indicated that the model simulated the system within acceptable tolerances. The coefficient of determination and coefficient of model-fit efficiency between simulated and observed daily flows were 0.91 and 0.82, respectively, for model calibration and 0.88 and 0.77, respectively, for model confirmation. The model is most sensitive to estimates of the area of effective impervious land in the basin; the spatial distribution of rain-fall; and the lower-zone evapotranspiration, lower-zone nominal storage, and infiltration-capacity parameters during recession and low-flow periods. The error contribution from these sources varies with season and antecedent conditions.
Efficient Geological Modelling of Large AEM Surveys
NASA Astrophysics Data System (ADS)
Bach, Torben; Martlev Pallesen, Tom; Jørgensen, Flemming; Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas
2014-05-01
Combining geological expert knowledge with geophysical observations into a final 3D geological model is, in most cases, not a straight forward process. It typically involves many types of data and requires both an understanding of the data and the geological target. When dealing with very large areas, such as modelling of large AEM surveys, the manual task for the geologist to correctly evaluate and properly utilise all the data available in the survey area, becomes overwhelming. In the ERGO project (Efficient High-Resolution Geological Modelling) we address these issues and propose a new modelling methodology enabling fast and consistent modelling of very large areas. The vision of the project is to build a user friendly expert system that enables the combination of very large amounts of geological and geophysical data with geological expert knowledge. This is done in an "auto-pilot" type functionality, named Smart Interpretation, designed to aid the geologist in the interpretation process. The core of the expert system is a statistical model that describes the relation between data and geological interpretation made by a geological expert. This facilitates fast and consistent modelling of very large areas. It will enable the construction of models with high resolution as the system will "learn" the geology of an area directly from interpretations made by a geological expert, and instantly apply it to all hard data in the survey area, ensuring the utilisation of all the data available in the geological model. Another feature is that the statistical model the system creates for one area can be used in another area with similar data and geology. This feature can be useful as an aid to an untrained geologist to build a geological model, guided by the experienced geologist way of interpretation, as quantified by the expert system in the core statistical model. In this project presentation we provide some examples of the problems we are aiming to address in the project, and show some preliminary results.
NASA Astrophysics Data System (ADS)
Akhtar, Taimoor; Shoemaker, Christine
2016-04-01
Watershed model calibration is inherently a multi-criteria problem. Conflicting trade-offs exist between different quantifiable calibration criterions indicating the non-existence of a single optimal parameterization. Hence, many experts prefer a manual approach to calibration where the inherent multi-objective nature of the calibration problem is addressed through an interactive, subjective, time-intensive and complex decision making process. Multi-objective optimization can be used to efficiently identify multiple plausible calibration alternatives and assist calibration experts during the parameter estimation process. However, there are key challenges to the use of multi objective optimization in the parameter estimation process which include: 1) multi-objective optimization usually requires many model simulations, which is difficult for complex simulation models that are computationally expensive; and 2) selection of one from numerous calibration alternatives provided by multi-objective optimization is non-trivial. This study proposes a "Hybrid Automatic Manual Strategy" (HAMS) for watershed model calibration to specifically address the above-mentioned challenges. HAMS employs a 3-stage framework for parameter estimation. Stage 1 incorporates the use of an efficient surrogate multi-objective algorithm, GOMORS, for identification of numerous calibration alternatives within a limited simulation evaluation budget. The novelty of HAMS is embedded in Stages 2 and 3 where an interactive visual and metric based analytics framework is available as a decision support tool to choose a single calibration from the numerous alternatives identified in Stage 1. Stage 2 of HAMS provides a goodness-of-fit measure / metric based interactive framework for identification of a small subset (typically less than 10) of meaningful and diverse set of calibration alternatives from the numerous alternatives obtained in Stage 1. Stage 3 incorporates the use of an interactive visual analytics framework for decision support in selection of one parameter combination from the alternatives identified in Stage 2. HAMS is applied for calibration of flow parameters of a SWAT model, (Soil and Water Assessment Tool) designed to simulate flow in the Cannonsville watershed in upstate New York. Results from the application of HAMS to Cannonsville indicate that efficient multi-objective optimization and interactive visual and metric based analytics can bridge the gap between the effective use of both automatic and manual strategies for parameter estimation of computationally expensive watershed models.
What is an expert? A systems perspective on expertise.
Caley, Michael Julian; O'Leary, Rebecca A; Fisher, Rebecca; Low-Choy, Samantha; Johnson, Sandra; Mengersen, Kerrie
2014-02-01
Expert knowledge is a valuable source of information with a wide range of research applications. Despite the recent advances in defining expert knowledge, little attention has been given to how to view expertise as a system of interacting contributory factors for quantifying an individual's expertise. We present a systems approach to expertise that accounts for many contributing factors and their inter-relationships and allows quantification of an individual's expertise. A Bayesian network (BN) was chosen for this purpose. For illustration, we focused on taxonomic expertise. The model structure was developed in consultation with taxonomists. The relative importance of the factors within the network was determined by a second set of taxonomists (supra-experts) who also provided validation of the model structure. Model performance was assessed by applying the model to hypothetical career states of taxonomists designed to incorporate known differences in career states for model testing. The resulting BN model consisted of 18 primary nodes feeding through one to three higher-order nodes before converging on the target node (Taxonomic Expert). There was strong consistency among node weights provided by the supra-experts for some nodes, but not others. The higher-order nodes, "Quality of work" and "Total productivity", had the greatest weights. Sensitivity analysis indicated that although some factors had stronger influence in the outer nodes of the network, there was relatively equal influence of the factors leading directly into the target node. Despite the differences in the node weights provided by our supra-experts, there was good agreement among assessments of our hypothetical experts that accurately reflected differences we had specified. This systems approach provides a way of assessing the overall level of expertise of individuals, accounting for multiple contributory factors, and their interactions. Our approach is adaptable to other situations where it is desirable to understand components of expertise.
Expert opinion on landslide susceptibility elicted by probabilistic inversion from scenario rankings
NASA Astrophysics Data System (ADS)
Lee, Katy; Dashwood, Claire; Lark, Murray
2016-04-01
For many natural hazards the opinion of experts, with experience in assessing susceptibility under different circumstances, is a valuable source of information on which to base risk assessments. This is particularly important where incomplete process understanding, and limited data, limit the scope to predict susceptibility by mechanistic or statistical modelling. The expert has a tacit model of a system, based on their understanding of processes and their field experience. This model may vary in quality, depending on the experience of the expert. There is considerable interest in how one may elicit expert understanding by a process which is transparent and robust, to provide a basis for decision support. One approach is to provide experts with a set of scenarios, and then to ask them to rank small overlapping subsets of these with respect to susceptibility. Methods of probabilistic inversion have been used to compute susceptibility scores for each scenario, implicit in the expert ranking. It is also possible to model these scores as functions of measurable properties of the scenarios. This approach has been used to assess susceptibility of animal populations to invasive diseases, to assess risk to vulnerable marine environments and to assess the risk in hypothetical novel technologies for food production. We will present the results of a study in which a group of geologists with varying degrees of expertise in assessing landslide hazards were asked to rank sets of hypothetical simplified scenarios with respect to land slide susceptibility. We examine the consistency of their rankings and the importance of different properties of the scenarios in the tacit susceptibility model that their rankings implied. Our results suggest that this is a promising approach to the problem of how experts can communicate their tacit model of uncertain systems to those who want to make use of their expertise.
Liu, Fengchen; Porco, Travis C; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K; Bailey, Robin L; Keenan, Jeremy D; Solomon, Anthony W; Emerson, Paul M; Gambhir, Manoj; Lietman, Thomas M
2015-08-01
Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts' opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon's signed-rank statistic. Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher's information. Each individual expert's forecast was poorer than the sum of experts. Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. Clinicaltrials.gov NCT00792922.
A principled approach to the measurement of situation awareness in commercial aviation
NASA Technical Reports Server (NTRS)
Tenney, Yvette J.; Adams, Marilyn Jager; Pew, Richard W.; Huggins, A. W. F.; Rogers, William H.
1992-01-01
The issue of how to support situation awareness among crews of modern commercial aircraft is becoming especially important with the introduction of automation in the form of sophisticated flight management computers and expert systems designed to assist the crew. In this paper, cognitive theories are discussed that have relevance for the definition and measurement of situation awareness. These theories suggest that comprehension of the flow of events is an active process that is limited by the modularity of attention and memory constraints, but can be enhanced by expert knowledge and strategies. Three implications of this perspective for assessing and improving situation awareness are considered: (1) Scenario variations are proposed that tax awareness by placing demands on attention; (2) Experimental tasks and probes are described for assessing the cognitive processes that underlie situation awareness; and (3) The use of computer-based human performance models to augment the measures of situation awareness derived from performance data is explored. Finally, two potential example applications of the proposed assessment techniques are described, one concerning spatial awareness using wide field of view displays and the other emphasizing fault management in aircraft systems.
Pedagogical applications of cognitive research on musical improvisation.
Biasutti, Michele
2015-01-01
This paper presents a model for the implementation of educational activities involving musical improvisation that is based on a review of the literature on the psychology of music. Psychology of music is a complex field of research in which quantitative and qualitative methods have been employed involving participants ranging from novices to expert performers. The cognitive research has been analyzed to propose a pedagogical approach to the development of processes rather than products that focus on an expert's use of improvisation. The intention is to delineate a reflective approach that goes beyond the mere instruction of some current practices of teaching improvisation in jazz pedagogy. The review highlights that improvisation is a complex, multidimensional act that involves creative and performance behaviors in real-time in addition to processes such as sensory and perceptual encoding, motor control, performance monitoring, and memory storage and recall. Educational applications for the following processes are outlined: anticipation, use of repertoire, emotive communication, feedback, and flow. These characteristics are discussed in relation to the design of a pedagogical approach to musical improvisation based on reflection and metacognition development.
Cooperating Expert Systems For Space Station Power Distribution Management
NASA Astrophysics Data System (ADS)
Nguyen, T. A.; Chiou, W. C.
1987-02-01
In a complex system such as the manned Space Station, it is deem necessary that many expert systems must perform tasks in a concurrent and cooperative manner. An important question arise is: what cooperative-task-performing models are appropriate for multiple expert systems to jointly perform tasks. The solution to this question will provide a crucial automation design criteria for the Space Station complex systems architecture. Based on a client/server model for performing tasks, we have developed a system that acts as a front-end to support loosely-coupled communications between expert systems running on multiple Symbolics machines. As an example, we use two ART*-based expert systems to demonstrate the concept of parallel symbolic manipulation for power distribution management and dynamic load planner/scheduler in the simulated Space Station environment. This on-going work will also explore other cooperative-task-performing models as alternatives which can evaluate inter and intra expert system communication mechanisms. It will be served as a testbed and a bench-marking tool for other Space Station expert subsystem communication and information exchange.
ERIC Educational Resources Information Center
Hankins, George.
1987-01-01
Describes the novice-to-expert model of human learning and compares it to the recent advances in the areas of artificial intelligence and expert systems. Discusses some of the characteristics of experts, proposing connections between them with expert systems and theories of left-right brain functions. (TW)
Automatically calibrating admittances in KATE's autonomous launch operations model
NASA Technical Reports Server (NTRS)
Morgan, Steve
1992-01-01
This report documents a 1000-line Symbolics LISP program that automatically calibrates all 15 fluid admittances in KATE's Autonomous Launch Operations (ALO) model. (KATE is Kennedy Space Center's Knowledge-based Autonomous Test Engineer, a diagnosis and repair expert system created for use on the Space Shuttle's various fluid flow systems.) As a new KATE application, the calibrator described here breaks new ground for KSC's Artificial Intelligence Lab by allowing KATE to both control and measure the hardware she supervises. By automating a formerly manual process, the calibrator: (1) saves the ALO model builder untold amounts of labor; (2) enables quick repairs after workmen accidently adjust ALO's hand valves; and (3) frees the modeler to pursue new KATE applications that previously were too complicated. Also reported are suggestions for enhancing the program: (1) to calibrate ALO's TV cameras, pumps, and sensor tolerances; and (2) to calibrate devices in other KATE models, such as the shuttle's LOX and Environment Control System (ECS).
NASA Technical Reports Server (NTRS)
Vorobiova, M. I.; Degteva, M. O.; Neta, M. O. (Principal Investigator)
1999-01-01
The Techa River (Southern Urals, Russia) was contaminated in 1949-1956 by liquid radioactive wastes from the Mayak complex, the first Russian facility for the production of plutonium. The measurements of environmental contamination were started in 1951. A simple model describing radionuclide transport along the free-flowing river and the accumulation of radionuclides by bottom sediments is presented. This model successfully correlates the rates of radionuclide releases as reconstructed by the Mayak experts, hydrological data, and available environmental monitoring data for the early period of contamination (1949-1951). The model was developed to reconstruct doses for people who lived in the riverside communities during the period of the releases and who were chronically exposed to external and internal irradiation. The model fills the data gaps and permits reconstruction of external gamma-exposure rates in air on the river bank and radionuclide concentrations in river water used for drinking and other household needs in 1949-1951.
A feeling of flow: exploring junior scientists' experiences with dictation of scientific articles.
Spanager, Lene; Danielsen, Anne Kjaergaard; Pommergaard, Hans-Christian; Burcharth, Jakob; Rosenberg, Jacob
2013-08-10
Science involves publishing results, but many scientists do not master this. We introduced dictation as a method of producing a manuscript draft, participating in writing teams and attending a writing retreat to junior scientists in our department. This study aimed to explore the scientists' experiences with this process. Four focus group interviews were conducted and comprised all participating scientists (n = 14). Each transcript was transcribed verbatim and coded independently by two interviewers. The coding structure was discussed until consensus and from this the emergent themes were identified. Participants were 7 PhD students, 5 scholarship students and 2 clinical research nurses. Three main themes were identified: 'Preparing and then letting go' indicated that dictating worked best when properly prepared. 'The big dictation machine' described benefits of writing teams when junior scientists got feedback on both content and structure of their papers. 'Barriers to and drivers for participation' described flow-like states that participants experienced during the dictation. Motivation and a high level of preparation were pivotal to be able to dictate a full article in one day. The descriptions of flow-like states seemed analogous to the theoretical model of flow which is interesting, as flow is usually deemed a state reserved to skilled experts. Our findings suggest that other academic groups might benefit from using the concept including dictation of manuscripts to encourage participants' confidence in their writing skills.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprague, Michael A.; Boldyrev, Stanislav; Fischer, Paul
This report details the impact exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the DOE applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought together experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.
DOT National Transportation Integrated Search
1974-10-01
The basic manual, published as the first volume of this report, is intended for use as a tool in predicting noise levels which will be generated by freely-flowing vehicle traffic along a highway of known characteristics. The first volume explains the...
Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.
Tute, Erik; Steiner, Jochen
2018-01-01
Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.
ERIC Educational Resources Information Center
Kirschenbaum, Daniel S.; Gierut, Kristen
2013-01-01
Objective: To compare and contrast 5 sets of expert recommendations about the treatment of childhood and adolescent obesity. Method: We reviewed 5 sets of recent expert recommendations: 2007 health care organizations' four stage model, 2007 Canadian clinical practice guidelines, 2008 Endocrine Society recommendations, 2009 seven step model, and…
Video Modeling by Experts with Video Feedback to Enhance Gymnastics Skills
ERIC Educational Resources Information Center
Boyer, Eva; Miltenberger, Raymond G.; Batsche, Catherine; Fogel, Victoria
2009-01-01
The effects of combining video modeling by experts with video feedback were analyzed with 4 female competitive gymnasts (7 to 10 years old) in a multiple baseline design across behaviors. During the intervention, after the gymnast performed a specific gymnastics skill, she viewed a video segment showing an expert gymnast performing the same skill…
Recognizing the importance of conversation between experts and non-experts in science communication
NASA Astrophysics Data System (ADS)
Rushlow, C. R.; Soderquist, B.; Cohn, T.; Eitel, K.
2016-12-01
Science communication is often perceived by scientists as the flow of information from experts to non-experts, and institutions have responded by providing science communication training that focuses on best practices for disseminating information. This unidirectional approach neglects a key component of science communication: scientists must understand the needs and values of the stakeholders for whom they are producing information, whether the stakeholders are community members, resource managers, or policy makers. We designed an activity for graduate students enrolled in a science communication class at the McCall Outdoor Science School to both alert them to this misconception, and to give them an opportunity to rectify it. Over the course of 24-hours, we challenged students to have a conversation about climate change with someone they encountered in the community of McCall, ID. Using material from their conversations, students created a story in podcast or video form to share with the class. Through reflecting on this activity, students experienced a change in their perceptions of their identities as science communicators. Many students expressed an increased interest in listening to the stories of community members to learn more about the community's needs and values. We repeated the activity with early career scientists attending a climate workshop in McCall offered by the USGS Northwest Climate Science Center, focusing our evaluation around the science identity model of Carlone and Johnson (2007). Evaluations suggest that participants recognized their role as scientists in not only to providing information, but also in listening to the values and needs of the people for whom they are working. We believe this understanding is fundamental to being a good science communicator and ensuring that science remains relevant to communities.
NASA Astrophysics Data System (ADS)
Lufri, L.; Fitri, R.; Yogica, R.
2018-04-01
The purpose of this study is to produce a learning model based on problem solving and meaningful learning standards by expert assessment or validation for the course of Animal Development. This research is a development research that produce the product in the form of learning model, which consist of sub product, namely: the syntax of learning model and student worksheets. All of these products are standardized through expert validation. The research data is the level of validity of all sub products obtained using questionnaire, filled by validators from various field of expertise (field of study, learning strategy, Bahasa). Data were analysed using descriptive statistics. The result of the research shows that the problem solving and meaningful learning model has been produced. Sub products declared appropriate by expert include the syntax of learning model and student worksheet.
A Web Interface for Eco System Modeling
NASA Astrophysics Data System (ADS)
McHenry, K.; Kooper, R.; Serbin, S. P.; LeBauer, D. S.; Desai, A. R.; Dietze, M. C.
2012-12-01
We have developed the Predictive Ecosystem Analyzer (PEcAn) as an open-source scientific workflow system and ecoinformatics toolbox that manages the flow of information in and out of regional-scale terrestrial biosphere models, facilitates heterogeneous data assimilation, tracks data provenance, and enables more effective feedback between models and field research. The over-arching goal of PEcAn is to make otherwise complex analyses transparent, repeatable, and accessible to a diverse array of researchers, allowing both novice and expert users to focus on using the models to examine complex ecosystems rather than having to deal with complex computer system setup and configuration questions in order to run the models. Through the developed web interface we hide much of the data and model details and allow the user to simply select locations, ecosystem models, and desired data sources as inputs to the model. Novice users are guided by the web interface through setting up a model execution and plotting the results. At the same time expert users are given enough freedom to modify specific parameters before the model gets executed. This will become more important as more and more models are added to the PEcAn workflow as well as more and more data that will become available as NEON comes online. On the backend we support the execution of potentially computationally expensive models on different High Performance Computers (HPC) and/or clusters. The system can be configured with a single XML file that gives it the flexibility needed for configuring and running the different models on different systems using a combination of information stored in a database as well as pointers to files on the hard disk. While the web interface usually creates this configuration file, expert users can still directly edit it to fine tune the configuration.. Once a workflow is finished the web interface will allow for the easy creation of plots over result data while also allowing the user to download the results for further processing. The current workflow in the web interface is a simple linear workflow, but will be expanded to allow for more complex workflows. We are working with Kepler and Cyberintegrator to allow for these more complex workflows as well as collecting provenance of the workflow being executed. This provenance regarding model executions is stored in a database along with the derived results. All of this information is then accessible using the BETY database web frontend. The PEcAn interface.
Multicriteria decision model for retrofitting existing buildings
NASA Astrophysics Data System (ADS)
Bostenaru Dan, B.
2003-04-01
In this paper a model to decide which buildings from an urban area should be retrofitted is presented. The model has been cast into existing ones by choosing the decision rule, criterion weighting and decision support system types most suitable for the spatial problem of reducing earthquake risk in urban areas, considering existing spatial multiatributive and multiobjective decision methods and especially collaborative issues. Due to the participative character of the group decision problem "retrofitting existing buildings" the decision making model is based on interactivity. Buildings have been modeled following the criteria of spatial decision support systems. This includes identifying the corresponding spatial elements of buildings according to the information needs of actors from different sphaeres like architects, construction engineers and economists. The decision model aims to facilitate collaboration between this actors. The way of setting priorities interactivelly will be shown, by detailing the two phases: judgemental and computational, in this case site analysis, collection and evaluation of the unmodified data and converting survey data to information with computational methods using additional expert support. Buildings have been divided into spatial elements which are characteristic for the survey, present typical damages in case of an earthquake and are decisive for a better seismic behaviour in case of retrofitting. The paper describes the architectural and engineering characteristics as well as the structural damage for constuctions of different building ages on the example of building types in Bucharest, Romania in compressible and interdependent charts, based on field observation, reports from the 1977 earthquake and detailed studies made by the author together with a local engineer for the EERI Web Housing Encyclopedia. On this base criteria for setting priorities flow into the expert information contained in the system.
Kulczycki, Emanuel; Rozkosz, Ewa A
2017-01-01
This article discusses the Polish Journal Ranking, which is used in the research evaluation system in Poland. In 2015, the ranking, which represents all disciplines, allocated 17,437 journals into three lists: A, B, and C. The B list constitutes a ranking of Polish journals that are indexed neither in the Web of Science nor the European Reference Index for the Humanities. This ranking was built by evaluating journals in three dimensions: formal, bibliometric, and expert-based. We have analysed data on 2035 Polish journals from the B list. Our study aims to determine how an expert-based evaluation influenced the results of final evaluation. In our study, we used structural equation modelling, which is regression based, and we designed three pairs of theoretical models for three fields of science: (1) humanities, (2) social sciences, and (3) engineering, natural sciences, and medical sciences. Each pair consisted of the full model and the reduced model (i.e., the model without the expert-based evaluation). Our analysis revealed that the multidimensional evaluation of local journals should not rely only on the bibliometric indicators, which are based on the Web of Science or Scopus. Moreover, we have shown that the expert-based evaluation plays a major role in all fields of science. We conclude with recommendations that the formal evaluation should be reduced to verifiable parameters and that the expert-based evaluation should be based on common guidelines for the experts.
NASA Astrophysics Data System (ADS)
Cho, Adrian
2018-06-01
Philip Hopkins, a theoretical astrophysicist at the California Institute of Technology in Pasadena, likes to prank his colleagues. An expert in simulating the formation of galaxies, Hopkins sometimes begins his talks by projecting images of his creations next to photos of real galaxies and defying his audience to tell them apart. "We can even trick astronomers," Hopkins says. For decades, scientists have tried to simulate how the trillions of galaxies in the observable universe arose from clouds of gas after the big bang. But only in the past few years have the simulations begun to reproduce both the details of individual galaxies and their distribution of masses and shapes. As the fake universes improve, their role is also changing. Previously, information flowed one way: from the astronomers studying real galaxies to the modelers trying to simulate them. Now, insight is flowing the other way, too, with the models helping guide astronomers and astrophysicists. The models suggest that the earliest galaxies were oddly pickle-shaped, that wafer-thin spiral galaxies are surprisingly rugged in the face of collisions, and, perhaps most important, that galaxies must form stars far more slowly than astrophysicists expected. Progress is coming so fast, says Tiziana Di Matteo, a numerical cosmologist at Carnegie Mellon University in Pittsburgh, Pennsylvania, that "the whole thing has reached this little golden age."
The Shrinkage Model And Expert System Of Plastic Lens Formation
NASA Astrophysics Data System (ADS)
Chang, Rong-Seng
1988-06-01
Shrinkage causes both the appearance & dimension defects of the injected plastic lens. We have built up a model of state equations with the help of finite element analysis program to estimate the volume change (shrinkage and swelling) under the combinations of injection variables such as pressure and temperature etc., then the personal computer expert system has been build up to make that knowledge conveniently available to the user in the model design, process planning, process operation and some other work. The domain knowledge is represented by a R-graph (Relationship-graph) model which states the relationships of variables & equations. This model could be compare with other models in the expert system. If the user has better model to solve the shrinkage problem, the program will evaluate it automatically and a learning file will be trigger by the expert system to teach the user to update their knowledge base and modify the old model by this better model. The Rubin's model and Gilmore's model have been input to the expert system. The conflict has been solved both from the user and the deeper knowledge base. A cube prism and the convex lens examples have been shown in this paper. This program is written by MULISP language in IBM PC-AT. The natural language provides English Explaination of know why and know how and the automatic English translation for the equation rules and the production rules.
Information Retrieval Diary of an Expert Technical Translator.
ERIC Educational Resources Information Center
Cremmins, Edward T.
1984-01-01
Recommends use of entries from the information retrieval diary of Ted Crump, expert technical translator at the National Institute of Health, in the construction of computer models showing how expert translators solve problems of ambiguity in language. Expert and inexpert translation systems, eponyms, abbreviations, and alphabetic solutions are…
Girardi, Dominic; Küng, Josef; Kleiser, Raimund; Sonnberger, Michael; Csillag, Doris; Trenkler, Johannes; Holzinger, Andreas
2016-09-01
Established process models for knowledge discovery find the domain-expert in a customer-like and supervising role. In the field of biomedical research, it is necessary to move the domain-experts into the center of this process with far-reaching consequences for both their research output and the process itself. In this paper, we revise the established process models for knowledge discovery and propose a new process model for domain-expert-driven interactive knowledge discovery. Furthermore, we present a research infrastructure which is adapted to this new process model and demonstrate how the domain-expert can be deeply integrated even into the highly complex data-mining process and data-exploration tasks. We evaluated this approach in the medical domain for the case of cerebral aneurysms research.
A Step-Wise Approach to Elicit Triangular Distributions
NASA Technical Reports Server (NTRS)
Greenberg, Marc W.
2013-01-01
Adapt/combine known methods to demonstrate an expert judgment elicitation process that: 1.Models expert's inputs as a triangular distribution, 2.Incorporates techniques to account for expert bias and 3.Is structured in a way to help justify expert's inputs. This paper will show one way of "extracting" expert opinion for estimating purposes. Nevertheless, as with most subjective methods, there are many ways to do this.
Nojima, Masanori; Tokunaga, Mutsumi; Nagamura, Fumitaka
2018-05-05
To investigate under what circumstances inappropriate use of 'multivariate analysis' is likely to occur and to identify the population that needs more support with medical statistics. The frequency of inappropriate regression model construction in multivariate analysis and related factors were investigated in observational medical research publications. The inappropriate algorithm of using only variables that were significant in univariate analysis was estimated to occur at 6.4% (95% CI 4.8% to 8.5%). This was observed in 1.1% of the publications with a medical statistics expert (hereinafter 'expert') as the first author, 3.5% if an expert was included as coauthor and in 12.2% if experts were not involved. In the publications where the number of cases was 50 or less and the study did not include experts, inappropriate algorithm usage was observed with a high proportion of 20.2%. The OR of the involvement of experts for this outcome was 0.28 (95% CI 0.15 to 0.53). A further, nation-level, analysis showed that the involvement of experts and the implementation of unfavourable multivariate analysis are associated at the nation-level analysis (R=-0.652). Based on the results of this study, the benefit of participation of medical statistics experts is obvious. Experts should be involved for proper confounding adjustment and interpretation of statistical models. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Zhang, Zhi-hong; Dong, Hong-ye; Peng, Bo; Liu, Hong-fei; Li, Chun-lei; Liang, Min; Pan, Wei-san
2011-05-30
The purpose of this article was to build an expert system for the development and formulation of push-pull osmotic pump tablets (PPOP). Hundreds of PPOP formulations were studied according to different poorly water-soluble drugs and pharmaceutical acceptable excipients. The knowledge base including database and rule base was built based on the reported results of hundreds of PPOP formulations containing different poorly water-soluble drugs and pharmaceutical excipients and the experiences available from other researchers. The prediction model of release behavior was built using back propagation (BP) neural network, which is good at nonlinear mapping and learning function. Formulation design model was established based on the prediction model of release behavior, which was the nucleus of the inference engine. Finally, the expert system program was constructed by VB.NET associating with SQL Server. Expert system is one of the most popular aspects in artificial intelligence. To date there is no expert system available for the formulation of controlled release dosage forms yet. Moreover, osmotic pump technology (OPT) is gradually getting consummate all over the world. It is meaningful to apply expert system on OPT. Famotidine, a water insoluble drug was chosen as the model drug to validate the applicability of the developed expert system. Copyright © 2011 Elsevier B.V. All rights reserved.
Fischer, Heidi J; Vergara, Ximena P; Yost, Michael; Silva, Michael; Lombardi, David A; Kheifets, Leeka
2017-01-01
Job exposure matrices (JEMs) are tools used to classify exposures for job titles based on general job tasks in the absence of individual level data. However, exposure uncertainty due to variations in worker practices, job conditions, and the quality of data has never been quantified systematically in a JEM. We describe a methodology for creating a JEM which defines occupational exposures on a continuous scale and utilizes elicitation methods to quantify exposure uncertainty by assigning exposures probability distributions with parameters determined through expert involvement. Experts use their knowledge to develop mathematical models using related exposure surrogate data in the absence of available occupational level data and to adjust model output against other similar occupations. Formal expert elicitation methods provided a consistent, efficient process to incorporate expert judgment into a large, consensus-based JEM. A population-based electric shock JEM was created using these methods, allowing for transparent estimates of exposure.
Evaluation of HardSys/HardDraw, An Expert System for Electromagnetic Interactions Modelling
1993-05-01
interactions ir complex systems. This report gives a description of HardSys/HardDraw and reviews the main concepts used in its design. Various aspects of its ...HardDraw, an expert system for the modelling of electromagnetic interactions in complex systems. It consists of two main components: HardSys and HardDraw...HardSys is the advisor part of the expert system. It is knowledge-based, that is it contains a database of models and properties for various types of
NASA Astrophysics Data System (ADS)
Hrachowitz, Markus; Fovet, Ophelie; Ruiz, Laurent; Gascuel-Odoux, Chantal; Savenije, Hubert
2014-05-01
Hydrological models are frequently characterized by what is often considered to be adequate calibration performances. In many cases, however, these models experience a substantial uncertainty and performance decrease in validation periods, thus resulting in poor predictive power. Besides the likely presence of data errors, this observation can point towards wrong or insufficient representations of the underlying processes and their heterogeneity. In other words, right results are generated for the wrong reasons. Thus ways are sought to increase model consistency and to thereby satisfy the contrasting priorities of the need a) to increase model complexity and b) to limit model equifinality. In this study a stepwise model development approach is chosen to test the value of an exhaustive and systematic combined use of hydrological signatures, expert knowledge and readily available, yet anecdotal and rarely exploited, hydrological information for increasing model consistency towards generating the right answer for the right reasons. A simple 3-box, 7 parameter, conceptual HBV-type model, constrained by 4 calibration objective functions was able to adequately reproduce the hydrograph with comparatively high values for the 4 objective functions in the 5-year calibration period. However, closer inspection of the results showed a dramatic decrease of model performance in the 5-year validation period. In addition, assessing the model's skill to reproduce a range of 20 hydrological signatures including, amongst others, the flow duration curve, the autocorrelation function and the rising limb density, showed that it could not adequately reproduce the vast majority of these signatures, indicating a lack of model consistency. Subsequently model complexity was increased in a stepwise way to allow for more process heterogeneity. To limit model equifinality, increase in complexity was counter-balanced by a stepwise application of "realism constraints", inferred from expert knowledge (e.g. unsaturated storage capacity of hillslopes should exceed the one of wetlands) and anecdotal hydrological information (e.g. long-term estimates of actual evaporation obtained from the Budyko framework and long-term estimates of baseflow contribution) to ensure that the model is well behaved with respect to the modeller's perception of the system. A total of 11 model set-ups with increased complexity and an increased number of realism constraints was tested. It could be shown that in spite of largely unchanged calibration performance, compared to the simplest set-up, the most complex model set-up (12 parameters, 8 constraints) exhibited significantly increased performance in the validation period while uncertainty did not increase. In addition, the most complex model was characterized by a substantially increased skill to reproduce all 20 signatures, indicating a more suitable representation of the system. The results suggest that a model, "well" constrained by 4 calibration objective functions may still be an inadequate representation of the system and that increasing model complexity, if counter-balanced by realism constraints, can indeed increase predictive performance of a model and its skill to reproduce a range of hydrological signatures, but that it does not necessarily result in increased uncertainty. The results also strongly illustrate the need to move away from automated model calibration towards a more general expert-knowledge driven strategy of constraining models if a certain level of model consistency is to be achieved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moges, Edom; Demissie, Yonas; Li, Hong-Yi
2016-04-01
In most water resources applications, a single model structure might be inadequate to capture the dynamic multi-scale interactions among different hydrological processes. Calibrating single models for dynamic catchments, where multiple dominant processes exist, can result in displacement of errors from structure to parameters, which in turn leads to over-correction and biased predictions. An alternative to a single model structure is to develop local expert structures that are effective in representing the dominant components of the hydrologic process and adaptively integrate them based on an indicator variable. In this study, the Hierarchical Mixture of Experts (HME) framework is applied to integratemore » expert model structures representing the different components of the hydrologic process. Various signature diagnostic analyses are used to assess the presence of multiple dominant processes and the adequacy of a single model, as well as to identify the structures of the expert models. The approaches are applied for two distinct catchments, the Guadalupe River (Texas) and the French Broad River (North Carolina) from the Model Parameter Estimation Experiment (MOPEX), using different structures of the HBV model. The results show that the HME approach has a better performance over the single model for the Guadalupe catchment, where multiple dominant processes are witnessed through diagnostic measures. Whereas, the diagnostics and aggregated performance measures prove that French Broad has a homogeneous catchment response, making the single model adequate to capture the response.« less
TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0
NASA Technical Reports Server (NTRS)
Ortiz, C. J.
1994-01-01
The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K MS-DOS format diskette. TARGET was developed in 1991.
Rhetorical Consequences of the Computer Society: Expert Systems and Human Communication.
ERIC Educational Resources Information Center
Skopec, Eric Wm.
Expert systems are computer programs that solve selected problems by modelling domain-specific behaviors of human experts. These computer programs typically consist of an input/output system that feeds data into the computer and retrieves advice, an inference system using the reasoning and heuristic processes of human experts, and a knowledge…
a Study on Satellite Diagnostic Expert Systems Using Case-Based Approach
NASA Astrophysics Data System (ADS)
Park, Young-Tack; Kim, Jae-Hoon; Park, Hyun-Soo
1997-06-01
Many research works are on going to monitor and diagnose diverse malfunctions of satellite systems as the complexity and number of satellites increase. Currently, many works on monitoring and diagnosis are carried out by human experts but there are needs to automate much of the routine works of them. Hence, it is necessary to study on using expert systems which can assist human experts routine work by doing automatically, thereby allow human experts devote their expertise more critical and important areas of monitoring and diagnosis. In this paper, we are employing artificial intelligence techniques to model human experts' knowledge and inference the constructed knowledge. Especially, case-based approaches are used to construct a knowledge base to model human expert capabilities which use previous typical exemplars. We have designed and implemented a prototype case-based system for diagnosing satellite malfunctions using cases. Our system remembers typical failure cases and diagnoses a current malfunction by indexing the case base. Diverse methods are used to build a more user friendly interface which allows human experts can build a knowledge base in as easy way.
NASA Astrophysics Data System (ADS)
Furfaro, R.; Kargel, J. S.; Fink, W.; Bishop, M. P.
2010-12-01
Glaciers and ice sheets are among the largest unstable parts of the solid Earth. Generally, glaciers are devoid of resources (other than water), are dangerous, are unstable and no infrastructure is normally built directly on their surfaces. Areas down valley from large alpine glaciers are also commonly unstable due to landslide potential of moraines, debris flows, snow avalanches, outburst floods from glacier lakes, and other dynamical alpine processes; yet there exists much development and human occupation of some disaster-prone areas. Satellite remote sensing can be extremely effective in providing cost-effective and time- critical information. Space-based imagery can be used to monitor glacier outlines and their lakes, including processes such as iceberg calving and debris accumulation, as well as changing thicknesses and flow speeds. Such images can also be used to make preliminary identifications of specific hazardous spots and allows preliminary assessment of possible modes of future disaster occurrence. Autonomous assessment of glacier conditions and their potential for hazards would present a major advance and permit systematized analysis of more data than humans can assess. This technical leap will require the design and implementation of Artificial Intelligence (AI) algorithms specifically designed to mimic glacier experts’ reasoning. Here, we introduce the theory of Fuzzy Cognitive Maps (FCM) as an AI tool for predicting and assessing natural hazards in alpine glacier environments. FCM techniques are employed to represent expert knowledge of glaciers physical processes. A cognitive model embedded in a fuzzy logic framework is constructed via the synergistic interaction between glaciologists and AI experts. To verify the effectiveness of the proposed AI methodology as applied to predicting hazards in glacier environments, we designed and implemented a FCM that addresses the challenging problem of autonomously assessing the Glacier Lake Outburst Flow Potential and Impound Water Upstream Flow Potential. The FCM is constructed using what is currently our understanding of how glacier lake outbursts occur, whereas the causal connection between concepts is defined to capture the expertise of glacier scientists. The proposed graph contains 27 nodes and a network of connections that represent the causal link between concepts. To test the developed FCM, we defined three scenarios representing glacier lake environmental conditions that either occurred or that are likely to occur in such highly dynamic environments. For each case, the FCM has been initialized using observables extracted from hypothesized remote sensing imagery. The map, which converges to a fixed point for all of the test scenarios within 15 iterations, shows reasoning consistent with that of glacier experts. The FCM-based cognitive approach has the potential to be the AI core of real-time operational hazards assessment and detection systems.
NASA Astrophysics Data System (ADS)
Pulido-Velazquez, Manuel; Macian-Sorribes, Hector; María Benlliure-Moreno, Jose; Fullana-Montoro, Juan
2015-04-01
Water resources systems in areas with a strong tradition in water use are complex to manage by the high amount of constraints that overlap in time and space, creating a complicated framework in which past, present and future collide between them. In addition, it is usual to find "hidden constraints" in system operations, which condition operation decisions being unnoticed by anyone but the river managers and users. Being aware of those hidden constraints requires usually years of experience and a degree of involvement in that system's management operations normally beyond the possibilities of technicians. However, their impact in the management decisions is strongly imprinted in the historical data records available. The purpose of this contribution is to present a methodology capable of assessing operating rules in complex water resources systems combining historical records and expert criteria. Both sources are coupled using fuzzy logic. The procedure stages are: 1) organize expert-technicians preliminary meetings to let the first explain how they manage the system; 2) set up a fuzzy rule-based system (FRB) structure according to the way the system is managed; 3) use the historical records available to estimate the inputs' fuzzy numbers, to assign preliminary output values to the FRB rules and to train and validate these rules; 4) organize expert-technician meetings to discuss the rule structure and the input's quantification, returning if required to the second stage; 5) once the FRB structure is accepted, its output values must be refined and completed with the aid of the experts by using meetings, workshops or surveys; 6) combine the FRB with a Decision Support System (DSS) to simulate the effect of those management decisions; 7) compare its results with the ones offered by the historical records and/or simulation or optimization models; and 8) discuss with the stakeholders the model performance returning, if it's required, to the fifth or the second stage. The methodology proposed has been applied to the Jucar River Basin (Spain). This basin has 3 reservoirs, 4 headwaters, 11 demands and 5 environmental flows; which form together a complex constraint set. After the preliminary meetings, one 81-rule FRB was created, using as inputs the system state variables at the start of the hydrologic year, and as outputs the target reservoir release schedule. The inputs' fuzzy numbers were estimated jointly using surveys. Fifteen years of historical records were used to train the system's outputs. The obtained FRB was then refined during additional expert-technician meetings. After that, the resulting FRB was introduced into a DSS simulating the effect of those management rules for different hydrological conditions. Three additional FRB's were created using: 1) exclusively the historical records; 2) a stochastic optimization model; and 3) a deterministic optimization model. The results proved to be consistent with the expectations, with the stakeholder's FRB performance located between the data-driven simulation and the stochastic optimization FRB's; and reflect the stakeholders' major goals and concerns about the river management. ACKNOWLEDGEMENT: This study has been partially supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) funds.
ICADS: A cooperative decision making model with CLIPS experts
NASA Technical Reports Server (NTRS)
Pohl, Jens; Myers, Leonard
1991-01-01
A cooperative decision making model is described which is comprised of six concurrently executing domain experts coordinated by a blackboard control expert. The focus application field is architectural design, and the domain experts represent consultants in the area of daylighting, noise control, structural support, cost estimating, space planning, and climate responsiveness. Both the domain experts and the blackboard were implemented as production systems, using an enhanced version of the basic CLIPS package. Acting in unison as an Expert Design Advisor, the domain and control experts react to the evolving design solution progressively developed by the user in a 2-D CAD drawing environment. A Geometry Interpreter maps each drawing action taken by the user to real world objects, such as spaces, walls, windows, and doors. These objects, endowed with geometric and nongeometric attributes, are stored as frames in a semantic network. Object descriptions are derived partly from the geometry of the drawing environment and partly from knowledge bases containing prototypical, generalized information about the building type and site conditions under consideration.
Reliability and performance evaluation of systems containing embedded rule-based expert systems
NASA Technical Reports Server (NTRS)
Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.
1989-01-01
A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.
NASA Astrophysics Data System (ADS)
Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad
2016-05-01
Bayesian inference has traditionally been conceived as the proper framework for the formal incorporation of expert knowledge in parameter estimation of groundwater models. However, conventional Bayesian inference is incapable of taking into account the imprecision essentially embedded in expert provided information. In order to solve this problem, a number of extensions to conventional Bayesian inference have been introduced in recent years. One of these extensions is 'fuzzy Bayesian inference' which is the result of integrating fuzzy techniques into Bayesian statistics. Fuzzy Bayesian inference has a number of desirable features which makes it an attractive approach for incorporating expert knowledge in the parameter estimation process of groundwater models: (1) it is well adapted to the nature of expert provided information, (2) it allows to distinguishably model both uncertainty and imprecision, and (3) it presents a framework for fusing expert provided information regarding the various inputs of the Bayesian inference algorithm. However an important obstacle in employing fuzzy Bayesian inference in groundwater numerical modeling applications is the computational burden, as the required number of numerical model simulations often becomes extremely exhaustive and often computationally infeasible. In this paper, a novel approach of accelerating the fuzzy Bayesian inference algorithm is proposed which is based on using approximate posterior distributions derived from surrogate modeling, as a screening tool in the computations. The proposed approach is first applied to a synthetic test case of seawater intrusion (SWI) in a coastal aquifer. It is shown that for this synthetic test case, the proposed approach decreases the number of required numerical simulations by an order of magnitude. Then the proposed approach is applied to a real-world test case involving three-dimensional numerical modeling of SWI in Kish Island, located in the Persian Gulf. An expert elicitation methodology is developed and applied to the real-world test case in order to provide a road map for the use of fuzzy Bayesian inference in groundwater modeling applications.
NASA Astrophysics Data System (ADS)
Neri, Augusto; Bevilacqua, Andrea; Esposti Ongaro, Tomaso; Isaia, Roberto; Aspinall, Willy P.; Bisson, Marina; Flandoli, Franco; Baxter, Peter J.; Bertagnini, Antonella; Iannuzzi, Enrico; Orsucci, Simone; Pistolesi, Marco; Rosi, Mauro; Vitale, Stefano
2015-04-01
Campi Flegrei (CF) is an example of an active caldera containing densely populated settlements at very high risk of pyroclastic density currents (PDCs). We present here an innovative method for assessing background spatial PDC hazard in a caldera setting with probabilistic invasion maps conditional on the occurrence of an explosive event. The method encompasses the probabilistic assessment of potential vent opening positions, derived in the companion paper, combined with inferences about the spatial density distribution of PDC invasion areas from a simplified flow model, informed by reconstruction of deposits from eruptions in the last 15 ka. The flow model describes the PDC kinematics and accounts for main effects of topography on flow propagation. Structured expert elicitation is used to incorporate certain sources of epistemic uncertainty, and a Monte Carlo approach is adopted to produce a set of probabilistic hazard maps for the whole CF area. Our findings show that, in case of eruption, almost the entire caldera is exposed to invasion with a mean probability of at least 5%, with peaks greater than 50% in some central areas. Some areas outside the caldera are also exposed to this danger, with mean probabilities of invasion of the order of 5-10%. Our analysis suggests that these probability estimates have location-specific uncertainties which can be substantial. The results prove to be robust with respect to alternative elicitation models and allow the influence on hazard mapping of different sources of uncertainty, and of theoretical and numerical assumptions, to be quantified.
NASA Astrophysics Data System (ADS)
Moser, M.
2009-04-01
The catchment Gadeinerbach in the District of Lungau/Salzburg/Austria is prone to debris flows. Large debris flow events dates back from the years 1934 and 1953. In the upper catchment large mass movements represent debris sources. A field study shows the debris potential and the catchment looks like a "sleeping torrential giant". To carry out mitigation measures a detailed risk management concept, based on a risk assessment in combination of historical analysis, field study and numerical modeling on the alluvial fan was conducted. Human activities have partly altered the surface of the alluvial fan Gadeinerbach but nevertheless some important hazard indicators could be found. With the hazard indicators and photo analysis from the large debris flow event 1934 the catchment character could be pointed out. With the help of these historical data sets (hazard indicators, sediment and debris amount...) it is possible to calibrate the provided numerical models and to win useful knowledge over the pro and cons and their application. The results were used to simulate the design event and furthermore to derive mitigation measures. Therefore the most effective protection against debris with a reduction of the high energy level to a lower level under particular energy change in combination with a debris/bedload deposition place has been carried out. Expert opinion, the study of historical data and a field work is in addition to numerical simulation techniques very necessary for the work in the field of natural hazard management.
Cui, Meng; Yang, Shuo; Yu, Tong; Yang, Ce; Gao, Yonghong; Zhu, Haiyan
2013-10-01
To design a model to capture information on the state and trends of knowledge creation, at both an individual and an organizational level, in order to enhance knowledge management. We designed a graph-theoretic knowledge model, the expert knowledge map (EKM), based on literature-based annotation. A case study in the domain of Traditional Chinese Medicine research was used to illustrate the usefulness of the model. The EKM successfully captured various aspects of knowledge and enhanced knowledge management within the case-study organization through the provision of knowledge graphs, expert graphs, and expert-knowledge biography. Our model could help to reveal the hot topics, trends, and products of the research done by an organization. It can potentially be used to facilitate knowledge learning, sharing and decision-making among researchers, academicians, students, and administrators of organizations.
Eliciting expert opinion for economic models: an applied example.
Leal, José; Wordsworth, Sarah; Legood, Rosa; Blair, Edward
2007-01-01
Expert opinion is considered as a legitimate source of information for decision-analytic modeling where required data are unavailable. Our objective was to develop a practical computer-based tool for eliciting expert opinion about the shape of the uncertainty distribution around individual model parameters. We first developed a prepilot survey with departmental colleagues to test a number of alternative approaches to eliciting opinions on the shape of the uncertainty distribution around individual parameters. This information was used to develop a survey instrument for an applied clinical example. This involved eliciting opinions from experts to inform a number of parameters involving Bernoulli processes in an economic model evaluating DNA testing for families with a genetic disease, hypertrophic cardiomyopathy. The experts were cardiologists, clinical geneticists, and laboratory scientists working with cardiomyopathy patient populations and DNA testing. Our initial prepilot work suggested that the more complex elicitation techniques advocated in the literature were difficult to use in practice. In contrast, our approach achieved a reasonable response rate (50%), provided logical answers, and was generally rated as easy to use by respondents. The computer software user interface permitted graphical feedback throughout the elicitation process. The distributions obtained were incorporated into the model, enabling the use of probabilistic sensitivity analysis. There is clearly a gap in the literature between theoretical elicitation techniques and tools that can be used in applied decision-analytic models. The results of this methodological study are potentially valuable for other decision analysts deriving expert opinion.
Use of cccupancy models to evaluate expert knowledge-based species-habitat relationships
Iglecia, Monica N.; Collazo, Jaime A.; McKerrow, Alexa
2012-01-01
Expert knowledge-based species-habitat relationships are used extensively to guide conservation planning, particularly when data are scarce. Purported relationships describe the initial state of knowledge, but are rarely tested. We assessed support in the data for suitability rankings of vegetation types based on expert knowledge for three terrestrial avian species in the South Atlantic Coastal Plain of the United States. Experts used published studies, natural history, survey data, and field experience to rank vegetation types as optimal, suitable, and marginal. We used single-season occupancy models, coupled with land cover and Breeding Bird Survey data, to examine the hypothesis that patterns of occupancy conformed to species-habitat suitability rankings purported by experts. Purported habitat suitability was validated for two of three species. As predicted for the Eastern Wood-Pewee (Contopus virens) and Brown-headed Nuthatch (Sitta pusilla), occupancy was strongly influenced by vegetation types classified as “optimal habitat” by the species suitability rankings for nuthatches and wood-pewees. Contrary to predictions, Red-headed Woodpecker (Melanerpes erythrocephalus) models that included vegetation types as covariates received similar support by the data as models without vegetation types. For all three species, occupancy was also related to sampling latitude. Our results suggest that covariates representing other habitat requirements might be necessary to model occurrence of generalist species like the woodpecker. The modeling approach described herein provides a means to test expert knowledge-based species-habitat relationships, and hence, help guide conservation planning.
Building groundwater modeling capacity in Mongolia
Valder, Joshua F.; Carter, Janet M.; Anderson, Mark T.; Davis, Kyle W.; Haynes, Michelle A.; Dorjsuren Dechinlhundev,
2016-06-16
Ulaanbaatar, the capital city of Mongolia (fig. 1), is dependent on groundwater for its municipal and industrial water supply. The population of Mongolia is about 3 million people, with about one-half the population residing in or near Ulaanbaatar (World Population Review, 2016). Groundwater is drawn from a network of shallow wells in an alluvial aquifer along the Tuul River. Evidence indicates that current water use may not be sustainable from existing water sources, especially when factoring the projected water demand from a rapidly growing urban population (Ministry of Environment and Green Development, 2013). In response, the Government of Mongolia Ministry of Environment, Green Development, and Tourism (MEGDT) and the Freshwater Institute, Mongolia, requested technical assistance on groundwater modeling through the U.S. Army Corps of Engineers (USACE) to the U.S. Geological Survey (USGS). Scientists from the USGS and USACE provided two workshops in 2015 to Mongolian hydrology experts on basic principles of groundwater modeling using the USGS groundwater modeling program MODFLOW-2005 (Harbaugh, 2005). The purpose of the workshops was to bring together representatives from the Government of Mongolia, local universities, technical experts, and other key stakeholders to build in-country capacity in hydrogeology and groundwater modeling.A preliminary steady-state groundwater-flow model was developed as part of the workshops to demonstrate groundwater modeling techniques to simulate groundwater conditions in alluvial deposits along the Tuul River in the vicinity of Ulaanbaatar. ModelMuse (Winston, 2009) was used as the graphical user interface for MODFLOW for training purposes during the workshops. Basic and advanced groundwater modeling concepts included in the workshops were groundwater principles; estimating hydraulic properties; developing model grids, data sets, and MODFLOW input files; and viewing and evaluating MODFLOW output files. A key to success was developing in-country technical capacity and partnerships with the Mongolian University of Science and Technology; Freshwater Institute, Mongolia, a non-profit organization; United Nations Educational, Scientific and Cultural Organization (UNESCO); the Government of Mongolia; and the USACE.
Suzuki, Shinya; Part, Florian; Matsufuji, Yasushi; Huber-Humer, Marion
2018-02-01
To date construction materials that contain engineered nanomaterials (ENMs) are available at the markets, but at the same time very little is known about their environmental fate. Therefore, this study aimed at modeling the potential fate of ENMs by using the example of the Japanese construction sector and by conducting a dynamic material flow analysis. Expert interviews and national reports revealed that about 3920-4660 tons of ENMs are annually used for construction materials in Japan. Nanoscale TiO 2 , SiO 2 , Al 2 O 3 and carbon black have already been applied for decades to wall paints, road markings or concrete. The dynamic material flow model indicates that in 2016 about 95% of ENMs, which have been used since their year of market penetration, remained in buildings, whereas only 5% ended up in the Japanese waste management system or were diffusely released into the environment. Considering the current Japanese waste management system, ENMs were predicted to end up in recycled materials (40-47%) or in landfills (36-41%). It was estimated that only a small proportion was used in agriculture (5-7%, as ENM-containing sewage sludges) or was diffusely released into soils, surface waters or the atmosphere (5-19%). The results indicate that ENM release predominantly depend on their specific applications and characteristics. The model also highlights the importance of adequate collection and treatment of ENM-containing wastes. In future, similar dynamic flow models for other countries should consider, inasmuch as available, historical data on ENM production (e.g. like declaration reports that are annually published by relevant public authorities or associations), as such input data is very important regarding data reliability in order to decrease uncertainties and to continuously improve model accuracy. In addition, more environmental monitoring studies that aim at the quantification of ENM release and inadvertent transfer, particularly triggered by waste treatment processes, would be needed in order to validate such models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Informing Public Perceptions About Climate Change: A 'Mental Models' Approach.
Wong-Parodi, Gabrielle; Bruine de Bruin, Wändi
2017-10-01
As the specter of climate change looms on the horizon, people will face complex decisions about whether to support climate change policies and how to cope with climate change impacts on their lives. Without some grasp of the relevant science, they may find it hard to make informed decisions. Climate experts therefore face the ethical need to effectively communicate to non-expert audiences. Unfortunately, climate experts may inadvertently violate the maxims of effective communication, which require sharing communications that are truthful, brief, relevant, clear, and tested for effectiveness. Here, we discuss the 'mental models' approach towards developing communications, which aims to help experts to meet the maxims of effective communications, and to better inform the judgments and decisions of non-expert audiences.
Application of Satellite-Derived Atmospheric Motion Vectors for Estimating Mesoscale Flows.
NASA Astrophysics Data System (ADS)
Bedka, Kristopher M.; Mecikalski, John R.
2005-11-01
This study demonstrates methods to obtain high-density, satellite-derived atmospheric motion vectors (AMV) that contain both synoptic-scale and mesoscale flow components associated with and induced by cumuliform clouds through adjustments made to the University of Wisconsin—Madison Cooperative Institute for Meteorological Satellite Studies (UW-CIMSS) AMV processing algorithm. Operational AMV processing is geared toward the identification of synoptic-scale motions in geostrophic balance, which are useful in data assimilation applications. AMVs identified in the vicinity of deep convection are often rejected by quality-control checks used in the production of operational AMV datasets. Few users of these data have considered the use of AMVs with ageostrophic flow components, which often fail checks that assure both spatial coherence between neighboring AMVs and a strong correlation to an NWP-model first-guess wind field. The UW-CIMSS algorithm identifies coherent cloud and water vapor features (i.e., targets) that can be tracked within a sequence of geostationary visible (VIS) and infrared (IR) imagery. AMVs are derived through the combined use of satellite feature tracking and an NWP-model first guess. Reducing the impact of the NWP-model first guess on the final AMV field, in addition to adjusting the target selection and vector-editing schemes, is found to result in greater than a 20-fold increase in the number of AMVs obtained from the UW-CIMSS algorithm for one convective storm case examined here. Over a three-image sequence of Geostationary Operational Environmental Satellite (GOES)-12 VIS and IR data, 3516 AMVs are obtained, most of which contain flow components that deviate considerably from geostrophy. In comparison, 152 AMVs are derived when a tighter NWP-model constraint and no targeting adjustments were imposed, similar to settings used with operational AMV production algorithms. A detailed analysis reveals that many of these 3516 vectors contain low-level (100 70 kPa) convergent and midlevel (70 40 kPa) to upper-level (40 10 kPa) divergent motion components consistent with localized mesoscale flow patterns. The applicability of AMVs for estimating cloud-top cooling rates at the 1-km pixel scale is demonstrated with excellent correspondence to rates identified by a human expert.
MOAB: a spatially explicit, individual-based expert system for creating animal foraging models
Carter, J.; Finn, John T.
1999-01-01
We describe the development, structure, and corroboration process of a simulation model of animal behavior (MOAB). MOAB can create spatially explicit, individual-based animal foraging models. Users can create or replicate heterogeneous landscape patterns, and place resources and individual animals of a goven species on that landscape to simultaneously simulate the foraging behavior of multiple species. The heuristic rules for animal behavior are maintained in a user-modifiable expert system. MOAB can be used to explore hypotheses concerning the influence of landscape patttern on animal movement and foraging behavior. A red fox (Vulpes vulpes L.) foraging and nest predation model was created to test MOAB's capabilities. Foxes were simulated for 30-day periods using both expert system and random movement rules. Home range size, territory formation and other available simulation studies. A striped skunk (Mephitis mephitis L.) model also was developed. The expert system model proved superior to stochastic in respect to territory formation, general movement patterns and home range size.
GANViz: A Visual Analytics Approach to Understand the Adversarial Game.
Wang, Junpeng; Gou, Liang; Yang, Hao; Shen, Han-Wei
2018-06-01
Generative models bear promising implications to learn data representations in an unsupervised fashion with deep learning. Generative Adversarial Nets (GAN) is one of the most popular frameworks in this arena. Despite the promising results from different types of GANs, in-depth understanding on the adversarial training process of the models remains a challenge to domain experts. The complexity and the potential long-time training process of the models make it hard to evaluate, interpret, and optimize them. In this work, guided by practical needs from domain experts, we design and develop a visual analytics system, GANViz, aiming to help experts understand the adversarial process of GANs in-depth. Specifically, GANViz evaluates the model performance of two subnetworks of GANs, provides evidence and interpretations of the models' performance, and empowers comparative analysis with the evidence. Through our case studies with two real-world datasets, we demonstrate that GANViz can provide useful insight into helping domain experts understand, interpret, evaluate, and potentially improve GAN models.
Lee, Annisa Lai
2010-09-01
A popular perception holds that physicians prescribe requested drugs to patients influenced by mass mediated direct-to-consumer prescription drug advertising. The phenomenon poses a serious challenge to the two-step flow model, which emphasizes the influence of opinion leaders on their followers and their legitimating power over the informing power of the mass media. This study investigates a 2002 Food and Drug Administration (FDA) survey and finds that patients searching for drug information through mass and hybrid media in newspapers and magazines' small print, the Internet, and toll-free numbers are more likely to seek information through interpersonal communication channels like health care providers. Patients using small print, toll-free numbers, one's own physician, and other physicians are associated with influencing their physicians with various drug-requesting behaviors. But physicians only prescribe requested drugs to patients who are influenced by other health care providers, such as pharmacists and other physicians, not the mass media. The influence of expert opinion leaders of drugs is so strong that the patients even would switch from their own unyielding physicians who do not prescribe drugs as advised by the pharmacists. Physicians and patients all are influenced more by other expert opinion leaders of drugs than by the mass media and therefore still uphold the basic tenet of the two-step model.
Stevenson-Holt, Claire D; Watts, Kevin; Bellamy, Chloe C; Nevin, Owen T; Ramsey, Andrew D
2014-01-01
Least-cost models are widely used to study the functional connectivity of habitat within a varied landscape matrix. A critical step in the process is identifying resistance values for each land cover based upon the facilitating or impeding impact on species movement. Ideally resistance values would be parameterised with empirical data, but due to a shortage of such information, expert-opinion is often used. However, the use of expert-opinion is seen as subjective, human-centric and unreliable. This study derived resistance values from grey squirrel habitat suitability models (HSM) in order to compare the utility and validity of this approach with more traditional, expert-led methods. Models were built and tested with MaxEnt, using squirrel presence records and a categorical land cover map for Cumbria, UK. Predictions on the likelihood of squirrel occurrence within each land cover type were inverted, providing resistance values which were used to parameterise a least-cost model. The resulting habitat networks were measured and compared to those derived from a least-cost model built with previously collated information from experts. The expert-derived and HSM-inferred least-cost networks differ in precision. The HSM-informed networks were smaller and more fragmented because of the higher resistance values attributed to most habitats. These results are discussed in relation to the applicability of both approaches for conservation and management objectives, providing guidance to researchers and practitioners attempting to apply and interpret a least-cost approach to mapping ecological networks.
Ikegami, Tsuyoshi; Ganesh, Gowrishankar
2017-01-01
The question of how humans predict outcomes of observed motor actions by others is a fundamental problem in cognitive and social neuroscience. Previous theoretical studies have suggested that the brain uses parts of the forward model (used to estimate sensory outcomes of self-generated actions) to predict outcomes of observed actions. However, this hypothesis has remained controversial due to the lack of direct experimental evidence. To address this issue, we analyzed the behavior of darts experts in an understanding learning paradigm and utilized computational modeling to examine how outcome prediction of observed actions affected the participants' ability to estimate their own actions. We recruited darts experts because sports experts are known to have an accurate outcome estimation of their own actions as well as prediction of actions observed in others. We first show that learning to predict the outcomes of observed dart throws deteriorates an expert's abilities to both produce his own darts actions and estimate the outcome of his own throws (or self-estimation). Next, we introduce a state-space model to explain the trial-by-trial changes in the darts performance and self-estimation through our experiment. The model-based analysis reveals that the change in an expert's self-estimation is explained only by considering a change in the individual's forward model, showing that an improvement in an expert's ability to predict outcomes of observed actions affects the individual's forward model. These results suggest that parts of the same forward model are utilized in humans to both estimate outcomes of self-generated actions and predict outcomes of observed actions.
NASA Astrophysics Data System (ADS)
Klaar, Megan; Laize, Cedric; Maddock, Ian; Acreman, Mike; Tanner, Kath; Peet, Sarah
2014-05-01
A key challenge for environmental managers is the determination of environmental flows which allow a maximum yield of water resources to be taken from surface and sub-surface sources, whilst ensuring sufficient water remains in the environment to support biota and habitats. It has long been known that sensitivity to changes in water levels resulting from river and groundwater abstractions varies between rivers. Whilst assessment at the catchment scale is ideal for determining broad pressures on water resources and ecosystems, assessment of the sensitivity of reaches to changes in flow has previously been done on a site-by-site basis, often with the application of detailed but time consuming techniques (e.g. PHABSIM). While this is appropriate for a limited number of sites, it is costly in terms of money and time resources and therefore not appropriate for application at a national level required by responsible licensing authorities. To address this need, the Environment Agency (England) is developing an operational tool to predict relationships between physical habitat and flow which may be applied by field staff to rapidly determine the sensitivity of physical habitat to flow alteration for use in water resource management planning. An initial model of river sensitivity to abstraction (defined as the change in physical habitat related to changes in river discharge) was developed using site characteristics and data from 66 individual PHABSIM surveys throughout the UK (Booker & Acreman, 2008). By applying a multivariate multiple linear regression analysis to the data to define habitat availability-flow curves using resource intensity as predictor variables, the model (known as RAPHSA- Rapid Assessment of Physical Habitat Sensitivity to Abstraction) is able to take a risk-based approach to modeled certainty. Site specific information gathered using desk-based, or a variable amount of field work can be used to predict the shape of the habitat- flow curves, with the uncertainty of estimates reducing as more information is collected. Creation of generalized physical habitat- discharge relationships by the model allows environmental managers to select the desired level of confidence in the modeled results, based on environmental risk and the level of resource investment available. Hence, resources can be better directed according to the level of certainty required at each site. This model is intended to provide managers with an alternative to the existing use of either expert opinion or resource intensive site- specific investigations in determining local environmental flows. Here, we outline the potential use of this tool by the Environment Agency in routine operational and investigation- specific scenarios using case studies to illustrate its use.
Automating Risk Analysis of Software Design Models
Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688
Automating risk analysis of software design models.
Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
NASA Astrophysics Data System (ADS)
Passaro, Perry David
Misconceptions can be thought of as naive approaches to problem solving that are perceptually appealing but incorrect and inconsistent with scientific evidence (Piaget, 1929). One type of misconception involves flow distributions within circuits. This concept is important because miners' conceptual errors about flow distribution changes within complex circuits may be in part responsible for fatal mine disasters. Based on the theory that misconceptions of flow distribution changes within circuits were responsible for underground mine disasters involving mine ventilation circuits, a series of studies was undertaken with mining engineering students, professional mining engineers, as well as mine foremen, mine supervisors, mine rescue members, mine maintenance personnel, mining researchers and working miners to identify these conceptual errors and errors in mine ventilation procedures. Results indicate that misconceptions of flow distribution changes within circuits exist in over 70 percent of the subjects sampled. It is assumed that these misconceptions of flow distribution changes within circuits result in errors of judgment when miners are faced with inferring and changing ventilation arrangements when two or more mine sections are connected. Furthermore, it is assumed that these misconceptions are pervasive in the mining industry and may be responsible for at least two mine ventilation disasters. The findings of this study are consistent with Piaget's (1929) model of figurative and operative knowledge. This model states that misconceptions are in part due to a lack of knowledge of dynamic transformations and how to apply content information. Recommendations for future research include the development of an interactive expert system for training miners with ventilation arrangements. Such a system would meet the educational recommendations made by Piaget (1973b) by involving a hands-on approach that allows discovery, interaction, the opportunity to make mistakes and to review the cognitive concepts on which the subject relied during his manipulation of the ventilation system.
NASA Astrophysics Data System (ADS)
Jahangeer, F.; Gupta, P. K.; Yadav, B. K.
2017-12-01
Due to the reducing availability of water resources and the growing competition for water between residential, industrial, and agricultural users, increasing irrigation efficiency, by several methods like drip irrigation, is a demanding concern for agricultural experts. The understanding of the water and contaminants flow through the subsurface is needed for the sustainable irrigation water management, pollution assessment, polluted site remediation and groundwater recharge. In this study, the Windows-based computer software package HYDRUS-2D, which numerically simulates water and solute movement in two-dimensional, variably-saturated porous media, was used to evaluate the distribution of water and Nitrate in the sand tank. The laboratory and simulation experiments were conducted to evaluate the role of drainage, recharge flux, and infiltration on subsurface flow condition and subsequently, on nitrate movement in the subsurface. The water flow in the unsaturated zone model by Richards' equation, which was highly nonlinear and its parameters were largely dependent on the moisture content and pressure head of the partially saturated zone. Following different cases to be considered to evaluate- a) applying drainage and recharge flux to study domains, b) transient infiltration in a vertical soil column and c) subsequently, nitrate transport in 2D sand tank setup. A single porosity model was used for the simulation of water and nitrate flow in the study domain. The results indicate the transient water table position decreases as the time increase significantly by applying drainage flux at the bottom. Similarly, the water table positions in study domains increasing in the domain by applying recharge flux. Likewise, the water flow profile shows the decreasing water table elevation with increasing water content in the vertical domain. Moreover, the nitrate movement was dominated by advective flux and highly affected by the recharge flux in the vertical direction. The findings of the study help to enhance the understanding of the sustainable soil-water resources management and agricultural practices.
Integrating Safety and Mission Assurance in Design
NASA Technical Reports Server (NTRS)
Cianciola, Chris; Crane, Kenneth
2008-01-01
This presentation describes how the Ares Projects are learning from the successes and failures of previous launch systems in order to maximize safety and reliability while maintaining fiscal responsibility. The Ares Projects are integrating Safety and Mission Assurance into design activities and embracing independent assessments by Quality experts in thorough reviews of designs and processes. Incorporating Lean thinking into the design process, Ares is also streamlining existing processes and future manufacturing flows which will yield savings during production. Understanding the value of early involvement of Quality experts, the Ares Projects are leading launch vehicle development into the 21st century.
ERIC Educational Resources Information Center
Berger, Roland; Hänze, Martin
2015-01-01
We assessed the impact of expert students' instructional quality on the academic performance of novice students in 12th-grade physics classes organized in an expert model of cooperative learning ("jigsaw classroom"). The instructional quality of 129 expert students was measured by a newly developed rating system. As expected, when…
Delegating Decisions to Experts
ERIC Educational Resources Information Center
Li, Hao; Suen, Wing
2004-01-01
We present a model of delegation with self-interested and privately informed experts. A team of experts with extreme but opposite biases is acceptable to a wide range of decision makers with diverse preferences, but the value of expertise from such a team is low. A decision maker wants to appoint experts who are less partisan than he is in order…
NASA Astrophysics Data System (ADS)
Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.
2016-11-01
Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.
García-Alonso, Carlos; Pérez-Naranjo, Leonor
2009-01-01
Introduction Knowledge management, based on information transfer between experts and analysts, is crucial for the validity and usability of data envelopment analysis (DEA). Aim To design and develop a methodology: i) to assess technical efficiency of small health areas (SHA) in an uncertainty environment, and ii) to transfer information between experts and operational models, in both directions, for improving expert’s knowledge. Method A procedure derived from knowledge discovery from data (KDD) is used to select, interpret and weigh DEA inputs and outputs. Based on KDD results, an expert-driven Monte-Carlo DEA model has been designed to assess the technical efficiency of SHA in Andalusia. Results In terms of probability, SHA 29 is the most efficient being, on the contrary, SHA 22 very inefficient. 73% of analysed SHA have a probability of being efficient (Pe) >0.9 and 18% <0.5. Conclusions Expert knowledge is necessary to design and validate any operational model. KDD techniques make the transfer of information from experts to any operational model easy and results obtained from the latter improve expert’s knowledge.
ERIC Educational Resources Information Center
Neugebauer, Roger
2002-01-01
Discusses several strategies recommended by small business experts to help for-profit and non-profit child care centers survive a financial crisis. Strategies include: identifying the source of the problem, monitoring cash flow, reducing or deferring expenditures, expediting regular income and exploring new sources of income, patiently working…
2010-01-01
Background The communication literature currently focuses primarily on improving physicians' verbal and non-verbal behaviors during the medical interview. The Four Habits Model is a teaching and research framework for physician communication that is based on evidence linking specific communication behaviors with processes and outcomes of care. The Model conceptualizes basic communication tasks as "Habits" and describes the sequence of physician communication behaviors during the clinical encounter associated with improved outcomes. Using the Four Habits Model as a starting point, we asked communication experts to identify the verbal communication behaviors of patients that are important in outpatient encounters. Methods We conducted a 4-round Delphi process with 17 international experts in communication research, medical education, and health care delivery. All rounds were conducted via the internet. In round 1, experts reviewed a list of proposed patient verbal communication behaviors within the Four Habits Model framework. The proposed patient verbal communication behaviors were identified based on a review of the communication literature. The experts could: approve the proposed list; add new behaviors; or modify behaviors. In rounds 2, 3, and 4, they rated each behavior for its fit (agree or disagree) with a particular habit. After each round, we calculated the percent agreement for each behavior and provided these data in the next round. Behaviors receiving more than 70% of experts' votes (either agree or disagree) were considered as achieving consensus. Results Of the 14 originally-proposed patient verbal communication behaviors, the experts modified all but 2, and they added 20 behaviors to the Model in round 1. In round 2, they were presented with 59 behaviors and 14 options to remove specific behaviors for rating. After 3 rounds of rating, the experts retained 22 behaviors. This set included behaviors such as asking questions, expressing preferences, and summarizing information. Conclusion The process identified communication tasks and verbal communication behaviors for patients similar to those outlined for physicians in the Four Habits Model. This represents an important step in building a single model that can be applied to teaching patients and physicians the communication skills associated with improved satisfaction and positive outcomes of care. PMID:20403173
Nodes on ropes: a comprehensive data and control flow for steering ensemble simulations.
Waser, Jürgen; Ribičić, Hrvoje; Fuchs, Raphael; Hirsch, Christian; Schindler, Benjamin; Blöschl, Günther; Gröller, M Eduard
2011-12-01
Flood disasters are the most common natural risk and tremendous efforts are spent to improve their simulation and management. However, simulation-based investigation of actions that can be taken in case of flood emergencies is rarely done. This is in part due to the lack of a comprehensive framework which integrates and facilitates these efforts. In this paper, we tackle several problems which are related to steering a flood simulation. One issue is related to uncertainty. We need to account for uncertain knowledge about the environment, such as levee-breach locations. Furthermore, the steering process has to reveal how these uncertainties in the boundary conditions affect the confidence in the simulation outcome. Another important problem is that the simulation setup is often hidden in a black-box. We expose system internals and show that simulation steering can be comprehensible at the same time. This is important because the domain expert needs to be able to modify the simulation setup in order to include local knowledge and experience. In the proposed solution, users steer parameter studies through the World Lines interface to account for input uncertainties. The transport of steering information to the underlying data-flow components is handled by a novel meta-flow. The meta-flow is an extension to a standard data-flow network, comprising additional nodes and ropes to abstract parameter control. The meta-flow has a visual representation to inform the user about which control operations happen. Finally, we present the idea to use the data-flow diagram itself for visualizing steering information and simulation results. We discuss a case-study in collaboration with a domain expert who proposes different actions to protect a virtual city from imminent flooding. The key to choosing the best response strategy is the ability to compare different regions of the parameter space while retaining an understanding of what is happening inside the data-flow system. © 2011 IEEE
Interictal epileptiform discharge characteristics underlying expert interrater agreement.
Bagheri, Elham; Dauwels, Justin; Dean, Brian C; Waters, Chad G; Westover, M Brandon; Halford, Jonathan J
2017-10-01
The presence of interictal epileptiform discharges (IED) in the electroencephalogram (EEG) is a key finding in the medical workup of a patient with suspected epilepsy. However, inter-rater agreement (IRA) regarding the presence of IED is imperfect, leading to incorrect and delayed diagnoses. An improved understanding of which IED attributes mediate expert IRA might help in developing automatic methods for IED detection able to emulate the abilities of experts. Therefore, using a set of IED scored by a large number of experts, we set out to determine which attributes of IED predict expert agreement regarding the presence of IED. IED were annotated on a 5-point scale by 18 clinical neurophysiologists within 200 30-s EEG segments from recordings of 200 patients. 5538 signal analysis features were extracted from the waveforms, including wavelet coefficients, morphological features, signal energy, nonlinear energy operator response, electrode location, and spectrogram features. Feature selection was performed by applying elastic net regression and support vector regression (SVR) was applied to predict expert opinion, with and without the feature selection procedure and with and without several types of signal normalization. Multiple types of features were useful for predicting expert annotations, but particular types of wavelet features performed best. Local EEG normalization also enhanced best model performance. As the size of the group of EEGers used to train the models was increased, the performance of the models leveled off at a group size of around 11. The features that best predict inter-rater agreement among experts regarding the presence of IED are wavelet features, using locally standardized EEG. Our models for predicting expert opinion based on EEGer's scores perform best with a large group of EEGers (more than 10). By examining a large group of EEG signal analysis features we found that wavelet features with certain wavelet basis functions performed best to identify IEDs. Local normalization also improves predictability, suggesting the importance of IED morphology over amplitude-based features. Although most IED detection studies in the past have used opinion from three or fewer experts, our study suggests a "wisdom of the crowd" effect, such that pooling over a larger number of expert opinions produces a better correlation between expert opinion and objectively quantifiable features of the EEG. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
Modeling climate change impacts on maize growth with the focus on plant internal water transport
NASA Astrophysics Data System (ADS)
Heinlein, Florian; Biernath, Christian; Klein, Christian; Thieme, Christoph; Priesack, Eckart
2015-04-01
Based on climate change experiments in chambers and on field measurements, the scientific community expects regional and global changes of crop biomass production and yields. In central Europe one major aspect of climate change is the shift of precipitation towards winter months and the increase of extreme events, e.g. heat stress and heavy precipitation, during the main growing season in summer. To understand water uptake, water use, and transpiration rates by plants numerous crop models were developed. We tested the ability of two existing canopy models (CERES-Maize and SPASS) embedded in the model environment Expert-N5.0 to simulate the water balance, water use efficiency and crop growth. Additionally, sap flow was measured using heat-ratio measurement devices at the stem base of individual plants. The models were tested against data on soil water contents, as well as on evaporation and transpiration rates of Maize plants, which were grown on lysimeters at Helmholtz Zentrum München and in the field at the research station Scheyern, Germany, in summer 2013 and 2014. We present the simulation results and discuss observed shortcomings of the models. CERES-Maize and SPASS could simulate the measured dynamics of xylem sap flow. However, these models oversimplify plant water transport, and thus, cannot explain the underlying mechanisms. Therefore, to overcome these shortcomings, we additionally propose a new model, which is based on two coupled 1-D Richards equations, describing explicitly the plant and soil water transport. This model, which has previously successfully been applied to simulate water flux of 94 individual beech trees of an old-grown forest, will lead to a more mechanistic representation of the soil-plant-water-flow-continuum. This xylem water flux model was now implemented into the crop model SPASS and adjusted to simulate water flux of single maize plants. The modified version is presented and explained. Basic model input requirements are the plant above- and below-ground architectures. Shoot architectures were derived from terrestrial laser scanning. Root architectures of Maize plants were generated using a simple L-system. Preliminary results will be presented together with simulation results by CERES-Maize and SPASS.
Consistent improvements in processor speed and computer access have substantially increased the use of computer modeling by experts and non-experts alike. Several new computer modeling packages operating under graphical operating systems (i.e. Microsoft Windows or Macintosh) m...
Building a Foreign Military Sales Construction Delivery Strategy Decision Support System
1991-09-01
DSS, formulates it into a computer model and produces solutions using information and expert heuristics. Using the Expert Systeic Process to Build a DSS...computer model . There are five stages in the development of an expert system. They are: 1) Identify and characterize the important aspects of the problem...and Steven A. Hidreth. U.S. Security Assistance: The Political Process. Massachusetts: Heath and Company, 1985. 19. Guirguis , Amir A., Program
Computer Aided Dosimetry and Verification of Exposure to Radiation
NASA Astrophysics Data System (ADS)
Waller, Edward; Stodilka, Robert Z.; Leach, Karen E.; Lalonde, Louise
2002-06-01
In the timeframe following the September 11th attacks on the United States, increased emphasis has been placed on Chemical, Biological, Radiological and Nuclear (CBRN) preparedness. Of prime importance is rapid field assessment of potential radiation exposure to Canadian Forces field personnel. This work set up a framework for generating an 'expert' computer system for aiding and assisting field personnel in determining the extent of radiation insult to military personnel. Data was gathered by review of the available literature, discussions with medical and health physics personnel having hands-on experience dealing with radiation accident victims, and from experience of the principal investigator. Flow charts and generic data fusion algorithms were developed. Relationships between known exposure parameters, patient interview and history, clinical symptoms, clinical work-ups, physical dosimetry, biological dosimetry, and dose reconstruction as critical data indicators were investigated. The data obtained was examined in terms of information theory. A main goal was to determine how best to generate an adaptive model (i.e. when more data becomes available, how is the prediction improved). Consideration was given to determination of predictive algorithms for health outcome. In addition. the concept of coding an expert medical treatment advisor system was developed (U)
Expert judgement and uncertainty quantification for climate change
NASA Astrophysics Data System (ADS)
Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.
2016-05-01
Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.
Chung, Younjin; Salvador-Carulla, Luis; Salinas-Pérez, José A; Uriarte-Uriarte, Jose J; Iruin-Sanz, Alvaro; García-Alonso, Carlos R
2018-04-25
Decision-making in mental health systems should be supported by the evidence-informed knowledge transfer of data. Since mental health systems are inherently complex, involving interactions between its structures, processes and outcomes, decision support systems (DSS) need to be developed using advanced computational methods and visual tools to allow full system analysis, whilst incorporating domain experts in the analysis process. In this study, we use a DSS model developed for interactive data mining and domain expert collaboration in the analysis of complex mental health systems to improve system knowledge and evidence-informed policy planning. We combine an interactive visual data mining approach, the self-organising map network (SOMNet), with an operational expert knowledge approach, expert-based collaborative analysis (EbCA), to develop a DSS model. The SOMNet was applied to the analysis of healthcare patterns and indicators of three different regional mental health systems in Spain, comprising 106 small catchment areas and providing healthcare for over 9 million inhabitants. Based on the EbCA, the domain experts in the development team guided and evaluated the analytical processes and results. Another group of 13 domain experts in mental health systems planning and research evaluated the model based on the analytical information of the SOMNet approach for processing information and discovering knowledge in a real-world context. Through the evaluation, the domain experts assessed the feasibility and technology readiness level (TRL) of the DSS model. The SOMNet, combined with the EbCA, effectively processed evidence-based information when analysing system outliers, explaining global and local patterns, and refining key performance indicators with their analytical interpretations. The evaluation results showed that the DSS model was feasible by the domain experts and reached level 7 of the TRL (system prototype demonstration in operational environment). This study supports the benefits of combining health systems engineering (SOMNet) and expert knowledge (EbCA) to analyse the complexity of health systems research. The use of the SOMNet approach contributes to the demonstration of DSS for mental health planning in practice.
1990-09-01
following two chapters. 28 V. COCOMO MODEL A. OVERVIEW The COCOMO model which stands for COnstructive COst MOdel was developed by Barry Boehm and is...estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W. Boehm and...cost estimation model which uses an expert system to automate the Intermediate COnstructive Cost Estimation MOdel (COCOMO), developed by Barry W
The blackboard model - A framework for integrating multiple cooperating expert systems
NASA Technical Reports Server (NTRS)
Erickson, W. K.
1985-01-01
The use of an artificial intelligence (AI) architecture known as the blackboard model is examined as a framework for designing and building distributed systems requiring the integration of multiple cooperating expert systems (MCXS). Aerospace vehicles provide many examples of potential systems, ranging from commercial and military aircraft to spacecraft such as satellites, the Space Shuttle, and the Space Station. One such system, free-flying, spaceborne telerobots to be used in construction, servicing, inspection, and repair tasks around NASA's Space Station, is examined. The major difficulties found in designing and integrating the individual expert system components necessary to implement such a robot are outlined. The blackboard model, a general expert system architecture which seems to address many of the problems found in designing and building such a system, is discussed. A progress report on a prototype system under development called DBB (Distributed BlackBoard model) is given. The prototype will act as a testbed for investigating the feasibility, utility, and efficiency of MCXS-based designs developed under the blackboard model.
Williams, Kent E; Voigt, Jeffrey R
2004-01-01
The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.
Van der Fels-Klerx, Ine H J; Goossens, Louis H J; Saatkamp, Helmut W; Horst, Suzan H S
2002-02-01
This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.
NASA Astrophysics Data System (ADS)
Wegrzyński, Wojciech; Konecki, Marek
2018-01-01
This paper presents results of CFD and scale modelling of the flow of heat and smoke inside and outside of a compartment, in case of fire. Estimation of mass flow out of a compartment is critical, as it is the boundary condition in further considerations related to the exhaust of the smoke from a building - also in analysis related to the performance of natural ventilation in wind conditions. Both locations of the fire and the size of compartment were addressed as possible variables, which influence the mass and the temperature of smoke that leaves the room engulfed in fire. Results of the study show small to none influence of both size of the compartment and the location of the fire, on the mass flow of smoke exiting the room. On the same time, both of these parameters influence the temperature of the smoke - in larger compartments lower average temperatures of the smoke layer, but higher maximum values were observed. Results of this study may be useful also in the determination of the worst case scenarios for structural analysis, or in the investiga tion of the spread of fire through the compartment. Based on the results presented in this study, researchers can attribute an expert judgement choice of fire location, as a single scenario that is representative of a larger amount of probable scenarios.
Perimal-Lewis, Lua; Teubner, David; Hakendorf, Paul; Horwood, Chris
2016-12-01
Effective and accurate use of routinely collected health data to produce Key Performance Indicator reporting is dependent on the underlying data quality. In this research, Process Mining methodology and tools were leveraged to assess the data quality of time-based Emergency Department data sourced from electronic health records. This research was done working closely with the domain experts to validate the process models. The hospital patient journey model was used to assess flow abnormalities which resulted from incorrect timestamp data used in time-based performance metrics. The research demonstrated process mining as a feasible methodology to assess data quality of time-based hospital performance metrics. The insight gained from this research enabled appropriate corrective actions to be put in place to address the data quality issues. © The Author(s) 2015.
Guerrin, F; Dumas, J
2001-02-01
This paper describes a qualitative model of the functioning of salmon redds (spawning areas of salmon) and its impact on mortality rates of early stages. For this, we use Qsim, a qualitative simulator, which appeared adequate for representing available qualitative knowledge of freshwater ecology experts (see Part I of this paper). Since the number of relevant variables was relatively large, it appeared necessary to decompose the model into two parts, corresponding to processes occurring at separate time-scales. A qualitative clock allows us to submit the simulation of salmon developmental stages to the calculation of accumulated daily temperatures (degree-days), according to the clock ticks and a water temperature regime set by the user. Therefore, this introduces some way of real-time dating and duration in a purely qualitative model. Simulating both sub-models, either separately or by means of alternate transitions, allows us to generate the evolutions of variables of interest, such as the mortality rates according to two factors (flow of oxygenated water and plugging of gravel interstices near the bed surface), under various scenarios.
Estimating Classifier Accuracy Using Noisy Expert Labels
estimators to real -world problems is limited. We applythe estimators to labels simulated from three models of the expert labeling process and also four real ...thatconditional dependence between experts negatively impacts estimator performance. On two of the real datasets, the estimatorsclearly outperformed the
Learning classification models with soft-label information.
Nguyen, Quang; Valizadegan, Hamed; Hauskrecht, Milos
2014-01-01
Learning of classification models in medicine often relies on data labeled by a human expert. Since labeling of clinical data may be time-consuming, finding ways of alleviating the labeling costs is critical for our ability to automatically learn such models. In this paper we propose a new machine learning approach that is able to learn improved binary classification models more efficiently by refining the binary class information in the training phase with soft labels that reflect how strongly the human expert feels about the original class labels. Two types of methods that can learn improved binary classification models from soft labels are proposed. The first relies on probabilistic/numeric labels, the other on ordinal categorical labels. We study and demonstrate the benefits of these methods for learning an alerting model for heparin induced thrombocytopenia. The experiments are conducted on the data of 377 patient instances labeled by three different human experts. The methods are compared using the area under the receiver operating characteristic curve (AUC) score. Our AUC results show that the new approach is capable of learning classification models more efficiently compared to traditional learning methods. The improvement in AUC is most remarkable when the number of examples we learn from is small. A new classification learning framework that lets us learn from auxiliary soft-label information provided by a human expert is a promising new direction for learning classification models from expert labels, reducing the time and cost needed to label data.
How Do Novice and Expert Learners Represent, Understand, and Discuss Geologic Time?
NASA Astrophysics Data System (ADS)
Layow, Erica Amanda
This dissertation examined the representations novice and expert learners constructed for the geologic timescale. Learners engaged in a three-part activity. The purpose was to compare novice learners' representations to those of expert learners. This provided insight into the similarities and differences between their strategies for event ordering, assigning values and scale to the geologic timescale model, as well as their language and practices to complete the model. With a qualitative approach to data analysis informed by an expert-novice theoretical framework grounded in phenomenography, learner responses comprised the data analyzed. These data highlighted learners' metacognitive thoughts that might not otherwise be shared through lectures or laboratory activities. Learners' responses were analyzed using a discourse framework that positioned learners as knowers. Novice and expert learners both excelled at ordering and discussing events before the Phanerozoic, but were challenged with events during the Phanerozoic. Novice learners had difficulty assigning values to events and establishing a scale for their models. Expert learners expressed difficulty with determining a scale because of the size of the model, yet eventually used anchor points and unitized the model to establish a scale. Despite challenges constructing their models, novice learners spoke confidently using claims and few hedging phrases indicating their confidence in statements made. Experts used more hedges than novices, however the hedging comments were made about more complex conceptions. Using both phenomenographic and discourse analysis approaches for analysis foregrounded learners' discussions of how they perceived geologic time and their ways of knowing and doing. This research is intended to enhance the geoscience community's understanding of the ways novice and expert learners think and discuss conceptions of geologic time, including the events and values of time, and the strategies used to determine accuracy of scale. This knowledge will provide a base from which to support geoscience curriculum development at the university level, specifically to design activities that will not only engage and express learners' metacognitive scientific practices, but to encourage their construction of scientific identities and membership in the geoscience community.
Vandenberghe, V; Goethals, P L M; Van Griensven, A; Meirlaen, J; De Pauw, N; Vanrolleghem, P; Bauwens, W
2005-09-01
During the summer of 1999, two automated water quality measurement stations were installed along the Dender river in Belgium. The variables dissolved oxygen, temperature, conductivity, pH, rain-intensity, flow and solar radiation were measured continuously. In this paper these on-line measurement series are presented and interpreted using also additional measurements and ecological expert-knowledge. The purpose was to demonstrate the variability in time and space of the aquatic processes and the consequences of conducting and interpreting discrete measurements for river quality assessment and management. The large fluctuations of the data illustrated the importance of continuous measurements for the complete description and modelling of the biological processes in the river.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haurykiewicz, John Paul; Dinehart, Timothy Grant; Parker, Robert Young
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with informationmore » and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.« less
NASA Astrophysics Data System (ADS)
Kahveci, E. E.; Taymaz, I.
2018-03-01
In this study it was experimentally investigated the effect of mass flow rates of reactant gases which is one of the most important operational parameters of polymer electrolyte membrane (PEM) fuel cell on power density. The channel type is serpentine and single PEM fuel cell has an active area of 25 cm2. Design-Expert 8.0 (trial version) was used with four variables to investigate the effect of variables on the response using. Cell temperature, hydrogen mass flow rate, oxygen mass flow rate and humidification temperature were selected as independent variables. In addition, the power density was used as response to determine the combined effects of these variables. It was kept constant cell and humidification temperatures while changing mass flow rates of reactant gases. From the results an increase occurred in power density with increasing the hydrogen flow rates. But oxygen flow rate does not have a significant effect on power density within determined mass flow rates.
EX.MAIN. Expert System Model for Maintenance and Staff Training.
ERIC Educational Resources Information Center
Masturzi, Elio R.
EX.MAIN, a model for maintenance and staff training which combines knowledge based expert systems and computer based training, was developed jointly by the Department of Production Engineering of the University of Naples and CIRCUMVESUVIANA, the largest private railroad in Italy. It is a global model in the maintenance field which contains both…
Professional Education in Expert Search: A Content Model
ERIC Educational Resources Information Center
Smith, Catherine L.; Roseberry, Martha I.
2013-01-01
This paper presents a descriptive model of the subject matter taught in courses on expert search in ALA-accredited programs, answering the question: What is taught in formal professional education on search expertise? The model emerged from a grounded content analysis of 44 course descriptions and 16 syllabi, and was validated via a review of…
A comparison of two methods for expert elicitation in health technology assessments.
Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken
2016-07-26
When data needed to inform parameters in decision models are lacking, formal elicitation of expert judgement can be used to characterise parameter uncertainty. Although numerous methods for eliciting expert opinion as probability distributions exist, there is little research to suggest whether one method is more useful than any other method. This study had three objectives: (i) to obtain subjective probability distributions characterising parameter uncertainty in the context of a health technology assessment; (ii) to compare two elicitation methods by eliciting the same parameters in different ways; (iii) to collect subjective preferences of the experts for the different elicitation methods used. Twenty-seven clinical experts were invited to participate in an elicitation exercise to inform a published model-based cost-effectiveness analysis of alternative treatments for prostate cancer. Participants were individually asked to express their judgements as probability distributions using two different methods - the histogram and hybrid elicitation methods - presented in a random order. Individual distributions were mathematically aggregated across experts with and without weighting. The resulting combined distributions were used in the probabilistic analysis of the decision model and mean incremental cost-effectiveness ratios and the expected values of perfect information (EVPI) were calculated for each method, and compared with the original cost-effectiveness analysis. Scores on the ease of use of the two methods and the extent to which the probability distributions obtained from each method accurately reflected the expert's opinion were also recorded. Six experts completed the task. Mean ICERs from the probabilistic analysis ranged between £162,600-£175,500 per quality-adjusted life year (QALY) depending on the elicitation and weighting methods used. Compared to having no information, use of expert opinion decreased decision uncertainty: the EVPI value at the £30,000 per QALY threshold decreased by 74-86 % from the original cost-effectiveness analysis. Experts indicated that the histogram method was easier to use, but attributed a perception of more accuracy to the hybrid method. Inclusion of expert elicitation can decrease decision uncertainty. Here, choice of method did not affect the overall cost-effectiveness conclusions, but researchers intending to use expert elicitation need to be aware of the impact different methods could have.
A Model of How Different Biology Experts Explain Molecular and Cellular Mechanisms
Trujillo, Caleb M.; Anderson, Trevor R.; Pelaez, Nancy J.
2015-01-01
Constructing explanations is an essential skill for all science learners. The goal of this project was to model the key components of expert explanation of molecular and cellular mechanisms. As such, we asked: What is an appropriate model of the components of explanation used by biology experts to explain molecular and cellular mechanisms? Do explanations made by experts from different biology subdisciplines at a university support the validity of this model? Guided by the modeling framework of R. S. Justi and J. K. Gilbert, the validity of an initial model was tested by asking seven biologists to explain a molecular mechanism of their choice. Data were collected from interviews, artifacts, and drawings, and then subjected to thematic analysis. We found that biologists explained the specific activities and organization of entities of the mechanism. In addition, they contextualized explanations according to their biological and social significance; integrated explanations with methods, instruments, and measurements; and used analogies and narrated stories. The derived methods, analogies, context, and how themes informed the development of our final MACH model of mechanistic explanations. Future research will test the potential of the MACH model as a guiding framework for instruction to enhance the quality of student explanations. PMID:25999313
Expert mission planning and replanning scheduling system for NASA KSC payload operations
NASA Technical Reports Server (NTRS)
Pierce, Roger
1987-01-01
EMPRESS (Expert Mission Planning and REplanning Scheduling System) is an expert system created to assist payload mission planners at Kennedy in the long range planning and scheduling of horizontal payloads for space shuttle flights. Using the current flight manifest, these planners develop mission and payload schedules detailing all processing to be performed in the Operations and Checkout building at Kennedy. With the EMPRESS system, schedules are generated quickly using standard flows that represent the tasks and resources required to process a specific horizontal carrier. Resources can be tracked and resource conflicts can be determined and resolved interactively. Constraint relationships between tasks are maintained and can be enforced when a task is moved or rescheduled. The domain, structure, and functionality of the EMPRESS system is briefly designed. The limitations of the EMPRESS system are described as well as improvements expected with the EMPRESS-2 development.
Brandt, Laura A.; Benscoter, Allison; Harvey, Rebecca G.; Speroterra, Carolina; Bucklin, David N.; Romañach, Stephanie; Watling, James I.; Mazzotti, Frank J.
2017-01-01
Climate envelope models are widely used to describe potential future distribution of species under different climate change scenarios. It is broadly recognized that there are both strengths and limitations to using climate envelope models and that outcomes are sensitive to initial assumptions, inputs, and modeling methods Selection of predictor variables, a central step in modeling, is one of the areas where different techniques can yield varying results. Selection of climate variables to use as predictors is often done using statistical approaches that develop correlations between occurrences and climate data. These approaches have received criticism in that they rely on the statistical properties of the data rather than directly incorporating biological information about species responses to temperature and precipitation. We evaluated and compared models and prediction maps for 15 threatened or endangered species in Florida based on two variable selection techniques: expert opinion and a statistical method. We compared model performance between these two approaches for contemporary predictions, and the spatial correlation, spatial overlap and area predicted for contemporary and future climate predictions. In general, experts identified more variables as being important than the statistical method and there was low overlap in the variable sets (<40%) between the two methods Despite these differences in variable sets (expert versus statistical), models had high performance metrics (>0.9 for area under the curve (AUC) and >0.7 for true skill statistic (TSS). Spatial overlap, which compares the spatial configuration between maps constructed using the different variable selection techniques, was only moderate overall (about 60%), with a great deal of variability across species. Difference in spatial overlap was even greater under future climate projections, indicating additional divergence of model outputs from different variable selection techniques. Our work is in agreement with other studies which have found that for broad-scale species distribution modeling, using statistical methods of variable selection is a useful first step, especially when there is a need to model a large number of species or expert knowledge of the species is limited. Expert input can then be used to refine models that seem unrealistic or for species that experts believe are particularly sensitive to change. It also emphasizes the importance of using multiple models to reduce uncertainty and improve map outputs for conservation planning. Where outputs overlap or show the same direction of change there is greater certainty in the predictions. Areas of disagreement can be used for learning by asking why the models do not agree, and may highlight areas where additional on-the-ground data collection could improve the models.
Outcome of the First wwPDB Hybrid/Integrative Methods Task Force Workshop
Sali, Andrej; Berman, Helen M.; Schwede, Torsten; Trewhella, Jill; Kleywegt, Gerard; Burley, Stephen K.; Markley, John; Nakamura, Haruki; Adams, Paul; Bonvin, Alexandre M.J.J.; Chiu, Wah; Dal Peraro, Matteo; Di Maio, Frank; Ferrin, Thomas E.; Grünewald, Kay; Gutmanas, Aleksandras; Henderson, Richard; Hummer, Gerhard; Iwasaki, Kenji; Johnson, Graham; Lawson, Catherine L.; Meiler, Jens; Marti-Renom, Marc A.; Montelione, Gaetano T.; Nilges, Michael; Nussinov, Ruth; Patwardhan, Ardan; Rappsilber, Juri; Read, Randy J.; Saibil, Helen; Schröder, Gunnar F.; Schwieters, Charles D.; Seidel, Claus A. M.; Svergun, Dmitri; Topf, Maya; Ulrich, Eldon L.; Velankar, Sameer; Westbrook, John D.
2016-01-01
Summary Structures of biomolecular systems are increasingly computed by integrative modeling that relies on varied types of experimental data and theoretical information. We describe here the proceedings and conclusions from the first wwPDB Hybrid/Integrative Methods Task Force Workshop held at the European Bioinformatics Institute in Hinxton, UK, October 6 and 7, 2014. At the workshop, experts in various experimental fields of structural biology, experts in integrative modeling and visualization, and experts in data archiving addressed a series of questions central to the future of structural biology. How should integrative models be represented? How should the data and integrative models be validated? What data should be archived? How should the data and models be archived? What information should accompany the publication of integrative models? PMID:26095030
NASA Technical Reports Server (NTRS)
Kawamura, K.; Beale, G. O.; Schaffer, J. D.; Hsieh, B. J.; Padalkar, S.; Rodriguez-Moscoso, J. J.
1985-01-01
The results of the first phase of Research on an Expert System for Database Operation of Simulation/Emulation Math Models, is described. Techniques from artificial intelligence (AI) were to bear on task domains of interest to NASA Marshall Space Flight Center. One such domain is simulation of spacecraft attitude control systems. Two related software systems were developed to and delivered to NASA. One was a generic simulation model for spacecraft attitude control, written in FORTRAN. The second was an expert system which understands the usage of a class of spacecraft attitude control simulation software and can assist the user in running the software. This NASA Expert Simulation System (NESS), written in LISP, contains general knowledge about digital simulation, specific knowledge about the simulation software, and self knowledge.
NASA Technical Reports Server (NTRS)
Shewhart, Mark
1991-01-01
Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.
A condition metric for Eucalyptus woodland derived from expert evaluations.
Sinclair, Steve J; Bruce, Matthew J; Griffioen, Peter; Dodd, Amanda; White, Matthew D
2018-02-01
The evaluation of ecosystem quality is important for land-management and land-use planning. Evaluation is unavoidably subjective, and robust metrics must be based on consensus and the structured use of observations. We devised a transparent and repeatable process for building and testing ecosystem metrics based on expert data. We gathered quantitative evaluation data on the quality of hypothetical grassy woodland sites from experts. We used these data to train a model (an ensemble of 30 bagged regression trees) capable of predicting the perceived quality of similar hypothetical woodlands based on a set of 13 site variables as inputs (e.g., cover of shrubs, richness of native forbs). These variables can be measured at any site and the model implemented in a spreadsheet as a metric of woodland quality. We also investigated the number of experts required to produce an opinion data set sufficient for the construction of a metric. The model produced evaluations similar to those provided by experts, as shown by assessing the model's quality scores of expert-evaluated test sites not used to train the model. We applied the metric to 13 woodland conservation reserves and asked managers of these sites to independently evaluate their quality. To assess metric performance, we compared the model's evaluation of site quality with the managers' evaluations through multidimensional scaling. The metric performed relatively well, plotting close to the center of the space defined by the evaluators. Given the method provides data-driven consensus and repeatability, which no single human evaluator can provide, we suggest it is a valuable tool for evaluating ecosystem quality in real-world contexts. We believe our approach is applicable to any ecosystem. © 2017 State of Victoria.
Blom, Lisa; Laflamme, Lucie; Mölsted Alvesson, Helle
2018-01-01
Image-based teleconsultation between medical experts and healthcare staff at remote emergency centres can improve the diagnosis of conditions which are challenging to assess. One such condition is burns. Knowledge is scarce regarding how medical experts perceive the influence of such teleconsultation on their roles and relations to colleagues at point of care. In this qualitative study, semi-structured interviews were conducted with 15 medical experts to explore their expectations of a newly developed App for burns diagnostics and care prior to its implementation. Purposive sampling included male and female physicians at different stages of their career, employed at different referral hospitals and all potential future tele-experts in remote teleconsultation using the App. Positioning theory was used to analyse the data. The experts are already facing changes in their diagnostic practices due to the informal use of open access applications like WhatsApp. Additional changes are expected when the new App is launched. Four positions of medical experts were identified in situations of diagnostic advice, two related to patient flow-clinical specialist and gatekeeper-and two to point of care staff-educator and mentor. The experts move flexibly between the positions during diagnostic practices with remote colleagues. A new position in relation to previous research on medical roles-the mentor-came to light in this setting. The App is expected to have an important educational impact, streamline the diagnostic process, improve both triage and referrals and be a more secure option for remote diagnosis compared to current practices. Verbal communication is however expected to remain important for certain situations, in particular those related to the mentor position. The quality and security of referrals are expected to be improved through the App but the medical experts see less potential for conveying moral support via the App during remote consultations. Experts' reflections on remote consultations highlight the embedded social and cultural dimensions of implementing new technology.
Information Needs of the Ceramic Industry; A Users-Need Study.
ERIC Educational Resources Information Center
Janning, Edward A.; And Others
This report examines the problems in the flow of scientific and technological information in the Ceramic Industry. The research methodology used involved a panel of experts which defined the functions performed by ceramists and their corresponding information needs, listed sources of information available to ceramists, and defined problems and…
Hypersonic and Supersonic Flow Roadmaps Using Bibliometrics and Database Tomography.
ERIC Educational Resources Information Center
Kostoff, R. N.; Eberhart, Henry J.; Toothman, Darrell Ray
1999-01-01
Database Tomography (DT) is a textual database-analysis system consisting of algorithms for extracting multiword phrase frequencies and proximities from a large textual database, to augment interpretative capabilities of the expert human analyst. Describes use of the DT process, supplemented by literature bibliometric analyses, to derive technical…
An object oriented generic controller using CLIPS
NASA Technical Reports Server (NTRS)
Nivens, Cody R.
1990-01-01
In today's applications, the need for the division of code and data has focused on the growth of object oriented programming. This philosophy gives software engineers greater control over the environment of an application. Yet the use of object oriented design does not exclude the need for greater understanding by the application of what the controller is doing. Such understanding is only possible by using expert systems. Providing a controller that is capable of controlling an object by using rule-based expertise would expedite the use of both object oriented design and expert knowledge of the dynamic of an environment in modern controllers. This project presents a model of a controller that uses the CLIPS expert system and objects in C++ to create a generic controller. The polymorphic abilities of C++ allow for the design of a generic component stored in individual data files. Accompanying the component is a set of rules written in CLIPS which provide the following: the control of individual components, the input of sensory data from components and the ability to find the status of a given component. Along with the data describing the application, a set of inference rules written in CLIPS allows the application to make use of sensory facts and status and control abilities. As a demonstration of this ability, the control of the environment of a house is provided. This demonstration includes the data files describing the rooms and their contents as far as devices, windows and doors. The rules used for the home consist of the flow of people in the house and the control of devices by the home owner.
ERIC Educational Resources Information Center
Moallem, Mahnaz
1998-01-01
Examines an expert teacher's thinking and teaching processes in order to link them to instructional-design procedures. Findings suggest that there were fundamental differences between the teacher's thinking and teaching processes and microinstructional design models. (Author/AEF)
A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base
NASA Technical Reports Server (NTRS)
Kautzmann, Frank N., III
1988-01-01
Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.
NASA Technical Reports Server (NTRS)
Thomas, J. M.; Hanagud, S.
1975-01-01
The results of two questionnaires sent to engineering experts are statistically analyzed and compared with objective data from Saturn V design and testing. Engineers were asked how likely it was for structural failure to occur at load increments above and below analysts' stress limit predictions. They were requested to estimate the relative probabilities of different failure causes, and of failure at each load increment given a specific cause. Three mathematical models are constructed based on the experts' assessment of causes. The experts' overall assessment of prediction strength fits the Saturn V data better than the models do, but a model test option (T-3) based on the overall assessment gives more design change likelihood to overstrength structures than does an older standard test option. T-3 compares unfavorably with the standard option in a cost optimum structural design problem. The report reflects a need for subjective data when objective data are unavailable.
NASA Astrophysics Data System (ADS)
Hrachowitz, M.; Fovet, O.; Ruiz, L.; Euser, T.; Gharari, S.; Nijzink, R.; Freer, J.; Savenije, H. H. G.; Gascuel-Odoux, C.
2014-09-01
Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus, ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study, the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by four calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce a suite of hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by "prior constraints," inferred from expert knowledge to ensure a model which behaves well with respect to the modeler's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model setup exhibited increased performance in the independent test period and skill to better reproduce all tested signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if counter-balanced by prior constraints, can significantly increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge-driven strategy of constraining models.
Two-Phase Flow Technology Developed and Demonstrated for the Vision for Exploration
NASA Technical Reports Server (NTRS)
Sankovic, John M.; McQuillen, John B.; Lekan, Jack F.
2005-01-01
NASA s vision for exploration will once again expand the bounds of human presence in the universe with planned missions to the Moon and Mars. To attain the numerous goals of this vision, NASA will need to develop technologies in several areas, including advanced power-generation and thermal-control systems for spacecraft and life support. The development of these systems will have to be demonstrated prior to implementation to ensure safe and reliable operation in reduced-gravity environments. The Two-Phase Flow Facility (T(PHI) FFy) Project will provide the path to these enabling technologies for critical multiphase fluid products. The safety and reliability of future systems will be enhanced by addressing focused microgravity fluid physics issues associated with flow boiling, condensation, phase separation, and system stability, all of which are essential to exploration technology. The project--a multiyear effort initiated in 2004--will include concept development, normal-gravity testing (laboratories), reduced gravity aircraft flight campaigns (NASA s KC-135 and C-9 aircraft), space-flight experimentation (International Space Station), and model development. This project will be implemented by a team from the NASA Glenn Research Center, QSS Group, Inc., ZIN Technologies, Inc., and the Extramural Strategic Research Team composed of experts from academia.
Example-based learning: effects of model expertise in relation to student expertise.
Boekhout, Paul; van Gog, Tamara; van de Wiel, Margje W J; Gerards-Last, Dorien; Geraets, Jacques
2010-12-01
Worked examples are very effective for novice learners. They typically present a written-out ideal (didactical) solution for learners to study. This study used worked examples of patient history taking in physiotherapy that presented a non-didactical solution (i.e., based on actual performance). The effects of model expertise (i.e., worked example based on advanced, third-year student model or expert physiotherapist model) in relation to students' expertise (i.e., first- or second-year) were investigated. One hundred and thirty-four physiotherapy students (61 first-year and 73 second-year). Design was 2 × 2 factorial with factors 'Student Expertise' (first-year vs. second-year) and 'Model Expertise' (expert vs. advanced student). Within expertise levels, students were randomly assigned to the Expert Example or the Advanced Student Example condition. All students studied two examples (content depending on their assigned condition) and then completed a retention and test task. They rated their invested mental effort after each example and test task. Second-year students invested less mental effort in studying the examples, and in performing the retention and transfer tasks than first-year students. They also performed better on the retention test, but not on the transfer test. In contrast to our hypothesis, there was no interaction between student expertise and model expertise: all students who had studied the Expert examples performed better on the transfer test than students who had studied Advanced Student Examples. This study suggests that when worked examples are based on actual performance, rather than an ideal procedure, expert models are to be preferred over advanced student models.
Temporal and contextual knowledge in model-based expert systems
NASA Technical Reports Server (NTRS)
Toth-Fejel, Tihamer; Heher, Dennis
1987-01-01
A basic paradigm that allows representation of physical systems with a focus on context and time is presented. Paragon provides the capability to quickly capture an expert's knowledge in a cognitively resonant manner. From that description, Paragon creates a simulation model in LISP, which when executed, verifies that the domain expert did not make any mistakes. The Achille's heel of rule-based systems has been the lack of a systematic methodology for testing, and Paragon's developers are certain that the model-based approach overcomes that problem. The reason this testing is now possible is that software, which is very difficult to test, has in essence been transformed into hardware.
Mathematical modeling in realistic mathematics education
NASA Astrophysics Data System (ADS)
Riyanto, B.; Zulkardi; Putri, R. I. I.; Darmawijoyo
2017-12-01
The purpose of this paper is to produce Mathematical modelling in Realistics Mathematics Education of Junior High School. This study used development research consisting of 3 stages, namely analysis, design and evaluation. The success criteria of this study were obtained in the form of local instruction theory for school mathematical modelling learning which was valid and practical for students. The data were analyzed using descriptive analysis method as follows: (1) walk through, analysis based on the expert comments in the expert review to get Hypothetical Learning Trajectory for valid mathematical modelling learning; (2) analyzing the results of the review in one to one and small group to gain practicality. Based on the expert validation and students’ opinion and answers, the obtained mathematical modeling problem in Realistics Mathematics Education was valid and practical.
Braathen, Sverre; Sendstad, Ole Jakob
2004-08-01
Possible techniques for representing automatic decision-making behavior approximating human experts in complex simulation model experiments are of interest. Here, fuzzy logic (FL) and constraint satisfaction problem (CSP) methods are applied in a hybrid design of automatic decision making in simulation game models. The decision processes of a military headquarters are used as a model for the FL/CSP decision agents choice of variables and rulebases. The hybrid decision agent design is applied in two different types of simulation games to test the general applicability of the design. The first application is a two-sided zero-sum sequential resource allocation game with imperfect information interpreted as an air campaign game. The second example is a network flow stochastic board game designed to capture important aspects of land manoeuvre operations. The proposed design is shown to perform well also in this complex game with a very large (billionsize) action set. Training of the automatic FL/CSP decision agents against selected performance measures is also shown and results are presented together with directions for future research.
Mental models of a water management system in a green building.
Kalantzis, Anastasia; Thatcher, Andrew; Sheridan, Craig
2016-11-01
This intergroup case study compared users' mental models with an expert design model of a water management system in a green building. The system incorporates a constructed wetland component and a rainwater collection pond that together recycle water for re-use in the building and its surroundings. The sample consisted of five building occupants and the cleaner (6 users) and two experts who were involved with the design of the water management system. Users' mental model descriptions and the experts' design model were derived from in-depth interviews combined with self-constructed (and verified) diagrams. Findings from the study suggest that there is considerable variability in the user mental models that could impact the efficient functioning of the water management system. Recommendations for improvements are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Outcome of the First wwPDB Hybrid/Integrative Methods Task Force Workshop.
Sali, Andrej; Berman, Helen M; Schwede, Torsten; Trewhella, Jill; Kleywegt, Gerard; Burley, Stephen K; Markley, John; Nakamura, Haruki; Adams, Paul; Bonvin, Alexandre M J J; Chiu, Wah; Peraro, Matteo Dal; Di Maio, Frank; Ferrin, Thomas E; Grünewald, Kay; Gutmanas, Aleksandras; Henderson, Richard; Hummer, Gerhard; Iwasaki, Kenji; Johnson, Graham; Lawson, Catherine L; Meiler, Jens; Marti-Renom, Marc A; Montelione, Gaetano T; Nilges, Michael; Nussinov, Ruth; Patwardhan, Ardan; Rappsilber, Juri; Read, Randy J; Saibil, Helen; Schröder, Gunnar F; Schwieters, Charles D; Seidel, Claus A M; Svergun, Dmitri; Topf, Maya; Ulrich, Eldon L; Velankar, Sameer; Westbrook, John D
2015-07-07
Structures of biomolecular systems are increasingly computed by integrative modeling that relies on varied types of experimental data and theoretical information. We describe here the proceedings and conclusions from the first wwPDB Hybrid/Integrative Methods Task Force Workshop held at the European Bioinformatics Institute in Hinxton, UK, on October 6 and 7, 2014. At the workshop, experts in various experimental fields of structural biology, experts in integrative modeling and visualization, and experts in data archiving addressed a series of questions central to the future of structural biology. How should integrative models be represented? How should the data and integrative models be validated? What data should be archived? How should the data and models be archived? What information should accompany the publication of integrative models? Copyright © 2015 Elsevier Ltd. All rights reserved.
Human motion tracking by temporal-spatial local gaussian process experts.
Zhao, Xu; Fu, Yun; Liu, Yuncai
2011-04-01
Human pose estimation via motion tracking systems can be considered as a regression problem within a discriminative framework. It is always a challenging task to model the mapping from observation space to state space because of the high-dimensional characteristic in the multimodal conditional distribution. In order to build the mapping, existing techniques usually involve a large set of training samples in the learning process which are limited in their capability to deal with multimodality. We propose, in this work, a novel online sparse Gaussian Process (GP) regression model to recover 3-D human motion in monocular videos. Particularly, we investigate the fact that for a given test input, its output is mainly determined by the training samples potentially residing in its local neighborhood and defined in the unified input-output space. This leads to a local mixture GP experts system composed of different local GP experts, each of which dominates a mapping behavior with the specific covariance function adapting to a local region. To handle the multimodality, we combine both temporal and spatial information therefore to obtain two categories of local experts. The temporal and spatial experts are integrated into a seamless hybrid system, which is automatically self-initialized and robust for visual tracking of nonlinear human motion. Learning and inference are extremely efficient as all the local experts are defined online within very small neighborhoods. Extensive experiments on two real-world databases, HumanEva and PEAR, demonstrate the effectiveness of our proposed model, which significantly improve the performance of existing models.
1991-09-01
Distribution system ... ......... 4 2. Architechture of an Expert system .. .............. 66 vi List of Tables Table Page 1. Prototype Component Model...expert system to properly process work requests Ln civil engineering (8:23). Electric Power Research Institute (EPRI). EPRI is a private organization ...used (51) Training Level. The level of training shop technicians receive, and the resulting proficiency, are important in all organizations . Experts 1
NASA Technical Reports Server (NTRS)
Kawamura, K.; Beale, G. O.; Schaffer, J. D.; Hsieh, B. J.; Padalkar, S.; Rodriguez-Moscoso, J. J.
1985-01-01
A reference manual is provided for NESS, a simulation expert system. This manual gives user information regarding starting and operating NASA expert simulation system (NESS). This expert system provides an intelligent interface to a generic simulation program for spacecraft attitude control problems. A menu of the functions the system can perform is provided. Control repeated returns to this menu after executing each user request.
Robot environment expert system
NASA Technical Reports Server (NTRS)
Potter, J. L.
1985-01-01
The Robot Environment Expert System uses a hexidecimal tree data structure to model a complex robot environment where not only the robot arm moves, but also the robot itself and other objects may move. The hextree model allows dynamic updating, collision avoidance and path planning over time, to avoid moving objects.
Extraction of Capillary Non-perfusion from Fundus Fluorescein Angiogram
NASA Astrophysics Data System (ADS)
Sivaswamy, Jayanthi; Agarwal, Amit; Chawla, Mayank; Rani, Alka; Das, Taraprasad
Capillary Non-Perfusion (CNP) is a condition in diabetic retinopathy where blood ceases to flow to certain parts of the retina, potentially leading to blindness. This paper presents a solution for automatically detecting and segmenting CNP regions from fundus fluorescein angiograms (FFAs). CNPs are modelled as valleys, and a novel technique based on extrema pyramid is presented for trough-based valley detection. The obtained valley points are used to segment the desired CNP regions by employing a variance-based region growing scheme. The proposed algorithm has been tested on 40 images and validated against expert-marked ground truth. In this paper, we present results of testing and validation of our algorithm against ground truth and compare the segmentation performance against two others methods.The performance of the proposed algorithm is presented as a receiver operating characteristic (ROC) curve. The area under this curve is 0.842 and the distance of ROC from the ideal point (0,1) is 0.31. The proposed method for CNP segmentation was found to outperform the watershed [1] and heat-flow [2] based methods.
Retrosynthetic Reaction Prediction Using Neural Sequence-to-Sequence Models
2017-01-01
We describe a fully data driven model that learns to perform a retrosynthetic reaction prediction task, which is treated as a sequence-to-sequence mapping problem. The end-to-end trained model has an encoder–decoder architecture that consists of two recurrent neural networks, which has previously shown great success in solving other sequence-to-sequence prediction tasks such as machine translation. The model is trained on 50,000 experimental reaction examples from the United States patent literature, which span 10 broad reaction types that are commonly used by medicinal chemists. We find that our model performs comparably with a rule-based expert system baseline model, and also overcomes certain limitations associated with rule-based expert systems and with any machine learning approach that contains a rule-based expert system component. Our model provides an important first step toward solving the challenging problem of computational retrosynthetic analysis. PMID:29104927
An expert system for the selection of building elements during architectural design
NASA Astrophysics Data System (ADS)
Alibaba, Halil Zafer
This thesis explains the development stages of an expert system for the evaluation and selection of building elements during the early stages of architectural design. This expert system is called BES. It is produced after two prototypes were established. Testing of BES is made on professional architects who are from both academia and the practical construction market of Northern Cyprus. BES is intended to be used by experienced and inexperienced architects. The model includes selection of all kinds of main building elements that are available like retaining walls, foundations, external walls, internal walls, floors, external stairs, internal stairs, roofs, external chimneys, internal chimneys, windows and external doors and internal doors and their sub-type building elements. The selection is achieved via SMART Methodology depending on the performance requirements and an expert system shell Exsys Corvid version 1.2.14 is used to structure the expert system. The use of computers in today's world is very important with its advantages in handling vast amount of data. The use of the model through Internet makes the model international, and a useful design aid for architects. In addition, the decision-making feature of this model provides a suitable selection among numerous alternatives. The thesis also explains the development and the experience gained through use of the BES. It discusses the further development of the model.
Garrard, Lili; Price, Larry R; Bott, Marjorie J; Gajewski, Byron J
2016-10-01
Item response theory (IRT) models provide an appropriate alternative to the classical ordinal confirmatory factor analysis (CFA) during the development of patient-reported outcome measures (PROMs). Current literature has identified the assessment of IRT model fit as both challenging and underdeveloped (Sinharay & Johnson, 2003; Sinharay, Johnson, & Stern, 2006). This study evaluates the performance of Ordinal Bayesian Instrument Development (OBID), a Bayesian IRT model with a probit link function approach, through applications in two breast cancer-related instrument development studies. The primary focus is to investigate an appropriate method for comparing Bayesian IRT models in PROMs development. An exact Bayesian leave-one-out cross-validation (LOO-CV) approach (Vehtari & Lampinen, 2002) is implemented to assess prior selection for the item discrimination parameter in the IRT model and subject content experts' bias (in a statistical sense and not to be confused with psychometric bias as in differential item functioning) toward the estimation of item-to-domain correlations. Results support the utilization of content subject experts' information in establishing evidence for construct validity when sample size is small. However, the incorporation of subject experts' content information in the OBID approach can be sensitive to the level of expertise of the recruited experts. More stringent efforts need to be invested in the appropriate selection of subject experts to efficiently use the OBID approach and reduce potential bias during PROMs development.
An Open Source Framework for Coupled Hydro-Hydrogeo-Chemical Systems in Catchment Research
NASA Astrophysics Data System (ADS)
Delfs, J.; Sachse, A.; Gayler, S.; Grathwohl, P.; He, W.; Jang, E.; Kalbacher, T.; Klein, C.; Kolditz, O.; Maier, U.; Priesack, E.; Rink, K.; Selle, B.; Shao, H.; Singh, A. K.; Streck, T.; Sun, Y.; Wang, W.; Walther, M.
2013-12-01
This poster presents an open-source framework designed to assist water scientists in the study of catchment hydraulic functions with associated chemical processes, e.g. contaminant degradation, plant nutrient turnover. The model successfully calculates the feedbacks between surface water, subsurface water and air in standard benchmarks. In specific model applications to heterogeneous catchments, subsurface water is driven by density variations and runs through double porous media. Software codes of water science are tightly coupled by iteration, namely the Storm Water Management Model (SWMM) for urban runoff, Expert-N for simulating water fluxes and nutrient turnover in agricultural and forested soils, and OpenGeoSys (OGS) for groundwater. The coupled model calculates flow of hydrostatic shallow water over the land surface with finite volume and difference methods. The flow equations for water in the porous subsurface are discretized in space with finite elements. Chemical components are transferred through 1D, 2D or 3D watershed representations with advection-dispersion solvers or, as an alternative, random walk particle tracking. A transport solver can be in sequence with a chemical solver, e.g. PHREEQ-C, BRNS, additionally. Besides coupled partial differential equations, the concept of hydrological response units is employed in simulations at regional scale with scarce data availability. In this case, a conceptual hydrological model, specifically the Jena Adaptable Modeling System (JAMS), passes groundwater recharge through a software interface into OGS, which solves the partial differential equations of groundwater flow. Most components of the modeling framework are open source and can be modified for individual purposes. Applications range from temperate climate regions in Germany (Ammer catchment and Hessian Ried) to arid regions in the Middle East (Oman and Dead See). Some of the presented examples originate from intensively monitored research sites of the WESS research centre and the monitoring initiative TERENO. Other examples originate from the IWAS project on integrated water resources management. The model applications are primarily concerned with groundwater resources, which are endangered by overexploitation, intrusion of saltwater, and nitrate loads.
Advanced Technology Training System on Motor-Operated Valves
NASA Technical Reports Server (NTRS)
Wiederholt, Bradley J.; Widjaja, T. Kiki; Yasutake, Joseph Y.; Isoda, Hachiro
1993-01-01
This paper describes how features from the field of Intelligent Tutoring Systems are applied to the Motor-Operated Valve (MOV) Advanced Technology Training System (ATTS). The MOV ATTS is a training system developed at Galaxy Scientific Corporation for the Central Research Institute of Electric Power Industry in Japan and the Electric Power Research Institute in the United States. The MOV ATTS combines traditional computer-based training approaches with system simulation, integrated expert systems, and student and expert modeling. The primary goal of the MOV ATTS is to reduce human errors that occur during MOV overhaul and repair. The MOV ATTS addresses this goal by providing basic operational information of the MOV, simulating MOV operation, providing troubleshooting practice of MOV failures, and tailoring this training to the needs of each individual student. The MOV ATTS integrates multiple expert models (functional and procedural) to provide advice and feedback to students. The integration also provides expert model validation support to developers. Student modeling is supported by two separate student models: one model registers and updates the student's current knowledge of basic MOV information, while another model logs the student's actions and errors during troubleshooting exercises. These two models are used to provide tailored feedback to the student during the MOV course.
Woodward, Andrea; Torregrosa, Alicia; Madej, Mary Ann; Reichmuth, Michael; Fong, Darren
2014-01-01
The system dynamics model described in this report is the result of a collaboration between U.S. Geological Survey (USGS) scientists and National Park Service (NPS) San Francisco Bay Area Network (SFAN) staff, whose goal was to develop a methodology to integrate inventory and monitoring data to better understand ecosystem dynamics and trends using salmon in Olema Creek, Marin County, California, as an example case. The SFAN began monitoring multiple life stages of coho salmon (Oncorhynchus kisutch) in Olema Creek during 2003 (Carlisle and others, 2013), building on previous monitoring of spawning fish and redds. They initiated water-quality and habitat monitoring, and had access to flow and weather data from other sources. This system dynamics model of the freshwater portion of the coho salmon life cycle in Olema Creek integrated 8 years of existing monitoring data, literature values, and expert opinion to investigate potential factors limiting survival and production, identify data gaps, and improve monitoring and restoration prescriptions. A system dynamics model is particularly effective when (1) data are insufficient in time series length and/or measured parameters for a statistical or mechanistic model, and (2) the model must be easily accessible by users who are not modelers. These characteristics helped us meet the following overarching goals for this model: Summarize and synthesize NPS monitoring data with data and information from other sources to describe factors and processes affecting freshwater survival of coho salmon in Olema Creek. Provide a model that can be easily manipulated to experiment with alternative values of model parameters and novel scenarios of environmental drivers. Although the model describes the ecological dynamics of Olema Creek, these dynamics are structurally similar to numerous other coastal streams along the California coast that also contain anadromous fish populations. The model developed for Olema can be used, at least as a starting point, for other watersheds. This report describes each of the model elements with sufficient detail to guide the primary target audience, the NPS resource specialist, to run the model, interpret the results, change the input data to explore hypotheses, and ultimately modify and improve the model. Running the model and interpreting the results does not require modeling expertise on the part of the user. Additional companion publications will highlight other aspects of the model, such as its development, the rationale behind the methodological approach, scenario testing, and discussions of its use. System dynamics models consist of three basic elements: stocks, flows, and converters. Stocks are measurable quantities that can change over time, such as animal populations. Flows are any processes or conditions that change the quantity in a stock over time (Ford, 1999), are expressed in the model as a rate of change, and are diagrammed as arrows to or from stocks. Converters are processes or conditions that change the rate of flows. A converter is connected to a flow with an arrow indicating that it alters the rate of change. Anything that influences the rate of change (such as different environmental conditions, other external factors, or feedbacks from other stocks or flows) is modeled as a converter. For example, the number of fish in a population is appropriately modeled as a stock. Mortality is modeled as a flow because it is a rate of change over time used to determine the number of fish in the population. The density-dependent effect on mortality is modeled as a converter because it influences the rate of morality. Together, the flow and converter change the number, or stock, of juvenile coho. The instructions embedded in the stocks, flows, converters, and the sequence in which they are linked are processed by the simulation software with each completed sequence composing a model run. At each modeled time step within the model run, the stock counts will go up, down, or stay the same based on the modeled flows and the influence of converters on those flows. The model includes a user-friendly interface to change model parameters, which allows park staff and others to conduct sensitivity analyses, incorporate future knowledge, and implement scenarios for various future conditions. The model structure incorporates place holders for relationships that we hypothesize are significant but data are currently lacking. Future climate scenarios project stream temperatures higher than any that have ever been recorded at Olema Creek. Exploring climate change impacts on coho survival is a high priority for park staff, therefore the model provides the user with the option to experiment with hypothesized effects and to incorporate effects based on future observations.
ERIC Educational Resources Information Center
Warner, Zachary B.
2013-01-01
This study compared an expert-based cognitive model of domain mastery with student-based cognitive models of task performance for Integrated Algebra. Interpretations of student test results are limited by experts' hypotheses of how students interact with the items. In reality, the cognitive processes that students use to solve each item may be…
ERIC Educational Resources Information Center
Roduta Roberts, Mary; Alves, Cecilia B.; Chu, Man-Wai; Thompson, Margaret; Bahry, Louise M.; Gotzmann, Andrea
2014-01-01
The purpose of this study was to evaluate the adequacy of three cognitive models, one developed by content experts and two generated from student verbal reports for explaining examinee performance on a grade 3 diagnostic mathematics test. For this study, the items were developed to directly measure the attributes in the cognitive model. The…
ERIC Educational Resources Information Center
Wu, Hsin-Kai
2010-01-01
The purposes of this article are to present the design of a technology-enhanced learning environment (Air Pollution Modeling Environment [APoME]) that was informed by a novice-expert analysis and to discuss high school students' development of modelling practices in the learning environment. APoME was designed to help high school students…
NASA Technical Reports Server (NTRS)
Ungar, Eugene K.; Richards, W. Lance
2015-01-01
The aircraft-based Stratospheric Observatory for Infrared Astronomy (SOFIA) is a platform for multiple infrared astronomical observation experiments. These experiments carry sensors cooled to liquid helium temperatures. The liquid helium supply is contained in large (i.e., 10 liters or more) vacuum-insulated dewars. Should the dewar vacuum insulation fail, the inrushing air will condense and freeze on the dewar wall, resulting in a large heat flux on the dewar's contents. The heat flux results in a rise in pressure and the actuation of the dewar pressure relief system. A previous NASA Engineering and Safety Center (NESC) assessment provided recommendations for the wall heat flux that would be expected from a loss of vacuum and detailed an appropriate method to use in calculating the maximum pressure that would occur in a loss of vacuum event. This method involved building a detailed supercritical helium compressible flow thermal/fluid model of the vent stack and exercising the model over the appropriate range of parameters. The experimenters designing science instruments for SOFIA are not experts in compressible supercritical flows and do not generally have access to the thermal/fluid modeling packages that are required to build detailed models of the vent stacks. Therefore, the SOFIA Program engaged the NESC to develop a simplified methodology to estimate the maximum pressure in a liquid helium dewar after the loss of vacuum insulation. The method would allow the university-based science instrument development teams to conservatively determine the cryostat's vent neck sizing during preliminary design of new SOFIA Science Instruments. This report details the development of the simplified method, the method itself, and the limits of its applicability. The simplified methodology provides an estimate of the dewar pressure after a loss of vacuum insulation that can be used for the initial design of the liquid helium dewar vent stacks. However, since it is not an exact tool, final verification of the dewar pressure vessel design requires a complete, detailed real fluid compressible flow model of the vent stack. The wall heat flux resulting from a loss of vacuum insulation increases the dewar pressure, which actuates the pressure relief mechanism and results in high-speed flow through the dewar vent stack. At high pressures, the flow can be choked at the vent stack inlet, at the exit, or at an intermediate transition or restriction. During previous SOFIA analyses, it was observed that there was generally a readily identifiable section of the vent stack that would limit the flow – e.g., a small diameter entrance or an orifice. It was also found that when the supercritical helium was approximated as an ideal gas at the dewar condition, the calculated mass flow rate based on choking at the limiting entrance or transition was less than the mass flow rate calculated using the detailed real fluid model2. Using this lower mass flow rate would yield a conservative prediction of the dewar’s wall heat flux capability. The simplified method of the current work was developed by building on this observation.
Mavandadi, Sam; Feng, Steve; Yu, Frank; Dimitrov, Stoyan; Nielsen-Saines, Karin; Prescott, William R; Ozcan, Aydogan
2012-01-01
We propose a methodology for digitally fusing diagnostic decisions made by multiple medical experts in order to improve accuracy of diagnosis. Toward this goal, we report an experimental study involving nine experts, where each one was given more than 8,000 digital microscopic images of individual human red blood cells and asked to identify malaria infected cells. The results of this experiment reveal that even highly trained medical experts are not always self-consistent in their diagnostic decisions and that there exists a fair level of disagreement among experts, even for binary decisions (i.e., infected vs. uninfected). To tackle this general medical diagnosis problem, we propose a probabilistic algorithm to fuse the decisions made by trained medical experts to robustly achieve higher levels of accuracy when compared to individual experts making such decisions. By modelling the decisions of experts as a three component mixture model and solving for the underlying parameters using the Expectation Maximisation algorithm, we demonstrate the efficacy of our approach which significantly improves the overall diagnostic accuracy of malaria infected cells. Additionally, we present a mathematical framework for performing 'slide-level' diagnosis by using individual 'cell-level' diagnosis data, shedding more light on the statistical rules that should govern the routine practice in examination of e.g., thin blood smear samples. This framework could be generalized for various other tele-pathology needs, and can be used by trained experts within an efficient tele-medicine platform.
Crossword expertise as recognitional decision making: an artificial intelligence approach
Thanasuan, Kejkaew; Mueller, Shane T.
2014-01-01
The skills required to solve crossword puzzles involve two important aspects of lexical memory: semantic information in the form of clues that indicate the meaning of the answer, and orthographic patterns that constrain the possibilities but may also provide hints to possible answers. Mueller and Thanasuan (2013) proposed a model accounting for the simple memory access processes involved in solving individual crossword clues, but expert solvers also bring additional skills and strategies to bear on solving complete puzzles. In this paper, we developed an computational model of crossword solving that incorporates strategic and other factors, and is capable of solving crossword puzzles in a human-like fashion, in order to understand the complete set of skills needed to solve a crossword puzzle. We compare our models to human expert and novice solvers to investigate how different strategic and structural factors in crossword play impact overall performance. Results reveal that expert crossword solving relies heavily on fluent semantic memory search and retrieval, which appear to allow experts to take better advantage of orthographic-route solutions, and experts employ strategies that enable them to use orthographic information. Furthermore, other processes central to traditional AI models (error correction and backtracking) appear to be of less importance for human players. PMID:25309483
Crossword expertise as recognitional decision making: an artificial intelligence approach.
Thanasuan, Kejkaew; Mueller, Shane T
2014-01-01
THE SKILLS REQUIRED TO SOLVE CROSSWORD PUZZLES INVOLVE TWO IMPORTANT ASPECTS OF LEXICAL MEMORY: semantic information in the form of clues that indicate the meaning of the answer, and orthographic patterns that constrain the possibilities but may also provide hints to possible answers. Mueller and Thanasuan (2013) proposed a model accounting for the simple memory access processes involved in solving individual crossword clues, but expert solvers also bring additional skills and strategies to bear on solving complete puzzles. In this paper, we developed an computational model of crossword solving that incorporates strategic and other factors, and is capable of solving crossword puzzles in a human-like fashion, in order to understand the complete set of skills needed to solve a crossword puzzle. We compare our models to human expert and novice solvers to investigate how different strategic and structural factors in crossword play impact overall performance. Results reveal that expert crossword solving relies heavily on fluent semantic memory search and retrieval, which appear to allow experts to take better advantage of orthographic-route solutions, and experts employ strategies that enable them to use orthographic information. Furthermore, other processes central to traditional AI models (error correction and backtracking) appear to be of less importance for human players.
Hosseinzade, Zeinab; Pagsuyoin, Sheree A; Ponnambalam, Kumaraswamy; Monem, Mohammad J
2017-12-01
The stiff competition for water between agriculture and non-agricultural production sectors makes it necessary to have effective management of irrigation networks in farms. However, the process of selecting flow control structures in irrigation networks is highly complex and involves different levels of decision makers. In this paper, we apply multi-attribute decision making (MADM) methodology to develop a decision analysis (DA) framework for evaluating, ranking and selecting check and intake structures for irrigation canals. The DA framework consists of identifying relevant attributes for canal structures, developing a robust scoring system for alternatives, identifying a procedure for data quality control, and identifying a MADM model for the decision analysis. An application is illustrated through an analysis for automation purposes of the Qazvin irrigation network, one of the oldest and most complex irrigation networks in Iran. A survey questionnaire designed based on the decision framework was distributed to experts, managers, and operators of the Qazvin network and to experts from the Ministry of Power in Iran. Five check structures and four intake structures were evaluated. A decision matrix was generated from the average scores collected from the survey, and was subsequently solved using TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) method. To identify the most critical structure attributes for the selection process, optimal attribute weights were calculated using Entropy method. For check structures, results show that the duckbill weir is the preferred structure while the pivot weir is the least preferred. Use of the duckbill weir can potentially address the problem with existing Amil gates where manual intervention is required to regulate water levels during periods of flow extremes. For intake structures, the Neyrpic® gate and constant head orifice are the most and least preferred alternatives, respectively. Some advantages of the Neyrpic® gate are ease of operation and capacity to measure discharge flows. Overall, the application to the Qazvin irrigation network demonstrates the utility of the proposed DA framework in selecting appropriate structures for regulating water flows in irrigation canals. This framework systematically aids the decision process by capturing decisions made at various levels (individual farmers to high-level management). It can be applied to other cases where a new irrigation network is being designed, or where changes in irrigation structures need to be identified to improve flow control in existing networks. Copyright © 2017 Elsevier B.V. All rights reserved.
Groff, Shannon C.; Loftin, Cynthia S.; Drummond, Frank; Bushmann, Sara; McGill, Brian J.
2016-01-01
Non-native honeybees historically have been managed for crop pollination, however, recent population declines draw attention to pollination services provided by native bees. We applied the InVEST Crop Pollination model, developed to predict native bee abundance from habitat resources, in Maine's wild blueberry crop landscape. We evaluated model performance with parameters informed by four approaches: 1) expert opinion; 2) sensitivity analysis; 3) sensitivity analysis informed model optimization; and, 4) simulated annealing (uninformed) model optimization. Uninformed optimization improved model performance by 29% compared to expert opinion-informed model, while sensitivity-analysis informed optimization improved model performance by 54%. This suggests that expert opinion may not result in the best parameter values for the InVEST model. The proportion of deciduous/mixed forest within 2000 m of a blueberry field also reliably predicted native bee abundance in blueberry fields, however, the InVEST model provides an efficient tool to estimate bee abundance beyond the field perimeter.
ERIC Educational Resources Information Center
Schellekens, Ad; Paas, Fred; Verbraeck, Alexander; van Merrienboer, Jeroen J. G.
2010-01-01
In a preceding case study, a process-focused demand-driven approach for organising flexible educational programmes in higher professional education (HPE) was developed. Operations management and instructional design contributed to designing a flexible educational model by means of discrete-event simulation. Educational experts validated the model…
The User Interface: A Hypertext Model Linking Art Objects and Related Information.
ERIC Educational Resources Information Center
Moline, Judi
This report presents a model combining the emerging technologies of hypertext and expert systems. Hypertext is relatively unexplored but promises an innovative approach to information retrieval. In contrast, expert systems have been used experimentally in many different application areas ranging from medical diagnosis to oil exploration. The…
Expert Supervisors' Priorities When Working with Easy and Challenging Supervisees
ERIC Educational Resources Information Center
Kemer, Gulsah; Borders, L. DiAnne; Yel, Nedim
2017-01-01
Using Kemer, Borders, and Willse's (2014) concept map as a conceptual model, the authors aimed to understand expert supervisors' priorities with their easy and challenging supervisees. Experts' priorities with easy and challenging supervisees were represented in different parts of the concept map, and they seemed to individualize their work with…
POTW Expert is a PCX-based software program modeled after EPA/s Handbook Retrofitting POTWs (EPA-625/6-89/020) (formerly, Handbook for Improving POTW Performance Using the Composite Correction Program Approach). POTW Expert assists POTW owners and operators, state and local regu...
Expert Anticipatory Skill in Striking Sports: A Review and a Model
ERIC Educational Resources Information Center
Muller, Sean; Abernethy, Bruce
2012-01-01
Expert performers in striking sports can hit objects moving at high speed with incredible precision. Exceptionally well developed anticipation skills are necessary to cope with the severe constraints on interception. In this paper, we provide a review of the empirical evidence regarding expert interception in striking sports and propose a…
SWAN: An expert system with natural language interface for tactical air capability assessment
NASA Technical Reports Server (NTRS)
Simmons, Robert M.
1987-01-01
SWAN is an expert system and natural language interface for assessing the war fighting capability of Air Force units in Europe. The expert system is an object oriented knowledge based simulation with an alternate worlds facility for performing what-if excursions. Responses from the system take the form of generated text, tables, or graphs. The natural language interface is an expert system in its own right, with a knowledge base and rules which understand how to access external databases, models, or expert systems. The distinguishing feature of the Air Force expert system is its use of meta-knowledge to generate explanations in the frame and procedure based environment.
Liu, Fengchen; Porco, Travis C.; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K.; Bailey, Robin L.; Keenan, Jeremy D.; Solomon, Anthony W.; Emerson, Paul M.; Gambhir, Manoj; Lietman, Thomas M.
2015-01-01
Background Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. Methods The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts’ opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon’s signed-rank statistic. Findings Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher’s information. Each individual expert’s forecast was poorer than the sum of experts. Interpretation Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. PMID:26302380
Log-Linear Modeling of Agreement among Expert Exposure Assessors
Hunt, Phillip R.; Friesen, Melissa C.; Sama, Susan; Ryan, Louise; Milton, Donald
2015-01-01
Background: Evaluation of expert assessment of exposure depends, in the absence of a validation measurement, upon measures of agreement among the expert raters. Agreement is typically measured using Cohen’s Kappa statistic, however, there are some well-known limitations to this approach. We demonstrate an alternate method that uses log-linear models designed to model agreement. These models contain parameters that distinguish between exact agreement (diagonals of agreement matrix) and non-exact associations (off-diagonals). In addition, they can incorporate covariates to examine whether agreement differs across strata. Methods: We applied these models to evaluate agreement among expert ratings of exposure to sensitizers (none, likely, high) in a study of occupational asthma. Results: Traditional analyses using weighted kappa suggested potential differences in agreement by blue/white collar jobs and office/non-office jobs, but not case/control status. However, the evaluation of the covariates and their interaction terms in log-linear models found no differences in agreement with these covariates and provided evidence that the differences observed using kappa were the result of marginal differences in the distribution of ratings rather than differences in agreement. Differences in agreement were predicted across the exposure scale, with the likely moderately exposed category more difficult for the experts to differentiate from the highly exposed category than from the unexposed category. Conclusions: The log-linear models provided valuable information about patterns of agreement and the structure of the data that were not revealed in analyses using kappa. The models’ lack of dependence on marginal distributions and the ease of evaluating covariates allow reliable detection of observational bias in exposure data. PMID:25748517
Localized Smart-Interpretation
NASA Astrophysics Data System (ADS)
Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas; Bach, Torben; Pallesen, Tom
2014-05-01
The complex task of setting up a geological model consists not only of combining available geological information into a conceptual plausible model, but also requires consistency with availably data, e.g. geophysical data. However, in many cases the direct geological information, e.g borehole samples, are very sparse, so in order to create a geological model, the geologist needs to rely on the geophysical data. The problem is however, that the amount of geophysical data in many cases are so vast that it is practically impossible to integrate all of them in the manual interpretation process. This means that a lot of the information available from the geophysical surveys are unexploited, which is a problem, due to the fact that the resulting geological model does not fulfill its full potential and hence are less trustworthy. We suggest an approach to geological modeling that 1. allow all geophysical data to be considered when building the geological model 2. is fast 3. allow quantification of geological modeling. The method is constructed to build a statistical model, f(d,m), describing the relation between what the geologists interpret, d, and what the geologist knows, m. The para- meter m reflects any available information that can be quantified, such as geophysical data, the result of a geophysical inversion, elevation maps, etc... The parameter d reflects an actual interpretation, such as for example the depth to the base of a ground water reservoir. First we infer a statistical model f(d,m), by examining sets of actual interpretations made by a geological expert, [d1, d2, ...], and the information used to perform the interpretation; [m1, m2, ...]. This makes it possible to quantify how the geological expert performs interpolation through f(d,m). As the geological expert proceeds interpreting, the number of interpreted datapoints from which the statistical model is inferred increases, and therefore the accuracy of the statistical model increases. When a model f(d,m) successfully has been inferred, we are able to simulate how the geological expert would perform an interpretation given some external information m, through f(d|m). We will demonstrate this method applied on geological interpretation and densely sampled airborne electromagnetic data. In short, our goal is to build a statistical model describing how a geological expert performs geological interpretation given some geophysical data. We then wish to use this statistical model to perform semi automatic interpretation, everywhere where such geophysical data exist, in a manner consistent with the choices made by a geological expert. Benefits of such a statistical model are that 1. it provides a quantification of how a geological expert performs interpretation based on available diverse data 2. all available geophysical information can be used 3. it allows much faster interpretation of large data sets.
VIDEO MODELING BY EXPERTS WITH VIDEO FEEDBACK TO ENHANCE GYMNASTICS SKILLS
Boyer, Eva; Miltenberger, Raymond G; Batsche, Catherine; Fogel, Victoria
2009-01-01
The effects of combining video modeling by experts with video feedback were analyzed with 4 female competitive gymnasts (7 to 10 years old) in a multiple baseline design across behaviors. During the intervention, after the gymnast performed a specific gymnastics skill, she viewed a video segment showing an expert gymnast performing the same skill and then viewed a video replay of her own performance of the skill. The results showed that all gymnasts demonstrated improved performance across three gymnastics skills following exposure to the intervention. PMID:20514194
Video modeling by experts with video feedback to enhance gymnastics skills.
Boyer, Eva; Miltenberger, Raymond G; Batsche, Catherine; Fogel, Victoria
2009-01-01
The effects of combining video modeling by experts with video feedback were analyzed with 4 female competitive gymnasts (7 to 10 years old) in a multiple baseline design across behaviors. During the intervention, after the gymnast performed a specific gymnastics skill, she viewed a video segment showing an expert gymnast performing the same skill and then viewed a video replay of her own performance of the skill. The results showed that all gymnasts demonstrated improved performance across three gymnastics skills following exposure to the intervention.
NASA Astrophysics Data System (ADS)
Vacik, Harald; Huber, Patrick; Hujala, Teppo; Kurtilla, Mikko; Wolfslehner, Bernhard
2015-04-01
It is an integral element of the European understanding of sustainable forest management to foster the design and marketing of forest products, non-wood forest products (NWFPs) and services that go beyond the production of timber. Despite the relevance of NWFPs in Europe, forest management and planning methods have been traditionally tailored towards wood and wood products, because most forest management models and silviculture techniques were developed to ensure a sustained production of timber. Although several approaches exist which explicitly consider NWFPs as management objectives in forest planning, specific models are needed for the assessment of their production potential in different environmental contexts and for different management regimes. Empirical data supporting a comprehensive assessment of the potential of NWFPs are rare, thus making development of statistical models particularly problematic. However, the complex causal relationships between the sustained production of NWFPs, the available ecological resources, as well as the organizational and the market potential of forest management regimes are well suited for knowledge-based expert models. Bayesian belief networks (BBNs) are a kind of probabilistic graphical model that have become very popular to practitioners and scientists mainly due to the powerful probability theory involved, which makes BBNs suitable to deal with a wide range of environmental problems. In this contribution we present the development of a Bayesian belief network to assess the potential of NWFPs for small scale forest owners. A three stage iterative process with stakeholder and expert participation was used to develop the Bayesian Network within the frame of the StarTree Project. The group of participants varied in the stages of the modelling process. A core team, consisting of one technical expert and two domain experts was responsible for the entire modelling process as well as for the first prototype of the network structure, including nodes and relationships. A top-level causal network, was further decomposed to sub level networks. Stakeholder participation including a group of experts from different related subject areas was used in model verification and validation. We demonstrate that BBNs can be used to transfer expert knowledge from science to practice and thus have the ability to contribute to improved problem understanding of non-expert decision makers for a sustainable production of NWFPs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; White, Amanda M.; Whitney, Paul D.
2013-06-04
The Multi-Source Signatures for Nuclear Programs project, part of Pacific Northwest National Laboratory’s (PNNL) Signature Discovery Initiative, seeks to computationally capture expert assessment of multi-type information such as text, sensor output, imagery, or audio/video files, to assess nuclear activities through a series of Bayesian network (BN) models. These models incorporate knowledge from a diverse range of information sources in order to help assess a country’s nuclear activities. The models span engineering topic areas, state-level indicators, and facility-specific characteristics. To illustrate the development, calibration, and use of BN models for multi-source assessment, we present a model that predicts a country’s likelihoodmore » to participate in the international nuclear nonproliferation regime. We validate this model by examining the extent to which the model assists non-experts arrive at conclusions similar to those provided by nuclear proliferation experts. We also describe the PNNL-developed software used throughout the lifecycle of the Bayesian network model development.« less
Staff exchange with Chemical Waste Management. Final project report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrer, B.J.; Barak, D.W.
1993-12-01
Original objective was transfer of PNL technology and expertise in computational chemistry and waste flow/treatment modeling to CWM. Identification and characterization of a broader portfolio of PNL`s environmental remediation technologies with high potential for rapid application became the focus of the exchange, which included E-mail exchanges. Of the 14 technologies discussed, the following were identified as being of high interest to CWM: six phase soil heating (in-situ heating), high energy electrical corona, RAAS/ReOpt{trademark} (remedial, expert system), TEES{trademark} (catalytic production of methane from biological wastes), PST (process for treating petroleum sludge). CWM`s reorganization and downsizing reduced the potential benefits to industry,more » but a proposal for transfer and application of PST to Wheelabrator was made.« less
Leroux, Dorothée; Hezard, Nathalie; Lebreton, Aurélien; Bauters, Anne; Suchon, Pierre; de Maistre, Emmanuel; Biron, Christine; Huisse, Marie-Genevieve; Ternisien, Catherine; Voisin, Sophie; Gruel, Yves; Pouplard, Claire
2014-09-01
A rapid lateral flow immunoassay (LFIA) (STic Expert(®) HIT), recently developed for the diagnosis of heparin-induced thrombocytopenia (HIT), was evaluated in a prospective multicentre cohort of 334 consecutive patients. The risk of HIT was estimated by the 4Ts score as low, intermediate and high in 28·7%, 61·7% and 9·6% of patients, respectively. Definite HIT was diagnosed in 40 patients (12·0%) with positive results on both enzyme-linked immunosorbent assay (Asserachrom(®) HPIA IgG) and serotonin release assay. The inter-reader reproducibility of results obtained was excellent (kappa ratio > 0·9). The negative predictive value of LFIA with plasma samples was 99·6% with a negative likelihood ratio (LR) of 0·03, and was comparable to those of the particle gel immunoassay (H/PF4-PaGIA(®) ) performed in 124 cases. Positive predictive value and positive LR were 44·4% and 5·87, respectively, and the results were similar for serum samples. The probability of HIT in intermediate risk patients decreased from 11·2% to 0·4% when the LFIA result was negative and increased to 42·5% when it was positive. In conclusion, the STic Expert(®) HIT combined with the 4Ts score is a reliable tool to rule out the diagnosis of HIT. © 2014 John Wiley & Sons Ltd.
Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.
2018-01-01
Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.
Elissen, Arianne M J; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk
2015-05-01
To support providers and commissioners in accurately assessing their local populations' health needs, this study produces an overview of Dutch predictive risk models for health care, focusing specifically on the type, combination and relevance of included determinants for achieving the Triple Aim (improved health, better care experience, and lower costs). We conducted a mixed-methods study combining document analyses, interviews and a Delphi study. Predictive risk models were identified based on a web search and expert input. Participating in the study were Dutch experts in predictive risk modelling (interviews; n=11) and experts in healthcare delivery, insurance and/or funding methodology (Delphi panel; n=15). Ten predictive risk models were analysed, comprising 17 unique determinants. Twelve were considered relevant by experts for estimating community health needs. Although some compositional similarities were identified between models, the combination and operationalisation of determinants varied considerably. Existing predictive risk models provide a good starting point, but optimally balancing resources and targeting interventions on the community level will likely require a more holistic approach to health needs assessment. Development of additional determinants, such as measures of people's lifestyle and social network, may require policies pushing the integration of routine data from different (healthcare) sources. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yun, S. H.; Chang, C.
2015-12-01
It is the numerical simulation using a VolcFlow model to determine the runout range of pyroclastic density currents where an eruption column had been formed by the explosive Plinian eruption and the collapse of the column had caused to occur on Mt. Baekdu. We assumed that the most realistic way for the simulation of a sustained volcanic column is to modify the topography with a cone above the crater to follow expert advice from Dr. Karim Kelfoun, the developer of VolcFlow. Then we set the radius and height of the cone, the volume of pyroclastic flow, and the duration and simulation time accoding to the volcanic explosivity index (VEI). Also we set the yield stress as 5,000 Pa, 10,000 Pa, 15,000 Pa, the basal friction angle as 3°, 5°, 10°, respectively. As the simulation results, the longest runout range was 2.3 km, 9.1 km, 14.4 km, 18.6 km, 23.4 km from VEI 3 to VEI 7, respectively. It can be used as a very important material to predict the impact range of pyroclastic density currents and to minimize human and material damages caused by pyroclastic density currents derived from the future explosive eruption of Mt. Baekdu. This research was supported by a grant 'Development of Advanced Volcanic Disaster Response System considering Potential Volcanic Risk around Korea' [MPSS-NH-2015-81] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea.
Utility of Squeeze Flow in the Food Industry
NASA Astrophysics Data System (ADS)
Huang, T. A.
2008-07-01
Squeeze flow for obtaining shear viscosity on Newtonian and non-Newtonian fluids has long been established in the literature. Rotational shear flow using cone/plate, a set of parallel plates, or concentric cylinders all develop wall slip, shear fracture, or instability on food related materials such as peanut butter or mayonnaise. Viscosity data obtained using any one of the above mentioned set-ups is suspect or potentially results in significant error. They are unreliable to support or predict the textural differences perceived by consumer evaluation. RMS-800, from Rheometrics Inc., was employed to conduct the squeezing flow under constant speeds on a set of parallel plates. Viscosity data, over a broad range of shear rates, is compared between Hellmann's real (HRM) and light mayonnaise (HLM). The Consistency and shear-thinning indices, as defined in the Power-Law Model, were determined. HRM exhibits a more pronounced shear-thinning when compared to HLM yet the Consistency of HRM is significantly higher. Sensory evaluation by a trained expert panel ranked that adhesiveness and cohesiveness of HLM are significantly higher. It appears that the degree of shear thinning is one of the key rheological parameters in predicting the above mentioned difference in textural attributes. Error involved in determining viscosity from non-parallelism between two plates can be significant to affect the accuracy of the viscosity, in particular, shear-thinning index. Details are a subject for the next presentation. Nevertheless, the method is proven to be fast, rugged, simple, and reliable. It can be developed as a QC tool.
Reusable rocket engine turbopump health monitoring system, part 3
NASA Technical Reports Server (NTRS)
Perry, John G.
1989-01-01
Degradation mechanisms and sensor identification/selection resulted in a list of degradation modes and a list of sensors that are utilized in the diagnosis of these degradation modes. The sensor list is divided into primary and secondary indicators of the corresponding degradation modes. The signal conditioning requirements are discussed, describing the methods of producing the Space Shuttle Main Engine (SSME) post-hot-fire test data to be utilized by the Health Monitoring System. Development of the diagnostic logic and algorithms is also presented. The knowledge engineering approach, as utilized, includes the knowledge acquisition effort, characterization of the expert's problem solving strategy, conceptually defining the form of the applicable knowledge base, and rule base, and identifying an appropriate inferencing mechanism for the problem domain. The resulting logic flow graphs detail the diagnosis/prognosis procedure as followed by the experts. The nature and content of required support data and databases is also presented. The distinction between deep and shallow types of knowledge is identified. Computer coding of the Health Monitoring System is shown to follow the logical inferencing of the logic flow graphs/algorithms.
An expert system for planning and scheduling in a telerobotic environment
NASA Technical Reports Server (NTRS)
Ntuen, Celestine A.; Park, Eui H.
1991-01-01
A knowledge based approach to assigning tasks to multi-agents working cooperatively in jobs that require a telerobot in the loop was developed. The generality of the approach allows for such a concept to be applied in a nonteleoperational domain. The planning architecture known as the task oriented planner (TOP) uses the principle of flow mechanism and the concept of planning by deliberation to preserve and use knowledge about a particular task. The TOP is an open ended architecture developed with a NEXPERT expert system shell and its knowledge organization allows for indirect consultation at various levels of task abstraction. Considering that a telerobot operates in a hostile and nonstructured environment, task scheduling should respond to environmental changes. A general heuristic was developed for scheduling jobs with the TOP system. The technique is not to optimize a given scheduling criterion as in classical job and/or flow shop problems. For a teleoperation job schedule, criteria are situation dependent. A criterion selection is fuzzily embedded in the task-skill matrix computation. However, goal achievement with minimum expected risk to the human operator is emphasized.
Being an expert witness in geomorphology
NASA Astrophysics Data System (ADS)
Keller, Edward A.
2015-02-01
Gathering your own data and coming to your own conclusion through scientific research and discovery is the most important principle to remember when being an expert witness in geomorphology. You can only be questioned in deposition and trial in your area of expertise. You are qualified as an expert by education, knowledge, and experience. You will have absolutely nothing to fear from cross-examination if you are prepared and confident about your work. Being an expert witness requires good communication skills. When you make a presentation, speak clearly and avoid jargon, especially when addressing a jury. Keep in mind that when you take on a case that may eventually go to court as a lawsuit, the entire process, with appeals and so forth, can take several years. Therefore, being an expert may become a long-term commitment of your time and energy. You may be hired by either side in a dispute, but your job is the same - determine the scientific basis of the case and explain your scientific reasoning to the lawyers, the judge, and the jury. Your work, including pre-trial investigations, often determines what the case will be based on. The use of science in the discovery part of an investigation is demonstrated from a California case involving the Ventura River, where building of a flood control levee restricted flow to a narrower channel, increasing unit stream power as well as potential for bank erosion and landsliding.
Knowledge-based fault diagnosis system for refuse collection vehicle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, CheeFai; Juffrizal, K.; Khalil, S. N.
The refuse collection vehicle is manufactured by local vehicle body manufacturer. Currently; the company supplied six model of the waste compactor truck to the local authority as well as waste management company. The company is facing difficulty to acquire the knowledge from the expert when the expert is absence. To solve the problem, the knowledge from the expert can be stored in the expert system. The expert system is able to provide necessary support to the company when the expert is not available. The implementation of the process and tool is able to be standardize and more accurate. The knowledgemore » that input to the expert system is based on design guidelines and experience from the expert. This project highlighted another application on knowledge-based system (KBS) approached in trouble shooting of the refuse collection vehicle production process. The main aim of the research is to develop a novel expert fault diagnosis system framework for the refuse collection vehicle.« less
NASA Technical Reports Server (NTRS)
Momoh, James; Chattopadhyay, Deb; Basheer, Omar Ali AL
1996-01-01
The space power system has two sources of energy: photo-voltaic blankets and batteries. The optimal power management problem on-board has two broad operations: off-line power scheduling to determine the load allocation schedule of the next several hours based on the forecast of load and solar power availability. The nature of this study puts less emphasis on speed requirement for computation and more importance on the optimality of the solution. The second category problem, on-line power rescheduling, is needed in the event of occurrence of a contingency to optimally reschedule the loads to minimize the 'unused' or 'wasted' energy while keeping the priority on certain type of load and minimum disturbance of the original optimal schedule determined in the first-stage off-line study. The computational performance of the on-line 'rescheduler' is an important criterion and plays a critical role in the selection of the appropriate tool. The Howard University Center for Energy Systems and Control has developed a hybrid optimization-expert systems based power management program. The pre-scheduler has been developed using a non-linear multi-objective optimization technique called the Outer Approximation method and implemented using the General Algebraic Modeling System (GAMS). The optimization model has the capability of dealing with multiple conflicting objectives viz. maximizing energy utilization, minimizing the variation of load over a day, etc. and incorporates several complex interaction between the loads in a space system. The rescheduling is performed using an expert system developed in PROLOG which utilizes a rule-base for reallocation of the loads in an emergency condition viz. shortage of power due to solar array failure, increase of base load, addition of new activity, repetition of old activity etc. Both the modules handle decision making on battery charging and discharging and allocation of loads over a time-horizon of a day divided into intervals of 10 minutes. The models have been extensively tested using a case study for the Space Station Freedom and the results for the case study will be presented. Several future enhancements of the pre-scheduler and the 'rescheduler' have been outlined which include graphic analyzer for the on-line module, incorporating probabilistic considerations, including spatial location of the loads and the connectivity using a direct current (DC) load flow model.
Measuring novices' field mapping abilities using an in-class exercise based on expert task analysis
NASA Astrophysics Data System (ADS)
Caulkins, J. L.
2010-12-01
We are interested in developing a model of expert-like behavior for improving the teaching methods of undergraduate field geology. Our aim is to assist students in mastering the process of field mapping more efficiently and effectively and to improve their ability to think creatively in the field. To examine expert-mapping behavior, a cognitive task analysis was conducted with expert geologic mappers in an attempt to define the process of geologic mapping (i.e. to understand how experts carry out geological mapping). The task analysis indicates that expert mappers have a wealth of geologic scenarios at their disposal that they compare against examples seen in the field, experiences that most undergraduate mappers will not have had. While presenting students with many geological examples in class may increase their understanding of geologic processes, novices still struggle when presented with a novel field situation. Based on the task analysis, a short (45-minute) paper-map-based exercise was designed and tested with 14 pairs of 3rd year geology students. The exercise asks students to generate probable geologic models based on a series of four (4) data sets. Each data set represents a day’s worth of data; after the first “day,” new sheets simply include current and previously collected data (e.g. “Day 2” data set includes data from “Day 1” plus the new “Day 2” data). As the geologic complexity increases, students must adapt, reject or generate new geologic models in order to fit the growing data set. Preliminary results of the exercise indicate that students who produced more probable geologic models, and produced higher ratios of probable to improbable models, tended to go on to do better on the mapping exercises at the 3rd year field school. These results suggest that those students with more cognitively available geologic models may be more able to use these models in field settings than those who are unable to draw on these models for whatever reason. Giving students practice at generating geologic models to explain data may be useful in preparing our students for field mapping exercises.
Ultrasound detection of simulated intra-ocular foreign bodies by minimally trained personnel.
Sargsyan, Ashot E; Dulchavsky, Alexandria G; Adams, James; Melton, Shannon; Hamilton, Douglas R; Dulchavsky, Scott A
2008-01-01
To test the ability of non-expert ultrasound operators of divergent backgrounds to detect the presence, size, location, and composition of foreign bodies in an ocular model. High school students (N = 10) and NASA astronauts (N = 4) completed a brief ultrasound training session which focused on basic ultrasound principles and the detection of foreign bodies. The operators used portable ultrasound devices to detect foreign objects of varying location, size (0.5-2 mm), and material (glass, plastic, metal) in a gelatinous ocular model. Operator findings were compared to known foreign object parameters and ultrasound experts (N = 2) to determine accuracy across and between groups. Ultrasound had high sensitivity (astronauts 85%, students 87%, and experts 100%) and specificity (astronauts 81%, students 83%, and experts 95%) for the detection of foreign bodies. All user groups were able to accurately detect the presence of foreign bodies in this model (astronauts 84%, students 81%, and experts 97%). Astronaut and student sensitivity results for material (64% vs. 48%), size (60% vs. 46%), and position (77% vs. 64%) were not statistically different. Experts' results for material (85%), size (90%), and position (98%) were higher; however, the small sample size precluded statistical conclusions. Ultrasound can be used by operators with varying training to detect the presence, location, and composition of intraocular foreign bodies with high sensitivity, specificity, and accuracy.
Gagnon, Denis; Plamondon, André; Larivière, Christian
2016-09-06
Expertise is a key factor modulating the risk of low back disorders (LBD). Through years of practice in the workplace, the typical expert acquires high level specific skills and maintains a clean record of work-related injuries. Ergonomic observations of manual materials handling (MMH) tasks show that expert techniques differ from those of novices, leading to the idea that expert techniques are safer. Biomechanical studies of MMH tasks performed by experts/novices report mixed results for kinematic/kinetic variables, evoking potential internal effect of expertise. In the context of series of box transfers simulated by actual workers, detailed internal loads predicted by a multiple-joint EMG-assisted optimization lumbar spine model are compared between experts and novices. The results confirmed that the distribution of internal moments are modulated by worker expertise. Experts flexed less their lumbar spine and exerted more active muscle forces while novices relied more on passive resistance of the muscles and ligamentous spine. More specifically for novices, the passive contributions came from global extensor muscles, selected local extensor muscles, and passive structures of the lumbar spine (ligaments and discs). The distinctive distribution of internal forces was not concomitant with a similar effect on joint forces, these forces being dependent on external loading which was equivalent between experts and novices. From a safety standpoint, the present results suggest that experts were more efficient than novices in partitioning internal moment contributions to balance net (external) loading. Thus, safer handling practices might be seen as a result of experts׳ experience. Copyright © 2016 Elsevier Ltd. All rights reserved.
Expert Systems for Libraries at SCIL [Small Computers in Libraries]'88.
ERIC Educational Resources Information Center
Kochtanek, Thomas R.; And Others
1988-01-01
Six brief papers on expert systems for libraries cover (1) a knowledge-based approach to database design; (2) getting started in expert systems; (3) using public domain software to develop a business reference system; (4) a music cataloging inquiry system; (5) linguistic analysis of reference transactions; and (6) a model of a reference librarian.…
NASA Astrophysics Data System (ADS)
Wissmeier, L. C.; Barry, D. A.
2009-12-01
Computer simulations of water availability and quality play an important role in state-of-the-art water resources management. However, many of the most utilized software programs focus either on physical flow and transport phenomena (e.g., MODFLOW, MT3DMS, FEFLOW, HYDRUS) or on geochemical reactions (e.g., MINTEQ, PHREEQC, CHESS, ORCHESTRA). In recent years, several couplings between both genres of programs evolved in order to consider interactions between flow and biogeochemical reactivity (e.g., HP1, PHWAT). Software coupling procedures can be categorized as ‘close couplings’, where programs pass information via the memory stack at runtime, and ‘remote couplings’, where the information is exchanged at each time step via input/output files. The former generally involves modifications of software codes and therefore expert programming skills are required. We present a generic recipe for remotely coupling the PHREEQC geochemical modeling framework and flow and solute transport (FST) simulators. The iterative scheme relies on operator splitting with continuous re-initialization of PHREEQC and the FST of choice at each time step. Since PHREEQC calculates the geochemistry of aqueous solutions in contact with soil minerals, the procedure is primarily designed for couplings to FST’s for liquid phase flow in natural environments. It requires the accessibility of initial conditions and numerical parameters such as time and space discretization in the input text file for the FST and control of the FST via commands to the operating system (batch on Windows; bash/shell on Unix/Linux). The coupling procedure is based on PHREEQC’s capability to save the state of a simulation with all solid, liquid and gaseous species as a PHREEQC input file by making use of the dump file option in the TRANSPORT keyword. The output from one reaction calculation step is therefore reused as input for the following reaction step where changes in element amounts due to advection/dispersion are introduced as irreversible reactions. An example for the coupling of PHREEQC and MATLAB for the solution of unsaturated flow and transport is provided.
Value-Added Models: What the Experts Say
ERIC Educational Resources Information Center
Amrein-Beardsley, Audrey; Pivovarova, Margarita; Geiger, Tray J.
2016-01-01
Being an expert involves explaining how things are supposed to work, and, perhaps more important, why things might not work as supposed. In this study, researchers surveyed scholars with expertise in value-added models (VAMs) to solicit their opinions about the uses and potential of VAMs for teacher-level accountability purposes (for example, in…
AIRID: an application of the KAS/Prospector expert system builder to airplane identification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aldridge, J.P.
1984-01-01
The Knowledge Acquisition System/Prospector expert system building tool developed by SRI, International, has been used to construct an expert system to identify aircraft on the basis of observables such as wing shape, engine number/location, fuselage shape, and tail assembly shape. Additional detailed features are allowed to influence the identification as other favorable features. Constraints on the observations imposed by bad weather and distant observations have been included as contexts to the models. Models for Soviet and US fighter aircraft have been included. Inclusion of other types of aircraft such as bombers, transports, and reconnaissance craft is straightforward. Two models permitmore » exploration of the interaction of semantic and taxonomic networks with the models. A full set of text data for fluid communication with the user has been included. The use of demons as triggered output responses to enhance utility to the user has been explored. This paper presents discussion of the ease of building the expert system using this powerful tool and problems encountered in the construction process.« less
Analysis of a mammography teaching program based on an affordance design model.
Luo, Ping; Eikman, Edward A; Kealy, William; Qian, Wei
2006-12-01
The wide use of computer technology in education, particularly in mammogram reading, asks for e-learning evaluation. The existing media comparative studies, learner attitude evaluations, and performance tests are problematic. Based on an affordance design model, this study examined an existing e-learning program on mammogram reading. The selection criteria include content relatedness, representativeness, e-learning orientation, image quality, program completeness, and accessibility. A case study was conducted to examine the affordance features, functions, and presentations of the selected software. Data collection and analysis methods include interviews, protocol-based document analysis, and usability tests and inspection. Also some statistics were calculated. The examination of PBE identified that this educational software designed and programmed some tools. The learner can use these tools in the process of optimizing displays, scanning images, comparing different projections, marking the region of interests, constructing a descriptive report, assessing one's learning outcomes, and comparing one's decisions with the experts' decisions. Further, PBE provides some resources for the learner to construct one's knowledge and skills, including a categorized image library, a term-searching function, and some teaching links. Besides, users found it easy to navigate and carry out tasks. The users also reacted positively toward PBE's navigation system, instructional aids, layout, pace and flow of information, graphics, and other presentation design. The software provides learners with some cognitive tools, supporting their perceptual problem-solving processes and extending their capabilities. Learners can internalize the mental models in mammogram reading through multiple perceptual triangulations, sensitization of related features, semantic description of mammogram findings, and expert-guided semantic report construction. The design of these cognitive tools and the software interface matches the findings and principles in human learning and instructional design. Working with PBE's case-based simulations and categorized gallery, learners can enrich and transfer their experience to their jobs.
Complex fuzzy soft expert sets
NASA Astrophysics Data System (ADS)
Selvachandran, Ganeshsree; Hafeed, Nisren A.; Salleh, Abdul Razak
2017-04-01
Complex fuzzy sets and its accompanying theory although at its infancy, has proven to be superior to classical type-1 fuzzy sets, due its ability in representing time-periodic problem parameters and capturing the seasonality of the fuzziness that exists in the elements of a set. These are important characteristics that are pervasive in most real world problems. However, there are two major problems that are inherent in complex fuzzy sets: it lacks a sufficient parameterization tool and it does not have a mechanism to validate the values assigned to the membership functions of the elements in a set. To overcome these problems, we propose the notion of complex fuzzy soft expert sets which is a hybrid model of complex fuzzy sets and soft expert sets. This model incorporates the advantages of complex fuzzy sets and soft sets, besides having the added advantage of allowing the users to know the opinion of all the experts in a single model without the need for any additional cumbersome operations. As such, this model effectively improves the accuracy of representation of problem parameters that are periodic in nature, besides having a higher level of computational efficiency compared to similar models in literature.
Expert and non-expert knowledge in medical practice.
Nordin, I
2000-01-01
One problematic aspect of the rationality of medical practice concerns the relation between expert knowledge and non-expert knowledge. In medical practice it is important to match medical knowledge with the self-knowledge of the individual patient. This paper tries to study the problem of such matching by describing a model for technological paradigms and comparing it with an ideal of technological rationality. The professionalised experts tend to base their decisions and actions mostly on medical knowledge while the rationality of medicine also involves just as important elements of the personal evaluation and knowledge of the patients. Since both types of knowledge are necessary for rational decisions, the gap between the expert and the non-expert has to be bridged in some way. A solution to the problem is suggested in terms of pluralism, with the patient as ultimate decision-maker.
External Peer Review Team Report Underground Testing Area Subproject for Frenchman Flat, Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sam Marutzky
2010-09-01
An external peer review was conducted to review the groundwater models used in the corrective action investigation stage of the Underground Test Area (UGTA) subproject to forecast zones of potential contamination in 1,000 years for the Frenchman Flat area. The goal of the external peer review was to provide technical evaluation of the studies and to assist in assessing the readiness of the UGTA subproject to progress to monitoring activities for further model evaluation. The external peer review team consisted of six independent technical experts with expertise in geology, hydrogeology,'''groundwater modeling, and radiochemistry. The peer review team was tasked withmore » addressing the following questions: 1. Are the modeling approaches, assumptions, and model results for Frenchman Flat consistent with the use of modeling studies as a decision tool for resolution of environmental and regulatory requirements? 2. Do the modeling results adequately account for uncertainty in models of flow and transport in the Frenchman Flat hydrological setting? a. Are the models of sufficient scale/resolution to adequately predict contaminant transport in the Frenchman Flat setting? b. Have all key processes been included in the model? c. Are the methods used to forecast contaminant boundaries from the transport modeling studies reasonable and appropriate? d. Are the assessments of uncertainty technically sound and consistent with state-of-the-art approaches currently used in the hydrological sciences? 3. Are the datasets and modeling results adequate for a transition to Corrective Action Unit monitoring studies—the next stage in the UGTA strategy for Frenchman Flat? The peer review team is of the opinion that, with some limitations, the modeling approaches, assumptions, and model results are consistent with the use of modeling studies for resolution of environmental and regulatory requirements. The peer review team further finds that the modeling studies have accounted for uncertainty in models of flow and transport in the Frenchman Flat except for a few deficiencies described in the report. Finally, the peer review team concludes that the UGTA subproject has explored a wide range of variations in assumptions, methods, and data, and should proceed to the next stage with an emphasis on monitoring studies. The corrective action strategy, as described in the Federal Facility Agreement and Consent Order, states that the groundwater flow and transport models for each corrective action unit will consider, at a minimum, the following: • Alternative hydrostratigraphic framework models of the modeling domain. • Uncertainty in the radiological and hydrological source terms. • Alternative models of recharge. • Alternative boundary conditions and groundwater flows. • Multiple permissive sets of calibrated flow models. • Probabilistic simulations of transport using plausible sets of alternative framework and recharge models, and boundary and groundwater flows from calibrated flow models. • Ensembles of forecasts of contaminant boundaries. • Sensitivity and uncertainty analyses of model outputs. The peer review team finds that these minimum requirements have been met. While the groundwater modeling and uncertainty analyses have been quite detailed, the peer review team has identified several modeling-related issues that should be addressed in the next phase of the corrective action activities: • Evaluating and using water-level gradients from the pilot wells at the Area 5 Radioactive Waste Management Site in model calibration. • Re-evaluating the use of geochemical age-dating data to constrain model calibrations. • Developing water budgets for the alluvial and upper volcanic aquifer systems in Frenchman Flat. • Considering modeling approaches in which calculated groundwater flow directions near the water table are not predetermined by model boundary conditions and areas of recharge, all of which are very uncertain. • Evaluating local-scale variations in hydraulic conductivity on the calculated contaminant boundaries. • Evaluating the effects of non-steady-state flow conditions on calculated contaminant boundaries, including the effects of long-term declines in water levels, climatic change, and disruption of groundwater system by potential earthquake faulting along either of the two major controlling fault zones in the flow system (the Cane Spring and Rock Valley faults). • Considering the use of less-complex modeling approaches. • Evaluating the large change in water levels in the vicinity of the Frenchman Flat playa and developing a conceptual model to explain these water-level changes. • Developing a long-term groundwater level monitoring program for Frenchman Flat with regular monitoring of water levels at key monitoring wells. Despite these reservations, the peer review team strongly believes that the UGTA subproject should proceed to the next stage.« less
Immersive volume rendering of blood vessels
NASA Astrophysics Data System (ADS)
Long, Gregory; Kim, Han Suk; Marsden, Alison; Bazilevs, Yuri; Schulze, Jürgen P.
2012-03-01
In this paper, we present a novel method of visualizing flow in blood vessels. Our approach reads unstructured tetrahedral data, resamples it, and uses slice based 3D texture volume rendering. Due to the sparse structure of blood vessels, we utilize an octree to efficiently store the resampled data by discarding empty regions of the volume. We use animation to convey time series data, wireframe surface to give structure, and utilize the StarCAVE, a 3D virtual reality environment, to add a fully immersive element to the visualization. Our tool has great value in interdisciplinary work, helping scientists collaborate with clinicians, by improving the understanding of blood flow simulations. Full immersion in the flow field allows for a more intuitive understanding of the flow phenomena, and can be a great help to medical experts for treatment planning.
NASA Astrophysics Data System (ADS)
Zilberman, Arkadi; Ben Asher, Jiftah; Kopeika, Norman S.
2016-10-01
The advancements in remote sensing in combination with sensor technology (both passive and active) enable growers to analyze an entire crop field as well as its local features. In particular, changes of actual evapo-transpiration (ET) as a function of water availability can be measured remotely with infrared radiometers. Detection of crop water stress and ET and combining it with the soil water flow model enable rational irrigation timing and application amounts. Nutrient deficiency, and in particular nitrogen deficiency, causes substantial crop losses. This deficiency needs to be identified immediately. A faster the detection and correction, a lesser the damage to the crop yield. In the present work, to retrieve ET a novel deterministic approach was used which is based on the remote sensing data. The algorithm can automatically provide timely valuable information on plant and soil water status, which can improve the management of irrigated crops. The solution is capable of bridging between Penman-Monteith ET model and Richards soil water flow model. This bridging can serve as a preliminary tool for expert irrigation system. To support decisions regarding fertilizers the greenness of plant canopies is assessed and quantified by using the spectral reflectance sensors and digital color imaging. Fertilization management can be provided on the basis of sampling and monitoring of crop nitrogen conditions using RS technique and translating measured N concentration in crop to kg/ha N application in the field.
Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield
NASA Technical Reports Server (NTRS)
Baurle, R. A.; Axdahl, E. L.
2017-01-01
Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.
Cornell Mixing Zone Expert System
This page provides an overview Cornell Mixing Zone Expert System water quality modeling and decision support system designed for environmental impact assessment of mixing zones resulting from wastewater discharge from point sources
NASA Astrophysics Data System (ADS)
Moresi, L.; May, D.; Peachey, T.; Enticott, C.; Abramson, D.; Robinson, T.
2004-12-01
Can you teach intuition ? Obviously we think that this is possible (though it's still just a hunch). People undoubtedly develop intuition for non-linear systems through painstaking repetition of complex tasks until they have sufficient feedback to begin to "see" the emergent behaviour. The better the exploration of the system can be exposed, the quicker the potential for developing an intuitive understanding. We have spent some time considering how to incorporate the intuitive knowledge of field geologists into mechanical modeling of geological processes. Our solution has been to allow expert geologist to steer (via a GUI) a genetic algorithm inversion of a mechanical forward model towards "structures" or patterns which are plausible in nature. The expert knowledge is then captured by analysis of the individual model parameters which are constrained by the steering (and by analysis of those which are unconstrained). The same system can also be used in reverse to expose the influence of individual parameters to the non-expert who is trying to learn just what does make a good match between model and observation. The ``distance'' between models preferred by experts, and those by an individual can be shown graphically to provide feedback. The examples we choose are from numerical models of extensional basins. We will first try to give each person some background information on the scientific problem from the poster and then we will let them loose on the numerical modeling tools with specific tasks to achieve. This will be an experiment in progress - we will later analyse how people use the GUI and whether there is really any significant difference between so-called experts and self-styled novices.
ERIC Educational Resources Information Center
Stolpe, Karin; Bjorklund, Lars
2012-01-01
This study aims to investigate two expert ecology teachers' ability to attend to essential details in a complex environment during a field excursion, as well as how they teach this ability to their students. In applying a cognitive dual-memory system model for learning, we also suggest a rationale for their behaviour. The model implies two…
A Cognitive Architecture for Human Performance Process Model Research
1992-11-01
individually defined, updatable world representation which is a description of the world as the operator knows it. It contains rules for decisions, an...operate it), and rules of engagement (knowledge about the operator’s expected behavior). The HPP model works in the following way. Information enters...based models depict the problem-solving processes of experts. The experts’ knowledge is represented in symbol structures, along with rules for
Process Mining for Individualized Behavior Modeling Using Wireless Tracking in Nursing Homes
Fernández-Llatas, Carlos; Benedi, José-Miguel; García-Gómez, Juan M.; Traver, Vicente
2013-01-01
The analysis of human behavior patterns is increasingly used for several research fields. The individualized modeling of behavior using classical techniques requires too much time and resources to be effective. A possible solution would be the use of pattern recognition techniques to automatically infer models to allow experts to understand individual behavior. However, traditional pattern recognition algorithms infer models that are not readily understood by human experts. This limits the capacity to benefit from the inferred models. Process mining technologies can infer models as workflows, specifically designed to be understood by experts, enabling them to detect specific behavior patterns in users. In this paper, the eMotiva process mining algorithms are presented. These algorithms filter, infer and visualize workflows. The workflows are inferred from the samples produced by an indoor location system that stores the location of a resident in a nursing home. The visualization tool is able to compare and highlight behavior patterns in order to facilitate expert understanding of human behavior. This tool was tested with nine real users that were monitored for a 25-week period. The results achieved suggest that the behavior of users is continuously evolving and changing and that this change can be measured, allowing for behavioral change detection. PMID:24225907
Knowledge Engineering as a Component of the Curriculum for Medical Cybernetists.
Karas, Sergey; Konev, Arthur
2017-01-01
According to a new state educational standard, students who have chosen medical cybernetics as their major must develop a knowledge engineering competency. Previously, in the course "Clinical cybernetics" while practicing project-based learning students were designing automated workstations for medical personnel using client-server technology. The purpose of the article is to give insight into the project of a new educational module "Knowledge engineering". Students will acquire expert knowledge by holding interviews and conducting surveys, and then they will formalize it. After that, students will form declarative expert knowledge in a network model and analyze the knowledge graph. Expert decision making methods will be applied in software on the basis of a production model of knowledge. Project implementation will result not only in the development of analytical competencies among students, but also creation of a practically useful expert system based on student models to support medical decisions. Nowadays, this module is being tested in the educational process.
Microcomputer-based classification of environmental data in municipal areas
NASA Astrophysics Data System (ADS)
Thiergärtner, H.
1995-10-01
Multivariate data-processing methods used in mineral resource identification can be used to classify urban regions. Using elements of expert systems, geographical information systems, as well as known classification and prognosis systems, it is possible to outline a single model that consists of resistant and of temporary parts of a knowledge base including graphical input and output treatment and of resistant and temporary elements of a bank of methods and algorithms. Whereas decision rules created by experts will be stored in expert systems directly, powerful classification rules in form of resistant but latent (implicit) decision algorithms may be implemented in the suggested model. The latent functions will be transformed into temporary explicit decision rules by learning processes depending on the actual task(s), parameter set(s), pixels selection(s), and expert control(s). This takes place both at supervised and nonsupervised classification of multivariately described pixel sets representing municipal subareas. The model is outlined briefly and illustrated by results obtained in a target area covering a part of the city of Berlin (Germany).
Minimal Residual Disease Evaluation in Childhood Acute Lymphoblastic Leukemia: An Economic Analysis
Gajic-Veljanoski, O.; Pham, B.; Pechlivanoglou, P.; Krahn, M.; Higgins, Caroline; Bielecki, Joanna
2016-01-01
Background Minimal residual disease (MRD) testing by higher performance techniques such as flow cytometry and polymerase chain reaction (PCR) can be used to detect the proportion of remaining leukemic cells in bone marrow or peripheral blood during and after the first phases of chemotherapy in children with acute lymphoblastic leukemia (ALL). The results of MRD testing are used to reclassify these patients and guide changes in treatment according to their future risk of relapse. We conducted a systematic review of the economic literature, cost-effectiveness analysis, and budget-impact analysis to ascertain the cost-effectiveness and economic impact of MRD testing by flow cytometry for management of childhood precursor B-cell ALL in Ontario. Methods A systematic literature search (1998–2014) identified studies that examined the incremental cost-effectiveness of MRD testing by either flow cytometry or PCR. We developed a lifetime state-transition (Markov) microsimulation model to quantify the cost-effectiveness of MRD testing followed by risk-directed therapy to no MRD testing and to estimate its marginal effect on health outcomes and on costs. Model input parameters were based on the literature, expert opinion, and data from the Pediatric Oncology Group of Ontario Networked Information System. Using predictions from our Markov model, we estimated the 1-year cost burden of MRD testing versus no testing and forecasted its economic impact over 3 and 5 years. Results In a base-case cost-effectiveness analysis, compared with no testing, MRD testing by flow cytometry at the end of induction and consolidation was associated with an increased discounted survival of 0.0958 quality-adjusted life-years (QALYs) and increased discounted costs of $4,180, yielding an incremental cost-effectiveness ratio (ICER) of $43,613/QALY gained. After accounting for parameter uncertainty, incremental cost-effectiveness of MRD testing was associated with an ICER of $50,249/QALY gained. In the budget-impact analysis, the 1-year cost expenditure for MRD testing by flow cytometry in newly diagnosed patients with precursor B-cell ALL was estimated at $340,760. We forecasted that the province would have to pay approximately $1.3 million over 3 years and $2.4 million over 5 years for MRD testing by flow cytometry in this population. Conclusions Compared with no testing, MRD testing by flow cytometry in newly diagnosed patients with precursor B-cell ALL represents good value for money at commonly used willingness-to-pay thresholds of $50,000/QALY and $100,000/QALY. PMID:27099644
Minimal Residual Disease Evaluation in Childhood Acute Lymphoblastic Leukemia: An Economic Analysis.
2016-01-01
Minimal residual disease (MRD) testing by higher performance techniques such as flow cytometry and polymerase chain reaction (PCR) can be used to detect the proportion of remaining leukemic cells in bone marrow or peripheral blood during and after the first phases of chemotherapy in children with acute lymphoblastic leukemia (ALL). The results of MRD testing are used to reclassify these patients and guide changes in treatment according to their future risk of relapse. We conducted a systematic review of the economic literature, cost-effectiveness analysis, and budget-impact analysis to ascertain the cost-effectiveness and economic impact of MRD testing by flow cytometry for management of childhood precursor B-cell ALL in Ontario. A systematic literature search (1998-2014) identified studies that examined the incremental cost-effectiveness of MRD testing by either flow cytometry or PCR. We developed a lifetime state-transition (Markov) microsimulation model to quantify the cost-effectiveness of MRD testing followed by risk-directed therapy to no MRD testing and to estimate its marginal effect on health outcomes and on costs. Model input parameters were based on the literature, expert opinion, and data from the Pediatric Oncology Group of Ontario Networked Information System. Using predictions from our Markov model, we estimated the 1-year cost burden of MRD testing versus no testing and forecasted its economic impact over 3 and 5 years. In a base-case cost-effectiveness analysis, compared with no testing, MRD testing by flow cytometry at the end of induction and consolidation was associated with an increased discounted survival of 0.0958 quality-adjusted life-years (QALYs) and increased discounted costs of $4,180, yielding an incremental cost-effectiveness ratio (ICER) of $43,613/QALY gained. After accounting for parameter uncertainty, incremental cost-effectiveness of MRD testing was associated with an ICER of $50,249/QALY gained. In the budget-impact analysis, the 1-year cost expenditure for MRD testing by flow cytometry in newly diagnosed patients with precursor B-cell ALL was estimated at $340,760. We forecasted that the province would have to pay approximately $1.3 million over 3 years and $2.4 million over 5 years for MRD testing by flow cytometry in this population. Compared with no testing, MRD testing by flow cytometry in newly diagnosed patients with precursor B-cell ALL represents good value for money at commonly used willingness-to-pay thresholds of $50,000/QALY and $100,000/QALY.
From Paris to Iowa and Back: Global Temperature Targets, Agricultural Impacts, and Producer Response
NASA Astrophysics Data System (ADS)
Anderson, C.; Hayhoe, K.; Terando, A. J.
2016-12-01
Traditionally, assessments such as those produced by IPCC and USGCRP have been structured to provide a one-way flow of information from scientists to national and international policy makers. Because the Paris Agreement will ultimately require corresponding domestic policies, the traditional one-way information flow could be inadequate, since it lacks both direct participation and informed feedback from many of the important entities that influence domestic policy. We have engaged Iowa row crop producers in identifying impacts and feasibility of adaptation under global warming of 1.0 and 2.0OC. Our engagement seeks to create within climate impacts assessment a decision-maker feedback loop. We have engaged an expert panel by using yield data modeling as a first step to communicate vividly the potential yield impacts of global average temperature targets. This engagement included validation with historical global average temperature before presenting yield impact under global mean surface temperature increase of 1.0 and 2.0OC. The expert panel requested further analysis of targets at 0.25 and 0.50OC increase and of possible impacts should they pursue adaptation by increasing maize plant population density and soil moisture storage. Several clear messages have emerged that can be voiced by Iowa agribusiness leaders to national and international decision-makers. While Iowa soybean agriculture may remain robust for the foreseeable future, the Paris Agreement is insufficient to protect Iowa maize production from substantial changes in productivity and volatility. These effects could be largely (though not entirely) mitigated by moving from the current +2OC to the "high ambition" +1.5OC target. The projected spring rainfall increase of 10% under +1OC would increase the cost of spring planting. The data model predicts a 5-day reduction in average number of fieldwork days, which requires the addition of one half-time person or larger planting equipment. The current annual rate of increase in maize plant density will maintain historical yield increase through +1OC but by +2OC is substantially reduced and results in unprecedented yield volatility. By increasing soil moisture during July, Iowa maize production can reduce markedly the impacts of +2OC.
Sustainment of Individual and Collective Future Combat Skills: Modeling and Research Methods
2010-01-01
expertise: Novice, Advanced Beginner , Competent, Proficient, and Expert. According to this conceptualization, tactical leaders develop cognitively...to equipment or containers. • Checklists, flowcharts , worksheets, decision tables, and system-fault tables. • Written instructions (e.g., on...novice; (2) advanced beginner ; (3) competent; (4) proficient; and (5) expert. Going from novice to expert, each level of skill development reflects
Proceedings of the international conference on cybernetics and societ
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-01
This book presents the papers given at a conference on artificial intelligence, expert systems and knowledge bases. Topics considered at the conference included automating expert system development, modeling expert systems, causal maps, data covariances, robot vision, image processing, multiprocessors, parallel processing, VLSI structures, man-machine systems, human factors engineering, cognitive decision analysis, natural language, computerized control systems, and cybernetics.
Visualization in hydrological and atmospheric modeling and observation
NASA Astrophysics Data System (ADS)
Helbig, C.; Rink, K.; Kolditz, O.
2013-12-01
In recent years, visualization of geoscientific and climate data has become increasingly important due to challenges such as climate change, flood prediction or the development of water management schemes for arid and semi-arid regions. Models for simulations based on such data often have a large number of heterogeneous input data sets, ranging from remote sensing data and geometric information (such as GPS data) to sensor data from specific observations sites. Data integration using such information is not straightforward and a large number of potential problems may occur due to artifacts, inconsistencies between data sets or errors based on incorrectly calibrated or stained measurement devices. Algorithms to automatically detect various of such problems are often numerically expensive or difficult to parameterize. In contrast, combined visualization of various data sets is often a surprisingly efficient means for an expert to detect artifacts or inconsistencies as well as to discuss properties of the data. Therefore, the development of general visualization strategies for atmospheric or hydrological data will often support researchers during assessment and preprocessing of the data for model setup. When investigating specific phenomena, visualization is vital for assessing the progress of the ongoing simulation during runtime as well as evaluating the plausibility of the results. We propose a number of such strategies based on established visualization methods that - are applicable to a large range of different types of data sets, - are computationally inexpensive to allow application for time-dependent data - can be easily parameterized based on the specific focus of the research. Examples include the highlighting of certain aspects of complex data sets using, for example, an application-dependent parameterization of glyphs, iso-surfaces or streamlines. In addition, we employ basic rendering techniques allowing affine transformations, changes in opacity as well as variation of transfer functions. We found that similar strategies can be applied for hydrological and atmospheric data such as the use of streamlines for visualization of wind or fluid flow or iso-surfaces as indicators of groundwater recharge levels in the subsurface or levels of humidity in the atmosphere. We applied these strategies for a wide range of hydrological and climate applications such as groundwater flow, distribution of chemicals in water bodies, development of convection cells in the atmosphere or heat flux on the earth's surface. Results have been evaluated in discussions with experts from hydrogeology and meteorology.
Cognitive Task Analysis of En Route Air Traffic Control: Model Extension and Validation.
ERIC Educational Resources Information Center
Redding, Richard E.; And Others
Phase II of a project extended data collection and analytic procedures to develop a model of expertise and skill development for en route air traffic control (ATC). New data were collected by recording the Dynamic Simulator (DYSIM) performance of five experts with a work overload problem. Expert controllers were interviewed in depth for mental…
ERIC Educational Resources Information Center
Miller, Joshua D.; Lynam, Donald R.
2008-01-01
Assessment of the "Diagnostic and Statistical Manual of Mental Disorders" (4th Ed.; "DSM-IV") personality disorders (PDs) using five-factor model (FFM) prototypes and counts has shown substantial promise, with a few exceptions. Miller, Reynolds, and Pilkonis suggested that the expert-generated FFM dependent prototype might be misspecified in…
Experts Discuss Models’ Role in Better Protecting U.S. Ports and Waterways
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manke, Kristin L.
2006-10-30
How can scientific models help save lives, property, and aquatic habitat during a terrorist attack on the nation's ports and rivers? Researchers at Pacific Northwest National Laboratory posed this question to scientific modeling and emergency response experts from across the country in a workshop held July 12-13, 2006, at Sequim, Wash.
ERIC Educational Resources Information Center
Purves, Barbara A.; Petersen, Jill; Puurveen, Gloria
2013-01-01
Purpose: In contrast to clinician-as-expert models, social models of clinical practice typically acknowledge people with aphasia as equal partners in intervention. Given this, there may be a place within speech-language pathology education for programs situating people with aphasia as experts. This paper describes an aphasia mentoring program that…
ERIC Educational Resources Information Center
Waight, Noemi; Liu, Xiufeng; Gregorius, Roberto Ma.
2015-01-01
This paper examined the nuances of the background process of design and development and follow up classroom implementation of computer-based models for high school chemistry. More specifically, the study examined the knowledge contributions of an interdisciplinary team of experts; points of tensions, negotiations and non-negotiable aspects of…
A Novice-Expert Study of Modeling Skills and Knowledge Structures about Air Quality
ERIC Educational Resources Information Center
Hsu, Ying-Shao; Lin, Li-Fen; Wu, Hsin-Kai; Lee, Dai-Ying; Hwang, Fu-Kwun
2012-01-01
This study compared modeling skills and knowledge structures of four groups as seen in their understanding of air quality. The four groups were: experts (atmospheric scientists), intermediates (upper-level graduate students in a different field), advanced novices (talented 11th and 12th graders), and novices (10th graders). It was found that when…
ERIC Educational Resources Information Center
Goldberg, Benjamin; Amburn, Charles; Ragusa, Charlie; Chen, Dar-Wei
2018-01-01
The U.S. Army is interested in extending the application of intelligent tutoring systems (ITS) beyond cognitive problem spaces and into psychomotor skill domains. In this paper, we present a methodology and validation procedure for creating expert model representations in the domain of rifle marksmanship. GIFT (Generalized Intelligent Framework…
The Puzzling Unidimensionality of DSM-5 Substance Use Disorder Diagnoses
MacCoun, Robert J.
2013-01-01
There is a perennial expert debate about the criteria to be included or excluded for the DSM diagnoses of substance use dependence. Yet analysts routinely report evidence for the unidimensionality of the resulting checklist. If in fact the checklist is unidimensional, the experts are wrong that the criteria are distinct, so either the experts are mistaken or the reported unidimensionality is spurious. I argue for the latter position, and suggest that the traditional reflexive measurement model is inappropriate for the DSM; a formative measurement model would be a more accurate characterization of the institutional process by which the checklist is created, and a network or causal model would be a more appropriate foundation for a scientifically grounded diagnostic system. PMID:24324446
Bayesian methods in reliability
NASA Astrophysics Data System (ADS)
Sander, P.; Badoux, R.
1991-11-01
The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.
Yilmaz Soylu, Meryem; Bruning, Roger H
2016-01-01
This study examined differences in self-regulation among college-age expert, moderately expert, and non-expert video game players in playing video games for fun. Winne's model of self-regulation (Winne, 2001) guided the study. The main assumption of this study was that expert video game players used more processes of self-regulation than the less-expert players. We surveyed 143 college students about their game playing frequency, habits, and use of self-regulation. Data analysis indicated that while playing recreational video games, expert gamers self-regulated more than moderately expert and non-expert players and moderately expert players used more processes of self-regulation than non-experts. Semi-structured interviews also were conducted with selected participants at each of the expertise levels. Qualitative follow-up analyses revealed five themes: (1) characteristics of expert video gamers, (2) conditions for playing a video game, (3) figuring out a game, (4) how gamers act and, (5) game context. Overall, findings indicated that playing a video game is a highly self-regulated activity and that becoming an expert video game player mobilizes multiple sets of self-regulation related skills and processes. These findings are seen as promising for educators desiring to encourage student self-regulation, because they indicate the possibility of supporting students via recreational video games by recognizing that their play includes processes of self-regulation.
Yilmaz Soylu, Meryem; Bruning, Roger H.
2016-01-01
This study examined differences in self-regulation among college-age expert, moderately expert, and non-expert video game players in playing video games for fun. Winne's model of self-regulation (Winne, 2001) guided the study. The main assumption of this study was that expert video game players used more processes of self-regulation than the less-expert players. We surveyed 143 college students about their game playing frequency, habits, and use of self-regulation. Data analysis indicated that while playing recreational video games, expert gamers self-regulated more than moderately expert and non-expert players and moderately expert players used more processes of self-regulation than non-experts. Semi-structured interviews also were conducted with selected participants at each of the expertise levels. Qualitative follow-up analyses revealed five themes: (1) characteristics of expert video gamers, (2) conditions for playing a video game, (3) figuring out a game, (4) how gamers act and, (5) game context. Overall, findings indicated that playing a video game is a highly self-regulated activity and that becoming an expert video game player mobilizes multiple sets of self-regulation related skills and processes. These findings are seen as promising for educators desiring to encourage student self-regulation, because they indicate the possibility of supporting students via recreational video games by recognizing that their play includes processes of self-regulation. PMID:27729881
2006-01-01
experts. Fig. 1 shows the synthesis flow for the NFG. It converts the Design Specification described by Scilab [18], a MATLAB-like software, into HDL...Tam- pare, Finland, pp. 118–123, Aug. 2005. [18] Scilab 3.0, INRIA-ENPC, France, http://scilabsoft.inria.fr/ [19] M. J. Schulte and J. E. Stine
1989-01-26
introduction, review and prospects." AUTOCARTO 8 pp 510-519. [VOY 10] VOYER: " Moteurs de systemes experts." Eyrolles editions 61, Bd. St.-Germain 75005...each knowlege Output of Extrated Results Oceanic Conditions Extraction Meta -Rule Base Figure 3. General Flow Chart of the System 207
van Mil, Anke C C M; Greyling, Arno; Zock, Peter L; Geleijnse, Johanna M; Hopman, Maria T; Mensink, Ronald P; Reesink, Koen D; Green, Daniel J; Ghiadoni, Lorenzo; Thijssen, Dick H
2016-09-01
Brachial artery flow-mediated dilation (FMD) is a popular technique to examine endothelial function in humans. Identifying volunteer and methodological factors related to variation in FMD is important to improve measurement accuracy and applicability. Volunteer-related and methodology-related parameters were collected in 672 volunteers from eight affiliated centres worldwide who underwent repeated measures of FMD. All centres adopted contemporary expert-consensus guidelines for FMD assessment. After calculating the coefficient of variation (%) of the FMD for each individual, we constructed quartiles (n = 168 per quartile). Based on two regression models (volunteer-related factors and methodology-related factors), statistically significant components of these two models were added to a final regression model (calculated as β-coefficient and R). This allowed us to identify factors that independently contributed to the variation in FMD%. Median coefficient of variation was 17.5%, with healthy volunteers demonstrating a coefficient of variation 9.3%. Regression models revealed age (β = 0.248, P < 0.001), hypertension (β = 0.104, P < 0.001), dyslipidemia (β = 0.331, P < 0.001), time between measurements (β = 0.318, P < 0.001), lab experience (β = -0.133, P < 0.001) and baseline FMD% (β = 0.082, P < 0.05) as contributors to the coefficient of variation. After including all significant factors in the final model, we found that time between measurements, hypertension, baseline FMD% and lab experience with FMD independently predicted brachial artery variability (total R = 0.202). Although FMD% showed good reproducibility, larger variation was observed in conditions with longer time between measurements, hypertension, less experience and lower baseline FMD%. Accounting for these factors may improve FMD% variability.
Space Suit CO2 Washout During Intravehicular Activity
NASA Technical Reports Server (NTRS)
Augustine, Phillip M.; Navarro, Moses; Conger, Bruce; Sargusingh, Miriam M.
2010-01-01
Space suit carbon dioxide (CO2) washout refers to the removal of CO2 gas from the oral-nasal area of a suited astronaut's (or crewmember's) helmet using the suit's ventilation system. Inadequate washout of gases can result in diminished mental/cognitive abilities as well as headaches and light headedness. In addition to general discomfort, these ailments can impair an astronaut s ability to perform mission-critical tasks ranging from flying the space vehicle to performing lunar extravehicular activities (EVAs). During design development for NASA s Constellation Program (CxP), conflicting requirements arose between the volume of air flow that the new Orion manned space vehicle is allocated to provide to the suited crewmember and the amount of air required to achieve CO2 washout in a space suit. Historically, space suits receive 6.0 actual cubic feet per minute (acfm) of air flow, which has adequately washed out CO2 for EVAs. For CxP, the Orion vehicle will provide 4.5 acfm of air flow to the suit. A group of subject matter experts (SM Es) among the EVA Systems community came to an early consensus that 4.5 acfm may be acceptable for low metabolic rate activities. However, this value appears very risky for high metabolic rates, hence the need for further analysis and testing. An analysis was performed to validate the 4.5 acfm value and to determine if adequate CO2 washout can be achieved with the new suit helmet design concepts. The analysis included computational fluid dynamic (CFD) modeling cases, which modeled the air flow and breathing characteristics of a human wearing suit helmets. Helmet testing was performed at the National Institute of Occupational Safety and Health (NIOSH) in Pittsburgh, Pennsylvania, to provide a gross-level validation of the CFD models. Although there was not a direct data correlation between the helmet testing and the CFD modeling, the testing data showed trends that are very similar to the CFD modeling. Overall, the analysis yielded results that were better than anticipated, with a few unexpected findings that could not easily be explained. Results indicate that 4.5 acfm is acceptable for CO2 washout and helmet design. This paper summarizes the results of this CO2 washout study.
Toward a theory of distributed word expert natural language parsing
NASA Technical Reports Server (NTRS)
Rieger, C.; Small, S.
1981-01-01
An approach to natural language meaning-based parsing in which the unit of linguistic knowledge is the word rather than the rewrite rule is described. In the word expert parser, knowledge about language is distributed across a population of procedural experts, each representing a word of the language, and each an expert at diagnosing that word's intended usage in context. The parser is structured around a coroutine control environment in which the generator-like word experts ask questions and exchange information in coming to collective agreement on sentence meaning. The word expert theory is advanced as a better cognitive model of human language expertise than the traditional rule-based approach. The technical discussion is organized around examples taken from the prototype LISP system which implements parts of the theory.
Group prioritisation with unknown expert weights in incomplete linguistic context
NASA Astrophysics Data System (ADS)
Cheng, Dong; Cheng, Faxin; Zhou, Zhili; Wang, Juan
2017-09-01
In this paper, we study a group prioritisation problem in situations when the expert weights are completely unknown and their judgement preferences are linguistic and incomplete. Starting from the theory of relative entropy (RE) and multiplicative consistency, an optimisation model is provided for deriving an individual priority vector without estimating the missing value(s) of an incomplete linguistic preference relation. In order to address the unknown expert weights in the group aggregating process, we define two new kinds of expert weight indicators based on RE: proximity entropy weight and similarity entropy weight. Furthermore, a dynamic-adjusting algorithm (DAA) is proposed to obtain an objective expert weight vector and capture the dynamic properties involved in it. Unlike the extant literature of group prioritisation, the proposed RE approach does not require pre-allocation of expert weights and can solve incomplete preference relations. An interesting finding is that once all the experts express their preference relations, the final expert weight vector derived from the DAA is fixed irrespective of the initial settings of expert weights. Finally, an application example is conducted to validate the effectiveness and robustness of the RE approach.
MacDonald, Kath; Irvine, Lindesay; Smith, Margaret Coulter
2015-12-01
To explore how young 'expert patients' living with Cystic Fibrosis and the healthcare professionals with whom they interact perceive partnership and negotiate care. Modern healthcare policy encourages partnership, engagement and self-management of long-term conditions. This philosophy is congruent with the model adopted in the care of those with Cystic Fibrosis, where self-management, trust and mutual respect are perceived to be integral to the development of the ongoing patient/professional relationship. Self-management is associated with the term; 'expert patient'; an individual with a long-term condition whose knowledge and skills are valued and used in partnership with healthcare professionals. However, the term 'expert patient' is debated in the literature as are the motivation for its use and the assumptions implicit in the term. A qualitative exploratory design informed by Interpretivism and Symbolic Interactionism was conducted. Thirty-four consultations were observed and 23 semi-structured interviews conducted between 10 patients, 2 carers and 12 healthcare professionals. Data were analysed thematically using the five stages of 'Framework' a matrix-based qualitative data analysis approach and were subject to peer review and respondent validation. The study received full ethical approval. Three main themes emerged; experiences of partnership, attributes of the expert patient and constructions of illness. Sub-themes of the 'ceremonial order of the clinic', negotiation and trust in relationships and perceptions of the expert patient are presented. The model of consultation may be a barrier to person-centred care. Healthcare professionals show leniency in negotiations, but do not always trust patients' accounts. The term 'expert patient' is unpopular and remains contested. Gaining insight into structures and processes that enable or inhibit partnership can lead to a collaborative approach to service redesign and a revision of the consultation model. © 2015 John Wiley & Sons Ltd.
Sparse distributed memory: understanding the speed and robustness of expert memory
Brogliato, Marcelo S.; Chada, Daniel M.; Linhares, Alexandre
2014-01-01
How can experts, sometimes in exacting detail, almost immediately and very precisely recall memory items from a vast repertoire? The problem in which we will be interested concerns models of theoretical neuroscience that could explain the speed and robustness of an expert's recollection. The approach is based on Sparse Distributed Memory, which has been shown to be plausible, both in a neuroscientific and in a psychological manner, in a number of ways. A crucial characteristic concerns the limits of human recollection, the “tip-of-tongue” memory event—which is found at a non-linearity in the model. We expand the theoretical framework, deriving an optimization formula to solve this non-linearity. Numerical results demonstrate how the higher frequency of rehearsal, through work or study, immediately increases the robustness and speed associated with expert memory. PMID:24808842
Use of an expert system data analysis manager for space shuttle main engine test evaluation
NASA Technical Reports Server (NTRS)
Abernethy, Ken
1988-01-01
The ability to articulate, collect, and automate the application of the expertise needed for the analysis of space shuttle main engine (SSME) test data would be of great benefit to NASA liquid rocket engine experts. This paper describes a project whose goal is to build a rule-based expert system which incorporates such expertise. Experiential expertise, collected directly from the experts currently involved in SSME data analysis, is used to build a rule base to identify engine anomalies similar to those analyzed previously. Additionally, an alternate method of expertise capture is being explored. This method would generate rules inductively based on calculations made using a theoretical model of the SSME's operation. The latter rules would be capable of diagnosing anomalies which may not have appeared before, but whose effects can be predicted by the theoretical model.
NASA Astrophysics Data System (ADS)
Arcand, K.; Megan, W.; DePasquale, J.; Jubett, A.; Edmonds, P.; DiVona, K.
2017-09-01
Three-dimensional (3D) modelling is more than just good fun, it offers a new vehicle to represent and understand scientific data and gives experts and non-experts alike the ability to manipulate models and gain new perspectives on data. This article explores the use of 3D modelling and printing in astronomy and astronomy communication and looks at some of the practical challenges, and solutions, to using 3D modelling, visualisation and printing in this way.
Wusor II: A Computer Aided Instruction Program with Student Modelling Capabilities.
1977-06-01
The Wun ,pus Advisor The Expert The Expert Chapter 3 Overview of the Expert From the work which was done on the expert of Wusor I, it was...network are numbered and represent caves. Circ led numbers represent caves which have been v is i ted by the player. To the top right of each v is i... ted cave is a marker for whether or not any warnings were sensed. (“U” indicates that a warn ing was sensed, and a “NW” means that a warning was not
Sicard, M; Perrot, N; Leclercq-Perlat, M-N; Baudrit, C; Corrieu, G
2011-01-01
Modeling the cheese ripening process remains a challenge because of its complexity. We still lack the knowledge necessary to understand the interactions that take place at different levels of scale during the process. However, information may be gathered from expert knowledge. Combining this expertise with knowledge extracted from experimental databases may allow a better understanding of the entire ripening process. The aim of this study was to elicit expert knowledge and to check its validity to assess the evolution of organoleptic quality during a dynamic food process: Camembert cheese ripening. Experiments on a pilot scale were carried out at different temperatures and relative humidities to obtain contrasting ripening kinetics. During these experiments, macroscopic evolution was evaluated from an expert's point of view and instrumental measurements were carried out to simultaneously monitor microbiological, physicochemical, and biochemical kinetics. A correlation of 76% was established between the microbiological, physicochemical, and biochemical data and the sensory phases measured according to expert knowledge, highlighting the validity of the experts' measurements. In the future, it is hoped that this expert knowledge may be integrated into food process models to build better decision-aid systems that will make it possible to preserve organoleptic qualities by linking them to other phenomena at the microscopic level. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Aylward, C.M.; Murdoch, J.D.; Donovan, Therese M.; Kilpatrick, C.W.; Bernier, C.; Katz, J.
2018-01-01
The American marten Martes americana is a species of conservation concern in the northeastern United States due to widespread declines from over‐harvesting and habitat loss. Little information exists on current marten distribution and how landscape characteristics shape patterns of occupancy across the region, which could help develop effective recovery strategies. The rarity of marten and lack of historical distribution records are also problematic for region‐wide conservation planning. Expert opinion can provide a source of information for estimating species–landscape relationships and is especially useful when empirical data are sparse. We created a survey to elicit expert opinion and build a model that describes marten occupancy in the northeastern United States as a function of landscape conditions. We elicited opinions from 18 marten experts that included wildlife managers, trappers and researchers. Each expert estimated occupancy probability at 30 sites in their geographic region of expertise. We, then, fit the response data with a set of 58 models that incorporated the effects of covariates related to forest characteristics, climate, anthropogenic impacts and competition at two spatial scales (1.5 and 5 km radii), and used model selection techniques to determine the best model in the set. Three top models had strong empirical support, which we model averaged based on AIC weights. The final model included effects of five covariates at the 5‐km scale: percent canopy cover (positive), percent spruce‐fir land cover (positive), winter temperature (negative), elevation (positive) and road density (negative). A receiver operating characteristic curve indicated that the model performed well based on recent occurrence records. We mapped distribution across the region and used circuit theory to estimate movement corridors between isolated core populations. The results demonstrate the effectiveness of expert‐opinion data at modeling occupancy for rare species and provide tools for planning marten recovery in the northeastern United States.
Induced seismicity hazard and risk by enhanced geothermal systems: an expert elicitation approach
NASA Astrophysics Data System (ADS)
Trutnevyte, Evelina; Azevedo, Inês L.
2018-03-01
Induced seismicity is a concern for multiple geoenergy applications, including low-carbon enhanced geothermal systems (EGS). We present the results of an international expert elicitation (n = 14) on EGS induced seismicity hazard and risk. Using a hypothetical scenario of an EGS plant and its geological context, we show that expert best-guess estimates of annualized exceedance probabilities of an M ≥ 3 event range from 0.2%-95% during reservoir stimulation and 0.2%-100% during operation. Best-guess annualized exceedance probabilities of M ≥ 5 event span from 0.002%-2% during stimulation and 0.003%-3% during operation. Assuming that tectonic M7 events could occur, some experts do not exclude induced (triggered) events of up to M7 too. If an induced M = 3 event happens at 5 km depth beneath a town with 10 000 inhabitants, most experts estimate a 50% probability that the loss is contained within 500 000 USD without any injuries or fatalities. In the case of an induced M = 5 event, there is 50% chance that the loss is below 50 million USD with the most-likely outcome of 50 injuries and one fatality or none. As we observe a vast diversity in quantitative expert judgements and underlying mental models, we conclude with implications for induced seismicity risk governance. That is, we suggest documenting individual expert judgements in induced seismicity elicitations before proceeding to consensual judgements, to convene larger expert panels in order not to cherry-pick the experts, and to aim for multi-organization multi-model assessments of EGS induced seismicity hazard and risk.
Estimating unknown parameters in haemophilia using expert judgement elicitation.
Fischer, K; Lewandowski, D; Janssen, M P
2013-09-01
The increasing attention to healthcare costs and treatment efficiency has led to an increasing demand for quantitative data concerning patient and treatment characteristics in haemophilia. However, most of these data are difficult to obtain. The aim of this study was to use expert judgement elicitation (EJE) to estimate currently unavailable key parameters for treatment models in severe haemophilia A. Using a formal expert elicitation procedure, 19 international experts provided information on (i) natural bleeding frequency according to age and onset of bleeding, (ii) treatment of bleeds, (iii) time needed to control bleeding after starting secondary prophylaxis, (iv) dose requirements for secondary prophylaxis according to onset of bleeding, and (v) life-expectancy. For each parameter experts provided their quantitative estimates (median, P10, P90), which were combined using a graphical method. In addition, information was obtained concerning key decision parameters of haemophilia treatment. There was most agreement between experts regarding bleeding frequencies for patients treated on demand with an average onset of joint bleeding (1.7 years): median 12 joint bleeds per year (95% confidence interval 0.9-36) for patients ≤ 18, and 11 (0.8-61) for adult patients. Less agreement was observed concerning estimated effective dose for secondary prophylaxis in adults: median 2000 IU every other day The majority (63%) of experts expected that a single minor joint bleed could cause irreversible damage, and would accept up to three minor joint bleeds or one trauma related joint bleed annually on prophylaxis. Expert judgement elicitation allowed structured capturing of quantitative expert estimates. It generated novel data to be used in computer modelling, clinical care, and trial design. © 2013 John Wiley & Sons Ltd.
Karst Groundwater Hydrologic Analyses Based on Aerial Thermography
NASA Technical Reports Server (NTRS)
Campbell, C. Warren; Keith, A. G.
2000-01-01
On February 23, 1999, thermal imagery of Marshall Space Flight Center, Alabama was collected using an airborne thermal camera. Ground resolution was I in. Approximately 40 km 2 of thermal imagery in and around Marshall Space Flight Center (MSFC) was analyzed to determine the location of springs for groundwater monitoring. Subsequently, forty-five springs were located ranging in flow from a few ml/sec to approximately 280 liter/sec. Groundwater temperatures are usually near the mean annual surface air temperature. On thermography collected during the winter, springs show up as very warm spots. Many of the new springs were submerged in lakes, streams, or swamps; consequently, flow measurements were difficult. Without estimates of discharge, the impacts of contaminated discharge on surface streams would be difficult to evaluate. An approach to obtaining an estimate was developed using the Environmental Protection Agency (EPA) Cornell Mixing Zone Expert System (CORMIX). The thermography was queried to obtain a temperature profile down the center of the surface plume. The spring discharge was modeled with CORMIX, and the flow adjusted until the surface temperature profile was matched. The presence of volatile compounds in some of the new springs also allowed MSFC to unravel the natural system of solution cavities of the karst aquifer. Sampling results also showed that two springs on either side of a large creek had the same water source so that groundwater was able to pass beneath the creek.
Interactive Inverse Groundwater Modeling - Addressing User Fatigue
NASA Astrophysics Data System (ADS)
Singh, A.; Minsker, B. S.
2006-12-01
This paper builds on ongoing research on developing an interactive and multi-objective framework to solve the groundwater inverse problem. In this work we solve the classic groundwater inverse problem of estimating a spatially continuous conductivity field, given field measurements of hydraulic heads. The proposed framework is based on an interactive multi-objective genetic algorithm (IMOGA) that not only considers quantitative measures such as calibration error and degree of regularization, but also takes into account expert knowledge about the structure of the underlying conductivity field expressed as subjective rankings of potential conductivity fields by the expert. The IMOGA converges to the optimal Pareto front representing the best trade- off among the qualitative as well as quantitative objectives. However, since the IMOGA is a population-based iterative search it requires the user to evaluate hundreds of solutions. This leads to the problem of 'user fatigue'. We propose a two step methodology to combat user fatigue in such interactive systems. The first step is choosing only a few highly representative solutions to be shown to the expert for ranking. Spatial clustering is used to group the search space based on the similarity of the conductivity fields. Sampling is then carried out from different clusters to improve the diversity of solutions shown to the user. Once the expert has ranked representative solutions from each cluster a machine learning model is used to 'learn user preference' and extrapolate these for the solutions not ranked by the expert. We investigate different machine learning models such as Decision Trees, Bayesian learning model, and instance based weighting to model user preference. In addition, we also investigate ways to improve the performance of these models by providing information about the spatial structure of the conductivity fields (which is what the expert bases his or her rank on). Results are shown for each of these machine learning models and the advantages and disadvantages for each approach are discussed. These results indicate that using the proposed two-step methodology leads to significant reduction in user-fatigue without deteriorating the solution quality of the IMOGA.
Developing and Testing a Model to Predict Outcomes of Organizational Change
Gustafson, David H; Sainfort, François; Eichler, Mary; Adams, Laura; Bisognano, Maureen; Steudel, Harold
2003-01-01
Objective To test the effectiveness of a Bayesian model employing subjective probability estimates for predicting success and failure of health care improvement projects. Data Sources Experts' subjective assessment data for model development and independent retrospective data on 221 healthcare improvement projects in the United States, Canada, and the Netherlands collected between 1996 and 2000 for validation. Methods A panel of theoretical and practical experts and literature in organizational change were used to identify factors predicting the outcome of improvement efforts. A Bayesian model was developed to estimate probability of successful change using subjective estimates of likelihood ratios and prior odds elicited from the panel of experts. A subsequent retrospective empirical analysis of change efforts in 198 health care organizations was performed to validate the model. Logistic regression and ROC analysis were used to evaluate the model's performance using three alternative definitions of success. Data Collection For the model development, experts' subjective assessments were elicited using an integrative group process. For the validation study, a staff person intimately involved in each improvement project responded to a written survey asking questions about model factors and project outcomes. Results Logistic regression chi-square statistics and areas under the ROC curve demonstrated a high level of model performance in predicting success. Chi-square statistics were significant at the 0.001 level and areas under the ROC curve were greater than 0.84. Conclusions A subjective Bayesian model was effective in predicting the outcome of actual improvement projects. Additional prospective evaluations as well as testing the impact of this model as an intervention are warranted. PMID:12785571
Aerodynamics of the EXPERT Re-Entry Ballistic Vehicle
NASA Astrophysics Data System (ADS)
Kharitonov, A. M.; Adamov, N. P.; Mazhul, I. I.; Vasenyov, L. G.; Zvegintsev, V. I.; Muylaert, J. M.
2009-01-01
Since 2002 till now, experimental studies of the EXPERT reentry capsule have been performed in ITAM SB RAS wind tunnels. These studies have been performed in consecutive ISTC project No. 2109, 3151, and currently ongoing project No. 3550. The results of earlier studies in ITAM wind tunnels can be found in [1-4]. The present paper describes new data obtained for the EXPERT model.
Natural and Artificial Intelligence in Neurosurgery: A Systematic Review.
Senders, Joeky T; Arnaout, Omar; Karhade, Aditya V; Dasenbrock, Hormuzdiyar H; Gormley, William B; Broekman, Marike L; Smith, Timothy R
2017-09-07
Machine learning (ML) is a domain of artificial intelligence that allows computer algorithms to learn from experience without being explicitly programmed. To summarize neurosurgical applications of ML where it has been compared to clinical expertise, here referred to as "natural intelligence." A systematic search was performed in the PubMed and Embase databases as of August 2016 to review all studies comparing the performance of various ML approaches with that of clinical experts in neurosurgical literature. Twenty-three studies were identified that used ML algorithms for diagnosis, presurgical planning, or outcome prediction in neurosurgical patients. Compared to clinical experts, ML models demonstrated a median absolute improvement in accuracy and area under the receiver operating curve of 13% (interquartile range 4-21%) and 0.14 (interquartile range 0.07-0.21), respectively. In 29 (58%) of the 50 outcome measures for which a P -value was provided or calculated, ML models outperformed clinical experts ( P < .05). In 18 of 50 (36%), no difference was seen between ML and expert performance ( P > .05), while in 3 of 50 (6%) clinical experts outperformed ML models ( P < .05). All 4 studies that compared clinicians assisted by ML models vs clinicians alone demonstrated a better performance in the first group. We conclude that ML models have the potential to augment the decision-making capacity of clinicians in neurosurgical applications; however, significant hurdles remain associated with creating, validating, and deploying ML models in the clinical setting. Shifting from the preconceptions of a human-vs-machine to a human-and-machine paradigm could be essential to overcome these hurdles. Published by Oxford University Press on behalf of Congress of Neurological Surgeons 2017.
Prognostic modelling options for remaining useful life estimation by industry
NASA Astrophysics Data System (ADS)
Sikorska, J. Z.; Hodkiewicz, M.; Ma, L.
2011-07-01
Over recent years a significant amount of research has been undertaken to develop prognostic models that can be used to predict the remaining useful life of engineering assets. Implementations by industry have only had limited success. By design, models are subject to specific assumptions and approximations, some of which are mathematical, while others relate to practical implementation issues such as the amount of data required to validate and verify a proposed model. Therefore, appropriate model selection for successful practical implementation requires not only a mathematical understanding of each model type, but also an appreciation of how a particular business intends to utilise a model and its outputs. This paper discusses business issues that need to be considered when selecting an appropriate modelling approach for trial. It also presents classification tables and process flow diagrams to assist industry and research personnel select appropriate prognostic models for predicting the remaining useful life of engineering assets within their specific business environment. The paper then explores the strengths and weaknesses of the main prognostics model classes to establish what makes them better suited to certain applications than to others and summarises how each have been applied to engineering prognostics. Consequently, this paper should provide a starting point for young researchers first considering options for remaining useful life prediction. The models described in this paper are Knowledge-based (expert and fuzzy), Life expectancy (stochastic and statistical), Artificial Neural Networks, and Physical models.
Heuristic Model Of The Composite Quality Index Of Environmental Assessment
NASA Astrophysics Data System (ADS)
Khabarov, A. N.; Knyaginin, A. A.; Bondarenko, D. V.; Shepet, I. P.; Korolkova, L. N.
2017-01-01
The goal of the paper is to present the heuristic model of the composite environmental quality index based on the integrated application of the elements of utility theory, multidimensional scaling, expert evaluation and decision-making. The composite index is synthesized in linear-quadratic form, it provides higher adequacy of the results of the assessment preferences of experts and decision-makers.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-20
... Transportation, Pipeline and Hazardous Materials Safety Administration (PHMSA) invite interested parties to...] Expert Forum on the Use of Performance-Based Regulatory Models in the U.S. Oil and Gas Industry, Offshore... and gas industry. The meeting will take place at the College of the Mainland, and hosted by the Gulf...
Merzagora, Isabella; Amadasi, Alberto; Blandino, Alberto; Travaini, Guido
In recent times Italy has been experiencing massive migration flows, therefore the attention on the issue of crimes committed by foreigners is increasing. But within trials, in the evaluation of criminal liability of foreigners, how do experts deal with them? Do the performed evaluations take cultural diversity into account? The present study took origin from these questions and examined a total of 86 reports by experts on criminal liability of foreign persons (16 females and 70 males). Examinees have been declared indictable in 31 cases (36%), totally mentally ill in 40 cases (45%) and with diminished liability in 15 cases (17%); when liability was excluded, examinees were diagnosed in 11 cases with mood disorders, in 23 cases with personality disorders, in 4 cases with adaptation disorders and post-traumatic stress disorder and in 10 cases with different diagnoses (in some cases more than one diagnosis was present). None of the reports used the section of the DSM concerning "cultural framing". Tests were used in 48 surveys (56% of cases), with more tests for each examinee, for a total of 39 Rorschach, 14 Raven test, 8 Minnesota Multiphasic Personality Inventory - MMPI - 4 Wechsler Adult Intelligence Scale - WAIS - level test, 8 Thematic Apperception test. When subjects were diagnosed with mental disorder and with diminished liability, 42 (79%) were also socially dangerous. Results highlight the importance of the relationship between the expert and the foreigner. Many factors ought to be critically considered by experts dealing with foreigners, like cultural awareness, knowledge of verbal communication, critical consideration of meanings and diagnosis, knowledge of the foreigners' personal story, presence of tests with inexact information and cultural fallacy. Copyright © 2018 Elsevier Ltd. All rights reserved.
Outcomes from the DOE Workshop on Turbulent Flow Simulation at the Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprague, Michael; Boldyrev, Stanislav; Chang, Choong-Seock
This paper summarizes the outcomes from the Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop, which was held 4-5 August 2015, and was sponsored by the U.S. Department of Energy Office of Advanced Scientific Computing Research. The workshop objective was to define and describe the challenges and opportunities that computing at the exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the U.S. Department of Energy applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought togethermore » experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.« less
Kowalewski, Karl-Friedrich; Hendrie, Jonathan D; Schmidt, Mona W; Garrow, Carly R; Bruckner, Thomas; Proctor, Tanja; Paul, Sai; Adigüzel, Davud; Bodenstedt, Sebastian; Erben, Andreas; Kenngott, Hannes; Erben, Young; Speidel, Stefanie; Müller-Stich, Beat P; Nickel, Felix
2017-05-01
Training and assessment outside of the operating room is crucial for minimally invasive surgery due to steep learning curves. Thus, we have developed and validated the sensor- and expert model-based laparoscopic training system, the iSurgeon. Participants of different experience levels (novice, intermediate, expert) performed four standardized laparoscopic knots. Instruments and surgeons' joint motions were tracked with an NDI Polaris camera and Microsoft Kinect v1. With frame-by-frame image analysis, the key steps of suturing and knot tying were identified and registered with motion data. Construct validity, concurrent validity, and test-retest reliability were analyzed. The Objective Structured Assessment of Technical Skills (OSATS) was used as the gold standard for concurrent validity. The system showed construct validity by discrimination between experience levels by parameters such as time (novice = 442.9 ± 238.5 s; intermediate = 190.1 ± 50.3 s; expert = 115.1 ± 29.1 s; p < 0.001), total path length (novice = 18,817 ± 10318 mm; intermediate = 9995 ± 3286 mm; expert = 7265 ± 2232 mm; p < 0.001), average speed (novice = 42.9 ± 8.3 mm/s; intermediate = 52.7 ± 11.2 mm/s; expert = 63.6 ± 12.9 mm/s; p < 0.001), angular path (novice = 20,573 ± 12,611°; intermediate = 8652 ± 2692°; expert = 5654 ± 1746°; p < 0.001), number of movements (novice = 2197 ± 1405; intermediate = 987 ± 367; expert = 743 ± 238; p < 0.001), number of movements per second (novice = 5.0 ± 1.4; intermediate = 5.2 ± 1.5; expert = 6.6 ± 1.6; p = 0.025), and joint angle range (for different axes and joints all p < 0.001). Concurrent validity of OSATS and iSurgeon parameters was established. Test-retest reliability was given for 7 out of 8 parameters. The key steps "wrapping the thread around the instrument" and "needle positioning" were most difficult to learn. Validity and reliability of the self-developed sensor-and expert model-based laparoscopic training system "iSurgeon" were established. Using multiple parameters proved more reliable than single metric parameters. Wrapping of the needle around the thread and needle positioning were identified as difficult key steps for laparoscopic suturing and knot tying. The iSurgeon could generate automated real-time feedback based on expert models which may result in shorter learning curves for laparoscopic tasks. Our next steps will be the implementation and evaluation of full procedural training in an experimental model.
NASA Astrophysics Data System (ADS)
Avolio, G.; Corso Radu, A.; Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.
2012-12-01
The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment is a very complex distributed computing system, composed of more than 20000 applications running on more than 2000 computers. The TDAQ Controls system has to guarantee the smooth and synchronous operations of all the TDAQ components and has to provide the means to minimize the downtime of the system caused by runtime failures. During data taking runs, streams of information messages sent or published by running applications are the main sources of knowledge about correctness of running operations. The huge flow of operational monitoring data produced is constantly monitored by experts in order to detect problems or misbehaviours. Given the scale of the system and the rates of data to be analyzed, the automation of the system functionality in the areas of operational monitoring, system verification, error detection and recovery is a strong requirement. To accomplish its objective, the Controls system includes some high-level components which are based on advanced software technologies, namely the rule-based Expert System and the Complex Event Processing engines. The chosen techniques allow to formalize, store and reuse the knowledge of experts and thus to assist the shifters in the ATLAS control room during the data-taking activities.
An Expert Map of Gambling Risk Perception.
Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul
2015-12-01
The purpose of the current study was to investigate the moderating or mediating role played by risk perception in decision-making, gambling behaviour, and disordered gambling aetiology. Eleven gambling expert clinicians and researchers completed a semi-structured interview derived from mental models and grounded theory methodologies. Expert interview data was used to construct a comprehensive expert mental model 'map' detailing risk-perception related factors contributing to harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and emergent themes. Findings indicated that experts considered idiosyncratic beliefs among gamblers result in overall underestimates of risk and loss, insufficient prioritization of needs, and planning and implementation of risk management strategies. Additional contextual factors influencing use of risk information (reinforcement and learning; mental states, environmental cues, ambivalence; and socio-cultural and biological variables) acted to shape risk perceptions and increase vulnerabilities to harm or disordered gambling. It was concluded that understanding the nature, extent and processes by which risk perception predisposes an individual to maintain gambling despite adverse consequences can guide the content of preventative educational responsible gambling campaigns.
A water balance approach to enhance national (GB) Daily Landslide Hazard Assessments
NASA Astrophysics Data System (ADS)
Dijkstra, Tom; Reeves, Helen; Freeborough, Katy; Dashwood, Claire; Pennington, Catherine; Jordan, Hannah; Hobbs, Peter; Richardson, Jennifer; Banks, Vanessa; Cole, Steven; Wells, Steven; Moore, Robert
2017-04-01
The British Geological Survey (BGS) is a member of the Natural Hazards Partnership (NHP) and delivers a national (GB) daily landslide hazard assessment (DLHA). The DLHA is based largely on 'expert' driven evaluations of the likelihood of landslides in response to antecedent ground conditions, adverse weather and reported landslide events. It concentrates on shallow translational slides and debris flows - events that most frequently have societal consequences by disrupting transport infrastructure and affecting buildings. Considerable experience with the issuing of DLHAs has been gained since 2012. However, it remains very difficult to appropriately assess changing ground conditions throughout GB even when good quality precipitation forecasts are available. Soil moisture sensors are available, but the network is sparse and not yet capable of covering GB to the detail required to underpin the forecasts. Therefore, we developed an approach where temporal and spatial variations in soil moisture can be obtained from a water balance model, representing processes in the near-surface and configured on a relatively coarse grid of 1 km2. Model outputs are not intended to be relevant to the slope scale. The assumption is that the likelihood of landslides being triggered by rainfall is dependent upon the soil moisture conditions of the near-surface, in combination with how much rain is forecast to occur for the following day. These variables form the basis for establishing thresholds to guide the issuing of DLHA and early warnings. The main aim is to obtain an insight into regional patterns of change and threshold exceedance. The BGS water balance model is still in its infancy and it requires substantial work to fine-tune and validate it. To test the performance of the BGS model we focused on an analysis of Scottish landslides (2004-2015) comprising translational slides and debris flows where the BGS model is conditionally evaluated against the Grid-to-Grid (G2G) Model. G2G is a physical-conceptual distributed hydrological model developed by the Centre for Ecology & Hydrology, also an NHP member. G2G is especially suited to simulate river flows over ungauged areas and has the capability to forecast fluvial river flows at any location across a gridded model domain. This is achieved by using spatial datasets on landscape properties - terrain, land-cover, soil and geology - in combination with gridded time-series of rainfall to shape a rainfall pattern into a river flow response over the model domain. G2G is operational on a 1 km2 grid over the GB and outputs soil moisture estimates that take some account of terrain slope in its water balance calculation. This research is part of an evolutionary process where capabilities of establishing the likelihood of landslides will develop as datasets are becoming increasingly detailed (and accessible) and the representation of hydrogeological and geotechnical processes continues to develop.
Venugopal, Divya; Rafi, Aboobacker Mohamed; Innah, Susheela Jacob; Puthayath, Bibin T.
2017-01-01
BACKGROUND: Process Excellence is a value based approach and focuses on standardizing work processes by eliminating the non-value added processes, identify process improving methodologies and maximize capacity and expertise of the staff. AIM AND OBJECTIVES: To Evaluate the utility of Process Excellence Tools in improving Donor Flow Management in a Tertiary care Hospital by studying the current state of donor movement within the blood bank and providing recommendations for eliminating the wait times and to improve the process and workflow. MATERIALS AND METHODS: The work was done in two phases; The First Phase comprised of on-site observations with the help of an expert trained in Process Excellence Methodology who observed and documented various aspects of donor flow, donor turn around time, total staff details and operator process flow. The Second Phase comprised of constitution of a Team to analyse the data collected. The analyzed data along with the recommendations were presented before an expert hospital committee and the management. RESULTS: Our analysis put forward our strengths and identified potential problems. Donor wait time was reduced by 50% after lean due to better donor management with reorganization of the infrastructure of the donor area. Receptionist tracking showed that 62% of the total time the staff wastes in walking and 22% in other non-value added activities. Defining Duties for each staff reduced the time spent by them in non-value added activities. Implementation of the token system, generation of unique identification code for donors and bar code labeling of the tubes and bags are among the other recommendations. CONCLUSION: Process Excellence is not a programme; it's a culture that transforms an organization and improves its Quality and Efficiency through new attitudes, elimination of wastes and reduction in costs. PMID:28970681
Venugopal, Divya; Rafi, Aboobacker Mohamed; Innah, Susheela Jacob; Puthayath, Bibin T
2017-01-01
Process Excellence is a value based approach and focuses on standardizing work processes by eliminating the non-value added processes, identify process improving methodologies and maximize capacity and expertise of the staff. To Evaluate the utility of Process Excellence Tools in improving Donor Flow Management in a Tertiary care Hospital by studying the current state of donor movement within the blood bank and providing recommendations for eliminating the wait times and to improve the process and workflow. The work was done in two phases; The First Phase comprised of on-site observations with the help of an expert trained in Process Excellence Methodology who observed and documented various aspects of donor flow, donor turn around time, total staff details and operator process flow. The Second Phase comprised of constitution of a Team to analyse the data collected. The analyzed data along with the recommendations were presented before an expert hospital committee and the management. Our analysis put forward our strengths and identified potential problems. Donor wait time was reduced by 50% after lean due to better donor management with reorganization of the infrastructure of the donor area. Receptionist tracking showed that 62% of the total time the staff wastes in walking and 22% in other non-value added activities. Defining Duties for each staff reduced the time spent by them in non-value added activities. Implementation of the token system, generation of unique identification code for donors and bar code labeling of the tubes and bags are among the other recommendations. Process Excellence is not a programme; it's a culture that transforms an organization and improves its Quality and Efficiency through new attitudes, elimination of wastes and reduction in costs.
Introducing Managers to Expert Systems.
ERIC Educational Resources Information Center
Finlay, Paul N.; And Others
1991-01-01
Describes a short course to expose managers to expert systems, consisting of (1) introductory lecture; (2) supervised computer tutorial; (3) lecture and discussion about knowledge structuring and modeling; and (4) small group work on a case study using computers. (SK)
The ATLAS Tier-0: Overview and operational experience
NASA Astrophysics Data System (ADS)
Elsing, Markus; Goossens, Luc; Nairz, Armin; Negri, Guido
2010-04-01
Within the ATLAS hierarchical, multi-tier computing infrastructure, the Tier-0 centre at CERN is mainly responsible for prompt processing of the raw data coming from the online DAQ system, to archive the raw and derived data on tape, to register the data with the relevant catalogues and to distribute them to the associated Tier-1 centers. The Tier-0 is already fully functional. It has been successfully participating in all cosmic and commissioning data taking since May 2007, and was ramped up to its foreseen full size, performance and throughput for the cosmic (and short single-beam) run periods between July and October 2008. Data and work flows for collision data taking were exercised in several "Full Dress Rehearsals" (FDRs) in the course of 2008. The transition from an expert to a shifter-based system was successfully established in July 2008. This article will give an overview of the Tier-0 system, its data and work flows, and operations model. It will review the operational experience gained in cosmic, commissioning, and FDR exercises during the past year. And it will give an outlook on planned developments and the evolution of the system towards first collision data taking expected now in late Autumn 2009.
Evaluation of a procedure to assess the adverse effects of illicit drugs.
van Amsterdam, J G C; Best, W; Opperhuizen, A; de Wolff, F A
2004-02-01
The assessment procedure of new synthetic illicit drugs that are not documented in the UN treaty on psychotropic drugs was evaluated using a modified Electre model. Drugs were evaluated by an expert panel via the open Delphi approach, where the written score was discussed on 16 items, covering medical, health, legal, and criminalistic issues of the drugs. After this face-to-face discussion the drugs were scored again. Taking the assessment of ketamine as an example, it appeared that each expert used its own scale to score, and that policymakers do not score deviant from experts trained in the medical-biological field. Of the five drugs evaluated by the panel, p-methoxy-metamphetamine (PMMA), gamma-hydroxybutyric acid (GHB), and 4-methylthio-amphetamine (MTA) were assessed as more adverse than ketamine and psilocine and psilocybine-containing mushrooms. Whereas some experts slightly adjusted during the assessment procedure their opinion on ketamine and PMMA, the opinion on mushrooms was not affected by the discussion held between the two scoring rounds. All experts rank the five drugs in a similar way on the adverse effect scale i.e., concordance scale of the Electre model, indicating unanimity in the expert panel with respect to the risk classification of these abused drugs.
NASA Astrophysics Data System (ADS)
Mo, Yunjeong
The purpose of this research is to support the development of an intelligent Decision Support System (DSS) by integrating quantitative information with expert knowledge in order to facilitate effective retrofit decision-making. To achieve this goal, the Energy Retrofit Decision Process Framework is analyzed. Expert system shell software, a retrofit measure cost database, and energy simulation software are needed for developing the DSS; Exsys Corvid, the NREM database and BEopt were chosen for implementing an integration model. This integration model demonstrates the holistic function of a residential energy retrofit system for existing homes, by providing a prioritized list of retrofit measures with cost information, energy simulation and expert advice. The users, such as homeowners and energy auditors, can acquire all of the necessary retrofit information from this unified system without having to explore several separate systems. The integration model plays the role of a prototype for the finalized intelligent decision support system. It implements all of the necessary functions for the finalized DSS, including integration of the database, energy simulation and expert knowledge.
Renewable energy education and industrial arts: linking knowledge producers with knowledge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foley, R.L.
This study introduces renewable energy technology into the industrial arts programs in the State of New Hampshire by providing the following information for decision making: (1) a broad-based perspective on renewable energy technology; (2) the selection of an educational change model; (3) data from a needs analysis; (4) an initial screening of potential teacher-trainers. The Wolf-Welsh Linkage Model was selected as the knowledge production/utilization model for bridging the knowledge gap between renewable energy experts and industrial arts teachers. Ninety-six renewable energy experts were identified by a three-step peer nomination process (92% response rate). The experts stressed the conceptual foundations, economicmore » justifications, and the scientific and quantitative basics of renewable energy technology. The teachers focused on wood-burning technology, educational strategies, and the more popular alternative energy sources such as windpower, hydropower, photovoltaics, and biomass. The most emphatic contribution of the needs analysis was the experts' and teachers' shared perception that residential/commercial building design, retrofitting, and construction is the single most important practical, technical area for the application of renewable energy technology.« less
Peer Review Documents Related to the Evaluation of ...
BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review and expert summaries of the BMDS application and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer reviews and expert summaries of the BMDS applications and its models as they were developed and eventually released.
Rural Policies for the 1990s. Rural Studies Series.
ERIC Educational Resources Information Center
Flora, Cornelia B., Ed.; Christenson, James A., Ed.
Written by some of the foremost experts on rural America, this book focuses on policy-relevant research on the problems of rural areas. In each chapter, rural policy needs are identified by examining the flow of events and rural sociology of the 1980s. Chapters are: (1) "Critical Times for Rural America: The Challenge for Rural Policy in the…
Network approaches for expert decisions in sports.
Glöckner, Andreas; Heinen, Thomas; Johnson, Joseph G; Raab, Markus
2012-04-01
This paper focuses on a model comparison to explain choices based on gaze behavior via simulation procedures. We tested two classes of models, a parallel constraint satisfaction (PCS) artificial neuronal network model and an accumulator model in a handball decision-making task from a lab experiment. Both models predict action in an option-generation task in which options can be chosen from the perspective of a playmaker in handball (i.e., passing to another player or shooting at the goal). Model simulations are based on a dataset of generated options together with gaze behavior measurements from 74 expert handball players for 22 pieces of video footage. We implemented both classes of models as deterministic vs. probabilistic models including and excluding fitted parameters. Results indicated that both classes of models can fit and predict participants' initially generated options based on gaze behavior data, and that overall, the classes of models performed about equally well. Early fixations were thereby particularly predictive for choices. We conclude that the analyses of complex environments via network approaches can be successfully applied to the field of experts' decision making in sports and provide perspectives for further theoretical developments. Copyright © 2011 Elsevier B.V. All rights reserved.
The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator.
Roh, S D; Kim, S W; Cho, W S
2001-10-01
The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator were accomplished. In the numerical modelling, two models applied to the modelling within the kiln are the combustion chamber model including the mass and energy balance equations for two combustion chambers and 3D thermal model. The combustion chamber model predicts temperature within the kiln, flue gas composition, flux and heat of combustion. Using the combustion chamber model and 3D thermal model, the production-rules for the process simulation can be obtained through interrelation analysis between control and operation variables. The process simulation of the kiln is operated with the production-rules for automatic operation. The process simulation aims to provide fundamental solutions to the problems in incineration process by introducing an online expert control system to provide an integrity in process control and management. Knowledge-based expert control systems use symbolic logic and heuristic rules to find solutions for various types of problems. It was implemented to be a hybrid intelligent expert control system by mutually connecting with the process control systems which has the capability of process diagnosis, analysis and control.
Mahalingam, S; Awad, Z; Tolley, N S; Khemani, S
2016-08-01
The objective of this study was to identify and investigate the face and content validity of ventilation tube insertion (VTI) training models described in the literature. A review of literature was carried out to identify articles describing VTI simulators. Feasible models were replicated and assessed by a group of experts. Postgraduate simulation centre. Experts were defined as surgeons who had performed at least 100 VTI on patients. Seventeen experts were participated ensuring sufficient statistical power for analysis. A standardised 18-item Likert-scale questionnaire was used. This addressed face validity (realism), global and task-specific content (suitability of the model for teaching) and curriculum recommendation. The search revealed eleven models, of which only five had associated validity data. Five models were found to be feasible to replicate. None of the tested models achieved face or global content validity. Only one model achieved task-specific validity, and hence, there was no agreement on curriculum recommendation. The quality of simulation models is moderate and there is room for improvement. There is a need for new models to be developed or existing ones to be refined in order to construct a more realistic training platform for VTI simulation. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Gavarieva, K. N.; Simonova, L. A.; Pankratov, D. L.; Gavariev, R. V.
2017-09-01
In article the main component of expert system of process of casting under pressure which consists of algorithms, united in logical models is considered. The characteristics of system showing data on a condition of an object of management are described. A number of logically interconnected steps allowing to increase quality of the received castings is developed
ERIC Educational Resources Information Center
Khlaisang, Jintavee
2010-01-01
The purpose of this study was to investigate proper website and courseware for e-learning in higher education. Methods used in this study included the data collection, the analysis surveys, the experts' in-depth interview, and the experts' focus group. Results indicated that there were 16 components for website, as well as 16 components for…
Proceedings: USACERL/ASCE First Joint Conference on Expert Systems, 29-30 June 1988
1989-01-01
Wong KOWLEDGE -BASED GRAPHIC DIALOGUES . o ...................... .... 80 D. L Mw 4 CONTENTS (Cont’d) ABSTRACTS ACCEPTED FOR PUBLICATION MAD, AN EXPERT...methodology of inductive shallow modeling was developed. Inductive systems may become powerful shallow modeling tools applicable to a large class of...analysis was conducted using a statistical package, Trajectories. Four different types of relationships were analyzed: linear, logarithmic, power , and
NASA Astrophysics Data System (ADS)
Krause, Lee S.; Burns, Carla L.
2000-06-01
This paper discusses the research currently in progress to develop the Conceptual Federation Object Model Design Tool. The objective of the Conceptual FOM (C-FOM) Design Tool effort is to provide domain and subject matter experts, such as scenario developers, with automated support for understanding and utilizing available HLA simulation and other simulation assets during HLA Federation development. The C-FOM Design Tool will import Simulation Object Models from HLA reuse repositories, such as the MSSR, to populate the domain space that will contain all the objects and their supported interactions. In addition, the C-FOM tool will support the conversion of non-HLA legacy models into HLA- compliant models by applying proven abstraction techniques against the legacy models. Domain experts will be able to build scenarios based on the domain objects and interactions in both a text and graphical form and export a minimal FOM. The ability for domain and subject matter experts to effectively access HLA and non-HLA assets is critical to the long-term acceptance of the HLA initiative.
[Study on Information Extraction of Clinic Expert Information from Hospital Portals].
Zhang, Yuanpeng; Dong, Jiancheng; Qian, Danmin; Geng, Xingyun; Wu, Huiqun; Wang, Li
2015-12-01
Clinic expert information provides important references for residents in need of hospital care. Usually, such information is hidden in the deep web and cannot be directly indexed by search engines. To extract clinic expert information from the deep web, the first challenge is to make a judgment on forms. This paper proposes a novel method based on a domain model, which is a tree structure constructed by the attributes of search interfaces. With this model, search interfaces can be classified to a domain and filled in with domain keywords. Another challenge is to extract information from the returned web pages indexed by search interfaces. To filter the noise information on a web page, a block importance model is proposed. The experiment results indicated that the domain model yielded a precision 10.83% higher than that of the rule-based method, whereas the block importance model yielded an F₁ measure 10.5% higher than that of the XPath method.
Three CLIPS-based expert systems for solving engineering problems
NASA Technical Reports Server (NTRS)
Parkinson, W. J.; Luger, G. F.; Bretz, R. E.
1990-01-01
We have written three expert systems, using the CLIPS PC-based expert system shell. These three expert systems are rule based and are relatively small, with the largest containing slightly less than 200 rules. The first expert system is an expert assistant that was written to help users of the ASPEN computer code choose the proper thermodynamic package to use with their particular vapor-liquid equilibrium problem. The second expert system was designed to help petroleum engineers choose the proper enhanced oil recovery method to be used with a given reservoir. The effectiveness of each technique is highly dependent upon the reservoir conditions. The third expert system is a combination consultant and control system. This system was designed specifically for silicon carbide whisker growth. Silicon carbide whiskers are an extremely strong product used to make ceramic and metal composites. The manufacture of whiskers is a very complicated process. which to date. has defied a good mathematical model. The process was run by experts who had gained their expertise by trial and error. A system of rules was devised by these experts both for procedure setup and for the process control. In this paper we discuss the three problem areas of the design, development and evaluation of the CLIPS-based programs.
Thalén, M; Sundberg, J
2001-01-01
The voice is apparently used in quite different manners in different styles of singing. Some of these differences concern the voice source, which varies considerably with loudness, pitch, and mode of phonation. We attempt to describe voice source differences between Classical, Pop, Jazz and Blues styles of singing as produced in a triad melody pattern by a professional female singer in soft, middle and loud phonation. An expert panel was asked to identify these triads as examples of either Classical, Pop, Jazz or Blues. The voice source was analysed by inverse filtering. Subglottal pressure Ps, closed quotient QClosed, glottal compliance (ratio between the air volume contained in a voice pulse and Ps), and the level difference between the two lowest source spectrum partials were analysed in the styles and in four modes of phonation: breathy, flow, neutral, and pressed. The same expert panel rated the degree of pressedness in the entire material. Averages across pitch were calculated for each mode and style and related to their total range of variation in the subject. The glottogram data showed a high correlation with the ratings of pressedness. Based on these correlations a pressedness factor was computed from the glottogram data. A phonation map was constructed with the axes representing mean adduction factor and mean Ps, respectively. In this map Classical was similar to flow phonation, Pop and Jazz to neutral and flow phonation, and Blues to pressed phonation.
Using cognitive task analysis to create a teaching protocol for bovine dystocia.
Read, Emma K; Baillie, Sarah
2013-01-01
When learning skilled techniques and procedures, students face many challenges. Learning is easier when detailed instructions are available, but experts often find it difficult to articulate all of the steps involved in a task or relate to the learner as a novice. This problem is further compounded when the technique is internal and unsighted (e.g., obstetrical procedures). Using expert bovine practitioners and a life-size model cow and calf, the steps and decision making involved in performing correction of two different dystocia presentations (anterior leg back and breech) were deconstructed using cognitive task analysis (CTA). Video cameras were positioned to capture movement inside and outside the cow model while the experts were asked to first perform the technique as they would in a real situation and then perform the procedure again as if articulating the steps to a novice learner. The audio segments were transcribed and, together with the video components, analyzed to create a list of steps for each expert. Consensus was achieved between experts during individual interviews followed by a group discussion. A "gold standard" list or teaching protocol was created for each malpresentation. CTA was useful in defining the technical and cognitive steps required to both perform and teach the tasks effectively. Differences between experts highlight the need for consensus before teaching the skill. In addition, the study identified several different, yet effective, techniques and provided information that could allow experts to consider other approaches they might use when their own technique fails.
ERIC Educational Resources Information Center
Cocking, Rodney R.; Mestre, Jose P.
The focus of this paper is on cognitive science as a model for understanding the application of human skills toward effective problem-solving. Sections include: (1) "Introduction" (discussing information processing framework, expert-novice distinctions, schema theory, and learning process); (2) "Application: The Expert-Novice…
Development and validation of a mass casualty conceptual model.
Culley, Joan M; Effken, Judith A
2010-03-01
To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions.
Streamlining Transportation Corridor Planning Processess: Freight and Traffic Information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franzese, Oscar
2010-08-01
The traffic investigation is one of the most important parts of an Environmental Impact Statement of projects involving the construction of new roadway facilities and/or the improvement of existing ones. The focus of the traffic analysis is on the determination of anticipated traffic flow characteristics of the proposed project, by the application of analytical methods that can be grouped under the umbrella of capacity analysis methodologies. In general, the main traffic parameter used in EISs to describe the quality of traffic flow is the Level of Service (LOS). The current state of the practice in terms of the traffic investigationsmore » for EISs has two main shortcomings. The first one is related to the information that is necessary to conduct the traffic analysis, and specifically to the lack of integration among the different transportation models and the sources of information that, in general, reside in GIS databases. A discussion of the benefits of integrating CRS&SI technologies and the transportation models used in the EIS traffic investigation is included. The second shortcoming is in the presentation of the results, both in terms of the appearance and formatting, as well as content. The presentation of traffic results (current and proposed) is discussed. This chapter also addresses the need of additional data, in terms of content and coverage. Regarding the former, other traffic parameters (e.g., delays) that are more meaningful to non-transportation experts than LOS, as well as additional information (e.g., freight flows) that can impact traffic conditions and safety are discussed. Spatial information technologies can decrease the negative effects of, and even eliminate, these shortcomings by making the relevant information that is input to the models more complete and readily available, and by providing the means to communicate the results in a more clear and efficient manner. The benefits that the application and use of CRS&SI technologies can provide to improve and expedite the traffic investigation part of the EIS process are presented.« less
NASA Astrophysics Data System (ADS)
Pohlmann, K. F.; Zhu, J.; Ye, M.; Carroll, R. W.; Chapman, J. B.; Russell, C. E.; Shafer, D. S.
2006-12-01
Yucca Mountain (YM), Nevada has been recommended as a deep geological repository for the disposal of spent fuel and high-level radioactive waste. If YM is licensed as a repository by the Nuclear Regulatory Commission, it will be important to identify the potential for radionuclides to migrate from underground nuclear testing areas located on the Nevada Test Site (NTS) to the hydraulically downgradient repository area to ensure that monitoring does not incorrectly attribute repository failure to radionuclides originating from other sources. In this study, we use the Death Valley Regional Flow System (DVRFS) model developed by the U.S. Geological Survey to investigate potential groundwater migration pathways and associated travel times from the NTS to the proposed YM repository area. Using results from the calibrated DVRFS model and the particle tracking post-processing package MODPATH we modeled three-dimensional groundwater advective pathways in the NTS and YM region. Our study focuses on evaluating the potential for groundwater pathways between the NTS and YM withdrawal area and whether travel times for advective flow along these pathways coincide with the prospective monitoring time frame at the proposed repository. We include uncertainty in effective porosity as this is a critical variable in the determination of time for radionuclides to travel from the NTS region to the YM withdrawal area. Uncertainty in porosity is quantified through evaluation of existing site data and expert judgment and is incorporated in the model through Monte Carlo simulation. Since porosity information is limited for this region, the uncertainty is quite large and this is reflected in the results as a large range in simulated groundwater travel times.
NASA Astrophysics Data System (ADS)
Lawley, Russell; Barron, Mark; Lee, Katy
2014-05-01
Uncertainty in geological linework: communicating the expert's tacit model to the data user(s) by expert elicitation. R. Lawley, M. Barron and K. Lee. NERC - British Geological Survey, Environmental Science Centre, Keyworth, Nottingham, UK, NG12 5GG The boundaries mapped in traditional field geological survey are subject to a wide range of inherent uncertainties. A map at a survey-scale of 1:10,000 is created by a combination of terrain interpretation, direct observations from boreholes and exposures (often sparsely distributed), and indirect interpretation of proxy variables such as soil properties, vegetation and remotely sensed images. A critical factor influencing the quality of the final map is the skill and experience of the surveyor to bring this information together in a coherent conceptual model. The users of geological data comprising or based on mapped boundaries are increasingly aware of these uncertainties, and want to know how to manage them. The growth of 3D modelling, which takes 2D surveys as a starting point, adds urgency to the need for a better understanding of survey uncertainties; particularly where 2D mapping of variable vintage has been compiled into a national coverage. Previous attempts to apply confidence on the basis of metrics such as data density, survey age or survey techniques have proved useful for isolating single, critical, factors but do not generally succeed in evaluating geological mapping 'in the round', because they cannot account for the 'conceptual' skill set of the surveyor. The British Geological Survey (BGS) is using expert elicitation methods to gain a better understanding of uncertainties within the national geological map of Great Britain. The expert elicitation approach starts with the assumption that experienced surveyors have an intuitive sense of the uncertainty of the boundaries that they map, based on a tacit model of geology and its complexity and the nature of the surveying process. The objective of elicitation is to extract this model in a useable, quantitative, form by a robust and transparent procedure. At BGS expert elicitation is being used to evaluate the uncertainty of mapped boundaries in different common mapping scenarios, with a view to building a 'collective' understanding of the challenges each scenario presents. For example, a 'sharp contact (at surface) between highly contrasting sedimentary rocks' represents one level of survey challenge that should be accurately met by all surveyors, even novices. In contrast, a 'transitional boundary defined by localised facies-variation' may require much more experience to resolve (without recourse to significantly more sampling). We will describe the initial phase of this exercise in which uncertainty models were elicited for mapped boundaries in six contrasting scenarios. Each scenario was presented to a panel of experts with varied expertise and career history. In five cases it was possible to arrive at a consensus model, in a sixth case experts with different experience took different views of the nature of the mapping problem. We will discuss our experience of the use of elicitation methodology and the implications of our results for further work at the BGS to quantify uncertainty in map products. In particular we will consider the value of elicitation as a means to capture the expertise of individuals as they retire, and as the composition of the organization's staff changes in response to the management and policy decisions.
Matin, Ivan; Hadzistevic, Miodrag; Vukelic, Djordje; Potran, Michal; Brajlih, Tomaz
2017-07-01
Nowadays, the integrated CAD/CAE systems are favored solutions for the design of simulation models for casting metal substructures of metal-ceramic crowns. The worldwide authors have used different approaches to solve the problems using an expert system. Despite substantial research progress in the design of experts systems for the simulation model design and manufacturing have insufficiently considered the specifics of casting in dentistry, especially the need for further CAD, RE, CAE for the estimation of casting parameters and the control of the casting machine. The novel expert system performs the following: CAD modeling of the simulation model for casting, fast modeling of gate design, CAD eligibility and cast ability check of the model, estimation and running of the program code for the casting machine, as well as manufacturing time reduction of the metal substructure. The authors propose an integration method using common data model approach, blackboard architecture, rule-based reasoning and iterative redesign method. Arithmetic mean roughness values was determinated with constant Gauss low-pass filter (cut-off length of 2.5mm) according to ISO 4287 using Mahr MARSURF PS1. Dimensional deviation between the designed model and manufactured cast was determined using the coordinate measuring machine Zeiss Contura G2 and GOM Inspect software. The ES allows for obtaining the castings derived roughness grade number N7. The dimensional deviation between the simulation model of the metal substructure and the manufactured cast is 0.018mm. The arithmetic mean roughness values measured on the casting substructure are from 1.935µm to 2.778µm. The realized developed expert system with the integrated database is fully applicable for the observed hardware and software. Values of the arithmetic mean roughness and dimensional deviation indicate that casting substructures are surface quality, which is more than enough and useful for direct porcelain veneering. The manufacture of the substructure shows that the proposed ES allows the improvement of the design process while reducing the manufacturing time. Copyright © 2017 Elsevier B.V. All rights reserved.
System of experts for intelligent data management (SEIDAM)
NASA Technical Reports Server (NTRS)
Goodenough, David G.; Iisaka, Joji; Fung, KO
1993-01-01
A proposal to conduct research and development on a system of expert systems for intelligent data management (SEIDAM) is being developed. CCRS has much expertise in developing systems for integrating geographic information with space and aircraft remote sensing data and in managing large archives of remotely sensed data. SEIDAM will be composed of expert systems grouped in three levels. At the lowest level, the expert systems will manage and integrate data from diverse sources, taking account of symbolic representation differences and varying accuracies. Existing software can be controlled by these expert systems, without rewriting existing software into an Artificial Intelligence (AI) language. At the second level, SEIDAM will take the interpreted data (symbolic and numerical) and combine these with data models. at the top level, SEIDAM will respond to user goals for predictive outcomes given existing data. The SEIDAM Project will address the research areas of expert systems, data management, storage and retrieval, and user access and interfaces.
System of Experts for Intelligent Data Management (SEIDAM)
NASA Technical Reports Server (NTRS)
Goodenough, David G.; Iisaka, Joji; Fung, KO
1992-01-01
It is proposed to conduct research and development on a system of expert systems for intelligent data management (SEIDAM). CCRS has much expertise in developing systems for integrating geographic information with space and aircraft remote sensing data and in managing large archives of remotely sensed data. SEIDAM will be composed of expert systems grouped in three levels. At the lowest level, the expert systems will manage and integrate data from diverse sources, taking account of symbolic representation differences and varying accuracies. Existing software can be controlled by these expert systems, without rewriting existing software into an Artificial Intelligence (AI) language. At the second level, SEIDAM will take the interpreted data (symbolic and numerical) and combine these with data models. At the top level, SEIDAM will respond to user goals for predictive outcomes given existing data. The SEIDAM Project will address the research areas of expert systems, data management, storage and retrieval, and user access and interfaces.
TES: A modular systems approach to expert system development for real-time space applications
NASA Technical Reports Server (NTRS)
Cacace, Ralph; England, Brenda
1988-01-01
A major goal of the Space Station era is to reduce reliance on support from ground based experts. The development of software programs using expert systems technology is one means of reaching this goal without requiring crew members to become intimately familiar with the many complex spacecraft subsystems. Development of an expert systems program requires a validation of the software with actual flight hardware. By combining accurate hardware and software modelling techniques with a modular systems approach to expert systems development, the validation of these software programs can be successfully completed with minimum risk and effort. The TIMES Expert System (TES) is an application that monitors and evaluates real time data to perform fault detection and fault isolation tasks as they would otherwise be carried out by a knowledgeable designer. The development process and primary features of TES, a modular systems approach, and the lessons learned are discussed.
An SSME High Pressure Oxidizer Turbopump diagnostic system using G2 real-time expert system
NASA Technical Reports Server (NTRS)
Guo, Ten-Huei
1991-01-01
An expert system which diagnoses various seal leakage faults in the High Pressure Oxidizer Turbopump of the SSME was developed using G2 real-time expert system. Three major functions of the software were implemented: model-based data generation, real-time expert system reasoning, and real-time input/output communication. This system is proposed as one module of a complete diagnostic system for the SSME. Diagnosis of a fault is defined as the determination of its type, severity, and likelihood. Since fault diagnosis is often accomplished through the use of heuristic human knowledge, an expert system based approach has been adopted as a paradigm to develop this diagnostic system. To implement this approach, a software shell which can be easily programmed to emulate the human decision process, the G2 Real-Time Expert System, was selected. Lessons learned from this implementation are discussed.
[Expertise test in the new Civil Prosecution Law (Law 1/2000)].
Laborda Calvo, E
2004-12-01
Expertise test was the object of many controversies in the previous Civil Prosecution Law (CPL) from the way of naming the experts to the difficulties in the receiving payment. The new CPL uses the social process as model and provides civil justice with an agile and guaranteeing procedure. The CPL provides the expert test with a greater amplitude and new range, and should be used at the time of the lawsuit and openly seen. The experts should assume the defense of their arguments and be subjected to the objections of the contrary party. The expert's test becomes a mixed documental and personal test. It also modifies the way of naming the experts and the acceptance that may condition the allocation of funds in the amount considered necessary. The objection is limited to the experts named judicially, it being possible to eliminate them, however, the reason for it should be justified.
An SSME high pressure oxidizer turbopump diagnostic system using G2(TM) real-time expert system
NASA Technical Reports Server (NTRS)
Guo, Ten-Huei
1991-01-01
An expert system which diagnoses various seal leakage faults in the High Pressure Oxidizer Turbopump of the SSME was developed using G2(TM) real-time expert system. Three major functions of the software were implemented: model-based data generation, real-time expert system reasoning, and real-time input/output communication. This system is proposed as one module of a complete diagnostic system for Space Shuttle Main Engine. Diagnosis of a fault is defined as the determination of its type, severity, and likelihood. Since fault diagnosis is often accomplished through the use of heuristic human knowledge, an expert system based approach was adopted as a paradigm to develop this diagnostic system. To implement this approach, a software shell which can be easily programmed to emulate the human decision process, the G2 Real-Time Expert System, was selected. Lessons learned from this implementation are discussed.
Virtual building environments (VBE) - Applying information modeling to buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bazjanac, Vladimir
2004-06-21
A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative andmore » the benefits from a couple of early VBE projects.« less
Riches, S F; Payne, G S; Morgan, V A; Dearnaley, D; Morgan, S; Partridge, M; Livni, N; Ogden, C; deSouza, N M
2015-05-01
The objectives are determine the optimal combination of MR parameters for discriminating tumour within the prostate using linear discriminant analysis (LDA) and to compare model accuracy with that of an experienced radiologist. Multiparameter MRIs in 24 patients before prostatectomy were acquired. Tumour outlines from whole-mount histology, T2-defined peripheral zone (PZ), and central gland (CG) were superimposed onto slice-matched parametric maps. T2, Apparent Diffusion Coefficient, initial area under the gadolinium curve, vascular parameters (K(trans),Kep,Ve), and (choline+polyamines+creatine)/citrate were compared between tumour and non-tumour tissues. Receiver operating characteristic (ROC) curves determined sensitivity and specificity at spectroscopic voxel resolution and per lesion, and LDA determined the optimal multiparametric model for identifying tumours. Accuracy was compared with an expert observer. Tumours were significantly different from PZ and CG for all parameters (all p < 0.001). Area under the ROC curve for discriminating tumour from non-tumour was significantly greater (p < 0.001) for the multiparametric model than for individual parameters; at 90 % specificity, sensitivity was 41 % (MRSI voxel resolution) and 59 % per lesion. At this specificity, an expert observer achieved 28 % and 49 % sensitivity, respectively. The model was more accurate when parameters from all techniques were included and performed better than an expert observer evaluating these data. • The combined model increases diagnostic accuracy in prostate cancer compared with individual parameters • The optimal combined model includes parameters from diffusion, spectroscopy, perfusion, and anatominal MRI • The computed model improves tumour detection compared to an expert viewing parametric maps.
A model of how different biology experts explain molecular and cellular mechanisms.
Trujillo, Caleb M; Anderson, Trevor R; Pelaez, Nancy J
2015-01-01
Constructing explanations is an essential skill for all science learners. The goal of this project was to model the key components of expert explanation of molecular and cellular mechanisms. As such, we asked: What is an appropriate model of the components of explanation used by biology experts to explain molecular and cellular mechanisms? Do explanations made by experts from different biology subdisciplines at a university support the validity of this model? Guided by the modeling framework of R. S. Justi and J. K. Gilbert, the validity of an initial model was tested by asking seven biologists to explain a molecular mechanism of their choice. Data were collected from interviews, artifacts, and drawings, and then subjected to thematic analysis. We found that biologists explained the specific activities and organization of entities of the mechanism. In addition, they contextualized explanations according to their biological and social significance; integrated explanations with methods, instruments, and measurements; and used analogies and narrated stories. The derived methods, analogies, context, and how themes informed the development of our final MACH model of mechanistic explanations. Future research will test the potential of the MACH model as a guiding framework for instruction to enhance the quality of student explanations. © 2015 C. M. Trujillo et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Johnson, Robin R.; Stone, Bradly T.; Miranda, Carrie M.; Vila, Bryan; James, Lois; James, Stephen M.; Rubio, Roberto F.; Berka, Chris
2014-01-01
Objective: To demonstrate that psychophysiology may have applications for objective assessment of expertise development in deadly force judgment and decision making (DFJDM). Background: Modern training techniques focus on improving decision-making skills with participative assessment between trainees and subject matter experts primarily through subjective observation. Objective metrics need to be developed. The current proof of concept study explored the potential for psychophysiological metrics in deadly force judgment contexts. Method: Twenty-four participants (novice, expert) were recruited. All wore a wireless Electroencephalography (EEG) device to collect psychophysiological data during high-fidelity simulated deadly force judgment and decision-making simulations using a modified Glock firearm. Participants were exposed to 27 video scenarios, one-third of which would have justified use of deadly force. Pass/fail was determined by whether the participant used deadly force appropriately. Results: Experts had a significantly higher pass rate compared to novices (p < 0.05). Multiple metrics were shown to distinguish novices from experts. Hierarchical regression analyses indicate that psychophysiological variables are able to explain 72% of the variability in expert performance, but only 37% in novices. Discriminant function analysis (DFA) using psychophysiological metrics was able to discern between experts and novices with 72.6% accuracy. Conclusion: While limited due to small sample size, the results suggest that psychophysiology may be developed for use as an objective measure of expertise in DFDJM. Specifically, discriminant function measures may have the potential to objectively identify expert skill acquisition. Application: Psychophysiological metrics may create a performance model with the potential to optimize simulator-based DFJDM training. These performance models could be used for trainee feedback, and/or by the instructor to assess performance objectively. PMID:25100966
Breakfast barriers and opportunities for children living in a Dutch disadvantaged neighbourhood.
van Kleef, Ellen; Vingerhoeds, Monique H; Vrijhof, Milou; van Trijp, Hans C M
2016-12-01
The objective of this study was to explore parents', children's, and experts' beliefs and experiences about breakfast motivation, opportunity, and ability and elicit their thoughts on effective interventions to encourage healthy breakfast consumption. The setting was a disadvantaged neighbourhood in Rotterdam, the Netherlands. Focus groups with mothers and children and semi-structured individual interviews with experts were conducted. Interview guides were developed based on the motivation, opportunity, and ability consumer psychology model. Thirty-two mothers of primary school children participated in five group discussions, eight focus groups were conducted with 44 children, and nine experts participated in interviews. Data from expert interviews and group discussions were coded and thematically analysed. The following themes emerged from the focus groups: (1) generally high motivation to have breakfast, (2) improved performance at school is key motivator, (3) limited time hinders breakfast, and (4) lack of nutritional knowledge about high quality breakfast. Experts mentioned lack of effort, knowledge, and time; financial constraints; and environmental issues (food availability) as barriers to breakfasting healthily. Several ways to encourage healthy breakfasting habits were identified: (1) involvement of both children and parents, (2) role models inspiring change, and (3) interactive educational approaches. Experts perceived more problems and challenges in achieving healthy breakfast habits than did mothers and children. Lack of opportunity (according to the children and experts) and ability (according to the experts) were identified, although the motivation to eat a healthy breakfast was present. Predominant barriers are lack of time and nutritional knowledge. Overall, findings suggest educational and social marketing approaches as interventions to encourage healthy breakfast consumption. Copyright © 2016 Elsevier Ltd. All rights reserved.
Intelligent control of mixed-culture bioprocesses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoner, D.L.; Larsen, E.D.; Miller, K.S.
A hierarchical control system is being developed and applied to a mixed culture bioprocess in a continuous stirred tank reactor. A bioreactor, with its inherent complexity and non-linear behavior was an interesting, yet, difficult application for control theory. The bottom level of the hierarchy was implemented as a number of integrated set point controls and data acquisition modules. Within the second level was a diagnostic system that used expert knowledge to determine the operational status of the sensors, actuators, and control modules. A diagnostic program was successfully implemented for the detection of stirrer malfunctions, and to monitor liquid delivery ratesmore » and recalibrate the pumps when deviations from desired flow rates occurred. The highest control level was a supervisory shell that was developed using expert knowledge and the history of the reactor operation to determine the set points required to meet a set of production criteria. At this stage the supervisory shell analyzed the data to determine the state of the system. In future implementations, this shell will determine the set points required to optimize a cost function using expert knowledge and adaptive learning techniques.« less
DOT National Transportation Integrated Search
2012-06-01
A small team of university-based transportation system experts and simulation experts has been : assembled to develop, test, and apply an approach to assessing road infrastructure capacity using : micro traffic simulation supported by publically avai...
Expert Panels, Consumers, and Chemistry.
ERIC Educational Resources Information Center
Rehfeldt, Thomas K.
2000-01-01
Studied the attributes, properties, and consumer acceptance of antiperspirant products through responses of 400 consumers (consumer data), expert panel data, and analytical data about the products. Results show how the Rasch model can provide the tool necessary to combine data from several sources. (SLD)
Comments of statistical issue in numerical modeling for underground nuclear test monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicholson, W.L.; Anderson, K.K.
1993-03-01
The Symposium concluded with prepared summaries by four experts in the involved disciplines. These experts made no mention of statistics and/or the statistical content of issues. The first author contributed an extemporaneous statement at the Symposium because there are important issues associated with conducting and evaluating numerical modeling that are familiar to statisticians and often treated successfully by them. This note expands upon these extemporaneous remarks. Statistical ideas may be helpful in resolving some numerical modeling issues. Specifically, we comment first on the role of statistical design/analysis in the quantification process to answer the question ``what do we know aboutmore » the numerical modeling of underground nuclear tests?`` and second on the peculiar nature of uncertainty analysis for situations involving numerical modeling. The simulations described in the workshop, though associated with topic areas, were basically sets of examples. Each simulation was tuned towards agreeing with either empirical evidence or an expert`s opinion of what empirical evidence would be. While the discussions were reasonable, whether the embellishments were correct or a forced fitting of reality is unclear and illustrates that ``simulation is easy.`` We also suggest that these examples of simulation are typical and the questions concerning the legitimacy and the role of knowing the reality are fair, in general, with respect to simulation. The answers will help us understand why ``prediction is difficult.``« less
A real-time expert system for self-repairing flight control
NASA Technical Reports Server (NTRS)
Gaither, S. A.; Agarwal, A. K.; Shah, S. C.; Duke, E. L.
1989-01-01
An integrated environment for specifying, prototyping, and implementing a self-repairing flight-control (SRFC) strategy is described. At an interactive workstation, the user can select paradigms such as rule-based expert systems, state-transition diagrams, and signal-flow graphs and hierarchically nest them, assign timing and priority attributes, establish blackboard-type communication, and specify concurrent execution on single or multiple processors. High-fidelity nonlinear simulations of aircraft and SRFC systems can be performed off-line, with the possibility of changing SRFC rules, inference strategies, and other heuristics to correct for control deficiencies. Finally, the off-line-generated SRFC can be transformed into highly optimized application-specific real-time C-language code. An application of this environment to the design of aircraft fault detection, isolation, and accommodation algorithms is presented in detail.
Software life cycle methodologies and environments
NASA Technical Reports Server (NTRS)
Fridge, Ernest
1991-01-01
Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.
Cirrus: Inducing Subject Models from Protocol Data
1988-08-16
Protocol analysis is used routinely by psychologists and other behavior scientists, and more recently, by knowledge engineers who wish to embed the...knowledge of human experts in an expert system. However, protocol analysis is notoriously difficult and time comsuming . Several systems have been developed to...formal trace of it (a problem behavior graph). The system, however, did not produce an abstract model of the subject. Bhaskar and Simon (1977) avoided the
Comments on statistical issues in numerical modeling for underground nuclear test monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicholson, W.L.; Anderson, K.K.
1993-11-01
The Symposium concluded with prepared summaries by four experts in the involved disciplines. These experts made no mention of statistics and/or the statistical content of issues. The first author contributed an extemporaneous statement at the Symposium because there are important issues associated with conducting and evaluating numerical modeling that are familiar to statisticians and often treated successfully by them. This note expands upon these extemporaneous remarks.
An intelligent training system for space shuttle flight controllers
NASA Technical Reports Server (NTRS)
Loftin, R. Bowen; Wang, Lui; Baffes, Paul; Hua, Grace
1988-01-01
An autonomous intelligent training system which integrates expert system technology with training/teaching methodologies is described. The system was designed to train Mission Control Center (MCC) Flight Dynamics Officers (FDOs) to deploy a certain type of satellite from the Space Shuttle. The Payload-assist module Deploys/Intelligent Computer-Aided Training (PD/ICAT) system consists of five components: a user interface, a domain expert, a training session manager, a trainee model, and a training scenario generator. The interface provides the trainee with information of the characteristics of the current training session and with on-line help. The domain expert (DeplEx for Deploy Expert) contains the rules and procedural knowledge needed by the FDO to carry out the satellite deploy. The DeplEx also contains mal-rules which permit the identification and diagnosis of common errors made by the trainee. The training session manager (TSM) examines the actions of the trainee and compares them with the actions of DeplEx in order to determine appropriate responses. A trainee model is developed for each individual using the system. The model includes a history of the trainee's interactions with the training system and provides evaluative data on the trainee's current skill level. A training scenario generator (TSG) designs appropriate training exercises for each trainee based on the trainee model and the training goals. All of the expert system components of PD/ICAT communicate via a common blackboard. The PD/ICAT is currently being tested. Ultimately, this project will serve as a vehicle for developing a general architecture for intelligent training systems together with a software environment for creating such systems.
An intelligent training system for space shuttle flight controllers
NASA Technical Reports Server (NTRS)
Loftin, R. Bowen; Wang, Lui; Baffles, Paul; Hua, Grace
1988-01-01
An autonomous intelligent training system which integrates expert system technology with training/teaching methodologies is described. The system was designed to train Mission Control Center (MCC) Flight Dynamics Officers (FDOs) to deploy a certain type of satellite from the Space Shuttle. The Payload-assist module Deploys/Intelligent Computer-Aided Training (PD/ICAT) system consists of five components: a user interface, a domain expert, a training session manager, a trainee model, and a training scenario generator. The interface provides the trainee with information of the characteristics of the current training session and with on-line help. The domain expert (Dep1Ex for Deploy Expert) contains the rules and procedural knowledge needed by the FDO to carry out the satellite deploy. The Dep1Ex also contains mal-rules which permit the identification and diagnosis of common errors made by the trainee. The training session manager (TSM) examines the actions of the trainee and compares them with the actions of Dep1Ex in order to determine appropriate responses. A trainee model is developed for each individual using the system. The model includes a history of the trainee's interactions with the training system and provides evaluative data on the trainee's current skill level. A training scenario generator (TSG) designs appropriate training exercises for each trainee based on the trainee model and the training goals. All of the expert system components of PD/ICAT communicate via a common blackboard. The PD/ICAT is currently being tested. Ultimately, this project will serve as a vehicle for developing a general architecture for intelligent training systems together with a software environment for creating such systems.
Jeffery, Kathleen A; Pelaez, Nancy; Anderson, Trevor R
2018-01-01
To keep biochemistry instruction current and relevant, it is crucial to expose students to cutting-edge scientific research and how experts reason about processes governed by thermodynamics and kinetics such as protein folding and dynamics. This study focuses on how experts explain their research into this topic with the intention of informing instruction. Previous research has modeled how expert biologists incorporate research methods, social or biological context, and analogies when they talk about their research on mechanisms. We used this model as a guiding framework to collect and analyze interview data from four experts. The similarities and differences that emerged from analysis indicate that all experts integrated theoretical knowledge with their research context, methods, and analogies when they explained how phenomena operate, in particular by mapping phenomena to mathematical models; they explored different processes depending on their explanatory aims, but readily transitioned between different perspectives and explanatory models; and they explained thermodynamic and kinetic concepts of relevance to protein folding in different ways that aligned with their particular research methods. We discuss how these findings have important implications for teaching and future educational research. © 2018 K. A. Jeffery et al. CBE—Life Sciences Education © 2018 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
EDNA: Expert fault digraph analysis using CLIPS
NASA Technical Reports Server (NTRS)
Dixit, Vishweshwar V.
1990-01-01
Traditionally fault models are represented by trees. Recently, digraph models have been proposed (Sack). Digraph models closely imitate the real system dependencies and hence are easy to develop, validate and maintain. However, they can also contain directed cycles and analysis algorithms are hard to find. Available algorithms tend to be complicated and slow. On the other hand, the tree analysis (VGRH, Tayl) is well understood and rooted in vast research effort and analytical techniques. The tree analysis algorithms are sophisticated and orders of magnitude faster. Transformation of a digraph (cyclic) into trees (CLP, LP) is a viable approach to blend the advantages of the representations. Neither the digraphs nor the trees provide the ability to handle heuristic knowledge. An expert system, to capture the engineering knowledge, is essential. We propose an approach here, namely, expert network analysis. We combine the digraph representation and tree algorithms. The models are augmented by probabilistic and heuristic knowledge. CLIPS, an expert system shell from NASA-JSC will be used to develop a tool. The technique provides the ability to handle probabilities and heuristic knowledge. Mixed analysis, some nodes with probabilities, is possible. The tool provides graphics interface for input, query, and update. With the combined approach it is expected to be a valuable tool in the design process as well in the capture of final design knowledge.
Kalpathy-Cramer, Jayashree; Campbell, J Peter; Erdogmus, Deniz; Tian, Peng; Kedarisetti, Dharanish; Moleta, Chace; Reynolds, James D; Hutcheson, Kelly; Shapiro, Michael J; Repka, Michael X; Ferrone, Philip; Drenser, Kimberly; Horowitz, Jason; Sonmez, Kemal; Swan, Ryan; Ostmo, Susan; Jonas, Karyn E; Chan, R V Paul; Chiang, Michael F
2016-11-01
To determine expert agreement on relative retinopathy of prematurity (ROP) disease severity and whether computer-based image analysis can model relative disease severity, and to propose consideration of a more continuous severity score for ROP. We developed 2 databases of clinical images of varying disease severity (100 images and 34 images) as part of the Imaging and Informatics in ROP (i-ROP) cohort study and recruited expert physician, nonexpert physician, and nonphysician graders to classify and perform pairwise comparisons on both databases. Six participating expert ROP clinician-scientists, each with a minimum of 10 years of clinical ROP experience and 5 ROP publications, and 5 image graders (3 physicians and 2 nonphysician graders) who analyzed images that were obtained during routine ROP screening in neonatal intensive care units. Images in both databases were ranked by average disease classification (classification ranking), by pairwise comparison using the Elo rating method (comparison ranking), and by correlation with the i-ROP computer-based image analysis system. Interexpert agreement (weighted κ statistic) compared with the correlation coefficient (CC) between experts on pairwise comparisons and correlation between expert rankings and computer-based image analysis modeling. There was variable interexpert agreement on diagnostic classification of disease (plus, preplus, or normal) among the 6 experts (mean weighted κ, 0.27; range, 0.06-0.63), but good correlation between experts on comparison ranking of disease severity (mean CC, 0.84; range, 0.74-0.93) on the set of 34 images. Comparison ranking provided a severity ranking that was in good agreement with ranking obtained by classification ranking (CC, 0.92). Comparison ranking on the larger dataset by both expert and nonexpert graders demonstrated good correlation (mean CC, 0.97; range, 0.95-0.98). The i-ROP system was able to model this continuous severity with good correlation (CC, 0.86). Experts diagnose plus disease on a continuum, with poor absolute agreement on classification but good relative agreement on disease severity. These results suggest that the use of pairwise rankings and a continuous severity score, such as that provided by the i-ROP system, may improve agreement on disease severity in the future. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Osamor, Victor C; Azeta, Ambrose A; Ajulo, Oluseyi O
2014-12-01
Over 1.5-2 million tuberculosis deaths occur annually. Medical professionals are faced with a lot of challenges in delivering good health-care with unassisted automation in hospitals where there are several patients who need the doctor's attention. To automate the pre-laboratory screening process against tuberculosis infection to aid diagnosis and make it fast and accessible to the public via the Internet. The expert system we have built is designed to also take care of people who do not have access to medical experts, but would want to check their medical status. A rule-based approach has been used, and unified modeling language and the client-server architecture technique were applied to model the system and to develop it as a web-based expert system for tuberculosis diagnosis. Algorithmic rules in the Tuberculosis-Diagnosis Expert System necessitate decision coverage where tuberculosis is either suspected or not suspected. The architecture consists of a rule base, knowledge base, and patient database. These units interact with the inference engine, which receives patient' data through the Internet via a user interface. We present the architecture of the Tuberculosis-Diagnosis Expert System and its implementation. We evaluated it for usability to determine the level of effectiveness, efficiency and user satisfaction. The result of the usability evaluation reveals that the system has a usability of 4.08 out of a scale of 5. This is an indication of a more-than-average system performance. Several existing expert systems have been developed for the purpose of supporting different medical diagnoses, but none is designed to translate tuberculosis patients' symptomatic data for online pre-laboratory screening. Our Tuberculosis-Diagnosis Expert System is an effective solution for the implementation of the needed web-based expert system diagnosis. © The Author(s) 2013.
NASA Technical Reports Server (NTRS)
Happell, Nadine; Miksell, Steve; Carlisle, Candace
1989-01-01
A major barrier in taking expert systems from prototype to operational status involves instilling end user confidence in the operational system. The software of different life cycle models is examined and the advantages and disadvantages of each when applied to expert system development are explored. The Fault Isolation Expert System for Tracking and data relay satellite system Applications (FIESTA) is presented as a case study of development of an expert system. The end user confidence necessary for operational use of this system is accentuated by the fact that it will handle real-time data in a secure environment, allowing little tolerance for errors. How FIESTA is dealing with transition problems as it moves from an off-line standalone prototype to an on-line real-time system is discussed.
NASA Astrophysics Data System (ADS)
Berger, Roland; Hänze, Martin
2015-01-01
We assessed the impact of expert students' instructional quality on the academic performance of novice students in 12th-grade physics classes organized in an expert model of cooperative learning ('jigsaw classroom'). The instructional quality of 129 expert students was measured by a newly developed rating system. As expected, when aggregating across all four subtopics taught, regression analysis revealed that academic performance of novice students increases with the quality of expert students' instruction. The difficulty of subtopics, however, moderates this effect: higher instructional quality of more difficult subtopics did not lead to better academic performance of novice students. We interpret this finding in the light of Cognitive Load Theory. Demanding tasks cause high intrinsic cognitive load and hindered the novice students' learning.
NASA Technical Reports Server (NTRS)
Happell, Nadine; Miksell, Steve; Carlisle, Candace
1989-01-01
A major barrier in taking expert systems from prototype to operational status involves instilling end user confidence in the operational system. The software of different life cycle models is examined and the advantages and disadvantages of each when applied to expert system development are explored. The Fault Isolation Expert System for Tracking and data relay satellite system Applications (FIESTA) is presented as a case study of development of an expert system. The end user confidence necessary for operational use of this system is accentuated by the fact that it will handle real-time data in a secure environment, allowing little tolerance for errors. How FIESTA is dealing with transition problems as it moves from an off-line standalone prototype to an on-line real-time system is discussed.
Characterizing Forest Change Using Community-Based Monitoring Data and Landsat Time Series
DeVries, Ben; Pratihast, Arun Kumar; Verbesselt, Jan; Kooistra, Lammert; Herold, Martin
2016-01-01
Increasing awareness of the issue of deforestation and degradation in the tropics has resulted in efforts to monitor forest resources in tropical countries. Advances in satellite-based remote sensing and ground-based technologies have allowed for monitoring of forests with high spatial, temporal and thematic detail. Despite these advances, there is a need to engage communities in monitoring activities and include these stakeholders in national forest monitoring systems. In this study, we analyzed activity data (deforestation and forest degradation) collected by local forest experts over a 3-year period in an Afro-montane forest area in southwestern Ethiopia and corresponding Landsat Time Series (LTS). Local expert data included forest change attributes, geo-location and photo evidence recorded using mobile phones with integrated GPS and photo capabilities. We also assembled LTS using all available data from all spectral bands and a suite of additional indices and temporal metrics based on time series trajectory analysis. We predicted deforestation, degradation or stable forests using random forest models trained with data from local experts and LTS spectral-temporal metrics as model covariates. Resulting models predicted deforestation and degradation with an out of bag (OOB) error estimate of 29% overall, and 26% and 31% for the deforestation and degradation classes, respectively. By dividing the local expert data into training and operational phases corresponding to local monitoring activities, we found that forest change models improved as more local expert data were used. Finally, we produced maps of deforestation and degradation using the most important spectral bands. The results in this study represent some of the first to combine local expert based forest change data and dense LTS, demonstrating the complementary value of both continuous data streams. Our results underpin the utility of both datasets and provide a useful foundation for integrated forest monitoring systems relying on data streams from diverse sources. PMID:27018852
Expert Opinions on Nutrition Issues in Clinical Dentistry.
ERIC Educational Resources Information Center
Palmer, Carole A.; And Others
1990-01-01
A survey of 79 experts in dental nutrition sought consensus on the appropriate scope of nutrition in clinical dentistry. Results support the need for greater attention to nutrition issues in dental schools and better models for nutrition interventions in dental practice. (Author/MSE)
Expert Maintenance Advisor Development for Navy Shipboard Systems
1994-01-01
Estoril (EDEN) Chair: Xavier Alaman, Instituto de Ingenieria del Conocimiento, SPAIN "A Model of Handling Uncertainty in Expert Systems," 01 Zhao...for Supervisory Process Control," Xavier Alaman, Instituto de Ingenieria del Conocimiento, SPAIN - (L) INTEGRATED KNOWLEDGE BASED SYSTEMS IN POWER
Friesen, Melissa C.; Wheeler, David C.; Vermeulen, Roel; Locke, Sarah J.; Zaebst, Dennis D.; Koutros, Stella; Pronk, Anjoeka; Colt, Joanne S.; Baris, Dalsu; Karagas, Margaret R.; Malats, Nuria; Schwenn, Molly; Johnson, Alison; Armenti, Karla R.; Rothman, Nathanial; Stewart, Patricia A.; Kogevinas, Manolis; Silverman, Debra T.
2016-01-01
Objectives: To efficiently and reproducibly assess occupational diesel exhaust exposure in a Spanish case-control study, we examined the utility of applying decision rules that had been extracted from expert estimates and questionnaire response patterns using classification tree (CT) models from a similar US study. Methods: First, previously extracted CT decision rules were used to obtain initial ordinal (0–3) estimates of the probability, intensity, and frequency of occupational exposure to diesel exhaust for the 10 182 jobs reported in a Spanish case-control study of bladder cancer. Second, two experts reviewed the CT estimates for 350 jobs randomly selected from strata based on each CT rule’s agreement with the expert ratings in the original study [agreement rate, from 0 (no agreement) to 1 (perfect agreement)]. Their agreement with each other and with the CT estimates was calculated using weighted kappa (κ w) and guided our choice of jobs for subsequent expert review. Third, an expert review comprised all jobs with lower confidence (low-to-moderate agreement rates or discordant assignments, n = 931) and a subset of jobs with a moderate to high CT probability rating and with moderately high agreement rates (n = 511). Logistic regression was used to examine the likelihood that an expert provided a different estimate than the CT estimate based on the CT rule agreement rates, the CT ordinal rating, and the availability of a module with diesel-related questions. Results: Agreement between estimates made by two experts and between estimates made by each of the experts and the CT estimates was very high for jobs with estimates that were determined by rules with high CT agreement rates (κ w: 0.81–0.90). For jobs with estimates based on rules with lower agreement rates, moderate agreement was observed between the two experts (κ w: 0.42–0.67) and poor-to-moderate agreement was observed between the experts and the CT estimates (κ w: 0.09–0.57). In total, the expert review of 1442 jobs changed 156 probability estimates, 128 intensity estimates, and 614 frequency estimates. The expert was more likely to provide a different estimate when the CT rule agreement rate was <0.8, when the CT ordinal ratings were low to moderate, or when a module with diesel questions was available. Conclusions: Our reliability assessment provided important insight into where to prioritize additional expert review; as a result, only 14% of the jobs underwent expert review, substantially reducing the exposure assessment burden. Overall, we found that we could efficiently, reproducibly, and reliably apply CT decision rules from one study to assess exposure in another study. PMID:26732820
2011 Naval Energy Forum. Volume 2
2011-10-14
APPLIED RESEARCH ASSOCIATES • CHASE SUPPLY, INC. • ESPEY MGF. & ELECTRONICS • FREE FLOW POWER • LIVEFUELS • MILSPRAY MILITARY TECHNOLOGIES • PETRA ... Martin Luther King experts reality check? 5 “Heavier-than-air flying machines are impossible” - Lord Kelvin, President, Royal Society, 1895 6 “The...1. Win-Win: Mutual Benefit & Value Proposition – DOD: Save Energy, Save Money, Save Lives (General Martin Dempsey) – DOE: Accelerate technology
Prompt comprehension in UNIX command production.
Doane, S M; McNamara, D S; Kintsch, W; Polson, P G; Clawson, D M
1992-07-01
We hypothesize that a cognitive analysis based on the construction-integration theory of comprehension (Kintsch, 1988) can predict what is difficult about generating complex composite commands in the UNIX operating system. We provide empirical support for assumptions of the Doane, Kintsch, and Polson (1989, 1990) construction-integration model for generating complex commands in UNIX. We asked users whose UNIX experience varied to produce complex UNIX commands, and then provided help prompts whenever the commands that they produced were erroneous. The help prompts were designed to assist subjects with respect to both the knowledge and the memory processes that our UNIX modeling efforts have suggested are lacking in less expert users. It appears that experts respond to different prompts than do novices. Expert performance is helped by the presentation of abstract information, whereas novice and intermediate performance is modified by presentation of concrete information. Second, while presentation of specific prompts helps less expert subjects, they do not provide sufficient information to obtain correct performance. Our analyses suggest that information about the ordering of commands is required to help the less expert with both knowledge and memory load problems in a manner consistent with skill acquisition theories.
Vandermoere, Frédéric
2008-04-01
This case study examines the hazard and risk perception and the need for decontamination according to people exposed to soil pollution. Using an ecological-symbolic approach (ESA), a multidisciplinary model is developed that draws upon psychological and sociological perspectives on risk perception and includes ecological variables by using data from experts' risk assessments. The results show that hazard perception is best predicted by objective knowledge, subjective knowledge, estimated knowledge of experts, and the assessed risks. However, experts' risk assessments induce an increase in hazard perception only when residents know the urgency of decontamination. Risk perception is best predicted by trust in the risk management. Additionally, need for decontamination relates to hazard perception, risk perception, estimated knowledge of experts, and thoughts about sustainability. In contrast to the knowledge deficit model, objective and subjective knowledge did not significantly relate to risk perception and need for decontamination. The results suggest that residents can make a distinction between hazards in terms of the seriousness of contamination on the one hand, and human health risks on the other hand. Moreover, next to the importance of social determinants of environmental risk perception, this study shows that the output of experts' risk assessments-or the objective risks-can create a hazard awareness rather than an alarming risk consciousness, despite residents' distrust of scientific knowledge.
The expert-generalist: a contradiction whose time has come.
Fins, Joseph J
2015-08-01
The author suggests the creation of expert-generalists to help provide the additional cost-effective access to care necessitated by increased insurance coverage under the Affordable Care Act. Expert-generalists, a concept drawn from an extant Canadian model, would be a cohort of primary care physicians who obtain additional training in a subspecialty area, which would widen their practice portfolio and bring enhanced infrastructure to primary care settings. Expanding the reach of primary care into the realm of more advanced subspecialty practice could be a way to enhance both access to and quality of care in a cost-effective fashion, in part because the educational framework for additional training already exists. Trainees could opt for an extra year of training after traditional residency or return to training after years in practice. Properly trained, an expert-generalist would benefit both the quality of the patient experience and the bottom line by expertly triaging patients to determine who will truly benefit from specialty consultations, decreasing specialists' engagement with cases that do not require their higher-tier care. The author considers the merits of this proposal, as well as potential objections and implementation challenges. It is suggested that this model be adopted incrementally, using demonstration projects that could assess the impact of an expert-generalist initiative on the physician workforce and on patients' access to quality primary and specialty care.
2016-02-01
We live in an age that increasingly calls for national or regional management of global risks. This article discusses the contributions that expert elicitation can bring to efforts to manage global risks and identifies challenges faced in conducting expert elicitation at this scale. In doing so it draws on lessons learned from conducting an expert elicitation as part of the World Health Organizations (WHO) initiative to estimate the global burden of foodborne disease; a study commissioned by the Foodborne Disease Epidemiology Reference Group (FERG). Expert elicitation is designed to fill gaps in data and research using structured, transparent methods. Such gaps are a significant challenge for global risk modeling. Experience with the WHO FERG expert elicitation shows that it is feasible to conduct an expert elicitation at a global scale, but that challenges do arise, including: defining an informative, yet feasible geographical structure for the elicitation; defining what constitutes expertise in a global setting; structuring international, multidisciplinary expert panels; and managing demands on experts' time in the elicitation. This article was written as part of a workshop, "Methods for Research Synthesis: A Cross-Disciplinary Approach" held at the Harvard Center for Risk Analysis on October 13, 2013. © 2016 Society for Risk Analysis.
Use (and abuse) of expert elicitation in support of decision making for public policy
Morgan, M. Granger
2014-01-01
The elicitation of scientific and technical judgments from experts, in the form of subjective probability distributions, can be a valuable addition to other forms of evidence in support of public policy decision making. This paper explores when it is sensible to perform such elicitation and how that can best be done. A number of key issues are discussed, including topics on which there are, and are not, experts who have knowledge that provides a basis for making informed predictive judgments; the inadequacy of only using qualitative uncertainty language; the role of cognitive heuristics and of overconfidence; the choice of experts; the development, refinement, and iterative testing of elicitation protocols that are designed to help experts to consider systematically all relevant knowledge when they make their judgments; the treatment of uncertainty about model functional form; diversity of expert opinion; and when it does or does not make sense to combine judgments from different experts. Although it may be tempting to view expert elicitation as a low-cost, low-effort alternative to conducting serious research and analysis, it is neither. Rather, expert elicitation should build on and use the best available research and analysis and be undertaken only when, given those, the state of knowledge will remain insufficient to support timely informed assessment and decision making. PMID:24821779
Control of standing balance while using constructions stilts: comparison of expert and novice users.
Noble, Jeremy W; Singer, Jonathan C; Prentice, Stephen D
2016-01-01
This study examined the control of standing balance while wearing construction stilts. Motion capture data were collected from nine expert stilt users and nine novices. Three standing conditions were analysed: ground, 60 cm stilts and an elevated platform. Each task was also performed with the head extended as a vestibular perturbation. Both expert and novice groups exhibited lower displacement of the whole body centre of mass and centre of pressure on construction stilts. Differences between the groups were only noted in the elevated condition with no stilts, where the expert group had lower levels of medial-lateral displacement of the centre of pressure. The postural manipulation revealed that the expert group had superior balance to the novice group. Conditions where stilts were worn showed lower levels of correspondence to the inverted pendulum model. Under normal conditions, both expert and novice groups were able to control their balance while wearing construction stilts. This work investigated the effects of experience on the control of balance while using construction stilts. Under normal conditions, expert and novice stilt users were able to control their balance while wearing construction stilts. Differences between the expert and novice users were revealed when the balance task was made more difficult, with the experts showing superior balance in these situations.
Elicitation of neurological knowledge with argument-based machine learning.
Groznik, Vida; Guid, Matej; Sadikov, Aleksander; Možina, Martin; Georgiev, Dejan; Kragelj, Veronika; Ribarič, Samo; Pirtošek, Zvezdan; Bratko, Ivan
2013-02-01
The paper describes the use of expert's knowledge in practice and the efficiency of a recently developed technique called argument-based machine learning (ABML) in the knowledge elicitation process. We are developing a neurological decision support system to help the neurologists differentiate between three types of tremors: Parkinsonian, essential, and mixed tremor (comorbidity). The system is intended to act as a second opinion for the neurologists, and most importantly to help them reduce the number of patients in the "gray area" that require a very costly further examination (DaTSCAN). We strive to elicit comprehensible and medically meaningful knowledge in such a way that it does not come at the cost of diagnostic accuracy. To alleviate the difficult problem of knowledge elicitation from data and domain experts, we used ABML. ABML guides the expert to explain critical special cases which cannot be handled automatically by machine learning. This very efficiently reduces the expert's workload, and combines expert's knowledge with learning data. 122 patients were enrolled into the study. The classification accuracy of the final model was 91%. Equally important, the initial and the final models were also evaluated for their comprehensibility by the neurologists. All 13 rules of the final model were deemed as appropriate to be able to support its decisions with good explanations. The paper demonstrates ABML's advantage in combining machine learning and expert knowledge. The accuracy of the system is very high with respect to the current state-of-the-art in clinical practice, and the system's knowledge base is assessed to be very consistent from a medical point of view. This opens up the possibility to use the system also as a teaching tool. Copyright © 2012 Elsevier B.V. All rights reserved.
Wheeler, David C.; Burstyn, Igor; Vermeulen, Roel; Yu, Kai; Shortreed, Susan M.; Pronk, Anjoeka; Stewart, Patricia A.; Colt, Joanne S.; Baris, Dalsu; Karagas, Margaret R.; Schwenn, Molly; Johnson, Alison; Silverman, Debra T.; Friesen, Melissa C.
2014-01-01
Objectives Evaluating occupational exposures in population-based case-control studies often requires exposure assessors to review each study participants' reported occupational information job-by-job to derive exposure estimates. Although such assessments likely have underlying decision rules, they usually lack transparency, are time-consuming and have uncertain reliability and validity. We aimed to identify the underlying rules to enable documentation, review, and future use of these expert-based exposure decisions. Methods Classification and regression trees (CART, predictions from a single tree) and random forests (predictions from many trees) were used to identify the underlying rules from the questionnaire responses and an expert's exposure assignments for occupational diesel exhaust exposure for several metrics: binary exposure probability and ordinal exposure probability, intensity, and frequency. Data were split into training (n=10,488 jobs), testing (n=2,247), and validation (n=2,248) data sets. Results The CART and random forest models' predictions agreed with 92–94% of the expert's binary probability assignments. For ordinal probability, intensity, and frequency metrics, the two models extracted decision rules more successfully for unexposed and highly exposed jobs (86–90% and 57–85%, respectively) than for low or medium exposed jobs (7–71%). Conclusions CART and random forest models extracted decision rules and accurately predicted an expert's exposure decisions for the majority of jobs and identified questionnaire response patterns that would require further expert review if the rules were applied to other jobs in the same or different study. This approach makes the exposure assessment process in case-control studies more transparent and creates a mechanism to efficiently replicate exposure decisions in future studies. PMID:23155187
Ensemble Learning Method for Outlier Detection and its Application to Astronomical Light Curves
NASA Astrophysics Data System (ADS)
Nun, Isadora; Protopapas, Pavlos; Sim, Brandon; Chen, Wesley
2016-09-01
Outlier detection is necessary for automated data analysis, with specific applications spanning almost every domain from financial markets to epidemiology to fraud detection. We introduce a novel mixture of the experts outlier detection model, which uses a dynamically trained, weighted network of five distinct outlier detection methods. After dimensionality reduction, individual outlier detection methods score each data point for “outlierness” in this new feature space. Our model then uses dynamically trained parameters to weigh the scores of each method, allowing for a finalized outlier score. We find that the mixture of experts model performs, on average, better than any single expert model in identifying both artificially and manually picked outliers. This mixture model is applied to a data set of astronomical light curves, after dimensionality reduction via time series feature extraction. Our model was tested using three fields from the MACHO catalog and generated a list of anomalous candidates. We confirm that the outliers detected using this method belong to rare classes, like Novae, He-burning, and red giant stars; other outlier light curves identified have no available information associated with them. To elucidate their nature, we created a website containing the light-curve data and information about these objects. Users can attempt to classify the light curves, give conjectures about their identities, and sign up for follow up messages about the progress made on identifying these objects. This user submitted data can be used further train of our mixture of experts model. Our code is publicly available to all who are interested.
Key properties of expert movement systems in sport : an ecological dynamics perspective.
Seifert, Ludovic; Button, Chris; Davids, Keith
2013-03-01
This paper identifies key properties of expertise in sport predicated on the performer-environment relationship. Weaknesses of traditional approaches to expert performance, which uniquely focus on the performer and the environment separately, are highlighted by an ecological dynamics perspective. Key properties of expert movement systems include 'multi- and meta-stability', 'adaptive variability', 'redundancy', 'degeneracy' and the 'attunement to affordances'. Empirical research on these expert system properties indicates that skill acquisition does not emerge from the internal representation of declarative and procedural knowledge, or the imitation of expert behaviours to linearly reduce a perceived 'gap' separating movements of beginners and a putative expert model. Rather, expert performance corresponds with the ongoing co-adaptation of an individual's behaviours to dynamically changing, interacting constraints, individually perceived and encountered. The functional role of adaptive movement variability is essential to expert performance in many different sports (involving individuals and teams; ball games and outdoor activities; land and aquatic environments). These key properties signify that, in sport performance, although basic movement patterns need to be acquired by developing athletes, there exists no ideal movement template towards which all learners should aspire, since relatively unique functional movement solutions emerge from the interaction of key constraints.
Expertise facilitates the transfer of anticipation skill across domains.
Rosalie, Simon M; Müller, Sean
2014-02-01
It is unclear whether perceptual-motor skill transfer is based upon similarity between the learning and transfer domains per identical elements theory, or facilitated by an understanding of underlying principles in accordance with general principle theory. Here, the predictions of identical elements theory, general principle theory, and aspects of a recently proposed model for the transfer of perceptual-motor skill with respect to expertise in the learning and transfer domains are examined. The capabilities of expert karate athletes, near-expert karate athletes, and novices to anticipate and respond to stimulus skills derived from taekwondo and Australian football were investigated in ecologically valid contexts using an in situ temporal occlusion paradigm and complex whole-body perceptual-motor skills. Results indicated that the karate experts and near-experts are as capable of using visual information to anticipate and guide motor skill responses as domain experts and near-experts in the taekwondo transfer domain, but only karate experts could perform like domain experts in the Australian football transfer domain. Findings suggest that transfer of anticipation skill is based upon expertise and an understanding of principles but may be supplemented by similarities that exist between the stimulus and response elements of the learning and transfer domains.
Vezzani, Annamaria; Dingledine, Raymond; Rossetti, Andrea O
2016-01-01
Status epilepticus (SE) is a life-threatening neurological emergency often refractory to available treatment options. It is a very heterogeneous condition in terms of clinical presentation and causes, which besides genetic, vascular and other structural causes also include CNS or severe systemic infections, sudden withdrawal from benzodiazepines or anticonvulsants and rare autoimmune etiologies. Treatment of SE is essentially based on expert opinions and antiepileptic drug treatment per se seems to have no major impact on prognosis. There is, therefore, urgent need of novel therapies that rely upon a better understanding of the basic mechanisms underlying this clinical condition. Accumulating evidence in animal models highlights that inflammation ensuing in the brain during SE may play a determinant role in ongoing seizures and their long-term detrimental consequences, independent of an infection or auto-immune cause; this evidence encourages reconsideration of the treatment flow in SE patients. PMID:26312647
A review of carbide fuel corrosion for nuclear thermal propulsion applications
NASA Astrophysics Data System (ADS)
Pelaccio, Dennis G.; El-Genk, Mohamed S.; Butt, Darryl P.
1993-10-01
At the operation conditions of interest in nuclear thermal propulsion reactors, carbide materials have been known to exhibit a number of life limiting phenomena. These include the formation of liquid, loss by vaporization, creep and corresponding gas flow restrictions, and local corrosion and fuel structure degradation due to excessive mechanical and/or thermal loading. In addition, the radiation environment in the reactor core can produce a substantial change in its local physical properties, which can produce high thermal stresses and corresponding stress fractures (cracking). Time-temperature history and cyclic operation of the nuclear reactor can also accelerate some of these processes. The University of New Mexico's Institute for Space Nuclear Power Studies, under NASA sponsorship has recently initiated a study to model the complicated hydrogen corrosion process. In support of this effort, an extensive review of the open literature was performed, and a technical expert workshop was conducted. This paper summarizes the results of this review.
A Review of Carbide Fuel Corrosion for Nuclear Thermal Propulsion Applications
NASA Astrophysics Data System (ADS)
Pelaccio, Dennis G.; El-Genk, Mohamed S.; Butt, Darryl P.
1994-07-01
At the operation conditions of interest in nuclear thermal propulsion reactors, carbide materials have been known to exhibit a number of life limiting phenomena. These include the formation of liquid, loss by vaporization, creep and corresponding gas flow restrictions, and local corrosion and fuel structure degradation due to excessive mechanical and/or thermal loading. In addition, the radiation environment in the reactor core can produce a substantial change in its local physical properties, which can produce high thermal stresses and corresponding stress fractures (cracking). Time-temperature history and cyclic operation of the nuclear reactor can also accelerate some of these processes. The University of New Mexico's Institute for Space Nuclear Power Studies, under NASA sponsorship has recently initiated a study to model the complicated hydrogen corrosion process. In support of this effort, an extensive review of the open literature was performed, and a technical expert workshop was conducted. This paper summarizes the results of this review.
Oñate, James A; Guskiewicz, Kevin M; Marshall, Stephen W; Giuliani, Carol; Yu, Bing; Garrett, William E
2005-06-01
Anterior cruciate ligament injury prevention programs have used videotapes of jump-landing technique as a key instructional component to improve landing performance. All videotape feedback model groups will increase knee flexion angles at initial contact and overall knee flexion motion and decrease peak vertical ground reaction forces and peak proximal anterior tibial shear forces to a greater extent than will a nonfeedback group. The secondary hypothesis is that the videotape feedback using the combination of the expert and self models will create the greatest change in each variable. Controlled laboratory study. Knee kinematics and kinetics of college-aged recreational athletes randomly placed in 3 different videotape feedback model groups (expert only, self only, combination of expert and self) and a nonfeedback group were collected while participants performed a basketball jump-landing task on 3 testing occasions. All feedback groups significantly increased knee angular displacement flexion angles [F(6,70) = 8.03, P = .001] and decreased peak vertical ground reaction forces [F(6,78) = 2.68, P = .021] during performance and retention tests. The self and combination groups significantly increased knee angular displacement flexion angles more than the control group did; the expert model group did not change significantly more than the control group did. All feedback groups and the nonfeedback group significantly reduced peak vertical forces across performance and retention tests. There were no statistically significant changes in knee flexion angle at initial ground contact (P = .111) and peak proximal anterior tibial shear forces (P = .509) for both testing sessions for each group. The use of self or combination videotape feedback is most useful for increasing knee angular displacement flexion angles and reducing peak vertical forces during landing. The use of self or combination modeling is more effective than is expert-only modeling for the implementation of instructional programs aimed at reducing the risk of jump-landing anterior cruciate ligament injuries.
Supervised interpretation of echocardiograms with a psychological model of expert supervision
NASA Astrophysics Data System (ADS)
Revankar, Shriram V.; Sher, David B.; Shalin, Valerie L.; Ramamurthy, Maya
1993-07-01
We have developed a collaborative scheme that facilitates active human supervision of the binary segmentation of an echocardiogram. The scheme complements the reliability of a human expert with the precision of segmentation algorithms. In the developed system, an expert user compares the computer generated segmentation with the original image in a user friendly graphics environment, and interactively indicates the incorrectly classified regions either by pointing or by circling. The precise boundaries of the indicated regions are computed by studying original image properties at that region, and a human visual attention distribution map obtained from the published psychological and psychophysical research. We use the developed system to extract contours of heart chambers from a sequence of two dimensional echocardiograms. We are currently extending this method to incorporate a richer set of inputs from the human supervisor, to facilitate multi-classification of image regions depending on their functionality. We are integrating into our system the knowledge related constraints that cardiologists use, to improve the capabilities of our existing system. This extension involves developing a psychological model of expert reasoning, functional and relational models of typical views in echocardiograms, and corresponding interface modifications to map the suggested actions to image processing algorithms.
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Basham, Bryan D.
1989-01-01
CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.
The corporate university: a model for sustaining an expert workforce in the human services.
Gould, Karen E
2005-05-01
The human service industry has become a complex industry in which agencies must respond to the demands of the marketplace. To respond to these demands, agencies must develop and maintain their knowledge capital by offering an extensive array of learning opportunities related to their business goals. The corporate university, a contemporary educational model designed to maintain an expert workforce, allows agencies to meet this need effectively.
A Summary of the Foundation Research Program, Fiscal Year 1985.
1986-05-12
system in the domain of actuarial science. Publication: T. R. Sivasankaran and M. Jarke, "Coupling Expert .z- Systems and Actuarial Pricing Models... Actuarial Pricing Models," Workshop on Coupling Symbolic and Numerical Computing in Expert Systems, Bellevue, Washington, August 1985. 16 Title: Application...Ramjets", AIAA-85-1177, AIAA/SAE/ ASME /ASEE 21st Joint Propulsion Conference, July 8-10, 1985. A. Gany and D. W. Netzer, "Fuel Performance Evaluation
Use of the Delphi method in resolving complex water resources issues
Taylor, J.G.; Ryder, S.D.
2003-01-01
The tri-state river basins, shared by Georgia, Alabama, and Florida, are being modeled by the U.S. Fish and Wildlife Service and the U.S. Army Corps of Engineers to help facilitate agreement in an acrimonious water dispute among these different state governments. Modeling of such basin reservoir operations requires parallel understanding of several river system components: hydropower production, flood control, municipal and industrial water use, navigation, and reservoir fisheries requirements. The Delphi method, using repetitive surveying of experts, was applied to determine fisheries' water and lake-level requirements on 25 reservoirs in these interstate basins. The Delphi technique allowed the needs and requirements of fish populations to be brought into the modeling effort on equal footing with other water supply and demand components. When the subject matter is concisely defined and limited, this technique can rapidly assess expert opinion on any natural resource issue, and even move expert opinion toward greater agreement.
A Dual Hesitant Fuzzy Multigranulation Rough Set over Two-Universe Model for Medical Diagnoses
Zhang, Chao; Li, Deyu; Yan, Yan
2015-01-01
In medical science, disease diagnosis is one of the difficult tasks for medical experts who are confronted with challenges in dealing with a lot of uncertain medical information. And different medical experts might express their own thought about the medical knowledge base which slightly differs from other medical experts. Thus, to solve the problems of uncertain data analysis and group decision making in disease diagnoses, we propose a new rough set model called dual hesitant fuzzy multigranulation rough set over two universes by combining the dual hesitant fuzzy set and multigranulation rough set theories. In the framework of our study, both the definition and some basic properties of the proposed model are presented. Finally, we give a general approach which is applied to a decision making problem in disease diagnoses, and the effectiveness of the approach is demonstrated by a numerical example. PMID:26858772
Thermal Expert System (TEXSYS): Systems autonomy demonstration project, volume 2. Results
NASA Technical Reports Server (NTRS)
Glass, B. J. (Editor)
1992-01-01
The Systems Autonomy Demonstration Project (SADP) produced a knowledge-based real-time control system for control and fault detection, isolation, and recovery (FDIR) of a prototype two-phase Space Station Freedom external active thermal control system (EATCS). The Thermal Expert System (TEXSYS) was demonstrated in recent tests to be capable of reliable fault anticipation and detection, as well as ordinary control of the thermal bus. Performance requirements were addressed by adopting a hierarchical symbolic control approach-layering model-based expert system software on a conventional, numerical data acquisition and control system. The model-based reasoning capabilities of TEXSYS were shown to be advantageous over typical rule-based expert systems, particularly for detection of unforeseen faults and sensor failures. Volume 1 gives a project overview and testing highlights. Volume 2 provides detail on the EATCS testbed, test operations, and online test results. Appendix A is a test archive, while Appendix B is a compendium of design and user manuals for the TEXSYS software.
Thermal Expert System (TEXSYS): Systems automony demonstration project, volume 1. Overview
NASA Technical Reports Server (NTRS)
Glass, B. J. (Editor)
1992-01-01
The Systems Autonomy Demonstration Project (SADP) produced a knowledge-based real-time control system for control and fault detection, isolation, and recovery (FDIR) of a prototype two-phase Space Station Freedom external active thermal control system (EATCS). The Thermal Expert System (TEXSYS) was demonstrated in recent tests to be capable of reliable fault anticipation and detection, as well as ordinary control of the thermal bus. Performance requirements were addressed by adopting a hierarchical symbolic control approach-layering model-based expert system software on a conventional, numerical data acquisition and control system. The model-based reasoning capabilities of TEXSYS were shown to be advantageous over typical rule-based expert systems, particularly for detection of unforeseen faults and sensor failures. Volume 1 gives a project overview and testing highlights. Volume 2 provides detail on the EATCS test bed, test operations, and online test results. Appendix A is a test archive, while Appendix B is a compendium of design and user manuals for the TEXSYS software.
Aerothermodynamics of expert ballistic vehicle at hypersonic speeds
NASA Astrophysics Data System (ADS)
Kharitonov, A. M.; Adamov, N. P.; Chirkashenko, V. F.; Mazhul, I. I.; Shpak, S. I.; Shiplyuk, A. N.; Vasenyov, L. G.; Zvegintsev, V. I.; Muylaert, J. M.
2012-01-01
The European EXPErimental Re-entry Test bed (EXPERT) vehicle is intended for studying various basic phenomena, such as the boundary-layer transition on blunted bodies, real gas effects during shock wave/boundary layer interaction, and effect of surface catalycity. Another task is to develop methods for recalculating the results of windtunnel experiments to flight conditions. The EXPERT program implies large-scale preflight research, in particular, various calculations with the use of advanced numerical methods, experimental studies of the models in various wind tunnels, and comparative analysis of data obtained for possible extrapolation of data to in-flight conditions. The experimental studies are performed in various aerodynamic centers of Europe and Russia under contracts with ESA-ESTEC. In particular, extensive experiments are performed at the Von Karman Institute for Fluid Dynamics (VKI, Belgium) and also at the DLR aerospace center in Germany. At ITAM SB RAS, the experimental studies of the EXPERT model characteristic were performed under ISTC Projects 2109, 3151, and 3550, in the T-313 supersonic wind tunnel and AT-303 hypersonic wind tunnel.
Enhancements to the Engine Data Interpretation System (EDIS)
NASA Technical Reports Server (NTRS)
Hofmann, Martin O.
1993-01-01
The Engine Data Interpretation System (EDIS) expert system project assists the data review personnel at NASA/MSFC in performing post-test data analysis and engine diagnosis of the Space Shuttle Main Engine (SSME). EDIS uses knowledge of the engine, its components, and simple thermodynamic principles instead of, and in addition to, heuristic rules gathered from the engine experts. EDIS reasons in cooperation with human experts, following roughly the pattern of logic exhibited by human experts. EDIS concentrates on steady-state static faults, such as small leaks, and component degradations, such as pump efficiencies. The objective of this contract was to complete the set of engine component models, integrate heuristic rules into EDIS, integrate the Power Balance Model into EDIS, and investigate modification of the qualitative reasoning mechanisms to allow 'fuzzy' value classification. The results of this contract is an operational version of EDIS. EDIS will become a module of the Post-Test Diagnostic System (PTDS) and will, in this context, provide system-level diagnostic capabilities which integrate component-specific findings provided by other modules.
Enhancements to the Engine Data Interpretation System (EDIS)
NASA Technical Reports Server (NTRS)
Hofmann, Martin O.
1993-01-01
The Engine Data Interpretation System (EDIS) expert system project assists the data review personnel at NASA/MSFC in performing post-test data analysis and engine diagnosis of the Space Shuttle Main Engine (SSME). EDIS uses knowledge of the engine, its components, and simple thermodynamic principles instead of, and in addition to, heuristic rules gathered from the engine experts. EDIS reasons in cooperation with human experts, following roughly the pattern of logic exhibited by human experts. EDIS concentrates on steady-state static faults, such as small leaks, and component degradations, such as pump efficiencies. The objective of this contract was to complete the set of engine component models, integrate heuristic rules into EDIS, integrate the Power Balance Model into EDIS, and investigate modification of the qualitative reasoning mechanisms to allow 'fuzzy' value classification. The result of this contract is an operational version of EDIS. EDIS will become a module of the Post-Test Diagnostic System (PTDS) and will, in this context, provide system-level diagnostic capabilities which integrate component-specific findings provided by other modules.
Motion generation of robotic surgical tasks: learning from expert demonstrations.
Reiley, Carol E; Plaku, Erion; Hager, Gregory D
2010-01-01
Robotic surgical assistants offer the possibility of automating portions of a task that are time consuming and tedious in order to reduce the cognitive workload of a surgeon. This paper proposes using programming by demonstration to build generative models and generate smooth trajectories that capture the underlying structure of the motion data recorded from expert demonstrations. Specifically, motion data from Intuitive Surgical's da Vinci Surgical System of a panel of expert surgeons performing three surgical tasks are recorded. The trials are decomposed into subtasks or surgemes, which are then temporally aligned through dynamic time warping. Next, a Gaussian Mixture Model (GMM) encodes the experts' underlying motion structure. Gaussian Mixture Regression (GMR) is then used to extract a smooth reference trajectory to reproduce a trajectory of the task. The approach is evaluated through an automated skill assessment measurement. Results suggest that this paper presents a means to (i) extract important features of the task, (ii) create a metric to evaluate robot imitative performance (iii) generate smoother trajectories for reproduction of three common medical tasks.
Thermal Expert System (TEXSYS): Systems autonomy demonstration project, volume 2. Results
NASA Astrophysics Data System (ADS)
Glass, B. J.
1992-10-01
The Systems Autonomy Demonstration Project (SADP) produced a knowledge-based real-time control system for control and fault detection, isolation, and recovery (FDIR) of a prototype two-phase Space Station Freedom external active thermal control system (EATCS). The Thermal Expert System (TEXSYS) was demonstrated in recent tests to be capable of reliable fault anticipation and detection, as well as ordinary control of the thermal bus. Performance requirements were addressed by adopting a hierarchical symbolic control approach-layering model-based expert system software on a conventional, numerical data acquisition and control system. The model-based reasoning capabilities of TEXSYS were shown to be advantageous over typical rule-based expert systems, particularly for detection of unforeseen faults and sensor failures. Volume 1 gives a project overview and testing highlights. Volume 2 provides detail on the EATCS testbed, test operations, and online test results. Appendix A is a test archive, while Appendix B is a compendium of design and user manuals for the TEXSYS software.
Transport and fate of engineered silver nanoparticles in aquifer media
NASA Astrophysics Data System (ADS)
Adrian, Y.; Schneidewind, U.; Azzam, R.
2016-12-01
Engineered silver nanoparticles (AgNPs) are used in various consumer and medical products due to their antimicrobial properties. Their transport behavior in the environment is still under investigation. Previous studies have been focusing on the transport of AgNPs in test systems with pure quartz sand or top soil materials, but studies investigating aquifer material are rare. However, the protection of groundwater resources is an important part in the protection of human health and the assurance of future economic activities. Therefore, expert knowledge regarding the transport, behavior and fate of engineered nanoparticles as potential contaminants in aquifers is essential. The transport and retention behavior of two commercially available engineered AgNPs (one stabilized with a polymere and one with a surfactant) in natural silicate-dominated aquifer material was investigated in saturated laboratory columns. For the experiments a mean grain size diameter of 0.7 mm was chosen with varying silt and clay contents to investigate their effect on the transport behavior of the AgNPs. Typical flow velocities were chosen to represent natural conditions. Particle concentration in the effluent was measured using ICP-MS and the finite element code HYDRUS-1D was used to model the transport and retention processes. The size of the silver nanoparticles in the effluent was analyzed using Flow Field-Flow Fractionation. The obtained results show that silt and clay contents as well as the stabilization of the AgNPs control the transport and retention of AgNPs. Increasing breakthrough was observed with decreasing clay and silt content.
Tjiam, Irene M; Schout, Barbara M A; Hendrikx, Ad J M; Scherpbier, Albert J J M; Witjes, J Alfred; van Merriënboer, Jeroen J G
2012-01-01
Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic processes that experts are not always able to explicate, simulator developers should collaborate with educational experts and physicians in developing efficient and effective training programmes. This article presents an approach to designing simulator-based skill training comprising cognitive task analysis integrated with instructional design according to the four-component/instructional design model. This theory-driven approach is illustrated by a description of how it was used in the development of simulator-based training for the nephrostomy procedure.
High-level user interfaces for transfer function design with semantics.
Salama, Christof Rezk; Keller, Maik; Kohlmann, Peter
2006-01-01
Many sophisticated techniques for the visualization of volumetric data such as medical data have been published. While existing techniques are mature from a technical point of view, managing the complexity of visual parameters is still difficult for non-expert users. To this end, this paper presents new ideas to facilitate the specification of optical properties for direct volume rendering. We introduce an additional level of abstraction for parametric models of transfer functions. The proposed framework allows visualization experts to design high-level transfer function models which can intuitively be used by non-expert users. The results are user interfaces which provide semantic information for specialized visualization problems. The proposed method is based on principal component analysis as well as on concepts borrowed from computer animation.
Variability in perceived satisfaction of reservoir management objectives
Owen, W.J.; Gates, T.K.; Flug, M.
1997-01-01
Fuzzy set theory provides a useful model to address imprecision in interpreting linguistically described objectives for reservoir management. Fuzzy membership functions can be used to represent degrees of objective satisfaction for different values of management variables. However, lack of background information, differing experiences and qualifications, and complex interactions of influencing factors can contribute to significant variability among membership functions derived from surveys of multiple experts. In the present study, probabilistic membership functions are used to model variability in experts' perceptions of satisfaction of objectives for hydropower generation, fish habitat, kayaking, rafting, and scenery preservation on the Green River through operations of Flaming Gorge Dam. Degree of variability in experts' perceptions differed among objectives but resulted in substantial uncertainty in estimation of optimal reservoir releases.
NASA Astrophysics Data System (ADS)
Lewis, Elizabeth; Kilsby, Chris; Fowler, Hayley
2014-05-01
The impact of climate change on hydrological systems requires further quantification in order to inform water management. This study intends to conduct such analysis using hydrological models. Such models are of varying forms, of which conceptual, lumped parameter models and physically-based models are two important types. The majority of hydrological studies use conceptual models calibrated against measured river flow time series in order to represent catchment behaviour. This method often shows impressive results for specific problems in gauged catchments. However, the results may not be robust under non-stationary conditions such as climate change, as physical processes and relationships amenable to change are not accounted for explicitly. Moreover, conceptual models are less readily applicable to ungauged catchments, in which hydrological predictions are also required. As such, the physically based, spatially distributed model SHETRAN is used in this study to develop a robust and reliable framework for modelling historic and future behaviour of gauged and ungauged catchments across the whole of Great Britain. In order to achieve this, a large array of data completely covering Great Britain for the period 1960-2006 has been collated and efficiently stored ready for model input. The data processed include a DEM, rainfall, PE and maps of geology, soil and land cover. A desire to make the modelling system easy for others to work with led to the development of a user-friendly graphical interface. This allows non-experts to set up and run a catchment model in a few seconds, a process that can normally take weeks or months. The quality and reliability of the extensive dataset for modelling hydrological processes has also been evaluated. One aspect of this has been an assessment of error and uncertainty in rainfall input data, as well as the effects of temporal resolution in precipitation inputs on model calibration. SHETRAN has been updated to accept gridded rainfall inputs, and UKCP09 gridded daily rainfall data has been disaggregated using hourly records to analyse the implications of using realistic sub-daily variability. Furthermore, the development of a comprehensive dataset and computationally efficient means of setting up and running catchment models has allowed for examination of how a robust parameter scheme may be derived. This analysis has been based on collective parameterisation of multiple catchments in contrasting hydrological settings and subject to varied processes. 350 gauged catchments all over the UK have been simulated, and a robust set of parameters is being sought by examining the full range of hydrological processes and calibrating to a highly diverse flow data series. The modelling system will be used to generate flow time series based on historical input data and also downscaled Regional Climate Model (RCM) forecasts using the UKCP09 Weather Generator. This will allow for analysis of flow frequency and associated future changes, which cannot be determined from the instrumental record or from lumped parameter model outputs calibrated only to historical catchment behaviour. This work will be based on the existing and functional modelling system described following some further improvements to calibration, particularly regarding simulation of groundwater-dominated catchments.
Ares I Reaction Control System Propellant Feedline Decontamination Modeling
NASA Technical Reports Server (NTRS)
Pasch, James J.
2010-01-01
The objective of the work presented here is to quantify the effects of purge gas temperature, pressure, and mass flow rate on Hydrazine (Hz) decontamination rates of the Ares I Roll Control System and Reaction Control System. A survey of experts in this field revealed the absence of any decontamination rate prediction models. Three basic decontamination methods were identified for analysis and modeling. These include low pressure eduction, high flow rate purge, and pulse purge. For each method, an approach to predict the Hz mass transfer rate, as a function of system pressure, temperature, and purge gas mass flow rate, is developed based on the applicable physics. The models show that low pressure eduction is two orders of magnitude more effective than the high velocity purge, which in turn is two orders of magnitude more effective than the pure diffusion component of pulse purging of deadheads. Eduction subjects the system to low pressure conditions that promote the extraction of Hz vapors. At 120 F, Hz is saturated at approximately 1 psia. At lower pressures and 120 F, Hz will boil, which is an extremely efficient means to remove liquid Hz. The Hz boiling rate is predicted by equating the rate at which energy is added to the saturated liquid Hz through heaters at the tube outer wall with the energy removed from the liquid through evaporation. Boil-off fluxes were predicted by iterating through the range of local pressures with limits set by the minimum allowed pressure of 0.2 psia and maximum allowed wall temperature of 120 F established by the heaters, which gives a saturation pressure of approximately 1.0 psia. Figure 1 shows the resulting boil-off fluxes as a function of local eduction pressure. As depicted in figure 1, the flux is a strong inverse function of eduction pressure, and that minimizing the eduction pressure maximizes the boil-off flux. Also, higher outer wall temperatures lead to higher boil-off fluxes and allow for boil-off over a greater range of eduction pressures.
Kang, Tae-Woo; Cynn, Heon-Seock
2017-01-01
The International Classification of Functioning, Disability, and Health (ICF) provides models for functions and disabilities. The ICF is presented as a frame that enables organizing physical therapists' clinical practice for application. The purpose of the present study was to describe processes through which stroke patients are assessed and treated based on the ICF model. The patient was a 65-year-old female diagnosed with right cerebral artery infarction with left hemiparesis. Progressive interventions were applied, such as those aiming at sitting and standing for the first two weeks, gait intervention for the third and fourth weeks, and those aiming at sitting from a standing position for the fifth and sixth weeks. The ICF model provides rehabilitation experts with a frame that enables them to accurately identify and understand their patients' problems. The ICF model helps the experts understand not only their patients' body structure, function, activity, and participation, but also their problems related to personal and environmental factors. The experts could efficiently make decisions and provide optimum treatment at clinics using the ICF model.
Expert elicitation, uncertainty, and the value of information in controlling invasive species
Johnson, Fred A.; Smith, Brian J.; Bonneau, Mathieu; Martin, Julien; Romagosa, Christina; Mazzotti, Frank J.; Waddle, J. Hardin; Reed, Robert; Eckles, Jennifer Kettevrlin; Vitt, Laurie J.
2017-01-01
We illustrate the utility of expert elicitation, explicit recognition of uncertainty, and the value of information for directing management and research efforts for invasive species, using tegu lizards (Salvator merianae) in southern Florida as a case study. We posited a post-birth pulse, matrix model in which four age classes of tegus are recognized: hatchlings, 1 year-old, 2 year-olds, and 3 + year-olds. This matrix model was parameterized using a 3-point process to elicit estimates of tegu demographic rates in southern Florida from 10 herpetology experts. We fit statistical distributions for each parameter and for each expert, then drew and pooled a large number of replicate samples from these to form a distribution for each demographic parameter. Using these distributions, as well as the observed correlations among elicited values, we generated a large sample of matrix population models to infer how the tegu population would respond to control efforts. We used the concepts of Pareto efficiency and stochastic dominance to conclude that targeting older age classes at relatively high rates appears to have the best chance of minimizing tegu abundance and control costs. We conclude that expert opinion combined with an explicit consideration of uncertainty can be valuable in conducting an initial assessment of what control strategy, effort, and monetary resources are needed to reduce and eventually eliminate the invader. Scientists, in turn, can use the value of information to focus research in a way that not only increases the efficacy of control, but minimizes costs as well.
Aviation Safety Risk Modeling: Lessons Learned From Multiple Knowledge Elicitation Sessions
NASA Technical Reports Server (NTRS)
Luxhoj, J. T.; Ancel, E.; Green, L. L.; Shih, A. T.; Jones, S. M.; Reveley, M. S.
2014-01-01
Aviation safety risk modeling has elements of both art and science. In a complex domain, such as the National Airspace System (NAS), it is essential that knowledge elicitation (KE) sessions with domain experts be performed to facilitate the making of plausible inferences about the possible impacts of future technologies and procedures. This study discusses lessons learned throughout the multiple KE sessions held with domain experts to construct probabilistic safety risk models for a Loss of Control Accident Framework (LOCAF), FLightdeck Automation Problems (FLAP), and Runway Incursion (RI) mishap scenarios. The intent of these safety risk models is to support a portfolio analysis of NASA's Aviation Safety Program (AvSP). These models use the flexible, probabilistic approach of Bayesian Belief Networks (BBNs) and influence diagrams to model the complex interactions of aviation system risk factors. Each KE session had a different set of experts with diverse expertise, such as pilot, air traffic controller, certification, and/or human factors knowledge that was elicited to construct a composite, systems-level risk model. There were numerous "lessons learned" from these KE sessions that deal with behavioral aggregation, conditional probability modeling, object-oriented construction, interpretation of the safety risk results, and model verification/validation that are presented in this paper.
Expertise finding in bibliographic network: topic dominance learning approach.
Neshati, Mahmood; Hashemi, Seyyed Hadi; Beigy, Hamid
2014-12-01
Expert finding problem in bibliographic networks has received increased interest in recent years. This problem concerns finding relevant researchers for a given topic. Motivated by the observation that rarely do all coauthors contribute to a paper equally, in this paper, we propose two discriminative methods for realizing leading authors contributing in a scientific publication. Specifically, we cast the problem of expert finding in a bibliographic network to find leading experts in a research group, which is easier to solve. We recognize three feature groups that can discriminate relevant experts from other authors of a document. Experimental results on a real dataset, and a synthetic one that is gathered from a Microsoft academic search engine, show that the proposed model significantly improves the performance of expert finding in terms of all common information retrieval evaluation metrics.
Transgenic Rat Models for Breast Cancer Research
1996-10-01
colleagues, Dr. Henry Pitot , an expert in hepatocarcinogenesis, and Dr. Michael Gould, an expert in breast cancer. Through our initial attempts at...974-978. 29. Dragan, Y.P. and H.C. Pitot . 1992. The role of the stages of initiation and promotion in phenotypic diversity during hepatocarcinogenesis
Decision support system and medical liability.
Allaërt, F. A.; Dusserre, L.
1992-01-01
Expert systems, which are going to be an essential tool in Medicine, are evolving in terms of sophistication of both knowledge representation and types of reasoning models used. The more efficient they are, the more often they will be used and professional liability will be involved. So after giving a short survey of configuration and working of expert systems, the authors will study the liabilities of people building and the using expert systems regarding some various dysfunctions. Of course the expert systems have to be considered only for human support and they should not possess any authority themselves, therefore the doctors must keep in mind that it is their own responsibility and as such keep their judgment and criticism. However other professionals could be involved, if they have participated in the building of expert systems. The different liabilities and the burden of proof are discussed according to some possible dysfunctions. In any case the final proof is inside the expert system by itself through re-computation of data. PMID:1482972
German, Ramaris E; Adler, Abby; Frankel, Sarah A; Stirman, Shannon Wiltsey; Pinedo, Paola; Evans, Arthur C; Beck, Aaron T; Creed, Torrey A
2018-03-01
Use of expert-led workshops plus consultation has been established as an effective strategy for training community mental health (CMH) clinicians in evidence-based practices (EBPs). Because of high rates of staff turnover, this strategy inadequately addresses the need to maintain capacity to deliver EBPs. This study examined knowledge, competency, and retention outcomes of a two-phase model developed to build capacity for an EBP in CMH programs. In the first phase, an initial training cohort in each CMH program participated in in-person workshops followed by expert-led consultation (in-person, expert-led [IPEL] phase) (N=214 clinicians). After this cohort completed training, new staff members participated in Web-based training (in place of in-person workshops), followed by peer-led consultation with the initial cohort (Web-based, trained-peer [WBTP] phase) (N=148). Tests of noninferiority assessed whether WBTP was not inferior to IPEL at increasing clinician cognitive-behavioral therapy (CBT) competency, as measured by the Cognitive Therapy Rating Scale. WBTP was not inferior to IPEL at developing clinician competency. Hierarchical linear models showed no significant differences in CBT knowledge acquisition between the two phases. Survival analyses indicated that WBTP trainees were less likely than IPEL trainees to complete training. In terms of time required from experts, WBTP required 8% of the resources of IPEL. After an initial investment to build in-house CBT expertise, CMH programs were able to use a WBTP model to broaden their own capacity for high-fidelity CBT. IPEL followed by WBTP offers an effective alternative to build EBP capacity in CMH programs, rather than reliance on external experts.
Expert review on poliovirus immunity and transmission.
Duintjer Tebbens, Radboud J; Pallansch, Mark A; Chumakov, Konstantin M; Halsey, Neal A; Hovi, Tapani; Minor, Philip D; Modlin, John F; Patriarca, Peter A; Sutter, Roland W; Wright, Peter F; Wassilak, Steven G F; Cochi, Stephen L; Kim, Jong-Hoon; Thompson, Kimberly M
2013-04-01
Successfully managing risks to achieve wild polioviruses (WPVs) eradication and address the complexities of oral poliovirus vaccine (OPV) cessation to stop all cases of paralytic poliomyelitis depends strongly on our collective understanding of poliovirus immunity and transmission. With increased shifting from OPV to inactivated poliovirus vaccine (IPV), numerous risk management choices motivate the need to understand the tradeoffs and uncertainties and to develop models to help inform decisions. The U.S. Centers for Disease Control and Prevention hosted a meeting of international experts in April 2010 to review the available literature relevant to poliovirus immunity and transmission. This expert review evaluates 66 OPV challenge studies and other evidence to support the development of quantitative models of poliovirus transmission and potential outbreaks. This review focuses on characterization of immunity as a function of exposure history in terms of susceptibility to excretion, duration of excretion, and concentration of excreted virus. We also discuss the evidence of waning of host immunity to poliovirus transmission, the relationship between the concentration of poliovirus excreted and infectiousness, the importance of different transmission routes, and the differences in transmissibility between OPV and WPV. We discuss the limitations of the available evidence for use in polio risk models, and conclude that despite the relatively large number of studies on immunity, very limited data exist to directly support quantification of model inputs related to transmission. Given the limitations in the evidence, we identify the need for expert input to derive quantitative model inputs from the existing data. © 2012 Society for Risk Analysis.
Pathway enrichment based on text mining and its validation on carotenoid and vitamin A metabolism.
Waagmeester, Andra; Pezik, Piotr; Coort, Susan; Tourniaire, Franck; Evelo, Chris; Rebholz-Schuhmann, Dietrich
2009-10-01
Carotenoid metabolism is relevant to the prevention of various diseases. Although the main actors in this metabolic pathway are known, our understanding of the pathway is still incomplete. The information on the carotenoids is scattered in the large and growing body of scientific literature. We designed a text-mining work flow to enrich existing pathways. It has been validated on the vitamin A pathway, which is a well-studied part of the carotenoid metabolism. In this study we used the vitamin A metabolism pathway as it has been described by an expert team on carotenoid metabolism from the European network of excellence in Nutrigenomics (NuGO). This work flow uses an initial set of publications cited in a review paper (1,191 publications), enlarges this corpus with Medline abstracts (13,579 documents), and then extracts the key terminology from all relevant publications. Domain experts validated the intermediate and final results of our text-mining work flow. With our approach we were able to enrich the pathway representing vitamin A metabolism. We found 37 new and relevant terms from a total of 89,086 terms, which have been qualified for inclusion in the analyzed pathway. These 37 terms have been assessed manually and as a result 13 new terms were then added as entities to the pathway. Another 14 entities belonged to other pathways, which could form the link of these pathways with the vitamin A pathway. The remaining 10 terms were classified as biomarkers or nutrients. Automatic literature analysis improves the enrichment of pathways with entities already described in the scientific literature.
[Assessment of an educational technology in the string literature about breastfeeding].
de Oliveira, Paula Marciana Pinheiro; Pagliuca, Lorita Marlena Freitag
2013-02-01
The goal of this study was to assess educational technology in the string literature about breastfeeding. The study was conducted between March and September 2009 by breastfeeding experts and experts on string literature. A psychometric model was adopted as the theoretical-methodological framework. For data collection, an instrument was used to assess the content about breastfeeding and the string literature rules. The analysis was based on comparisons of the notes and critical reflections of experts. Ethical guidelines were followed during the study. After the assessments, the educational technology was adjusted until all of the experts agreed. The assessment of educational technology can reduce obstacles to information dissemination and can lead to improvements in quality of life.
The digital storytelling process: A comparative analysis from various experts
NASA Astrophysics Data System (ADS)
Hussain, Hashiroh; Shiratuddin, Norshuhada
2016-08-01
Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.
Why Bother to Calibrate? Model Consistency and the Value of Prior Information
NASA Astrophysics Data System (ADS)
Hrachowitz, Markus; Fovet, Ophelie; Ruiz, Laurent; Euser, Tanja; Gharari, Shervan; Nijzink, Remko; Savenije, Hubert; Gascuel-Odoux, Chantal
2015-04-01
Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by 4 calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce 20 hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by using prior information about the system to impose "prior constraints", inferred from expert knowledge and to ensure a model which behaves well with respect to the modeller's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model set-up exhibited increased performance in the independent test period and skill to reproduce all 20 signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if efficiently counter-balanced by available prior constraints, can increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge driven strategy of constraining models.
Why Bother and Calibrate? Model Consistency and the Value of Prior Information.
NASA Astrophysics Data System (ADS)
Hrachowitz, M.; Fovet, O.; Ruiz, L.; Euser, T.; Gharari, S.; Nijzink, R.; Freer, J. E.; Savenije, H.; Gascuel-Odoux, C.
2014-12-01
Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by 4 calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce 20 hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by using prior information about the system to impose "prior constraints", inferred from expert knowledge and to ensure a model which behaves well with respect to the modeller's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model set-up exhibited increased performance in the independent test period and skill to reproduce all 20 signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if efficiently counter-balanced by available prior constraints, can increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge driven strategy of constraining models.
Marín, Víctor H; Delgado, Luisa E; Bachmann, Pamela
2008-09-01
The use of brainstorming techniques for the generation of conceptual models, as the basis for the integrated management of physical-ecological-social systems (PHES-systems) is tested and discussed. The methodology is applied in the analysis of the Aysén fjord and watershed (Southern Chilean Coast). Results show that the proposed methods can be adequately used in management scenarios characterized by highly hierarchical, experts/non-experts membership.
An Expert System for Managing Storage Space Constraints Aboard United States Naval Vessels
1991-12-01
aiad recommilts finher research to egabliab the CODU Wnuafa. 20. Dl$TRUUTWAVA ABiITY OF ABSTRACT 21Z ABSTRACT SECURI11TY CLASS011CATIO OU AM*kt~t K...concludes that the use of an expert system would provide valuable assistance to the afloat Supply Officer and recommends further research to establish the...APPLICABLE FORECASTING MODELS . .................... 20 C. APPLICABLE OPERATIONS RESEARCH MODELS .o.......24 IV. AN EXPER SYSTEM: VARIABLES TO CONSIDER
Practical problems in aggregating expert opinions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Booker, J.M.; Picard, R.R.; Meyer, M.A.
1993-11-01
Expert opinion is data given by a qualified person in response to a technical question. In these analyses, expert opinion provides information where other data are either sparse or non-existent. Improvements in forecasting result from the advantageous addition of expert opinion to observed data in many areas, such as meteorology and econometrics. More generally, analyses of large, complex systems often involve experts on various components of the system supplying input to a decision process; applications include such wide-ranging areas as nuclear reactor safety, management science, and seismology. For large or complex applications, no single expert may be knowledgeable enough aboutmore » the entire application. In other problems, decision makers may find it comforting that a consensus or aggregation of opinions is usually better than a single opinion. Many risk and reliability studies require a single estimate for modeling, analysis, reporting, and decision making purposes. For problems with large uncertainties, the strategy of combining as diverse a set of experts as possible hedges against underestimation of that uncertainty. Decision makers are frequently faced with the task of selecting the experts and combining their opinions. However, the aggregation is often the responsibility of an analyst. Whether the decision maker or the analyst does the aggregation, the input for it, such as providing weights for experts or estimating other parameters, is imperfect owing to a lack of omniscience. Aggregation methods for expert opinions have existed for over thirty years; yet many of the difficulties with their use remain unresolved. The bulk of these problem areas are summarized in the sections that follow: sensitivities of results to assumptions, weights for experts, correlation of experts, and handling uncertainties. The purpose of this paper is to discuss the sources of these problems and describe their effects on aggregation.« less
NASA Astrophysics Data System (ADS)
Moradi, M.; Delavar, M. R.; Moshiri, B.; Khamespanah, F.
2014-10-01
Being one of the most frightening disasters, earthquakes frequently cause huge damages to buildings, facilities and human beings. Although the prediction of characteristics of an earthquake seems to be impossible, its loss and damage is predictable in advance. Seismic loss estimation models tend to evaluate the extent to which the urban areas are vulnerable to earthquakes. Many factors contribute to the vulnerability of urban areas against earthquakes including age and height of buildings, the quality of the materials, the density of population and the location of flammable facilities. Therefore, seismic vulnerability assessment is a multi-criteria problem. A number of multi criteria decision making models have been proposed based on a single expert. The main objective of this paper is to propose a model which facilitates group multi criteria decision making based on the concept of majority voting. The main idea of majority voting is providing a computational tool to measure the degree to which different experts support each other's opinions and make a decision regarding this measure. The applicability of this model is examined in Tehran metropolitan area which is located in a seismically active region. The results indicate that neglecting the experts which get lower degrees of support from others enables the decision makers to avoid the extreme strategies. Moreover, a computational method is proposed to calculate the degree of optimism in the experts' opinions.
1980 Directory of Experts on Organization and Management of Construction/CIB W-65 Commission.
1981-02-02
7a t i onai I F ran, Ton I a tor N1anagernen t of Con strin t ion The,,,a uruts Com p iIa t i 4- 65 BIOGRAPH I C INFORMAIION PORN ,- CI.ogclier, Robc...Construction Cost Analysis of Cost-Flow Curves in Construction Unproductive Time in Building Operations Video Tape Recording for Site Studies optimization of
Time distortion for expert and novice online game players.
Rau, Pei-Luen Patrick; Peng, Shu-Yun; Yang, Chin-Chow
2006-08-01
Online game addiction is a new mental disorder. This disorder is difficult to describe because of its comprehensive nature. Many online game players have problems controlling their playing time. They cannot stop playing a game that they enjoy. This research surveyed the past literature on "flow" and time disorder theory. A time distortion experiment was conducted. This research invited 64 children, teenagers, and young adults to investigate player skill and playing time effects on online game break-off. The playing experience and degree of time distortion were measured and analyzed. The results showed that both novice and expert online game players were subject to time distortion. The participants had difficulty breaking off from the game without intrusion by others in the real world. This research also suggests eight questions for self-evaluation for online game addiction.
Hospital-based expert model for health technology procurement planning in hospitals.
Miniati, R; Cecconi, G; Frosini, F; Dori, F; Regolini, J; Iadanza, E; Biffi Gentili, G
2014-01-01
Although in the last years technology innovation in healthcare brought big improvements in care level and patient quality of life, hospital complexity and management cost became higher. For this reason, necessity of planning for medical equipment procurement within hospitals is getting more and more important in order to sustainable provide appropriate technology for both routine activity and innovative procedures. In order to support hospital decision makers for technology procurement planning, an expert model was designed as reported in the following paper. It combines the most widely used approaches for technology evaluation by taking into consideration Health Technology Assessment (HTA) and Medical Equipment Replacement Model (MERM). The designing phases include a first definition of prioritization algorithms, then the weighting process through experts' interviews and a final step for the model validation that included both statistical testing and comparison with real decisions. In conclusion, the designed model was able to provide a semi-automated tool that through the use of multidisciplinary information is able to prioritize different requests of technology acquisition in hospitals. Validation outcomes improved the model accuracy and created different "user profiles" according to the specific needs of decision makers.
Influence of Professional Affiliation on Expert’s View on Welfare Measures
Rousing, Tine; Forkman, Björn
2017-01-01
Simple Summary Animal welfare can be assessed from different ethical points of view, which may vary from one individual to another. This is often met by including different stakeholders’ opinions in the process of adding up welfare benefits and or welfare risks. However, in order to obtain the most reliable results, these expert panels should be balanced; since experts’ professional affiliations can influence their judgment on different welfare aspects as shown in the present study. Abstract The present study seeks to investigate the influence of expert affiliation in the weighing procedures within animal welfare assessments. Experts are often gathered with different backgrounds with differing approaches to animal welfare posing a potential pitfall if affiliation groups are not balanced in numbers of experts. At two time points (2012 and 2016), dairy cattle and swine experts from four different stakeholder groups, namely researchers (RES), production advisors (CONS), practicing veterinarians (VET) and animal welfare control officers (AWC) were asked to weigh eight different welfare criteria: Hunger, Thirst, Resting comfort, Ease of movement, Injuries, Disease, Human-animal bond and Emotional state. A total of 54 dairy cattle experts (RES = 15%, CONS = 22%, VET = 35%, AWC = 28%) and 34 swine experts (RES = 24%, CONS = 35%, AWC = 41%) participated. Between—and within—group differences in the prioritization of criteria were assessed. AWC cattle experts differed consistently from the other cattle expert groups but only significantly for the criteria Hunger (p = 0.04), and tendencies towards significance within the criteria Thirst (p = 0.06). No significant differences were found between expert groups among swine experts. Inter-expert differences were more pronounced for both species. The results highlight the challenges of using expert weightings in aggregated welfare assessment models, as the choice of expert affiliation may play a confounding role in the final aggregation due to different prioritization of criteria. PMID:29140262
Friesen, Melissa C.; Shortreed, Susan M.; Wheeler, David C.; Burstyn, Igor; Vermeulen, Roel; Pronk, Anjoeka; Colt, Joanne S.; Baris, Dalsu; Karagas, Margaret R.; Schwenn, Molly; Johnson, Alison; Armenti, Karla R.; Silverman, Debra T.; Yu, Kai
2015-01-01
Objectives: Rule-based expert exposure assessment based on questionnaire response patterns in population-based studies improves the transparency of the decisions. The number of unique response patterns, however, can be nearly equal to the number of jobs. An expert may reduce the number of patterns that need assessment using expert opinion, but each expert may identify different patterns of responses that identify an exposure scenario. Here, hierarchical clustering methods are proposed as a systematic data reduction step to reproducibly identify similar questionnaire response patterns prior to obtaining expert estimates. As a proof-of-concept, we used hierarchical clustering methods to identify groups of jobs (clusters) with similar responses to diesel exhaust-related questions and then evaluated whether the jobs within a cluster had similar (previously assessed) estimates of occupational diesel exhaust exposure. Methods: Using the New England Bladder Cancer Study as a case study, we applied hierarchical cluster models to the diesel-related variables extracted from the occupational history and job- and industry-specific questionnaires (modules). Cluster models were separately developed for two subsets: (i) 5395 jobs with ≥1 variable extracted from the occupational history indicating a potential diesel exposure scenario, but without a module with diesel-related questions; and (ii) 5929 jobs with both occupational history and module responses to diesel-relevant questions. For each subset, we varied the numbers of clusters extracted from the cluster tree developed for each model from 100 to 1000 groups of jobs. Using previously made estimates of the probability (ordinal), intensity (µg m−3 respirable elemental carbon), and frequency (hours per week) of occupational exposure to diesel exhaust, we examined the similarity of the exposure estimates for jobs within the same cluster in two ways. First, the clusters’ homogeneity (defined as >75% with the same estimate) was examined compared to a dichotomized probability estimate (<5 versus ≥5%; <50 versus ≥50%). Second, for the ordinal probability metric and continuous intensity and frequency metrics, we calculated the intraclass correlation coefficients (ICCs) between each job’s estimate and the mean estimate for all jobs within the cluster. Results: Within-cluster homogeneity increased when more clusters were used. For example, ≥80% of the clusters were homogeneous when 500 clusters were used. Similarly, ICCs were generally above 0.7 when ≥200 clusters were used, indicating minimal within-cluster variability. The most within-cluster variability was observed for the frequency metric (ICCs from 0.4 to 0.8). We estimated that using an expert to assign exposure at the cluster-level assignment and then to review each job in non-homogeneous clusters would require ~2000 decisions per expert, in contrast to evaluating 4255 unique questionnaire patterns or 14983 individual jobs. Conclusions: This proof-of-concept shows that using cluster models as a data reduction step to identify jobs with similar response patterns prior to obtaining expert ratings has the potential to aid rule-based assessment by systematically reducing the number of exposure decisions needed. While promising, additional research is needed to quantify the actual reduction in exposure decisions and the resulting homogeneity of exposure estimates within clusters for an exposure assessment effort that obtains cluster-level expert assessments as part of the assessment process. PMID:25477475
Friesen, Melissa C; Shortreed, Susan M; Wheeler, David C; Burstyn, Igor; Vermeulen, Roel; Pronk, Anjoeka; Colt, Joanne S; Baris, Dalsu; Karagas, Margaret R; Schwenn, Molly; Johnson, Alison; Armenti, Karla R; Silverman, Debra T; Yu, Kai
2015-05-01
Rule-based expert exposure assessment based on questionnaire response patterns in population-based studies improves the transparency of the decisions. The number of unique response patterns, however, can be nearly equal to the number of jobs. An expert may reduce the number of patterns that need assessment using expert opinion, but each expert may identify different patterns of responses that identify an exposure scenario. Here, hierarchical clustering methods are proposed as a systematic data reduction step to reproducibly identify similar questionnaire response patterns prior to obtaining expert estimates. As a proof-of-concept, we used hierarchical clustering methods to identify groups of jobs (clusters) with similar responses to diesel exhaust-related questions and then evaluated whether the jobs within a cluster had similar (previously assessed) estimates of occupational diesel exhaust exposure. Using the New England Bladder Cancer Study as a case study, we applied hierarchical cluster models to the diesel-related variables extracted from the occupational history and job- and industry-specific questionnaires (modules). Cluster models were separately developed for two subsets: (i) 5395 jobs with ≥1 variable extracted from the occupational history indicating a potential diesel exposure scenario, but without a module with diesel-related questions; and (ii) 5929 jobs with both occupational history and module responses to diesel-relevant questions. For each subset, we varied the numbers of clusters extracted from the cluster tree developed for each model from 100 to 1000 groups of jobs. Using previously made estimates of the probability (ordinal), intensity (µg m(-3) respirable elemental carbon), and frequency (hours per week) of occupational exposure to diesel exhaust, we examined the similarity of the exposure estimates for jobs within the same cluster in two ways. First, the clusters' homogeneity (defined as >75% with the same estimate) was examined compared to a dichotomized probability estimate (<5 versus ≥5%; <50 versus ≥50%). Second, for the ordinal probability metric and continuous intensity and frequency metrics, we calculated the intraclass correlation coefficients (ICCs) between each job's estimate and the mean estimate for all jobs within the cluster. Within-cluster homogeneity increased when more clusters were used. For example, ≥80% of the clusters were homogeneous when 500 clusters were used. Similarly, ICCs were generally above 0.7 when ≥200 clusters were used, indicating minimal within-cluster variability. The most within-cluster variability was observed for the frequency metric (ICCs from 0.4 to 0.8). We estimated that using an expert to assign exposure at the cluster-level assignment and then to review each job in non-homogeneous clusters would require ~2000 decisions per expert, in contrast to evaluating 4255 unique questionnaire patterns or 14983 individual jobs. This proof-of-concept shows that using cluster models as a data reduction step to identify jobs with similar response patterns prior to obtaining expert ratings has the potential to aid rule-based assessment by systematically reducing the number of exposure decisions needed. While promising, additional research is needed to quantify the actual reduction in exposure decisions and the resulting homogeneity of exposure estimates within clusters for an exposure assessment effort that obtains cluster-level expert assessments as part of the assessment process. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2014.
Mignan, A; Broccardo, M; Wiemer, S; Giardini, D
2017-10-19
The rise in the frequency of anthropogenic earthquakes due to deep fluid injections is posing serious economic, societal, and legal challenges to many geo-energy and waste-disposal projects. Existing tools to assess such problems are still inherently heuristic and mostly based on expert elicitation (so-called clinical judgment). We propose, as a complementary approach, an adaptive traffic light system (ATLS) that is function of a statistical model of induced seismicity. It offers an actuarial judgement of the risk, which is based on a mapping between earthquake magnitude and risk. Using data from six underground reservoir stimulation experiments, mostly from Enhanced Geothermal Systems, we illustrate how such a data-driven adaptive forecasting system could guarantee a risk-based safety target. The proposed model, which includes a linear relationship between seismicity rate and flow rate, as well as a normal diffusion process for post-injection, is first confirmed to be representative of the data. Being integrable, the model yields a closed-form ATLS solution that is both transparent and robust. Although simulations verify that the safety target is consistently ensured when the ATLS is applied, the model from which simulations are generated is validated on a limited dataset, hence still requiring further tests in additional fluid injection environments.
NASA Astrophysics Data System (ADS)
Warsta, L.; Karvonen, T.
2017-12-01
There are currently 25 shooting and training areas in Finland managed by The Finnish Defence Forces (FDF), where military activities can cause contamination of open waters and groundwater reservoirs. In the YMPYRÄ project, a computer software framework is being developed that combines existing open environmental data and proprietary information collected by FDF with computational models to investigate current and prevent future environmental problems. A data centric philosophy is followed in the development of the system, i.e. the models are updated and extended to handle available data from different areas. The results generated by the models are summarized as easily understandable flow and risk maps that can be opened in GIS programs and used in environmental assessments by experts. Substances investigated with the system include explosives and metals such as lead, and both surface and groundwater dominated areas can be simulated. The YMPYRÄ framework is composed of a three dimensional soil and groundwater flow model, several solute transport models and an uncertainty assessment system. Solute transport models in the framework include particle based, stream tube and finite volume based approaches. The models can be used to simulate solute dissolution from source area, transport in the unsaturated layers to groundwater and finally migration in groundwater to water extraction wells and springs. The models can be used to simulate advection, dispersion, equilibrium adsorption on soil particles, solubility and dissolution from solute phase and dendritic solute decay chains. Correct numerical solutions were confirmed by comparing results to analytical 1D and 2D solutions and by comparing the numerical solutions to each other. The particle based and stream tube type solute transport models were useful as they could complement the traditional finite volume based approach which in certain circumstances produced numerical dispersion due to piecewise solution of the governing equations in computational grids and included computationally intensive and in some cases unstable iterative solutions. The YMPYRÄ framework is being developed by WaterHope, Gain Oy, and SITO Oy consulting companies and funded by FDF.
Mac Nally, Ralph; Thomson, James R.; Kimmerer, Wim J.; Feyrer, Frederick; Newman, Ken B.; Sih, Andy; Bennett, William A.; Brown, Larry; Fleishman, Erica; Culberson, Steven D.; Castillo, Gonzalo
2010-01-01
Four species of pelagic fish of particular management concern in the upper San Francisco Estuary, California, USA, have declined precipitously since ca. 2002: delta smelt (Hypomesus transpacificus), longfin smelt (Spirinchus thaleichthys), striped bass (Morone saxatilis), and threadfin shad (Dorosoma petenense). The estuary has been monitored since the late 1960s with extensive collection of data on the fishes, their pelagic prey, phytoplankton biomass, invasive species, and physical factors. We used multivariate autoregressive (MAR) modeling to discern the main factors responsible for the declines. An expert-elicited model was built to describe the system. Fifty-four relationships were built into the model, only one of which was of uncertain direction a priori. Twenty-eight of the proposed relationships were strongly supported by or consistent with the data, while 26 were close to zero (not supported by the data but not contrary to expectations). The position of the 2 isohaline (a measure of the physical response of the estuary to freshwater flow) and increased water clarity over the period of analyses were two factors affecting multiple declining taxa (including fishes and the fishes' main zooplankton prey). Our results were relatively robust with respect to the form of stock–recruitment model used and to inclusion of subsidiary covariates but may be enhanced by using detailed state–space models that describe more fully the life-history dynamics of the declining species.
Mac Nally, Ralph; Thomson, James R; Kimmerer, Wim J; Feyrer, Frederick; Newman, Ken B; Sih, Andy; Bennett, William A; Brown, Larry; Fleishman, Erica; Culberson, Steven D; Castillo, Gonzalo
2010-07-01
Four species of pelagic fish of particular management concern in the upper San Francisco Estuary, California, USA, have declined precipitously since ca. 2002: delta smelt (Hypomesus transpacificus), longfin smelt (Spirinchus thaleichthys), striped bass (Morone saxatilis), and threadfin shad (Dorosoma petenense). The estuary has been monitored since the late 1960s with extensive collection of data on the fishes, their pelagic prey, phytoplankton biomass, invasive species, and physical factors. We used multivariate autoregressive (MAR) modeling to discern the main factors responsible for the declines. An expert-elicited model was built to describe the system. Fifty-four relationships were built into the model, only one of which was of uncertain direction a priori. Twenty-eight of the proposed relationships were strongly supported by or consistent with the data, while 26 were close to zero (not supported by the data but not contrary to expectations). The position of the 2 per thousand isohaline (a measure of the physical response of the estuary to freshwater flow) and increased water clarity over the period of analyses were two factors affecting multiple declining taxa (including fishes and the fishes' main zooplankton prey): Our results were relatively robust with respect to the form of stock-recruitment model used and to inclusion of subsidiary covariates but may be enhanced by using detailed state-space models that describe more fully the life-history dynamics of the declining species.
Comparing Habitat Suitability and Connectivity Modeling Methods for Conserving Pronghorn Migrations
Poor, Erin E.; Loucks, Colby; Jakes, Andrew; Urban, Dean L.
2012-01-01
Terrestrial long-distance migrations are declining globally: in North America, nearly 75% have been lost. Yet there has been limited research comparing habitat suitability and connectivity models to identify migration corridors across increasingly fragmented landscapes. Here we use pronghorn (Antilocapra americana) migrations in prairie habitat to compare two types of models that identify habitat suitability: maximum entropy (Maxent) and expert-based (Analytic Hierarchy Process). We used distance to wells, distance to water, NDVI, land cover, distance to roads, terrain shape and fence presence to parameterize the models. We then used the output of these models as cost surfaces to compare two common connectivity models, least-cost modeling (LCM) and circuit theory. Using pronghorn movement data from spring and fall migrations, we identified potential migration corridors by combining each habitat suitability model with each connectivity model. The best performing model combination was Maxent with LCM corridors across both seasons. Maxent out-performed expert-based habitat suitability models for both spring and fall migrations. However, expert-based corridors can perform relatively well and are a cost-effective alternative if species location data are unavailable. Corridors created using LCM out-performed circuit theory, as measured by the number of pronghorn GPS locations present within the corridors. We suggest the use of a tiered approach using different corridor widths for prioritizing conservation and mitigation actions, such as fence removal or conservation easements. PMID:23166656
Comparing habitat suitability and connectivity modeling methods for conserving pronghorn migrations.
Poor, Erin E; Loucks, Colby; Jakes, Andrew; Urban, Dean L
2012-01-01
Terrestrial long-distance migrations are declining globally: in North America, nearly 75% have been lost. Yet there has been limited research comparing habitat suitability and connectivity models to identify migration corridors across increasingly fragmented landscapes. Here we use pronghorn (Antilocapra americana) migrations in prairie habitat to compare two types of models that identify habitat suitability: maximum entropy (Maxent) and expert-based (Analytic Hierarchy Process). We used distance to wells, distance to water, NDVI, land cover, distance to roads, terrain shape and fence presence to parameterize the models. We then used the output of these models as cost surfaces to compare two common connectivity models, least-cost modeling (LCM) and circuit theory. Using pronghorn movement data from spring and fall migrations, we identified potential migration corridors by combining each habitat suitability model with each connectivity model. The best performing model combination was Maxent with LCM corridors across both seasons. Maxent out-performed expert-based habitat suitability models for both spring and fall migrations. However, expert-based corridors can perform relatively well and are a cost-effective alternative if species location data are unavailable. Corridors created using LCM out-performed circuit theory, as measured by the number of pronghorn GPS locations present within the corridors. We suggest the use of a tiered approach using different corridor widths for prioritizing conservation and mitigation actions, such as fence removal or conservation easements.
Extension of Companion Modeling Using Classification Learning
NASA Astrophysics Data System (ADS)
Torii, Daisuke; Bousquet, François; Ishida, Toru
Companion Modeling is a methodology of refining initial models for understanding reality through a role-playing game (RPG) and a multiagent simulation. In this research, we propose a novel agent model construction methodology in which classification learning is applied to the RPG log data in Companion Modeling. This methodology enables a systematic model construction that handles multi-parameters, independent of the modelers ability. There are three problems in applying classification learning to the RPG log data: 1) It is difficult to gather enough data for the number of features because the cost of gathering data is high. 2) Noise data can affect the learning results because the amount of data may be insufficient. 3) The learning results should be explained as a human decision making model and should be recognized by the expert as being the result that reflects reality. We realized an agent model construction system using the following two approaches: 1) Using a feature selction method, the feature subset that has the best prediction accuracy is identified. In this process, the important features chosen by the expert are always included. 2) The expert eliminates irrelevant features from the learning results after evaluating the learning model through a visualization of the results. Finally, using the RPG log data from the Companion Modeling of agricultural economics in northeastern Thailand, we confirm the capability of this methodology.
Mathematical modeling of the thermal and hydrodynamic structure of the cooling reservoir
NASA Astrophysics Data System (ADS)
Saminskiy, G.; Debolskaya, E.
2012-04-01
Hydrothermal conditions of the cooling reservoir is determined by the heat and mass transfer from the water surface to the atmosphere and the processes of heat transfer directly in the water mass of the reservoir. As the capacity of power plants, the corresponding increase in the volume of heated water and the use of deep lakes and reservoirs as coolers there is a need to develop new, more accurate, and the application of existing methods for the numerical simulation. In calculating the hydrothermal regime it must take into account the effect of wind, density (buoyancy) forces, and other data of the cooling reservoir. In addition to solving practical problems it is important to know not only the magnitude of the average temperature, but also its area and depth distribution. A successful solution can be achieved through mathematical modeling of general systems of equations of transport processes and the correct formulation of the problem, based on appropriate initial data. The purpose of the work is application of software package GETM for simulating the hydrothermal regime of cooling reservoir with an estimate of three-dimensional structure of transfer processes, the effects of wind, the friction of the water surface. Three-dimensional models are rarely applied, especially for far-field problems. If such models are required, experts in the field must develop and apply them. Primary physical processes included are surface heat transfer, short-wave and long-wave radiation and penetration, convective mixing, wind and flow induced mixing, entrainment of ambient water by pumped-storage inflows, inflow density stratification as impacted by temperature and dissolved and suspended solids. The model forcing data consists of the system bathymetry developed into the model grid; the boundary condition flow and temperature; the tributary and flow and temperature; and the system meteorology. Ivankovskoe reservoir belongs to the reservoirs of valley type (Tver region, Russia). It is used as a cooling reservoir for Konakovskaya power plant. It dumps the heated water in the Moshkovichevsky bay. Thermal and hydrodynamic structure of the Moshkovichevsky Bay is particular interest as the object of direct influence of heated water discharge. To study the effect of thermal discharge into the Ivankovskoe reservoir the model of the Moshkovichevsky Bay was built, which is subject to the largest thermal pollution. Step of the calculation grid is 25 meters. For further verification of the model field investigations were conducted in August-September 2011. The modeling results satisfactorily describe the thermal and hydrodynamic structure of the Moshkovichevsky Bay.
Educator of the Court: The Role of the Expert Witness in Cases Involving Autism Spectrum Disorder
Berryessa, Colleen M.
2017-01-01
The role of the expert witness in legal contexts is to educate fact finders of the court who may have no background in the expert’s area. This role can be especially difficult for those who assist in cases involving individuals with Autism Spectrum Disorder (ASD). As expert assistance on ASD is crucial to ensuring just outcomes for individuals diagnosed with ASD, knowledge on how expert witnesses perceive and approach their roles, and what factors may influence these perceptions, is essential. This qualitative research utilizes semi-structured interviews with a sample of expert witnesses in cases involving ASD, analyzed using a grounded-theory constant comparative analytic approach. Data reveal that experts appear to view their roles in court as reconstructionists, educators, myth-dispellers, and most of all, communicators, actively using their testimony to fill these roles in cases. These results also allow for the development of a model that illustrates two areas that coalesce to affect how experts view their roles in court: (1) personal experiences of experts in cases in which they have been involved; and (2) influences outside experts’ personal experiences, such as their general opinions or observations regarding ASD and its relationship to the criminal justice system. PMID:28943746
Zamparo, Paola; Zorzi, Elena; Marcantoni, Sara; Cesari, Paola
2015-01-01
The aim of this study was to compare experts to naïve practitioners in rating the beauty and the technical quality of a Tai Chi sequence observed in video-clips (of high and middle level performances). Our hypothesis are: i) movement evaluation will correlate with the level of skill expressed in the kinematics of the observed action but ii) only experts will be able to unravel the technical component from the aesthetic component of the observed action. The judgments delivered indicate that both expert and non-expert observers are able to discern a good from a mediocre performance; however, as expected, only experts discriminate the technical from the aesthetic component of the action evaluated and do this independently of the level of skill shown by the model (high or middle level performances). Furthermore, the judgments delivered were strongly related to the kinematic variables measured in the observed model, indicating that observers rely on specific movement kinematics (e.g. movement amplitude, jerk and duration) for action evaluation. These results provide evidence of the complementary functional role of visual and motor action representation in movement evaluation and underline the role of expertise in judging the aesthetic quality of movements.
2015-01-01
The aim of this study was to compare experts to naïve practitioners in rating the beauty and the technical quality of a Tai Chi sequence observed in video-clips (of high and middle level performances). Our hypothesis are: i) movement evaluation will correlate with the level of skill expressed in the kinematics of the observed action but ii) only experts will be able to unravel the technical component from the aesthetic component of the observed action. The judgments delivered indicate that both expert and non-expert observers are able to discern a good from a mediocre performance; however, as expected, only experts discriminate the technical from the aesthetic component of the action evaluated and do this independently of the level of skill shown by the model (high or middle level performances). Furthermore, the judgments delivered were strongly related to the kinematic variables measured in the observed model, indicating that observers rely on specific movement kinematics (e.g. movement amplitude, jerk and duration) for action evaluation. These results provide evidence of the complementary functional role of visual and motor action representation in movement evaluation and underline the role of expertise in judging the aesthetic quality of movements. PMID:26047473
Expert systems for automated maintenance of a Mars oxygen production system
NASA Technical Reports Server (NTRS)
Ash, Robert L.; Huang, Jen-Kuang; Ho, Ming-Tsang
1989-01-01
A prototype expert system was developed for maintaining autonomous operation of a Mars oxygen production system. Normal operation conditions and failure modes according to certain desired criteria are tested and identified. Several schemes for failure detection and isolation using forward chaining, backward chaining, knowledge-based and rule-based are devised to perform several housekeeping functions. These functions include self-health checkout, an emergency shut down program, fault detection and conventional control activities. An effort was made to derive the dynamic model of the system using Bond-Graph technique in order to develop the model-based failure detection and isolation scheme by estimation method. Finally, computer simulations and experimental results demonstrated the feasibility of the expert system and a preliminary reliability analysis for the oxygen production system is also provided.
NASA Astrophysics Data System (ADS)
Kuzma, H. A.; Boyle, K.; Pullman, S.; Reagan, M. T.; Moridis, G. J.; Blasingame, T. A.; Rector, J. W.; Nikolaou, M.
2010-12-01
A Self Teaching Expert System (SeTES) is being developed for the analysis, design and prediction of gas production from shales. An Expert System is a computer program designed to answer questions or clarify uncertainties that its designers did not necessarily envision which would otherwise have to be addressed by consultation with one or more human experts. Modern developments in computer learning, data mining, database management, web integration and cheap computing power are bringing the promise of expert systems to fruition. SeTES is a partial successor to Prospector, a system to aid in the identification and evaluation of mineral deposits developed by Stanford University and the USGS in the late 1970s, and one of the most famous early expert systems. Instead of the text dialogue used in early systems, the web user interface of SeTES helps a non-expert user to articulate, clarify and reason about a problem by navigating through a series of interactive wizards. The wizards identify potential solutions to queries by retrieving and combining together relevant records from a database. Inferences, decisions and predictions are made from incomplete and noisy inputs using a series of probabilistic models (Bayesian Networks) which incorporate records from the database, physical laws and empirical knowledge in the form of prior probability distributions. The database is mainly populated with empirical measurements, however an automatic algorithm supplements sparse data with synthetic data obtained through physical modeling. This constitutes the mechanism for how SeTES self-teaches. SeTES’ predictive power is expected to grow as users contribute more data into the system. Samples are appropriately weighted to favor high quality empirical data over low quality or synthetic data. Finally, a set of data visualization tools digests the output measurements into graphical outputs.
Home oxygen therapy: re-thinking the role of devices.
Melani, Andrea S; Sestini, Piersante; Rottoli, Paola
2018-03-01
A range of devices are available for delivering and monitoring home oxygen therapy (HOT). Guidelines do not give indications for the choice of the delivery device but recommend the use of an ambulatory system in subjects on HOT whilst walking. Areas covered: We provide a clinical overview of HOT and review traditional and newer delivery and monitoring devices for HOT. Despite relevant technology advancements, clinicians, faced with many challenges when they prescribe oxygen therapy, often remain familiar to traditional devices and continuous flow delivery of oxygen. Some self-filling delivery-less devices could increase the users' level of independence with ecological advantage and, perhaps, reduced cost. Some newer portable oxygen concentrators are being available, but more work is needed to understand their performances in different diseases and clinical settings. Pulse oximetry has gained large diffusion worldwide and some models permit long-term monitoring. Some closed-loop portable monitoring devices are also able to adjust oxygen flow automatically in accordance with the different needs of everyday life. This might help to improve adherence and the practice of proper oxygen titration that has often been omitted because difficult to perform and time-consuming. Expert commentary: The prescribing physicians should know the characteristics of newer devices and use technological advancements to improve the practice of HOT.
Chanona, J; Ribes, J; Seco, A; Ferrer, J
2006-01-01
This paper presents a model-knowledge based algorithm for optimising the primary sludge fermentation process design and operation. This is a recently used method to obtain the volatile fatty acids (VFA), needed to improve biological nutrient removal processes, directly from the raw wastewater. The proposed algorithm consists in a heuristic reasoning algorithm based on the expert knowledge of the process. Only effluent VFA and the sludge blanket height (SBH) have to be set as design criteria, and the optimisation algorithm obtains the minimum return sludge and waste sludge flow rates which fulfil those design criteria. A pilot plant fed with municipal raw wastewater was operated in order to obtain experimental results supporting the developed algorithm groundwork. The experimental results indicate that when SBH was increased, higher solids retention time was obtained in the settler and VFA production increased. Higher recirculation flow-rates resulted in higher VFA production too. Finally, the developed algorithm has been tested by simulating different design conditions with very good results. It has been able to find the optimal operation conditions in all cases on which preset design conditions could be achieved. Furthermore, this is a general algorithm that can be applied to any fermentation-elutriation scheme with or without fermentation reactor.
Comparison of small diameter stone baskets in an in vitro caliceal and ureteral model.
Korman, Emily; Hendlin, Kari; Chotikawanich, Ekkarin; Monga, Manoj
2011-01-01
Three small diameter (<1.5F) stone baskets have recently been introduced. Our objective was to evaluate the stone capture rate of these baskets in an in vitro ureteral model and an in vitro caliceal model using novice, resident, and expert operators. Sacred Heart Medical Halo™ (1.5F), Cook N-Circle(®) Nitinol Tipless Stone Extractor (1.5F), and Boston Scientific OptiFlex(®) (1.3F) stone baskets were tested in an in vitro ureteral and a caliceal model by three novices, three residents, and three experts. The caliceal model consisted of a 7-cm length of 10-mm O.D. plastic tubing with a convex base. Each operator was timed during removal of a 3-mm calculus from each model with three repetitions for each basket. Data were analyzed by analysis of variance single factor tests and t tests assuming unequal variances. In the ureteral model, the Halo had the fastest average rate of stone extraction for experts and novices (0:02 ± 0:01 and 0:08 ± 0:04 min, respectively), as well as the overall fastest average stone extraction rate (0:08 ± 0:06 min). No statistical significant differences in extraction times between baskets were identified in the resident group. In the novice group, the Halo stone extraction rate was significantly faster than the OptiFlex (P=0.029). In the expert group, the OptiFlex had statistically significant slower average extraction rates compared with the Halo (P=0.005) and the N-Circle (P=0.017). In the caliceal model, no statistically significant differences were noted. While no significant differences were noted in extraction times for the caliceal model, the extraction times for the ureteral model were slowest with the OptiFlex basket. Other variables important in selection of the appropriate basket include operator preference, clinical setting, and cost.
Drach-Zahavy, Anat; Broyer, Chaya; Dagan, Efrat
2017-09-01
Shared mental models are crucial for constructing mutual understanding of the patient's condition during a clinical handover. Yet, scant research, if any, has empirically explored mental models of the parties involved in a clinical handover. This study aimed to examine the similarities among mental models of incoming and outgoing nurses, and to test their accuracy by comparing them with mental models of expert nurses. A cross-sectional study, exploring nurses' mental models via the concept mapping technique. 40 clinical handovers. Data were collected via concept mapping of the incoming, outgoing, and expert nurses' mental models (total of 120 concept maps). Similarity and accuracy for concepts and associations indexes were calculated to compare the different maps. About one fifth of the concepts emerged in both outgoing and incoming nurses' concept maps (concept similarity=23%±10.6). Concept accuracy indexes were 35%±18.8 for incoming and 62%±19.6 for outgoing nurses' maps. Although incoming nurses absorbed fewer number of concepts and associations (23% and 12%, respectively), they partially closed the gap (35% and 22%, respectively) relative to expert nurses' maps. The correlations between concept similarities, and incoming as well as outgoing nurses' concept accuracy, were significant (r=0.43, p<0.01; r=0.68 p<0.01, respectively). Finally, in 90% of the maps, outgoing nurses added information concerning the processes enacted during the shift, beyond the expert nurses' gold standard. Two seemingly contradicting processes in the handover were identified. "Information loss", captured by the low similarity indexes among the mental models of incoming and outgoing nurses; and "information restoration", based on accuracy measures indexes among the mental models of the incoming nurses. Based on mental model theory, we propose possible explanations for these processes and derive implications for how to improve a clinical handover. Copyright © 2017 Elsevier Ltd. All rights reserved.
Expert system application for prioritizing preventive actions for shift work: shift expert.
Esen, Hatice; Hatipoğlu, Tuğçen; Cihan, Ahmet; Fiğlali, Nilgün
2017-09-19
Shift patterns, work hours, work arrangements and worker motivations have increasingly become key factors for job performance. The main objective of this article is to design an expert system that identifies the negative effects of shift work and prioritizes mitigation efforts according to their importance in preventing these negative effects. The proposed expert system will be referred to as the shift expert. A thorough literature review is conducted to determine the effects of shift work on workers. Our work indicates that shift work is linked to demographic variables, sleepiness and fatigue, health and well-being, and social and domestic conditions. These parameters constitute the sections of a questionnaire designed to focus on 26 important issues related to shift work. The shift expert is then constructed to provide prevention advice at the individual and organizational levels, and it prioritizes this advice using a fuzzy analytic hierarchy process model, which considers comparison matrices provided by users during the prioritization process. An empirical study of 61 workers working on three rotating shifts is performed. After administering the questionnaires, the collected data are analyzed statistically, and then the shift expert produces individual and organizational recommendations for these workers.
Brain mechanisms of persuasion: how 'expert power' modulates memory and attitudes.
Klucharev, Vasily; Smidts, Ale; Fernández, Guillén
2008-12-01
Human behaviour is affected by various forms of persuasion. The general persuasive effect of high expertise of the communicator, often referred to as 'expert power', is well documented. We found that a single exposure to a combination of an expert and an object leads to a long-lasting positive effect on memory for and attitude towards the object. Using functional magnetic resonance imaging, we probed the neural processes predicting these behavioural effects. Expert context was associated with distributed left-lateralized brain activity in prefrontal and temporal cortices related to active semantic elaboration. Furthermore, experts enhanced subsequent memory effects in the medial temporal lobe (i.e. in hippocampus and parahippocampal gyrus) involved in memory formation. Experts also affected subsequent attitude effects in the caudate nucleus involved in trustful behaviour, reward processing and learning. These results may suggest that the persuasive effect of experts is mediated by modulation of caudate activity resulting in a re-evaluation of the object in terms of its perceived value. Results extend our view of the functional role of the dorsal striatum in social interaction and enable us to make the first steps toward a neuroscientific model of persuasion.
Creating a test blueprint for a progress testing program: A paired-comparisons approach.
von Bergmann, HsingChi; Childs, Ruth A
2018-03-01
Creating a new testing program requires the development of a test blueprint that will determine how the items on each test form are distributed across possible content areas and practice domains. To achieve validity, categories of a blueprint are typically based on the judgments of content experts. How experts judgments are elicited and combined is important to the quality of resulting test blueprints. Content experts in dentistry participated in a day-long faculty-wide workshop to discuss, refine, and confirm the categories and their relative weights. After reaching agreement on categories and their definitions, experts judged the relative importance between category pairs, registering their judgments anonymously using iClicker, an audience response system. Judgments were combined in two ways: a simple calculation that could be performed during the workshop and a multidimensional scaling of the judgments performed later. Content experts were able to produce a set of relative weights using this approach. The multidimensional scaling yielded a three-dimensional model with the potential to provide deeper insights into the basis of the experts' judgments. The approach developed and demonstrated in this study can be applied across academic disciplines to elicit and combine content experts judgments for the development of test blueprints.