Process for selecting engineering tools : applied to selecting a SysML tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Spain, Mark J.; Post, Debra S.; Taylor, Jeffrey L.
2011-02-01
Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.
COUNCIL FOR REGULATORY ENVIRONMENTAL MODELING (CREM) PILOT WATER QUALITY MODEL SELECTION TOOL
EPA's Council for Regulatory Environmental Modeling (CREM) is currently supporting the development of a pilot model selection tool that is intended to help the states and the regions implement the total maximum daily load (TMDL) program. This tool will be implemented within the ...
Link, William; Sauer, John R.
2016-01-01
The analysis of ecological data has changed in two important ways over the last 15 years. The development and easy availability of Bayesian computational methods has allowed and encouraged the fitting of complex hierarchical models. At the same time, there has been increasing emphasis on acknowledging and accounting for model uncertainty. Unfortunately, the ability to fit complex models has outstripped the development of tools for model selection and model evaluation: familiar model selection tools such as Akaike's information criterion and the deviance information criterion are widely known to be inadequate for hierarchical models. In addition, little attention has been paid to the evaluation of model adequacy in context of hierarchical modeling, i.e., to the evaluation of fit for a single model. In this paper, we describe Bayesian cross-validation, which provides tools for model selection and evaluation. We describe the Bayesian predictive information criterion and a Bayesian approximation to the BPIC known as the Watanabe-Akaike information criterion. We illustrate the use of these tools for model selection, and the use of Bayesian cross-validation as a tool for model evaluation, using three large data sets from the North American Breeding Bird Survey.
ERIC Educational Resources Information Center
Baxa, Julie; Christ, Tanya
2018-01-01
Selecting and integrating the use of digital texts/tools in literacy lessons are complex tasks. The DigiLit framework provides a succinct model to guide planning, reflection, coaching, and formative evaluation of teachers' successful digital text/tool selection and integration for literacy lessons. For digital text/tool selection, teachers need to…
Nelson, Carl A; Miller, David J; Oleynikov, Dmitry
2008-01-01
As modular systems come into the forefront of robotic telesurgery, streamlining the process of selecting surgical tools becomes an important consideration. This paper presents a method for optimal queuing of tools in modular surgical tool systems, based on patterns in tool-use sequences, in order to minimize time spent changing tools. The solution approach is to model the set of tools as a graph, with tool-change frequency expressed as edge weights in the graph, and to solve the Traveling Salesman Problem for the graph. In a set of simulations, this method has shown superior performance at optimizing tool arrangements for streamlining surgical procedures.
PyCoTools: A Python Toolbox for COPASI.
Welsh, Ciaran M; Fullard, Nicola; Proctor, Carole J; Martinez-Guimera, Alvaro; Isfort, Robert J; Bascom, Charles C; Tasseff, Ryan; Przyborski, Stefan A; Shanley, Daryl P
2018-05-22
COPASI is an open source software package for constructing, simulating and analysing dynamic models of biochemical networks. COPASI is primarily intended to be used with a graphical user interface but often it is desirable to be able to access COPASI features programmatically, with a high level interface. PyCoTools is a Python package aimed at providing a high level interface to COPASI tasks with an emphasis on model calibration. PyCoTools enables the construction of COPASI models and the execution of a subset of COPASI tasks including time courses, parameter scans and parameter estimations. Additional 'composite' tasks which use COPASI tasks as building blocks are available for increasing parameter estimation throughput, performing identifiability analysis and performing model selection. PyCoTools supports exploratory data analysis on parameter estimation data to assist with troubleshooting model calibrations. We demonstrate PyCoTools by posing a model selection problem designed to show case PyCoTools within a realistic scenario. The aim of the model selection problem is to test the feasibility of three alternative hypotheses in explaining experimental data derived from neonatal dermal fibroblasts in response to TGF-β over time. PyCoTools is used to critically analyse the parameter estimations and propose strategies for model improvement. PyCoTools can be downloaded from the Python Package Index (PyPI) using the command 'pip install pycotools' or directly from GitHub (https://github.com/CiaranWelsh/pycotools). Documentation at http://pycotools.readthedocs.io. Supplementary data are available at Bioinformatics.
33 CFR 385.33 - Revisions to models and analytical tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...
Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P
2011-05-19
There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.
Lessons learned in deploying software estimation technology and tools
NASA Technical Reports Server (NTRS)
Panlilio-Yap, Nikki; Ho, Danny
1994-01-01
Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.
Chen, Juan; Snow, Jacqueline C; Culham, Jody C; Goodale, Melvyn A
2018-04-01
Images of tools induce stronger activation than images of nontools in a left-lateralized network that includes ventral-stream areas implicated in tool identification and dorsal-stream areas implicated in tool manipulation. Importantly, however, graspable tools tend to be elongated rather than stubby, and so the tool-selective responses in some of these areas may, to some extent, reflect sensitivity to elongation rather than "toolness" per se. Using functional magnetic resonance imaging, we investigated the role of elongation in driving tool-specific activation in the 2 streams and their interconnections. We showed that in some "tool-selective" areas, the coding of toolness and elongation coexisted, but in others, elongation and toolness were coded independently. Psychophysiological interaction analysis revealed that toolness, but not elongation, had a strong modulation of the connectivity between the ventral and dorsal streams. Dynamic causal modeling revealed that viewing tools (either elongated or stubby) increased the connectivity from the ventral- to the dorsal-stream tool-selective areas, but only viewing elongated tools increased the reciprocal connectivity between these areas. Overall, these data disentangle how toolness and elongation affect the activation and connectivity of the tool network and help to resolve recent controversies regarding the relative contribution of "toolness" versus elongation in driving dorsal-stream "tool-selective" areas.
The Commander’s Emergency Response Program: A Model for Future Implementation
2010-04-07
unintended Effects. The INVEST-E methodology serves as a tool for commanders and their designated practitioners to properly select projects, increasing...for commanders and their designated practitioners to properly select projects, increasing the effectiveness of CERP funds. 4 TABLE OF...and unintended Effects. The INVEST-E methodology serves as a tool for commanders and their designated practitioners to properly select projects
Method for automation of tool preproduction
NASA Astrophysics Data System (ADS)
Rychkov, D. A.; Yanyushkin, A. S.; Lobanov, D. V.; Arkhipov, P. V.
2018-03-01
The primary objective of tool production is a creation or selection of such tool design which could make it possible to secure high process efficiency, tool availability as well as a quality of received surfaces with minimum means and resources spent on it. It takes much time of application people, being engaged in tool preparation, to make a correct selection of the appropriate tool among the set of variants. Program software has been developed to solve the problem, which helps to create, systematize and carry out a comparative analysis of tool design to identify the rational variant under given production conditions. The literature indicates that systematization and selection of the tool rational design has been carried out in accordance with the developed modeling technology and comparative design analysis. Software application makes it possible to reduce the period of design by 80....85% and obtain a significant annual saving.
Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain; ...
2017-09-23
Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain
Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less
Posada, David
2006-01-01
ModelTest server is a web-based application for the selection of models of nucleotide substitution using the program ModelTest. The server takes as input a text file with likelihood scores for the set of candidate models. Models can be selected with hierarchical likelihood ratio tests, or with the Akaike or Bayesian information criteria. The output includes several statistics for the assessment of model selection uncertainty, for model averaging or to estimate the relative importance of model parameters. The server can be accessed at . PMID:16845102
Development of an Environment for Software Reliability Model Selection
1992-09-01
now is directed to other related problems such as tools for model selection, multiversion programming, and software fault tolerance modeling... multiversion programming, 7. Hlardware can be repaired by spare modules, which is not. the case for software, 2-6 N. Preventive maintenance is very important
NASA Astrophysics Data System (ADS)
El Naqa, I.; Suneja, G.; Lindsay, P. E.; Hope, A. J.; Alaly, J. R.; Vicic, M.; Bradley, J. D.; Apte, A.; Deasy, J. O.
2006-11-01
Radiotherapy treatment outcome models are a complicated function of treatment, clinical and biological factors. Our objective is to provide clinicians and scientists with an accurate, flexible and user-friendly software tool to explore radiotherapy outcomes data and build statistical tumour control or normal tissue complications models. The software tool, called the dose response explorer system (DREES), is based on Matlab, and uses a named-field structure array data type. DREES/Matlab in combination with another open-source tool (CERR) provides an environment for analysing treatment outcomes. DREES provides many radiotherapy outcome modelling features, including (1) fitting of analytical normal tissue complication probability (NTCP) and tumour control probability (TCP) models, (2) combined modelling of multiple dose-volume variables (e.g., mean dose, max dose, etc) and clinical factors (age, gender, stage, etc) using multi-term regression modelling, (3) manual or automated selection of logistic or actuarial model variables using bootstrap statistical resampling, (4) estimation of uncertainty in model parameters, (5) performance assessment of univariate and multivariate analyses using Spearman's rank correlation and chi-square statistics, boxplots, nomograms, Kaplan-Meier survival plots, and receiver operating characteristics curves, and (6) graphical capabilities to visualize NTCP or TCP prediction versus selected variable models using various plots. DREES provides clinical researchers with a tool customized for radiotherapy outcome modelling. DREES is freely distributed. We expect to continue developing DREES based on user feedback.
Development of materials for the rapid manufacture of die cast tooling
NASA Astrophysics Data System (ADS)
Hardro, Peter Jason
The focus of this research is to develop a material composition that can be processed by rapid prototyping (RP) in order to produce tooling for the die casting process. Where these rapidly produced tools will be superior to traditional tooling production methods by offering one or more of the following advantages: reduced tooling cost, shortened tooling creation time, reduced man-hours for tool creation, increased tool life, and shortened die casting cycle time. By utilizing RP's additive build process and vast material selection, there was a prospect that die cast tooling may be produced quicker and with superior material properties. To this end, the material properties that influence die life and cycle time were determined, and a list of materials that fulfill these "optimal" properties were highlighted. Physical testing was conducted in order to grade the processability of each of the material systems and to optimize the manufacturing process for the downselected material system. Sample specimens were produced and microscopy techniques were utilized to determine a number of physical properties of the material system. Additionally, a benchmark geometry was selected and die casting dies were produced from traditional tool materials (H13 steel) and techniques (machining) and from the newly developed materials and RP techniques (selective laser sintering (SLS) and laser engineered net shaping (LENS)). Once the tools were created, a die cast alloy was selected and a preset number of parts were shot into each tool. During tool creation, the manufacturing time and cost was closely monitored and an economic model was developed to compare traditional tooling to RP tooling. This model allows one to determine, in the early design stages, when it is advantageous to implement RP tooling and when traditional tooling would be best. The results of the physical testing and economic analysis has shown that RP tooling is able to achieve a number of the research objectives, namely, reduce tooling cost, shorten tooling creation time, and reduce the man-hours needed for tool creation. Though identifying the appropriate time to use RP tooling appears to be the most important aspect in achieving successful implementation.
Elementary Teachers' Selection and Use of Visual Models
NASA Astrophysics Data System (ADS)
Lee, Tammy D.; Gail Jones, M.
2018-02-01
As science grows in complexity, science teachers face an increasing challenge of helping students interpret models that represent complex science systems. Little is known about how teachers select and use models when planning lessons. This mixed methods study investigated the pedagogical approaches and visual models used by elementary in-service and preservice teachers in the development of a science lesson about a complex system (e.g., water cycle). Sixty-seven elementary in-service and 69 elementary preservice teachers completed a card sort task designed to document the types of visual models (e.g., images) that teachers choose when planning science instruction. Quantitative and qualitative analyses were conducted to analyze the card sort task. Semistructured interviews were conducted with a subsample of teachers to elicit the rationale for image selection. Results from this study showed that both experienced in-service teachers and novice preservice teachers tended to select similar models and use similar rationales for images to be used in lessons. Teachers tended to select models that were aesthetically pleasing and simple in design and illustrated specific elements of the water cycle. The results also showed that teachers were not likely to select images that represented the less obvious dimensions of the water cycle. Furthermore, teachers selected visual models more as a pedagogical tool to illustrate specific elements of the water cycle and less often as a tool to promote student learning related to complex systems.
Application of simulation models for the optimization of business processes
NASA Astrophysics Data System (ADS)
Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří
2016-06-01
The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.
USDA-ARS?s Scientific Manuscript database
Currently, sugarcane selection begins at the seedling stage with visual selection for cane yield and other yield-related traits. Although subjective and inefficient, visual selection remains the primary method for selection. Visual selection is inefficient because of the confounding effect of genoty...
Aboushanab, Tamer; AlSanad, Saud
2018-06-08
Cupping therapy is a popular treatment in various countries and regions, including Saudi Arabia. Cupping therapy is regulated in Saudi Arabia by the National Center for Complementary and Alternative Medicine (NCCAM), Ministry of Health. The authors recommend that this quality model for selecting patients in cupping clinics - first version (QMSPCC-1) - be used routinely as part of clinical practice and quality management in cupping clinics. The aim of the quality model is to ensure the safety of patients and to introduce and facilitate quality and auditing processes in cupping therapy clinics. Clinical evaluation of this tool is recommended. Continued development, re-evaluation and reassessment of this tool are important. Copyright © 2018. Published by Elsevier B.V.
The Green Tool represents infiltration-based stormwater control practices. It allows modelers to select a BMP type, channel shape and BMP unit dimensions, outflow control devices, and infiltration method. The program generates an HSPF-formatted FTABLE.
Evaluating models of healthcare delivery using the Model of Care Evaluation Tool (MCET).
Hudspeth, Randall S; Vogt, Marjorie; Wysocki, Ken; Pittman, Oralea; Smith, Susan; Cooke, Cindy; Dello Stritto, Rita; Hoyt, Karen Sue; Merritt, T Jeanne
2016-08-01
Our aim was to provide the outcome of a structured Model of Care (MoC) Evaluation Tool (MCET), developed by an FAANP Best-practices Workgroup, that can be used to guide the evaluation of existing MoCs being considered for use in clinical practice. Multiple MoCs are available, but deciding which model of health care delivery to use can be confusing. This five-component tool provides a structured assessment approach to model selection and has universal application. A literature review using CINAHL, PubMed, Ovid, and EBSCO was conducted. The MCET evaluation process includes five sequential components with a feedback loop from component 5 back to component 3 for reevaluation of any refinements. The components are as follows: (1) Background, (2) Selection of an MoC, (3) Implementation, (4) Evaluation, and (5) Sustainability and Future Refinement. This practical resource considers an evidence-based approach to use in determining the best model to implement based on need, stakeholder considerations, and feasibility. ©2015 American Association of Nurse Practitioners.
Model weights and the foundations of multimodel inference
Link, W.A.; Barker, R.J.
2006-01-01
Statistical thinking in wildlife biology and ecology has been profoundly influenced by the introduction of AIC (Akaike?s information criterion) as a tool for model selection and as a basis for model averaging. In this paper, we advocate the Bayesian paradigm as a broader framework for multimodel inference, one in which model averaging and model selection are naturally linked, and in which the performance of AIC-based tools is naturally evaluated. Prior model weights implicitly associated with the use of AIC are seen to highly favor complex models: in some cases, all but the most highly parameterized models in the model set are virtually ignored a priori. We suggest the usefulness of the weighted BIC (Bayesian information criterion) as a computationally simple alternative to AIC, based on explicit selection of prior model probabilities rather than acceptance of default priors associated with AIC. We note, however, that both procedures are only approximate to the use of exact Bayes factors. We discuss and illustrate technical difficulties associated with Bayes factors, and suggest approaches to avoiding these difficulties in the context of model selection for a logistic regression. Our example highlights the predisposition of AIC weighting to favor complex models and suggests a need for caution in using the BIC for computing approximate posterior model weights.
Multi-category micro-milling tool wear monitoring with continuous hidden Markov models
NASA Astrophysics Data System (ADS)
Zhu, Kunpeng; Wong, Yoke San; Hong, Geok Soon
2009-02-01
In-process monitoring of tool conditions is important in micro-machining due to the high precision requirement and high tool wear rate. Tool condition monitoring in micro-machining poses new challenges compared to conventional machining. In this paper, a multi-category classification approach is proposed for tool flank wear state identification in micro-milling. Continuous Hidden Markov models (HMMs) are adapted for modeling of the tool wear process in micro-milling, and estimation of the tool wear state given the cutting force features. For a noise-robust approach, the HMM outputs are connected via a medium filter to minimize the tool state before entry into the next state due to high noise level. A detailed study on the selection of HMM structures for tool condition monitoring (TCM) is presented. Case studies on the tool state estimation in the micro-milling of pure copper and steel demonstrate the effectiveness and potential of these methods.
GAMES II Project: a general architecture for medical knowledge-based systems.
Bruno, F; Kindler, H; Leaning, M; Moustakis, V; Scherrer, J R; Schreiber, G; Stefanelli, M
1994-10-01
GAMES II aims at developing a comprehensive and commercially viable methodology to avoid problems ordinarily occurring in KBS development. GAMES II methodology proposes to design a KBS starting from an epistemological model of medical reasoning (the Select and Test Model). The design is viewed as a process of adding symbol level information to the epistemological model. The architectural framework provided by GAMES II integrates the use of different formalisms and techniques providing a large set of tools. The user can select the most suitable one for representing a piece of knowledge after a careful analysis of its epistemological characteristics. Special attention is devoted to the tools dealing with knowledge acquisition (both manual and automatic). A panel of practicing physicians are assessing the medical value of such a framework and its related tools by using it in a practical application.
Gross, Douglas P; Armijo-Olivo, Susan; Shaw, William S; Williams-Whitt, Kelly; Shaw, Nicola T; Hartvigsen, Jan; Qin, Ziling; Ha, Christine; Woodhouse, Linda J; Steenstra, Ivan A
2016-09-01
Purpose We aimed to identify and inventory clinical decision support (CDS) tools for helping front-line staff select interventions for patients with musculoskeletal (MSK) disorders. Methods We used Arksey and O'Malley's scoping review framework which progresses through five stages: (1) identifying the research question; (2) identifying relevant studies; (3) selecting studies for analysis; (4) charting the data; and (5) collating, summarizing and reporting results. We considered computer-based, and other available tools, such as algorithms, care pathways, rules and models. Since this research crosses multiple disciplines, we searched health care, computing science and business databases. Results Our search resulted in 4605 manuscripts. Titles and abstracts were screened for relevance. The reliability of the screening process was high with an average percentage of agreement of 92.3 %. Of the located articles, 123 were considered relevant. Within this literature, there were 43 CDS tools located. These were classified into 3 main areas: computer-based tools/questionnaires (n = 8, 19 %), treatment algorithms/models (n = 14, 33 %), and clinical prediction rules/classification systems (n = 21, 49 %). Each of these areas and the associated evidence are described. The state of evidentiary support for CDS tools is still preliminary and lacks external validation, head-to-head comparisons, or evidence of generalizability across different populations and settings. Conclusions CDS tools, especially those employing rapidly advancing computer technologies, are under development and of potential interest to health care providers, case management organizations and funders of care. Based on the results of this scoping review, we conclude that these tools, models and systems should be subjected to further validation before they can be recommended for large-scale implementation for managing patients with MSK disorders.
System Architecture Modeling for Technology Portfolio Management using ATLAS
NASA Technical Reports Server (NTRS)
Thompson, Robert W.; O'Neil, Daniel A.
2006-01-01
Strategic planners and technology portfolio managers have traditionally relied on consensus-based tools, such as Analytical Hierarchy Process (AHP) and Quality Function Deployment (QFD) in planning the funding of technology development. While useful to a certain extent, these tools are limited in the ability to fully quantify the impact of a technology choice on system mass, system reliability, project schedule, and lifecycle cost. The Advanced Technology Lifecycle Analysis System (ATLAS) aims to provide strategic planners a decision support tool for analyzing technology selections within a Space Exploration Architecture (SEA). Using ATLAS, strategic planners can select physics-based system models from a library, configure the systems with technologies and performance parameters, and plan the deployment of a SEA. Key parameters for current and future technologies have been collected from subject-matter experts and other documented sources in the Technology Tool Box (TTB). ATLAS can be used to compare the technical feasibility and economic viability of a set of technology choices for one SEA, and compare it against another set of technology choices or another SEA. System architecture modeling in ATLAS is a multi-step process. First, the modeler defines the system level requirements. Second, the modeler identifies technologies of interest whose impact on an SEA. Third, the system modeling team creates models of architecture elements (e.g. launch vehicles, in-space transfer vehicles, crew vehicles) if they are not already in the model library. Finally, the architecture modeler develops a script for the ATLAS tool to run, and the results for comparison are generated.
IT vendor selection model by using structural equation model & analytical hierarchy process
NASA Astrophysics Data System (ADS)
Maitra, Sarit; Dominic, P. D. D.
2012-11-01
Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.
Pre-selection and assessment of green organic solvents by clustering chemometric tools.
Tobiszewski, Marek; Nedyalkova, Miroslava; Madurga, Sergio; Pena-Pereira, Francisco; Namieśnik, Jacek; Simeonov, Vasil
2018-01-01
The study presents the result of the application of chemometric tools for selection of physicochemical parameters of solvents for predicting missing variables - bioconcentration factors, water-octanol and octanol-air partitioning constants. EPI Suite software was successfully applied to predict missing values for solvents commonly considered as "green". Values for logBCF, logK OW and logK OA were modelled for 43 rather nonpolar solvents and 69 polar ones. Application of multivariate statistics was also proved to be useful in the assessment of the obtained modelling results. The presented approach can be one of the first steps and support tools in the assessment of chemicals in terms of their greenness. Copyright © 2017 Elsevier Inc. All rights reserved.
Newton, Paul; Chandler, Val; Morris-Thomson, Trish; Sayer, Jane; Burke, Linda
2015-01-01
To map current selection and recruitment processes for newly qualified nurses and to explore the advantages and limitations of current selection and recruitment processes. The need to improve current selection and recruitment practices for newly qualified nurses is highlighted in health policy internationally. A cross-sectional, sequential-explanatory mixed-method design with 4 components: (1) Literature review of selection and recruitment of newly qualified nurses; and (2) Literature review of a public sector professions' selection and recruitment processes; (3) Survey mapping existing selection and recruitment processes for newly qualified nurses; and (4) Qualitative study about recruiters' selection and recruitment processes. Literature searches on the selection and recruitment of newly qualified candidates in teaching and nursing (2005-2013) were conducted. Cross-sectional, mixed-method data were collected from thirty-one (n = 31) individuals in health providers in London who had responsibility for the selection and recruitment of newly qualified nurses using a survey instrument. Of these providers who took part, six (n = 6) purposively selected to be interviewed qualitatively. Issues of supply and demand in the workforce, rather than selection and recruitment tools, predominated in the literature reviews. Examples of tools to measure values, attitudes and skills were found in the nursing literature. The mapping exercise found that providers used many selection and recruitment tools, some providers combined tools to streamline process and assure quality of candidates. Most providers had processes which addressed the issue of quality in the selection and recruitment of newly qualified nurses. The 'assessment centre model', which providers were adopting, allowed for multiple levels of assessment and streamlined recruitment. There is a need to validate the efficacy of the selection tools. © 2014 John Wiley & Sons Ltd.
Automatic welding detection by an intelligent tool pipe inspection
NASA Astrophysics Data System (ADS)
Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.
2015-07-01
This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.
Mixed conditional logistic regression for habitat selection studies.
Duchesne, Thierry; Fortin, Daniel; Courbin, Nicolas
2010-05-01
1. Resource selection functions (RSFs) are becoming a dominant tool in habitat selection studies. RSF coefficients can be estimated with unconditional (standard) and conditional logistic regressions. While the advantage of mixed-effects models is recognized for standard logistic regression, mixed conditional logistic regression remains largely overlooked in ecological studies. 2. We demonstrate the significance of mixed conditional logistic regression for habitat selection studies. First, we use spatially explicit models to illustrate how mixed-effects RSFs can be useful in the presence of inter-individual heterogeneity in selection and when the assumption of independence from irrelevant alternatives (IIA) is violated. The IIA hypothesis states that the strength of preference for habitat type A over habitat type B does not depend on the other habitat types also available. Secondly, we demonstrate the significance of mixed-effects models to evaluate habitat selection of free-ranging bison Bison bison. 3. When movement rules were homogeneous among individuals and the IIA assumption was respected, fixed-effects RSFs adequately described habitat selection by simulated animals. In situations violating the inter-individual homogeneity and IIA assumptions, however, RSFs were best estimated with mixed-effects regressions, and fixed-effects models could even provide faulty conclusions. 4. Mixed-effects models indicate that bison did not select farmlands, but exhibited strong inter-individual variations in their response to farmlands. Less than half of the bison preferred farmlands over forests. Conversely, the fixed-effect model simply suggested an overall selection for farmlands. 5. Conditional logistic regression is recognized as a powerful approach to evaluate habitat selection when resource availability changes. This regression is increasingly used in ecological studies, but almost exclusively in the context of fixed-effects models. Fitness maximization can imply differences in trade-offs among individuals, which can yield inter-individual differences in selection and lead to departure from IIA. These situations are best modelled with mixed-effects models. Mixed-effects conditional logistic regression should become a valuable tool for ecological research.
NASA Astrophysics Data System (ADS)
Nerguizian, Vahe; Rafaf, Mustapha
2004-08-01
This article describes and provides valuable information for companies and universities with strategies to start fabricating MEMS for RF/Microwave and millimeter wave applications. The present work shows the infrastructure developed for RF/Microwave and millimeter wave MEMS platforms, which helps the identification, evaluation and selection of design tools and fabrication foundries taking into account packaging and testing. The selected and implemented simple infrastructure models, based on surface and bulk micromachining, yield inexpensive and innovative approaches for distributed choices of MEMS operating tools. With different educational or industrial institution needs, these models may be modified for specific resource changes using a careful analyzed iteration process. The inputs of the project are evaluation selection criteria and information sources such as financial, technical, availability, accessibility, simplicity, versatility and practical considerations. The outputs of the project are the selection of different MEMS design tools or software (solid modeling, electrostatic/electromagnetic and others, compatible with existing standard RF/Microwave design tools) and different MEMS manufacturing foundries. Typical RF/Microwave and millimeter wave MEMS solutions are introduced on the platform during the evaluation and development phases of the project for the validation of realistic results and operational decision making choices. The encountered challenges during the investigation and the development steps are identified and the dynamic behavior of the infrastructure is emphasized. The inputs (resources) and the outputs (demonstrated solutions) are presented in tables and flow chart mode diagrams.
NASA Astrophysics Data System (ADS)
Darmon, David
2018-03-01
In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.
An Interactive Tool For Semi-automated Statistical Prediction Using Earth Observations and Models
NASA Astrophysics Data System (ADS)
Zaitchik, B. F.; Berhane, F.; Tadesse, T.
2015-12-01
We developed a semi-automated statistical prediction tool applicable to concurrent analysis or seasonal prediction of any time series variable in any geographic location. The tool was developed using Shiny, JavaScript, HTML and CSS. A user can extract a predictand by drawing a polygon over a region of interest on the provided user interface (global map). The user can select the Climatic Research Unit (CRU) precipitation or Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS) as predictand. They can also upload their own predictand time series. Predictors can be extracted from sea surface temperature, sea level pressure, winds at different pressure levels, air temperature at various pressure levels, and geopotential height at different pressure levels. By default, reanalysis fields are applied as predictors, but the user can also upload their own predictors, including a wide range of compatible satellite-derived datasets. The package generates correlations of the variables selected with the predictand. The user also has the option to generate composites of the variables based on the predictand. Next, the user can extract predictors by drawing polygons over the regions that show strong correlations (composites). Then, the user can select some or all of the statistical prediction models provided. Provided models include Linear Regression models (GLM, SGLM), Tree-based models (bagging, random forest, boosting), Artificial Neural Network, and other non-linear models such as Generalized Additive Model (GAM) and Multivariate Adaptive Regression Splines (MARS). Finally, the user can download the analysis steps they used, such as the region they selected, the time period they specified, the predictand and predictors they chose and preprocessing options they used, and the model results in PDF or HTML format. Key words: Semi-automated prediction, Shiny, R, GLM, ANN, RF, GAM, MARS
Botvinick, Matthew M.; Buxbaum, Laurel J.; Bylsma, Lauren M.; Jax, Steven A.
2014-01-01
The act of reaching for and acting upon an object involves two forms of selection: selection of the object as a target, and selection of the action to be performed. While these two forms of selection are logically dissociable, and are evidently subserved by separable neural pathways, they must also be closely coordinated. We examine the nature of this coordination by developing and analyzing a computational model of object and action selection first proposed by Ward [Ward, R. (1999). Interactions between perception and action systems: a model for selective action. In G. W. Humphreys, J. Duncan, & A. Treisman (Eds.), Attention, Space and Action: Studies in Cognitive Neuroscience. Oxford: Oxford University Press]. An interesting tenet of this account, which we explore in detail, is that the interplay between object and action selection depends critically on top-down inputs representing the current task set or plan of action. A concrete manifestation of this, established through a series of simulations, is that the impact of distractor objects on reaching times can vary depending on the nature of the current action plan. In order to test the model's predictions in this regard, we conducted two experiments, one involving direct object manipulation, the other involving tool-use. In both experiments we observed the specific interaction between task set and distractor type predicted by the model. Our findings provide support for the computational model, and more broadly for an interactive account of object and action selection. PMID:19100758
Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin
2012-05-30
This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.
Szaleniec, Maciej
2012-01-01
Artificial Neural Networks (ANNs) are introduced as robust and versatile tools in quantitative structure-activity relationship (QSAR) modeling. Their application to the modeling of enzyme reactivity is discussed, along with methodological issues. Methods of input variable selection, optimization of network internal structure, data set division and model validation are discussed. The application of ANNs in the modeling of enzyme activity over the last 20 years is briefly recounted. The discussed methodology is exemplified by the case of ethylbenzene dehydrogenase (EBDH). Intelligent Problem Solver and genetic algorithms are applied for input vector selection, whereas k-means clustering is used to partition the data into training and test cases. The obtained models exhibit high correlation between the predicted and experimental values (R(2) > 0.9). Sensitivity analyses and study of the response curves are used as tools for the physicochemical interpretation of the models in terms of the EBDH reaction mechanism. Neural networks are shown to be a versatile tool for the construction of robust QSAR models that can be applied to a range of aspects important in drug design and the prediction of biological activity.
Gaussian process regression for tool wear prediction
NASA Astrophysics Data System (ADS)
Kong, Dongdong; Chen, Yongjie; Li, Ning
2018-05-01
To realize and accelerate the pace of intelligent manufacturing, this paper presents a novel tool wear assessment technique based on the integrated radial basis function based kernel principal component analysis (KPCA_IRBF) and Gaussian process regression (GPR) for real-timely and accurately monitoring the in-process tool wear parameters (flank wear width). The KPCA_IRBF is a kind of new nonlinear dimension-increment technique and firstly proposed for feature fusion. The tool wear predictive value and the corresponding confidence interval are both provided by utilizing the GPR model. Besides, GPR performs better than artificial neural networks (ANN) and support vector machines (SVM) in prediction accuracy since the Gaussian noises can be modeled quantitatively in the GPR model. However, the existence of noises will affect the stability of the confidence interval seriously. In this work, the proposed KPCA_IRBF technique helps to remove the noises and weaken its negative effects so as to make the confidence interval compressed greatly and more smoothed, which is conducive for monitoring the tool wear accurately. Moreover, the selection of kernel parameter in KPCA_IRBF can be easily carried out in a much larger selectable region in comparison with the conventional KPCA_RBF technique, which helps to improve the efficiency of model construction. Ten sets of cutting tests are conducted to validate the effectiveness of the presented tool wear assessment technique. The experimental results show that the in-process flank wear width of tool inserts can be monitored accurately by utilizing the presented tool wear assessment technique which is robust under a variety of cutting conditions. This study lays the foundation for tool wear monitoring in real industrial settings.
Approach to in-process tool wear monitoring in drilling: Application of Kalman filter theory
NASA Astrophysics Data System (ADS)
He, Ning; Zhang, Youzhen; Pan, Liangxian
1993-05-01
The two parameters often used in adaptive control, tool wear and wear rate, are the important factors affecting machinability. In this paper, it is attempted to use the modern cybernetics to solve the in-process tool wear monitoring problem by applying the Kalman filter theory to monitor drill wear quantitatively. Based on the experimental results, a dynamic model, a measuring model and a measurement conversion model suitable for Kalman filter are established. It is proved that the monitoring system possesses complete observability but does not possess complete controllability. A discriminant for selecting the characteristic parameters is put forward. The thrust force Fz is selected as the characteristic parameter in monitoring the tool wear by this discriminant. The in-process Kalman filter drill wear monitoring system composed of force sensor microphotography and microcomputer is well established. The results obtained by the Kalman filter, the common indirect measuring method and the real drill wear measured by the aid of microphotography are compared. The result shows that the Kalman filter has high precision of measurement and the real time requirement can be satisfied.
NASA Technical Reports Server (NTRS)
Nguyen, Lac; Kenney, Patrick J.
1993-01-01
Development of interactive virtual environments (VE) has typically consisted of three primary activities: model (object) development, model relationship tree development, and environment behavior definition and coding. The model and relationship tree development activities are accomplished with a variety of well-established graphic library (GL) based programs - most utilizing graphical user interfaces (GUI) with point-and-click interactions. Because of this GUI format, little programming expertise on the part of the developer is necessary to create the 3D graphical models or to establish interrelationships between the models. However, the third VE development activity, environment behavior definition and coding, has generally required the greatest amount of time and programmer expertise. Behaviors, characteristics, and interactions between objects and the user within a VE must be defined via command line C coding prior to rendering the environment scenes. In an effort to simplify this environment behavior definition phase for non-programmers, and to provide easy access to model and tree tools, a graphical interface and development tool has been created. The principal thrust of this research is to effect rapid development and prototyping of virtual environments. This presentation will discuss the 'Visual Interface for Virtual Interaction Development' (VIVID) tool; an X-Windows based system employing drop-down menus for user selection of program access, models, and trees, behavior editing, and code generation. Examples of these selection will be highlighted in this presentation, as will the currently available program interfaces. The functionality of this tool allows non-programming users access to all facets of VE development while providing experienced programmers with a collection of pre-coded behaviors. In conjunction with its existing, interfaces and predefined suite of behaviors, future development plans for VIVID will be described. These include incorporation of dual user virtual environment enhancements, tool expansion, and additional behaviors.
When product designers use perceptually based color tools
NASA Astrophysics Data System (ADS)
Bender, Walter R.
1998-07-01
Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to give guidance to their selection of seasonal palettes for use in production of the private-label merchandise of a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.
When product designers use perceptually based color tools
NASA Astrophysics Data System (ADS)
Bender, Walter R.
2001-01-01
Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to guide their selection of seasonal palettes in the production of the private-label merchandise in a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.
Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.
Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana
2016-01-01
The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.
Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo
2018-02-19
The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.
2018-01-01
Background The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. Methods On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. Results All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. Conclusion These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. PMID:29441756
Force Sensor Based Tool Condition Monitoring Using a Heterogeneous Ensemble Learning Model
Wang, Guofeng; Yang, Yinwei; Li, Zhimeng
2014-01-01
Tool condition monitoring (TCM) plays an important role in improving machining efficiency and guaranteeing workpiece quality. In order to realize reliable recognition of the tool condition, a robust classifier needs to be constructed to depict the relationship between tool wear states and sensory information. However, because of the complexity of the machining process and the uncertainty of the tool wear evolution, it is hard for a single classifier to fit all the collected samples without sacrificing generalization ability. In this paper, heterogeneous ensemble learning is proposed to realize tool condition monitoring in which the support vector machine (SVM), hidden Markov model (HMM) and radius basis function (RBF) are selected as base classifiers and a stacking ensemble strategy is further used to reflect the relationship between the outputs of these base classifiers and tool wear states. Based on the heterogeneous ensemble learning classifier, an online monitoring system is constructed in which the harmonic features are extracted from force signals and a minimal redundancy and maximal relevance (mRMR) algorithm is utilized to select the most prominent features. To verify the effectiveness of the proposed method, a titanium alloy milling experiment was carried out and samples with different tool wear states were collected to build the proposed heterogeneous ensemble learning classifier. Moreover, the homogeneous ensemble learning model and majority voting strategy are also adopted to make a comparison. The analysis and comparison results show that the proposed heterogeneous ensemble learning classifier performs better in both classification accuracy and stability. PMID:25405514
Force sensor based tool condition monitoring using a heterogeneous ensemble learning model.
Wang, Guofeng; Yang, Yinwei; Li, Zhimeng
2014-11-14
Tool condition monitoring (TCM) plays an important role in improving machining efficiency and guaranteeing workpiece quality. In order to realize reliable recognition of the tool condition, a robust classifier needs to be constructed to depict the relationship between tool wear states and sensory information. However, because of the complexity of the machining process and the uncertainty of the tool wear evolution, it is hard for a single classifier to fit all the collected samples without sacrificing generalization ability. In this paper, heterogeneous ensemble learning is proposed to realize tool condition monitoring in which the support vector machine (SVM), hidden Markov model (HMM) and radius basis function (RBF) are selected as base classifiers and a stacking ensemble strategy is further used to reflect the relationship between the outputs of these base classifiers and tool wear states. Based on the heterogeneous ensemble learning classifier, an online monitoring system is constructed in which the harmonic features are extracted from force signals and a minimal redundancy and maximal relevance (mRMR) algorithm is utilized to select the most prominent features. To verify the effectiveness of the proposed method, a titanium alloy milling experiment was carried out and samples with different tool wear states were collected to build the proposed heterogeneous ensemble learning classifier. Moreover, the homogeneous ensemble learning model and majority voting strategy are also adopted to make a comparison. The analysis and comparison results show that the proposed heterogeneous ensemble learning classifier performs better in both classification accuracy and stability.
Input variable selection and calibration data selection for storm water quality regression models.
Sun, Siao; Bertrand-Krajewski, Jean-Luc
2013-01-01
Storm water quality models are useful tools in storm water management. Interest has been growing in analyzing existing data for developing models for urban storm water quality evaluations. It is important to select appropriate model inputs when many candidate explanatory variables are available. Model calibration and verification are essential steps in any storm water quality modeling. This study investigates input variable selection and calibration data selection in storm water quality regression models. The two selection problems are mutually interacted. A procedure is developed in order to fulfil the two selection tasks in order. The procedure firstly selects model input variables using a cross validation method. An appropriate number of variables are identified as model inputs to ensure that a model is neither overfitted nor underfitted. Based on the model input selection results, calibration data selection is studied. Uncertainty of model performances due to calibration data selection is investigated with a random selection method. An approach using the cluster method is applied in order to enhance model calibration practice based on the principle of selecting representative data for calibration. The comparison between results from the cluster selection method and random selection shows that the former can significantly improve performances of calibrated models. It is found that the information content in calibration data is important in addition to the size of calibration data.
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
Consequences of Base Time for Redundant Signals Experiments
Townsend, James T.; Honey, Christopher
2007-01-01
We report analytical and computational investigations into the effects of base time on the diagnosticity of two popular theoretical tools in the redundant signals literature: (1) the race model inequality and (2) the capacity coefficient. We show analytically and without distributional assumptions that the presence of base time decreases the sensitivity of both of these measures to model violations. We further use simulations to investigate the statistical power model selection tools based on the race model inequality, both with and without base time. Base time decreases statistical power, and biases the race model test toward conservatism. The magnitude of this biasing effect increases as we increase the proportion of total reaction time variance contributed by base time. We marshal empirical evidence to suggest that the proportion of reaction time variance contributed by base time is relatively small, and that the effects of base time on the diagnosticity of our model-selection tools are therefore likely to be minor. However, uncertainty remains concerning the magnitude and even the definition of base time. Experimentalists should continue to be alert to situations in which base time may contribute a large proportion of the total reaction time variance. PMID:18670591
Sutton, Steven C; Hu, Mingxiu
2006-05-05
Many mathematical models have been proposed for establishing an in vitro/in vivo correlation (IVIVC). The traditional IVIVC model building process consists of 5 steps: deconvolution, model fitting, convolution, prediction error evaluation, and cross-validation. This is a time-consuming process and typically a few models at most are tested for any given data set. The objectives of this work were to (1) propose a statistical tool to screen models for further development of an IVIVC, (2) evaluate the performance of each model under different circumstances, and (3) investigate the effectiveness of common statistical model selection criteria for choosing IVIVC models. A computer program was developed to explore which model(s) would be most likely to work well with a random variation from the original formulation. The process used Monte Carlo simulation techniques to build IVIVC models. Data-based model selection criteria (Akaike Information Criteria [AIC], R2) and the probability of passing the Food and Drug Administration "prediction error" requirement was calculated. To illustrate this approach, several real data sets representing a broad range of release profiles are used to illustrate the process and to demonstrate the advantages of this automated process over the traditional approach. The Hixson-Crowell and Weibull models were often preferred over the linear. When evaluating whether a Level A IVIVC model was possible, the model selection criteria AIC generally selected the best model. We believe that the approach we proposed may be a rapid tool to determine which IVIVC model (if any) is the most applicable.
Optimization and Simulation of SLM Process for High Density H13 Tool Steel Parts
NASA Astrophysics Data System (ADS)
Laakso, Petri; Riipinen, Tuomas; Laukkanen, Anssi; Andersson, Tom; Jokinen, Antero; Revuelta, Alejandro; Ruusuvuori, Kimmo
This paper demonstrates the successful printing and optimization of processing parameters of high-strength H13 tool steel by Selective Laser Melting (SLM). D-Optimal Design of Experiments (DOE) approach is used for parameter optimization of laser power, scanning speed and hatch width. With 50 test samples (1×1×1cm) we establish parameter windows for these three parameters in relation to part density. The calculated numerical model is found to be in good agreement with the density data obtained from the samples using image analysis. A thermomechanical finite element simulation model is constructed of the SLM process and validated by comparing the calculated densities retrieved from the model with the experimentally determined densities. With the simulation tool one can explore the effect of different parameters on density before making any printed samples. Establishing a parameter window provides the user with freedom for parameter selection such as choosing parameters that result in fastest print speed.
CalFitter: a web server for analysis of protein thermal denaturation data.
Mazurenko, Stanislav; Stourac, Jan; Kunka, Antonin; Nedeljkovic, Sava; Bednar, David; Prokop, Zbynek; Damborsky, Jiri
2018-05-14
Despite significant advances in the understanding of protein structure-function relationships, revealing protein folding pathways still poses a challenge due to a limited number of relevant experimental tools. Widely-used experimental techniques, such as calorimetry or spectroscopy, critically depend on a proper data analysis. Currently, there are only separate data analysis tools available for each type of experiment with a limited model selection. To address this problem, we have developed the CalFitter web server to be a unified platform for comprehensive data fitting and analysis of protein thermal denaturation data. The server allows simultaneous global data fitting using any combination of input data types and offers 12 protein unfolding pathway models for selection, including irreversible transitions often missing from other tools. The data fitting produces optimal parameter values, their confidence intervals, and statistical information to define unfolding pathways. The server provides an interactive and easy-to-use interface that allows users to directly analyse input datasets and simulate modelled output based on the model parameters. CalFitter web server is available free at https://loschmidt.chemi.muni.cz/calfitter/.
O'Donnell, Michael S.; Aldridge, Cameron L.; Doherty, Kevin E.; Fedy, Bradley C.
2015-01-01
We deliver all products described herein as online geographic information system data for visualization and downloading. We outline the data properties for each model and their data inputs, describe the process of selecting appropriate data products for multifarious applications, describe all data products and software, provide newly derived model composites, and discuss how land managers may use the models to inform future sage-grouse studies and potentially refine conservation efforts. The models, software tools, and associated opportunities for novel applications of these products should provide a suite of additional, but not exclusive, tools for assessing Wyoming Greater Sage-grouse habitats, which land managers, conservationists, and scientists can apply to myriad applications.
Granato, Gregory E.; Jones, Susan C.
2014-01-01
In cooperation with FHWA, the U.S. Geological Survey developed the stochastic empirical loading and dilution model (SELDM) to supersede the 1990 FHWA runoff quality model. The SELDM tool is designed to transform disparate and complex scientific data into meaningful information about the adverse risks of runoff on receiving waters, the potential need for mitigation measures, and the potential effectiveness of such measures for reducing such risks. The SELDM tool is easy to use because much of the information and data needed to run it are embedded in the model and obtained by defining the site location and five simple basin properties. Information and data from thousands of sites across the country were compiled to facilitate the use of the SELDM tool. A case study illustrates how to use the SELDM tool for conducting the types of sensitivity analyses needed to properly assess water quality risks. For example, the use of deterministic values to model upstream stormflows instead of representative variations in prestorm flow and runoff may substantially overestimate the proportion of highway runoff in downstream flows. Also, the risks for total phosphorus excursions are substantially affected by the selected criteria and the modeling methods used. For example, if a single deterministic concentration is used rather than a stochastic population of values to model upstream concentrations, then the percentage of water quality excursions in the downstream receiving waters may depend entirely on the selected upstream concentration.
Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model
2016-01-01
The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694
2009-04-01
Capabilities Co ns tr uc tio n En gi ne er in g R es ea rc h La bo ra to ry Kathy L. Simunich, Timothy K. Perkins, David M. Bailey, David Brown, and...inversion height in convective condition is estimated with a one- dimensional model of the atmospheric boundary layer based on the Drie- donks slab model...tool and its capabilities. Installation geospatial data, in CAD format, were obtained for select buildings, roads, and topographic features in
Soós, Reka; Whiteman, Andrew D; Wilson, David C; Briciu, Cosmin; Nürnberger, Sofia; Oelz, Barbara; Gunsilius, Ellen; Schwehn, Ekkehard
2017-08-01
This is the second of two papers reporting the results of a major study considering 'operator models' for municipal solid waste management (MSWM) in emerging and developing countries. Part A documents the evidence base, while Part B presents a four-step decision support system for selecting an appropriate operator model in a particular local situation. Step 1 focuses on understanding local problems and framework conditions; Step 2 on formulating and prioritising local objectives; and Step 3 on assessing capacities and conditions, and thus identifying strengths and weaknesses, which underpin selection of the operator model. Step 4A addresses three generic questions, including public versus private operation, inter-municipal co-operation and integration of services. For steps 1-4A, checklists have been developed as decision support tools. Step 4B helps choose locally appropriate models from an evidence-based set of 42 common operator models ( coms); decision support tools here are a detailed catalogue of the coms, setting out advantages and disadvantages of each, and a decision-making flowchart. The decision-making process is iterative, repeating steps 2-4 as required. The advantages of a more formal process include avoiding pre-selection of a particular com known to and favoured by one decision maker, and also its assistance in identifying the possible weaknesses and aspects to consider in the selection and design of operator models. To make the best of whichever operator models are selected, key issues which need to be addressed include the capacity of the public authority as 'client', management in general and financial management in particular.
Agent-based modeling as a tool for program design and evaluation.
Lawlor, Jennifer A; McGirr, Sara
2017-12-01
Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kashani-Amin, Elaheh; Tabatabaei-Malazy, Ozra; Sakhteman, Amirhossein; Larijani, Bagher; Ebrahim-Habibi, Azadeh
2018-02-27
Prediction of proteins' secondary structure is one of the major steps in the generation of homology models. These models provide structural information which is used to design suitable ligands for potential medicinal targets. However, selecting a proper tool between multiple secondary structure prediction (SSP) options is challenging. The current study is an insight onto currently favored methods and tools, within various contexts. A systematic review was performed for a comprehensive access to recent (2013-2016) studies which used or recommended protein SSP tools. Three databases, Web of Science, PubMed and Scopus were systematically searched and 99 out of 209 studies were finally found eligible to extract data. Four categories of applications for 59 retrieved SSP tools were: (I) prediction of structural features of a given sequence, (II) evaluation of a method, (III) providing input for a new SSP method and (IV) integrating a SSP tool as a component for a program. PSIPRED was found to be the most popular tool in all four categories. JPred and tools utilizing PHD (Profile network from HeiDelberg) method occupied second and third places of popularity in categories I and II. JPred was only found in the two first categories, while PHD was present in three fields. This study provides a comprehensive insight about the recent usage of SSP tools which could be helpful for selecting a proper tool's choice. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Mysara, Mohamed; Elhefnawi, Mahmoud; Garibaldi, Jonathan M
2012-06-01
The investigation of small interfering RNA (siRNA) and its posttranscriptional gene-regulation has become an extremely important research topic, both for fundamental reasons and for potential longer-term therapeutic benefits. Several factors affect the functionality of siRNA including positional preferences, target accessibility and other thermodynamic features. State of the art tools aim to optimize the selection of target siRNAs by identifying those that may have high experimental inhibition. Such tools implement artificial neural network models as Biopredsi and ThermoComposition21, and linear regression models as DSIR, i-Score and Scales, among others. However, all these models have limitations in performance. In this work, a neural-network trained new siRNA scoring/efficacy prediction model was developed based on combining two existing scoring algorithms (ThermoComposition21 and i-Score), together with the whole stacking energy (ΔG), in a multi-layer artificial neural network. These three parameters were chosen after a comparative combinatorial study between five well known tools. Our developed model, 'MysiRNA' was trained on 2431 siRNA records and tested using three further datasets. MysiRNA was compared with 11 alternative existing scoring tools in an evaluation study to assess the predicted and experimental siRNA efficiency where it achieved the highest performance both in terms of correlation coefficient (R(2)=0.600) and receiver operating characteristics analysis (AUC=0.808), improving the prediction accuracy by up to 18% with respect to sensitivity and specificity of the best available tools. MysiRNA is a novel, freely accessible model capable of predicting siRNA inhibition efficiency with improved specificity and sensitivity. This multiclassifier approach could help improve the performance of prediction in several bioinformatics areas. MysiRNA model, part of MysiRNA-Designer package [1], is expected to play a key role in siRNA selection and evaluation. Copyright © 2012 Elsevier Inc. All rights reserved.
DOT National Transportation Integrated Search
2014-01-01
Historically, project-level decisions for the selection of highway features to promote safety were : based on either engineering judgment or adherence to accepted national guidance. These tools have allowed : highway designers to produce facilities t...
DOT National Transportation Integrated Search
2014-01-01
Historically, project-level decisions for the selection of highway features to promote safety were based on either engineering judgment or adherence to accepted national guidance. These tools have allowed highway designers to produce facilities that ...
Selection of higher order regression models in the analysis of multi-factorial transcription data.
Prazeres da Costa, Olivia; Hoffman, Arthur; Rey, Johannes W; Mansmann, Ulrich; Buch, Thorsten; Tresch, Achim
2014-01-01
Many studies examine gene expression data that has been obtained under the influence of multiple factors, such as genetic background, environmental conditions, or exposure to diseases. The interplay of multiple factors may lead to effect modification and confounding. Higher order linear regression models can account for these effects. We present a new methodology for linear model selection and apply it to microarray data of bone marrow-derived macrophages. This experiment investigates the influence of three variable factors: the genetic background of the mice from which the macrophages were obtained, Yersinia enterocolitica infection (two strains, and a mock control), and treatment/non-treatment with interferon-γ. We set up four different linear regression models in a hierarchical order. We introduce the eruption plot as a new practical tool for model selection complementary to global testing. It visually compares the size and significance of effect estimates between two nested models. Using this methodology we were able to select the most appropriate model by keeping only relevant factors showing additional explanatory power. Application to experimental data allowed us to qualify the interaction of factors as either neutral (no interaction), alleviating (co-occurring effects are weaker than expected from the single effects), or aggravating (stronger than expected). We find a biologically meaningful gene cluster of putative C2TA target genes that appear to be co-regulated with MHC class II genes. We introduced the eruption plot as a tool for visual model comparison to identify relevant higher order interactions in the analysis of expression data obtained under the influence of multiple factors. We conclude that model selection in higher order linear regression models should generally be performed for the analysis of multi-factorial microarray data.
NASA Astrophysics Data System (ADS)
Khan, Akhtar; Maity, Kalipada
2018-03-01
This paper explores some of the vital machinability characteristics of commercially pure titanium (CP-Ti) grade 2. Experiments were conducted based on Taguchi’s L9 orthogonal array. The selected material was machined on a heavy duty lathe (Model: HMT NH26) using uncoated carbide inserts in dry cutting environment. The selected inserts were designated by ISO as SNMG 120408 (Model: K313) and manufactured by Kennametal. These inserts were rigidly mounted on a right handed tool holder PSBNR 2020K12. Cutting speed, feed rate and depth of cut were selected as three input variables whereas tool wear (VBc) and surface roughness (Ra) were the major attentions. In order to confirm an appreciable machinability of the work part, an optimal parametric combination was attained with the help of grey relational analysis (GRA) approach. Finally, a mathematical model was developed to exhibit the accuracy and acceptability of the proposed methodology using multiple regression equations. The results indicated that, the suggested model is capable of predicting overall grey relational grade within acceptable range.
Variation simulation for compliant sheet metal assemblies with applications
NASA Astrophysics Data System (ADS)
Long, Yufeng
Sheet metals are widely used in discrete products, such as automobiles, aircraft, furniture and electronics appliances, due to their good manufacturability and low cost. A typical automotive body assembly consists of more than 300 parts welded together in more than 200 assembly fixture stations. Such an assembly system is usually quite complex, and takes a long time to develop. As the automotive customer demands products of increasing quality in a shorter time, engineers in automotive industry turn to computer-aided engineering (CAE) tools for help. Computers are an invaluable resource for engineers, not only to simplify and automate the design process, but also to share design specifications with manufacturing groups so that production systems can be tooled up quickly and efficiently. Therefore, it is beneficial to develop computerized simulation and evaluation tools for development of automotive body assembly systems. It is a well-known fact that assembly architectures (joints, fixtures, and assembly lines) have a profound impact on dimensional quality of compliant sheet metal assemblies. To evaluate sheet metal assembly architectures, a special dimensional analysis tool need be developed for predicting dimensional variation of the assembly. Then, the corresponding systematic tools can be established to help engineers select the assembly architectures. In this dissertation, a unified variation model is developed to predict variation in compliant sheet metal assemblies by considering fixture-induced rigid-body motion, deformation and springback. Based on the unified variation model, variation propagation models in multiple assembly stations with various configurations are established. To evaluate the dimensional capability of assembly architectures, quantitative indices are proposed based on the sensitivity matrix, which are independent of the variation level of the process. Examples are given to demonstrate their applications in selecting robust assembly architectures, and some useful guidelines for selection of assembly architectures are summarized. In addition, to enhance the fault diagnosis, a systematic methodology is proposed for selection of measurement configurations. Specifically, principles involved in selecting measurements are generalized first; then, the corresponding quantitative indices are developed to evaluate the measurement configurations, and finally, examples are present.
Computing Linear Mathematical Models Of Aircraft
NASA Technical Reports Server (NTRS)
Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.
1991-01-01
Derivation and Definition of Linear Aircraft Model (LINEAR) computer program provides user with powerful, and flexible, standard, documented, and verified software tool for linearization of mathematical models of aerodynamics of aircraft. Intended for use in software tool to drive linear analysis of stability and design of control laws for aircraft. Capable of both extracting such linearized engine effects as net thrust, torque, and gyroscopic effects, and including these effects in linear model of system. Designed to provide easy selection of state, control, and observation variables used in particular model. Also provides flexibility of allowing alternate formulations of both state and observation equations. Written in FORTRAN.
Kaserer, Teresa; Temml, Veronika; Kutil, Zsofia; Vanek, Tomas; Landa, Premysl; Schuster, Daniela
2015-01-01
Computational methods can be applied in drug development for the identification of novel lead candidates, but also for the prediction of pharmacokinetic properties and potential adverse effects, thereby aiding to prioritize and identify the most promising compounds. In principle, several techniques are available for this purpose, however, which one is the most suitable for a specific research objective still requires further investigation. Within this study, the performance of several programs, representing common virtual screening methods, was compared in a prospective manner. First, we selected top-ranked virtual screening hits from the three methods pharmacophore modeling, shape-based modeling, and docking. For comparison, these hits were then additionally predicted by external pharmacophore- and 2D similarity-based bioactivity profiling tools. Subsequently, the biological activities of the selected hits were assessed in vitro, which allowed for evaluating and comparing the prospective performance of the applied tools. Although all methods performed well, considerable differences were observed concerning hit rates, true positive and true negative hits, and hitlist composition. Our results suggest that a rational selection of the applied method represents a powerful strategy to maximize the success of a research project, tightly linked to its aims. We employed cyclooxygenase as application example, however, the focus of this study lied on highlighting the differences in the virtual screening tool performances and not in the identification of novel COX-inhibitors. Copyright © 2015 The Authors. Published by Elsevier Masson SAS.. All rights reserved.
A Selection of Composites Simulation Practices at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.
2007-01-01
One of the major areas of study at NASA Langley Research Center is the development of technologies that support the use of advanced composite materials in aerospace applications. Amongst the supporting technologies are analysis tools used to simulate the behavior of these materials. This presentation will discuss a number of examples of analysis tools and simulation practices conducted at NASA Langley. The presentation will include examples of damage tolerance analyses for both interlaminar and intralaminar failure modes. Tools for modeling interlaminar failure modes include fracture mechanics and cohesive methods, whilst tools for modeling intralaminar failure involve the development of various progressive failure analyses. Other examples of analyses developed at NASA Langley include a thermo-mechanical model of an orthotropic material and the simulation of delamination growth in z-pin reinforced laminates.
2014-01-01
computational and empirical dosimetric tools [31]. For the computational dosimetry, we employed finite-dif- ference time- domain (FDTD) modeling techniques to...temperature-time data collected for a well exposed to THz radiation using finite-difference time- domain (FDTD) modeling techniques and thermocouples... like )). Alter- ation in the expression of such genes underscores the signif- 62 IEEE TRANSACTIONS ON TERAHERTZ SCIENCE AND TECHNOLOGY, VOL. 6, NO. 1
77 FR 13607 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-07
... Transformation Grants: Use of System Dynamic Modeling and Economic Analysis in Select Communities--New--National... community interventions. Using a system dynamics approach, CDC also plans to conduct simulation modeling... the development of analytic tools for system dynamics modeling under more limited conditions. The...
User Guide for the Anvil Threat Cooridor Forecast Tool V2.4 for AWIPS
NASA Technical Reports Server (NTRS)
Barett, Joe H., III; Bauman, William H., III
2008-01-01
The Anvil Tool GUI allows users to select a Data Type, toggle the map refresh on/off, place labels, and choose the Profiler Type (source of the KSC 50 MHz profiler data), the Date- Time of the data, the Center of Plot, and the Station (location of the RAOB or 50 MHz profiler). If the Data Type is Models, the user selects a Fcst Hour (forecast hour) instead of Station. There are menus for User Profiles, Circle Label Options, and Frame Label Options. Labels can be placed near the center circle of the plot and/or at a specified distance and direction from the center of the circle (Center of Plot). The default selection for the map refresh is "ON". When the user creates a new Anvil Tool map with Refresh Map "ON, the plot is automatically displayed in the AWIPS frame. If another Anvil Tool map is already displayed and the user does not change the existing map number shown at the bottom of the GUI, the new Anvil Tool map will overwrite the old one. If the user turns the Refresh Map "OFF", the new Anvil Tool map is created but not automatically displayed. The user can still display the Anvil Tool map through the Maps dropdown menu* as shown in Figure 4.
Modeling and Analysis of Space Based Transceivers
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben
2005-01-01
This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.
Modeling and Analysis of Space Based Transceivers
NASA Technical Reports Server (NTRS)
Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.
2007-01-01
This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.
NASA Astrophysics Data System (ADS)
Kant Garg, Girish; Garg, Suman; Sangwan, K. S.
2018-04-01
The manufacturing sector consumes huge energy demand and the machine tools used in this sector have very less energy efficiency. Selection of the optimum machining parameters for machine tools is significant for energy saving and for reduction of environmental emission. In this work an empirical model is developed to minimize the power consumption using response surface methodology. The experiments are performed on a lathe machine tool during the turning of AISI 6061 Aluminum with coated tungsten inserts. The relationship between the power consumption and machining parameters is adequately modeled. This model is used for formulation of minimum power consumption criterion as a function of optimal machining parameters using desirability function approach. The influence of machining parameters on the energy consumption has been found using the analysis of variance. The validation of the developed empirical model is proved using the confirmation experiments. The results indicate that the developed model is effective and has potential to be adopted by the industry for minimum power consumption of machine tools.
ASTEC and MODEL: Controls software development at Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Downing, John P.; Bauer, Frank H.; Surber, Jeffrey L.
1993-01-01
The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at the Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. In the last three years the ASTEC (Analysis and Simulation Tools for Engineering Controls) software has been under development. ASTEC is meant to be an integrated collection of controls analysis tools for use at the desktop level. MODEL (Multi-Optimal Differential Equation Language) is a translator that converts programs written in the MODEL language to FORTRAN. An upgraded version of the MODEL program will be merged into ASTEC. MODEL has not been modified since 1981 and has not kept with changes in computers or user interface techniques. This paper describes the changes made to MODEL in order to make it useful in the 90's and how it relates to ASTEC.
RCrane: semi-automated RNA model building.
Keating, Kevin S; Pyle, Anna Marie
2012-08-01
RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.
NASA Astrophysics Data System (ADS)
Maringanti, Chetan; Chaubey, Indrajeet; Popp, Jennie
2009-06-01
Best management practices (BMPs) are effective in reducing the transport of agricultural nonpoint source pollutants to receiving water bodies. However, selection of BMPs for placement in a watershed requires optimization of the available resources to obtain maximum possible pollution reduction. In this study, an optimization methodology is developed to select and place BMPs in a watershed to provide solutions that are both economically and ecologically effective. This novel approach develops and utilizes a BMP tool, a database that stores the pollution reduction and cost information of different BMPs under consideration. The BMP tool replaces the dynamic linkage of the distributed parameter watershed model during optimization and therefore reduces the computation time considerably. Total pollutant load from the watershed, and net cost increase from the baseline, were the two objective functions minimized during the optimization process. The optimization model, consisting of a multiobjective genetic algorithm (NSGA-II) in combination with a watershed simulation tool (Soil Water and Assessment Tool (SWAT)), was developed and tested for nonpoint source pollution control in the L'Anguille River watershed located in eastern Arkansas. The optimized solutions provided a trade-off between the two objective functions for sediment, phosphorus, and nitrogen reduction. The results indicated that buffer strips were very effective in controlling the nonpoint source pollutants from leaving the croplands. The optimized BMP plans resulted in potential reductions of 33%, 32%, and 13% in sediment, phosphorus, and nitrogen loads, respectively, from the watershed.
Use of Cost-Utility Decision Models in Business Education.
ERIC Educational Resources Information Center
Lewis, Darrell R.
1989-01-01
Explains how cost-utility analysis can be applied to the selection of curriculum and instructional methods. Describes the use of multiattribute utility models of decision making as a tool for more informed judgment in educational administration. (SK)
MISFITS: evaluating the goodness of fit between a phylogenetic model and an alignment.
Nguyen, Minh Anh Thi; Klaere, Steffen; von Haeseler, Arndt
2011-01-01
As models of sequence evolution become more and more complicated, many criteria for model selection have been proposed, and tools are available to select the best model for an alignment under a particular criterion. However, in many instances the selected model fails to explain the data adequately as reflected by large deviations between observed pattern frequencies and the corresponding expectation. We present MISFITS, an approach to evaluate the goodness of fit (http://www.cibiv.at/software/misfits). MISFITS introduces a minimum number of "extra substitutions" on the inferred tree to provide a biologically motivated explanation why the alignment may deviate from expectation. These extra substitutions plus the evolutionary model then fully explain the alignment. We illustrate the method on several examples and then give a survey about the goodness of fit of the selected models to the alignments in the PANDIT database.
Model Uncertainty and Bayesian Model Averaged Benchmark Dose Estimation for Continuous Data
The benchmark dose (BMD) approach has gained acceptance as a valuable risk assessment tool, but risk assessors still face significant challenges associated with selecting an appropriate BMD/BMDL estimate from the results of a set of acceptable dose-response models. Current approa...
Overview of SDCM - The Spacecraft Design and Cost Model
NASA Technical Reports Server (NTRS)
Ferebee, Melvin J.; Farmer, Jeffery T.; Andersen, Gregory C.; Flamm, Jeffery D.; Badi, Deborah M.
1988-01-01
The Spacecraft Design and Cost Model (SDCM) is a computer-aided design and analysis tool for synthesizing spacecraft configurations, integrating their subsystems, and generating information concerning on-orbit servicing and costs. SDCM uses a bottom-up method in which the cost and performance parameters for subsystem components are first calculated; the model then sums the contributions from individual components in order to obtain an estimate of sizes and costs for each candidate configuration within a selected spacecraft system. An optimum spacraft configuration can then be selected.
Measures and limits of models of fixation selection.
Wilming, Niklas; Betz, Torsten; Kietzmann, Tim C; König, Peter
2011-01-01
Models of fixation selection are a central tool in the quest to understand how the human mind selects relevant information. Using this tool in the evaluation of competing claims often requires comparing different models' relative performance in predicting eye movements. However, studies use a wide variety of performance measures with markedly different properties, which makes a comparison difficult. We make three main contributions to this line of research: First we argue for a set of desirable properties, review commonly used measures, and conclude that no single measure unites all desirable properties. However the area under the ROC curve (a classification measure) and the KL-divergence (a distance measure of probability distributions) combine many desirable properties and allow a meaningful comparison of critical model performance. We give an analytical proof of the linearity of the ROC measure with respect to averaging over subjects and demonstrate an appropriate correction of entropy-based measures like KL-divergence for small sample sizes in the context of eye-tracking data. Second, we provide a lower bound and an upper bound of these measures, based on image-independent properties of fixation data and between subject consistency respectively. Based on these bounds it is possible to give a reference frame to judge the predictive power of a model of fixation selection. We provide open-source python code to compute the reference frame. Third, we show that the upper, between subject consistency bound holds only for models that predict averages of subject populations. Departing from this we show that incorporating subject-specific viewing behavior can generate predictions which surpass that upper bound. Taken together, these findings lay out the required information that allow a well-founded judgment of the quality of any model of fixation selection and should therefore be reported when a new model is introduced.
Visual analytics in cheminformatics: user-supervised descriptor selection for QSAR methods.
Martínez, María Jimena; Ponzoni, Ignacio; Díaz, Mónica F; Vazquez, Gustavo E; Soto, Axel J
2015-01-01
The design of QSAR/QSPR models is a challenging problem, where the selection of the most relevant descriptors constitutes a key step of the process. Several feature selection methods that address this step are concentrated on statistical associations among descriptors and target properties, whereas the chemical knowledge is left out of the analysis. For this reason, the interpretability and generality of the QSAR/QSPR models obtained by these feature selection methods are drastically affected. Therefore, an approach for integrating domain expert's knowledge in the selection process is needed for increase the confidence in the final set of descriptors. In this paper a software tool, which we named Visual and Interactive DEscriptor ANalysis (VIDEAN), that combines statistical methods with interactive visualizations for choosing a set of descriptors for predicting a target property is proposed. Domain expertise can be added to the feature selection process by means of an interactive visual exploration of data, and aided by statistical tools and metrics based on information theory. Coordinated visual representations are presented for capturing different relationships and interactions among descriptors, target properties and candidate subsets of descriptors. The competencies of the proposed software were assessed through different scenarios. These scenarios reveal how an expert can use this tool to choose one subset of descriptors from a group of candidate subsets or how to modify existing descriptor subsets and even incorporate new descriptors according to his or her own knowledge of the target property. The reported experiences showed the suitability of our software for selecting sets of descriptors with low cardinality, high interpretability, low redundancy and high statistical performance in a visual exploratory way. Therefore, it is possible to conclude that the resulting tool allows the integration of a chemist's expertise in the descriptor selection process with a low cognitive effort in contrast with the alternative of using an ad-hoc manual analysis of the selected descriptors. Graphical abstractVIDEAN allows the visual analysis of candidate subsets of descriptors for QSAR/QSPR. In the two panels on the top, users can interactively explore numerical correlations as well as co-occurrences in the candidate subsets through two interactive graphs.
Kling, Teresia; Johansson, Patrik; Sanchez, José; Marinescu, Voichita D.; Jörnsten, Rebecka; Nelander, Sven
2015-01-01
Statistical network modeling techniques are increasingly important tools to analyze cancer genomics data. However, current tools and resources are not designed to work across multiple diagnoses and technical platforms, thus limiting their applicability to comprehensive pan-cancer datasets such as The Cancer Genome Atlas (TCGA). To address this, we describe a new data driven modeling method, based on generalized Sparse Inverse Covariance Selection (SICS). The method integrates genetic, epigenetic and transcriptional data from multiple cancers, to define links that are present in multiple cancers, a subset of cancers, or a single cancer. It is shown to be statistically robust and effective at detecting direct pathway links in data from TCGA. To facilitate interpretation of the results, we introduce a publicly accessible tool (cancerlandscapes.org), in which the derived networks are explored as interactive web content, linked to several pathway and pharmacological databases. To evaluate the performance of the method, we constructed a model for eight TCGA cancers, using data from 3900 patients. The model rediscovered known mechanisms and contained interesting predictions. Possible applications include prediction of regulatory relationships, comparison of network modules across multiple forms of cancer and identification of drug targets. PMID:25953855
An Excel[TM] Model of a Radioactive Series
ERIC Educational Resources Information Center
Andrews, D. G. H.
2009-01-01
A computer model of the decay of a radioactive series, written in Visual Basic in Excel[TM], is presented. The model is based on the random selection of cells in an array. The results compare well with the theoretical equations. The model is a useful tool in teaching this aspect of radioactivity. (Contains 4 figures.)
Mueller, Martina; Wagner, Carol L; Annibale, David J; Knapp, Rebecca G; Hulsey, Thomas C; Almeida, Jonas S
2006-03-01
Approximately 30% of intubated preterm infants with respiratory distress syndrome (RDS) will fail attempted extubation, requiring reintubation and mechanical ventilation. Although ventilator technology and monitoring of premature infants have improved over time, optimal extubation remains challenging. Furthermore, extubation decisions for premature infants require complex informational processing, techniques implicitly learned through clinical practice. Computer-aided decision-support tools would benefit inexperienced clinicians, especially during peak neonatal intensive care unit (NICU) census. A five-step procedure was developed to identify predictive variables. Clinical expert (CE) thought processes comprised one model. Variables from that model were used to develop two mathematical models for the decision-support tool: an artificial neural network (ANN) and a multivariate logistic regression model (MLR). The ranking of the variables in the three models was compared using the Wilcoxon Signed Rank Test. The best performing model was used in a web-based decision-support tool with a user interface implemented in Hypertext Markup Language (HTML) and the mathematical model employing the ANN. CEs identified 51 potentially predictive variables for extubation decisions for an infant on mechanical ventilation. Comparisons of the three models showed a significant difference between the ANN and the CE (p = 0.0006). Of the original 51 potentially predictive variables, the 13 most predictive variables were used to develop an ANN as a web-based decision-tool. The ANN processes user-provided data and returns the prediction 0-1 score and a novelty index. The user then selects the most appropriate threshold for categorizing the prediction as a success or failure. Furthermore, the novelty index, indicating the similarity of the test case to the training case, allows the user to assess the confidence level of the prediction with regard to how much the new data differ from the data originally used for the development of the prediction tool. State-of-the-art, machine-learning methods can be employed for the development of sophisticated tools to aid clinicians' decisions. We identified numerous variables considered relevant for extubation decisions for mechanically ventilated premature infants with RDS. We then developed a web-based decision-support tool for clinicians which can be made widely available and potentially improve patient care world wide.
Moise, Leonard; Gutierrez, Andres; Kibria, Farzana; Martin, Rebecca; Tassone, Ryan; Liu, Rui; Terry, Frances; Martin, Bill; De Groot, Anne S
2015-01-01
Computational vaccine design, also known as computational vaccinology, encompasses epitope mapping, antigen selection and immunogen design using computational tools. The iVAX toolkit is an integrated set of tools that has been in development since 1998 by De Groot and Martin. It comprises a suite of immunoinformatics algorithms for triaging candidate antigens, selecting immunogenic and conserved T cell epitopes, eliminating regulatory T cell epitopes, and optimizing antigens for immunogenicity and protection against disease. iVAX has been applied to vaccine development programs for emerging infectious diseases, cancer antigens and biodefense targets. Several iVAX vaccine design projects have had success in pre-clinical studies in animal models and are progressing toward clinical studies. The toolkit now incorporates a range of immunoinformatics tools for infectious disease and cancer immunotherapy vaccine design. This article will provide a guide to the iVAX approach to computational vaccinology.
Modeling PPP Economic Benefits for Lunar ISRU
NASA Astrophysics Data System (ADS)
Blair, B.
2017-10-01
A new tool is needed for selecting the PPP strategy that could maximize the rate of lunar commercialization by attracting private capital into the development of critical infrastructure and robust capability. A PPP model under development for NASA-ESO will be described.
Model Educational Specifications for Technology in Schools.
ERIC Educational Resources Information Center
Maryland State Dept. of Education, College Park. Office of Administration and Finance.
This description of the Model Edspec, which can be used by itself or in conjunction with the "Format Guide of Educational Specifications," serves as a comprehensive planning tool for the selection and application of technology. The model is designed to assist schools in implementing the facilities development process, thereby making…
NASA Astrophysics Data System (ADS)
Okokpujie, Imhade Princess; Ikumapayi, Omolayo M.; Okonkwo, Ugochukwu C.; Salawu, Enesi Y.; Afolalu, Sunday A.; Dirisu, Joseph O.; Nwoke, Obinna N.; Ajayi, Oluseyi O.
2017-12-01
In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N), feed rate (f), axial depth of cut (a) and radial depth of cut (r). The experiment was designed using central composite design (CCD) in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM). The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.
3D FEM Simulation of Flank Wear in Turning
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta; Giardini, Claudio
2011-05-01
This work deals with tool wear simulation. Studying the influence of tool wear on tool life, tool substitution policy and influence on final part quality, surface integrity, cutting forces and power consumption it is important to reduce the global process costs. Adhesion, abrasion, erosion, diffusion, corrosion and fracture are some of the phenomena responsible of the tool wear depending on the selected cutting parameters: cutting velocity, feed rate, depth of cut, …. In some cases these wear mechanisms are described by analytical models as a function of process variables (temperature, pressure and sliding velocity along the cutting surface). These analytical models are suitable to be implemented in FEM codes and they can be utilized to simulate the tool wear. In the present paper a commercial 3D FEM software has been customized to simulate the tool wear during turning operations when cutting AISI 1045 carbon steel with uncoated tungsten carbide tip. The FEM software was improved by means of a suitable subroutine able to modify the tool geometry on the basis of the estimated tool wear as the simulation goes on. Since for the considered couple of tool-workpiece material the main phenomena generating wear are the abrasive and the diffusive ones, the tool wear model implemented into the subroutine was obtained as combination between the Usui's and the Takeyama and Murata's models. A comparison between experimental and simulated flank tool wear curves is reported demonstrating that it is possible to simulate the tool wear development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef
Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less
Tools & Resources | Efficient Windows Collaborative
Selection Tool Mobile App Window Selection Tool Mobile App Use the Window Selection Tool Mobile App for new Window Selection Tool Mobile App. LBNL's RESFEN RESFEN RESFEN is used for calculating the heating and
NASA Astrophysics Data System (ADS)
Ranatunga, T.
2017-12-01
Modeling of fate and transport of fecal bacteria in a watershed is a processed based approach that considers releases from manure, point sources, and septic systems. Overland transport with water and sediments, infiltration into soils, transport in the vadose zone and groundwater, die-off and growth processes, and in-stream transport are considered as the other major processes in bacteria simulation. This presentation will discuss a simulation of fecal indicator bacteria source loading and in-stream conditions of a non-tidal watershed (Cedar Bayou Watershed) in South Central Texas using two models; Spatially Explicit Load Enrichment Calculation Tool (SELECT) and Soil and Water Assessment Tool (SWAT). Furthermore, it will discuss a probable approach of bacteria source load reduction in order to meet the water quality standards in the streams. The selected watershed is listed as having levels of fecal indicator bacteria that posed a risk for contact recreation and wading by the Texas Commission of Environmental Quality (TCEQ). The SELECT modeling approach was used in estimating the bacteria source loading from land categories. Major bacteria sources considered were, failing septic systems, discharges from wastewater treatment facilities, excreta from livestock (Cattle, Horses, Sheep and Goat), excreta from Wildlife (Feral Hogs, and Deer), Pet waste (mainly from Dogs), and runoff from urban surfaces. The estimated source loads from SELECT model were input to the SWAT model, and simulate the bacteria transport through the land and in-stream. The calibrated SWAT model was then used to estimate the indicator bacteria in-stream concentrations for future years based on regional land use, population and household forecast (up to 2040). Based on the reductions required to meet the water quality standards in-stream, the corresponding required source load reductions were estimated.
Two Paradoxes in Linear Regression Analysis.
Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong
2016-12-25
Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.
Optimization of multi-environment trials for genomic selection based on crop models.
Rincent, R; Kuhn, E; Monod, H; Oury, F-X; Rousset, M; Allard, V; Le Gouis, J
2017-08-01
We propose a statistical criterion to optimize multi-environment trials to predict genotype × environment interactions more efficiently, by combining crop growth models and genomic selection models. Genotype × environment interactions (GEI) are common in plant multi-environment trials (METs). In this context, models developed for genomic selection (GS) that refers to the use of genome-wide information for predicting breeding values of selection candidates need to be adapted. One promising way to increase prediction accuracy in various environments is to combine ecophysiological and genetic modelling thanks to crop growth models (CGM) incorporating genetic parameters. The efficiency of this approach relies on the quality of the parameter estimates, which depends on the environments composing this MET used for calibration. The objective of this study was to determine a method to optimize the set of environments composing the MET for estimating genetic parameters in this context. A criterion called OptiMET was defined to this aim, and was evaluated on simulated and real data, with the example of wheat phenology. The MET defined with OptiMET allowed estimating the genetic parameters with lower error, leading to higher QTL detection power and higher prediction accuracies. MET defined with OptiMET was on average more efficient than random MET composed of twice as many environments, in terms of quality of the parameter estimates. OptiMET is thus a valuable tool to determine optimal experimental conditions to best exploit MET and the phenotyping tools that are currently developed.
Perriman, Noelyn; Davis, Deborah
2016-06-01
The objective of this systematic integrative review is to identify, summarise and communicate the findings of research relating to tools that measure maternal satisfaction with continuity of maternity care models. In so doing the most appropriate, reliable and valid tool that can be used to measure maternal satisfaction with continuity of maternity care will be determined. A systematic integrative review of published and unpublished literature was undertaken using selected databases. Research papers were included if they measured maternal satisfaction in a continuity model of maternity care, were published in English after 1999 and if they included (or made available) the instrument used to measure satisfaction. Six hundred and thirty two unique papers were identified and after applying the selection criteria, four papers were included in the review. Three of these originated in Australia and one in Canada. The primary focus of all papers was not on the development of a tool to measure maternal satisfaction but on the comparison of outcomes in different models of care. The instruments developed varied in terms of the degree to which they were tested for validity and reliability. Women's satisfaction with maternity services is an important measure of quality. Most satisfaction surveys in maternity appear to reflect fragmented models of care though continuity of care models are increasing in line with the evidence demonstrating their effectiveness. It is important that robust tools are developed for this context and that there is some consistency in the way this is measured and reported for the purposes of benchmarking and quality improvement. Copyright © 2016 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.
Distributed-parameter watershed models are often utilized for evaluating the effectiveness of sediment and nutrient abatement strategies through the traditional {calibrate→ validate→ predict} approach. The applicability of the method is limited due to modeling approximations. In ...
NASA Technical Reports Server (NTRS)
Hall, Laverne
1995-01-01
Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.
Empirical flow parameters : a tool for hydraulic model validity
Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.
2013-01-01
The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.
The Environmental Geophysics website features geophysical methods, terms and references; forward and inverse geophysical models for download; and a decision support tool to guide geophysical method selection for a variety of environmental applications.
Implementation of Structured Inquiry Based Model Learning toward Students' Understanding of Geometry
ERIC Educational Resources Information Center
Salim, Kalbin; Tiawa, Dayang Hjh
2015-01-01
The purpose of this study is implementation of a structured inquiry learning model in instruction of geometry. The model used is a model with a quasi-experimental study amounted to two classes of samples selected from the population of the ten classes with cluster random sampling technique. Data collection tool consists of a test item…
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.
2014-10-01
Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.
2015-03-01
Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
Developing Instructional Applications at the Secondary Level. The Computer as a Tool.
ERIC Educational Resources Information Center
McManus, Jack; And Others
Case studies are presented for seven Los Angeles area (California) high schools that worked with Pepperdine University in the IBM/ETS (International Business Machines/Educational Testing Service) Model Schools program, a project which provided training for selected secondary school teachers in the use of personal computers and selected software as…
An Analysis of the EPA Report on Pipeline Renewal Decision Making Tools and Approaches
Few DSS are commercially available for technology selection as most utilities make decisions based on in-house and consultant expertise (Matthews et al., 2011). This review presents some of the models proposed over the past 15 years for selecting technologies in the U.S. and wor...
Estimating time since infection in early homogeneous HIV-1 samples using a poisson model
2010-01-01
Background The occurrence of a genetic bottleneck in HIV sexual or mother-to-infant transmission has been well documented. This results in a majority of new infections being homogeneous, i.e., initiated by a single genetic strain. Early after infection, prior to the onset of the host immune response, the viral population grows exponentially. In this simple setting, an approach for estimating evolutionary and demographic parameters based on comparison of diversity measures is a feasible alternative to the existing Bayesian methods (e.g., BEAST), which are instead based on the simulation of genealogies. Results We have devised a web tool that analyzes genetic diversity in acutely infected HIV-1 patients by comparing it to a model of neutral growth. More specifically, we consider a homogeneous infection (i.e., initiated by a unique genetic strain) prior to the onset of host-induced selection, where we can assume a random accumulation of mutations. Previously, we have shown that such a model successfully describes about 80% of sexual HIV-1 transmissions provided the samples are drawn early enough in the infection. Violation of the model is an indicator of either heterogeneous infections or the initiation of selection. Conclusions When the underlying assumptions of our model (homogeneous infection prior to selection and fast exponential growth) are met, we are under a very particular scenario for which we can use a forward approach (instead of backwards in time as provided by coalescent methods). This allows for more computationally efficient methods to derive the time since the most recent common ancestor. Furthermore, the tool performs statistical tests on the Hamming distance frequency distribution, and outputs summary statistics (mean of the best fitting Poisson distribution, goodness of fit p-value, etc). The tool runs within minutes and can readily accommodate the tens of thousands of sequences generated through new ultradeep pyrosequencing technologies. The tool is available on the LANL website. PMID:20973976
Model Identification of Integrated ARMA Processes
ERIC Educational Resources Information Center
Stadnytska, Tetiana; Braun, Simone; Werner, Joachim
2008-01-01
This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…
Demographic Modelling in Weed Biocontrol
USDA-ARS?s Scientific Manuscript database
Demographic matrix modeling of plant populations can be a powerful tool to identify key life stage transitions that contribute the most to population growth of an invasive plant and hence should be targeted for disruption. Therefore, this approach has the potential to guide the pre-release selection...
Surface roughness model based on force sensors for the prediction of the tool wear.
de Agustina, Beatriz; Rubio, Eva María; Sebastián, Miguel Ángel
2014-04-04
In this study, a methodology has been developed with the objective of evaluating the surface roughness obtained during turning processes by measuring the signals detected by a force sensor under the same cutting conditions. In this way, the surface quality achieved along the process is correlated to several parameters of the cutting forces (thrust forces, feed forces and cutting forces), so the effect that the tool wear causes on the surface roughness is evaluated. In a first step, the best cutting conditions (cutting parameters and radius of tool) for a certain quality surface requirement were found for pieces of UNS A97075. Next, with this selection a model of surface roughness based on the cutting forces was developed for different states of wear that simulate the behaviour of the tool throughout its life. The validation of this model reveals that it was effective for approximately 70% of the surface roughness values obtained.
Selected Tether Applications Cost Model
NASA Technical Reports Server (NTRS)
Keeley, Michael G.
1988-01-01
Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.
Selection tool for foodborne norovirus outbreaks.
Verhoef, Linda P B; Kroneman, Annelies; van Duynhoven, Yvonne; Boshuizen, Hendriek; van Pelt, Wilfrid; Koopmans, Marion
2009-01-01
Detection of pathogens in the food chain is limited mainly to bacteria, and the globalization of the food industry enables international viral foodborne outbreaks to occur. Outbreaks from 2002 through 2006 recorded in a European norovirus surveillance database were investigated for virologic and epidemiologic indicators of food relatedness. The resulting validated multivariate logistic regression model comparing foodborne (n = 224) and person-to-person (n = 654) outbreaks was used to create a practical web-based tool that can be limited to epidemiologic parameters for nongenotyping countries. Non-genogroup-II.4 outbreaks, higher numbers of cases, and outbreaks in restaurants or households characterized (sensitivity = 0.80, specificity = 0.86) foodborne outbreaks and reduced the percentage of outbreaks requiring source-tracing to 31%. The selection tool enabled prospectively focused follow-up. Use of this tool is likely to improve data quality and strain typing in current surveillance systems, which is necessary for identification of potential international foodborne outbreaks.
Jensen, Jacob S; Egebo, Max; Meyer, Anne S
2008-05-28
Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included the development of a new variable selection tool, iterative backward elimination of changeable size intervals PLS. The spectral regions identified by the different variable selection methods were not identical, but all included two regions (1485-1425 and 1060-995 cm(-1)), which therefore were concluded to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69-79 mg of CE/L; r = 0.93-0.94) as compared to a calibration model developed using all variables (RMSEP = 115 mg of CE/L; r = 0.87). Only minor differences in the performance of the variable selection methods were observed.
Two Paradoxes in Linear Regression Analysis
FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong
2016-01-01
Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214
NASA Astrophysics Data System (ADS)
Granados, Xavier; Sánchez, Àlvar; López-López, Josep
2012-10-01
The development of superconducting applications and superconducting engineering requires the support of consistent tools which can provide models for obtaining a good understanding of the behaviour of the systems and predict novel features. These models aim to compute the behaviour of the superconducting systems, design superconducting devices and systems, and understand and test the behavior of the superconducting parts. 50 years ago, in 1962, Charles Bean provided the superconducting community with a model efficient enough to allow the computation of the response of a superconductor to external magnetic fields and currents flowing through in an understandable way: the so called critical-state model. Since then, in addition to the pioneering critical-state approach, other tools have been devised for designing operative superconducting systems, allowing integration of the superconducting design in nearly standard electromagnetic computer-aided design systems by modelling the superconducting parts with consideration of time-dependent processes. In April 2012, Barcelona hosted the 3rd International Workshop on Numerical Modelling of High Temperature Superconductors (HTS), the third in a series of workshops started in Lausanne in 2010 and followed by Cambridge in 2011. The workshop reflected the state-of-the-art and the new initiatives of HTS modelling, considering mathematical, physical and technological aspects within a wide and interdisciplinary scope. Superconductor Science and Technology is now publishing a selection of papers from the workshop which have been selected for their high quality. The selection comprises seven papers covering mathematical, physical and technological topics which contribute to an improvement in the development of procedures, understanding of phenomena and development of applications. We hope that they provide a perspective on the relevance and growth that the modelling of HTS superconductors has achieved in the past 25 years.
Thermomechanical conditions and stresses on the friction stir welding tool
NASA Astrophysics Data System (ADS)
Atthipalli, Gowtam
Friction stir welding has been commercially used as a joining process for aluminum and other soft materials. However, the use of this process in joining of hard alloys is still developing primarily because of the lack of cost effective, long lasting tools. Here I have developed numerical models to understand the thermo mechanical conditions experienced by the FSW tool and to improve its reusability. A heat transfer and visco-plastic flow model is used to calculate the torque, and traverse force on the tool during FSW. The computed values of torque and traverse force are validated using the experimental results for FSW of AA7075, AA2524, AA6061 and Ti-6Al-4V alloys. The computed torque components are used to determine the optimum tool shoulder diameter based on the maximum use of torque and maximum grip of the tool on the plasticized workpiece material. The estimation of the optimum tool shoulder diameter for FSW of AA6061 and AA7075 was verified with experimental results. The computed values of traverse force and torque are used to calculate the maximum shear stress on the tool pin to determine the load bearing ability of the tool pin. The load bearing ability calculations are used to explain the failure of H13 steel tool during welding of AA7075 and commercially pure tungsten during welding of L80 steel. Artificial neural network (ANN) models are developed to predict the important FSW output parameters as function of selected input parameters. These ANN consider tool shoulder radius, pin radius, pin length, welding velocity, tool rotational speed and axial pressure as input parameters. The total torque, sliding torque, sticking torque, peak temperature, traverse force, maximum shear stress and bending stress are considered as the output for ANN models. These output parameters are selected since they define the thermomechanical conditions around the tool during FSW. The developed ANN models are used to understand the effect of various input parameters on the total torque and traverse force during FSW of AA7075 and 1018 mild steel. The ANN models are also used to determine tool safety factor for wide range of input parameters. A numerical model is developed to calculate the strain and strain rates along the streamlines during FSW. The strain and strain rate values are calculated for FSW of AA2524. Three simplified models are also developed for quick estimation of output parameters such as material velocity field, torque and peak temperature. The material velocity fields are computed by adopting an analytical method of calculating velocities for flow of non-compressible fluid between two discs where one is rotating and other is stationary. The peak temperature is estimated based on a non-dimensional correlation with dimensionless heat input. The dimensionless heat input is computed using known welding parameters and material properties. The torque is computed using an analytical function based on shear strength of the workpiece material. These simplified models are shown to be able to predict these output parameters successfully.
NASA Astrophysics Data System (ADS)
Pound, M. W.; Wolfire, M. G.; Amarnath, N. S.
2003-12-01
The Dust InfraRed ToolBox (DIRT - a part of the Web Infrared ToolShed, or WITS, located at http://dustem.astro.umd.edu) is a Java applet for modeling astrophysical processes in circumstellar shells around young and evolved stars. DIRT has been used by the astrophysics community for about 5 years. Users can automatically and efficiently search grids of pre-calculated models to fit their data. A large set of physical parameters and dust types are included in the model database, which contains over 500,000 models. We are adding new functionality to DIRT to support new missions like SIRTF and SOFIA. A new Instrument module allows for plotting of the model points convolved with the spatial and spectral responses of the selected instrument. This lets users better fit data from specific instruments. Currently, we have implemented modules for the Infrared Array Camera (IRAC) and Multiband Imaging Photometer (MIPS) on SIRTF.
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315
Modeling and modification of medical 3D objects. The benefit of using a haptic modeling tool.
Kling-Petersen, T; Rydmark, M
2000-01-01
The Computer Laboratory of the medical faculty in Goteborg (Mednet) has since the end of 1998 been one of a limited numbers of participants in the development of a new modeling tool together with SensAble Technologies Inc [http:¿www.sensable.com/]. The software called SensAble FreeForm was officially released at Siggraph September 1999. Briefly, the software mimics the modeling techniques traditionally used by clay artists. An imported model or a user defined block of "clay" can be modified using different tools such as a ball, square block, scrape etc via the use of a SensAble Technologies PHANToM haptic arm. The model will deform in 3D as a result of touching the "clay" with any selected tool and the amount of deformation is linear to the force applied. By getting instantaneous haptic as well as visual feedback, precise and intuitive changes are easily made. While SensAble FreeForm lacks several of the features normally associated with a 3D modeling program (such as text handling, application of surface and bumpmaps, high-end rendering engines, etc) it's strength lies in the ability to rapidly create non-geometric 3D models. For medical use, very few anatomically correct models are created from scratch. However, FreeForm features tools enable advanced modification of reconstructed or 3D scanned models. One of the main problems with 3D laserscanning of medical specimens is that the technique usually leaves holes or gaps in the dataset corresponding to areas in shadows such as orifices, deep grooves etc. By using FreeForms different tools, these defects are easily corrected and gaps are filled out. Similarly, traditional 3D reconstruction (based on serial sections etc) often shows artifacts as a result of the triangulation and/or tessellation processes. These artifacts usually manifest as unnatural ridges or uneven areas ("the accordion effect"). FreeForm contains a smoothing algorithm that enables the user to select an area to be modified and subsequently apply any given amount of smoothing to the object. While the final objects need to be exported for further 3D graphic manipulation, FreeForm addresses one of the most time consuming problems of 3D modeling: modification and creation of non-geometric 3D objects.
Developing a MATLAB(registered)-Based Tool for Visualization and Transformation
NASA Technical Reports Server (NTRS)
Anderton, Blake J.
2003-01-01
An important step in the structural design and development of spacecraft is the experimental identification of a structure s modal characteristics, such as its natural frequencies and modes of vibration. These characteristics are vital to developing a representative model of any given structure or analyzing the range of input frequencies that can be handled by a particular structure. When setting up such a representative model of a structure, careful measurements using precision equipment (such as accelerometers and instrumented hammers) must be made on many individual points of the structure in question. The coordinate location of each data point is used to construct a wireframe geometric model of the structure. Response measurements obtained from the accelerometers is used to generate the modal shapes of the particular structure. Graphically, this is displayed as a combination of the ways a structure will ideally respond to a specified force input. Two types of models of the tested structure are often used in modal analysis: an analytic model showing expected behavior of the structure, and an experimental model showing measured results due to observed phenomena. To evaluate the results from the experimental model, a comparison of analytic and experimental results must be made between the two models. However, comparisons between these two models become difficult when the two coordinate orientations differ in a manner such that results are displayed in an unclear fashion. Such a problem proposes the need for a tool that not only communicates a graphical image of a structure s wireframe geometry based on various measurement locations (called nodes), but also allows for a type of transformation of the image s coordinate geometry so that a model s coordinate orientation is made to match the orientation of another model. Such a tool should also be designed so that it is able to construct coordinate geometry based on many different listings of node locations and is able to transform the wireframe coordinate orientation to match almost any possible orientation (i.e. it should not be a problem specific application) if it is to be of much value in modal analysis. Also, since universal files are used to store modal parameters and wireframe geometry, the tool must be able to read and extract information from universal files and use these files to exchange model data.The purpose of this project is to develop such a tool as a computer graphical user interface (GUI) capable of performing the following tasks: 1) Browsing for a particular universal file within the computer directory and displaying the name of this file to the screen; 2) Plotting each of the nodes within the universal file in a useful, descriptive, and easily understood figure; 3) Reading the node numbers from the selected file and listing these node numbers to the user for selection in an easily accessible format; 4) Allowing for user selection of a new model orientation defined by three selected nodes; and 5) Allowing the user to specify a directory to which the transformed model s node locations will be saved, and saving the transformed node locations to the specified file.
Selecting Tools to Model Integer and Binomial Multiplication
ERIC Educational Resources Information Center
Pratt, Sarah Smitherman; Eddy, Colleen M.
2017-01-01
Mathematics teachers frequently provide concrete manipulatives to students during instruction; however, the rationale for using certain manipulatives in conjunction with concepts may not be explored. This article focuses on area models that are currently used in classrooms to provide concrete examples of integer and binomial multiplication. The…
Before the U.S. Environmental Protection Agency issued the 1988 Guidelines for Estimating Exposures, it published proposed guidelines in the Federal Register for public review and comment. he guidelines are intended to give risk analysis a basic framework and the tools they need ...
University Macro Analytic Simulation Model.
ERIC Educational Resources Information Center
Baron, Robert; Gulko, Warren
The University Macro Analytic Simulation System (UMASS) has been designed as a forecasting tool to help university administrators budgeting decisions. Alternative budgeting strategies can be tested on a computer model and then an operational alternative can be selected on the basis of the most desirable projected outcome. UMASS uses readily…
Chun, Ting Sie; Malek, M A; Ismail, Amelia Ritahani
2015-01-01
The development of effluent removal prediction is crucial in providing a planning tool necessary for the future development and the construction of a septic sludge treatment plant (SSTP), especially in the developing countries. In order to investigate the expected functionality of the required standard, the prediction of the effluent quality, namely biological oxygen demand, chemical oxygen demand and total suspended solid of an SSTP was modelled using an artificial intelligence approach. In this paper, we adopt the clonal selection algorithm (CSA) to set up a prediction model, with a well-established method - namely the least-square support vector machine (LS-SVM) as a baseline model. The test results of the case study showed that the prediction of the CSA-based SSTP model worked well and provided model performance as satisfactory as the LS-SVM model. The CSA approach shows that fewer control and training parameters are required for model simulation as compared with the LS-SVM approach. The ability of a CSA approach in resolving limited data samples, non-linear sample function and multidimensional pattern recognition makes it a powerful tool in modelling the prediction of effluent removals in an SSTP.
Associations Between the Big Five Personality Traits and a Medical School Admission Interview.
Lourinho, Isabel; Moreira, André; Mota-Cardoso, Rui; Severo, Milton; Ferreira, Maria Amélia
2016-12-30
Personality has became popular in medical student's selection. However, few research exists about the association between the big five personality traits and the existent medical school selection tools. Our aim was to study which personality traits were selected by a medical school admission interview. One hundred ninety four graduate applicants that had applied to the Faculty of Medicine of the University of Porto through the graduate entry approach, after ranked on previous achievement, were interviewed between the academic years of 2011 and 2013. From these, 181 (93.3%) answered to the NEO Five-Factor Inventory that assesses high order personality traits of openness to experience, conscientiousness, extraversion, agreeableness and neuroticism. Admission interview corresponded to the second phase of the seriation process. Every applicant was interviewed and scored by three interviewers on seven dimensions asesssed by Lickert scale (1-10). Interview score was the sum of the dimensions. Linear mixed effects model and respective regression coefficients were used to estimate the association between personality traits from each interviewer's score. Final models were adjusted for gender, interviewers and previous achievement. Openness to experience (Beta = 0.18: CI 95%: 0.05; 0.30) had the strongest association with interview score followed by the interaction effect between the extraversion and conscientiousness traits (Beta = 0.14; CI 95%: 0.02; 0.25). Also, applicants scored higher when their gender was opposite to the interviewers. Previous achievement and interview score had no association. Our admission interview selected different personality traits when compared to other selection tools. Medical schools should be aware of the implications of the adopted selection tools on the admitted medical student's personality because it can help providing beneficial interventions.
Great apes select tools on the basis of their rigidity.
Manrique, Héctor Marín; Gross, Alexandra Nam-Mi; Call, Josep
2010-10-01
Wild chimpanzees select tools according to their rigidity. However, little is known about whether choices are solely based on familiarity with the materials or knowledge about tool properties. Furthermore, it is unclear whether tool manipulation is required prior to selection or whether observation alone can suffice. We investigated whether chimpanzees (Pan troglodytes) (n = 9), bonobos (Pan paniscus) (n = 4), orangutans (Pongo pygmaeus) (n = 6), and gorillas (Gorilla gorilla) (n = 2) selected new tools on the basis of their rigidity. Subjects faced an out-of-reach reward and a choice of three tools differing in color, diameter, material, and rigidity. We used 10 different 3-tool sets (1 rigid, 2 flexible). Subjects were unfamiliar with the tools and needed to select and use the rigid tool to retrieve the reward. Experiment 1 showed that subjects chose the rigid tool from the first trial with a 90% success rate. Experiments 2a and 2b addressed the role of manipulation and observation in tool selection. Subjects performed equally well in conditions in which they could manipulate the tools themselves or saw the experimenter manipulate the tools but decreased their performance if they could only visually inspect the tools. Experiment 3 showed that subjects could select flexible tools (as opposed to rigid ones) to meet new task demands. We conclude that great apes spontaneously selected unfamiliar rigid or flexible tools even after gathering minimal observational information. 2010 APA, all rights reserved
Shi, Zhenyu; Liu, Zhanqiang; Li, Yuchao; Qiao, Yang
2017-01-01
Cutting tool geometry should be very much considered in micro-cutting because it has a significant effect on the topography and accuracy of the machined surface, particularly considering the uncut chip thickness is comparable to the cutting edge radius. The objective of this paper was to clarify the influence of the mechanism of the cutting tool geometry on the surface topography in the micro-milling process. Four different cutting tools including two two-fluted end milling tools with different helix angles of 15° and 30° cutting tools, as well as two three-fluted end milling tools with different helix angles of 15° and 30° were investigated by combining theoretical modeling analysis with experimental research. The tool geometry was mathematically modeled through coordinate translation and transformation to make all three cutting edges at the cutting tool tip into the same coordinate system. Swept mechanisms, minimum uncut chip thickness, and cutting tool run-out were considered on modeling surface roughness parameters (the height of surface roughness Rz and average surface roughness Ra) based on the established mathematical model. A set of cutting experiments was carried out using four different shaped cutting tools. It was found that the sweeping volume of the cutting tool increases with the decrease of both the cutting tool helix angle and the flute number. Great coarse machined surface roughness and more non-uniform surface topography are generated when the sweeping volume increases. The outcome of this research should bring about new methodologies for micro-end milling tool design and manufacturing. The machined surface roughness can be improved by appropriately selecting the tool geometrical parameters. PMID:28772479
NASA Technical Reports Server (NTRS)
Tijidjian, Raffi P.
2010-01-01
The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.
Targeting Antibacterial Agents by Using Drug-Carrying Filamentous Bacteriophages
Yacoby, Iftach; Shamis, Marina; Bar, Hagit; Shabat, Doron; Benhar, Itai
2006-01-01
Bacteriophages have been used for more than a century for (unconventional) therapy of bacterial infections, for half a century as tools in genetic research, for 2 decades as tools for discovery of specific target-binding proteins, and for nearly a decade as tools for vaccination or as gene delivery vehicles. Here we present a novel application of filamentous bacteriophages (phages) as targeted drug carriers for the eradication of (pathogenic) bacteria. The phages are genetically modified to display a targeting moiety on their surface and are used to deliver a large payload of a cytotoxic drug to the target bacteria. The drug is linked to the phages by means of chemical conjugation through a labile linker subject to controlled release. In the conjugated state, the drug is in fact a prodrug devoid of cytotoxic activity and is activated following its dissociation from the phage at the target site in a temporally and spatially controlled manner. Our model target was Staphylococcus aureus, and the model drug was the antibiotic chloramphenicol. We demonstrated the potential of using filamentous phages as universal drug carriers for targetable cells involved in disease. Our approach replaces the selectivity of the drug itself with target selectivity borne by the targeting moiety, which may allow the reintroduction of nonspecific drugs that have thus far been excluded from antibacterial use (because of toxicity or low selectivity). Reintroduction of such drugs into the arsenal of useful tools may help to combat emerging bacterial antibiotic resistance. PMID:16723570
A farm-level precision land management framework based on integer programming
Li, Qi; Hu, Guiping; Jubery, Talukder Zaki; Ganapathysubramanian, Baskar
2017-01-01
Farmland management involves several planning and decision making tasks including seed selection and irrigation management. A farm-level precision farmland management model based on mixed integer linear programming is proposed in this study. Optimal decisions are designed for pre-season planning of crops and irrigation water allocation. The model captures the effect of size and shape of decision scale as well as special irrigation patterns. The authors illustrate the model with a case study on a farm in the state of California in the U.S. and show the model can capture the impact of precision farm management on profitability. The results show that threefold increase of annual net profit for farmers could be achieved by carefully choosing irrigation and seed selection. Although farmers could increase profits by applying precision management to seed or irrigation alone, profit increase is more significant if farmers apply precision management on seed and irrigation simultaneously. The proposed model can also serve as a risk analysis tool for farmers facing seasonal irrigation water limits as well as a quantitative tool to explore the impact of precision agriculture. PMID:28346499
Second Generation Crop Yield Models Review
NASA Technical Reports Server (NTRS)
Hodges, T. (Principal Investigator)
1982-01-01
Second generation yield models, including crop growth simulation models and plant process models, may be suitable for large area crop yield forecasting in the yield model development project. Subjective and objective criteria for model selection are defined and models which might be selected are reviewed. Models may be selected to provide submodels as input to other models; for further development and testing; or for immediate testing as forecasting tools. A plant process model may range in complexity from several dozen submodels simulating (1) energy, carbohydrates, and minerals; (2) change in biomass of various organs; and (3) initiation and development of plant organs, to a few submodels simulating key physiological processes. The most complex models cannot be used directly in large area forecasting but may provide submodels which can be simplified for inclusion into simpler plant process models. Both published and unpublished models which may be used for development or testing are reviewed. Several other models, currently under development, may become available at a later date.
Relating MBSE to Spacecraft Development: A NASA Pathfinder
NASA Technical Reports Server (NTRS)
Othon, Bill
2016-01-01
The NASA Engineering and Safety Center (NESC) has sponsored a Pathfinder Study to investigate how Model Based Systems Engineering (MBSE) and Model Based Engineering (MBE) techniques can be applied by NASA spacecraft development projects. The objectives of this Pathfinder Study included analyzing both the products of the modeling activity, as well as the process and tool chain through which the spacecraft design activities are executed. Several aspects of MBSE methodology and process were explored. Adoption and consistent use of the MBSE methodology within an existing development environment can be difficult. The Pathfinder Team evaluated the possibility that an "MBSE Template" could be developed as both a teaching tool as well as a baseline from which future NASA projects could leverage. Elements of this template include spacecraft system component libraries, data dictionaries and ontology specifications, as well as software services that do work on the models themselves. The Pathfinder Study also evaluated the tool chain aspects of development. Two chains were considered: 1. The Development tool chain, through which SysML model development was performed and controlled, and 2. The Analysis tool chain, through which both static and dynamic system analysis is performed. Of particular interest was the ability to exchange data between SysML and other engineering tools such as CAD and Dynamic Simulation tools. For this study, the team selected a Mars Lander vehicle as the element to be designed. The paper will discuss what system models were developed, how data was captured and exchanged, and what analyses were conducted.
Modeling Selection and Extinction Mechanisms of Biological Systems
NASA Astrophysics Data System (ADS)
Amirjanov, Adil
In this paper, the behavior of a genetic algorithm is modeled to enhance its applicability as a modeling tool of biological systems. A new description model for selection mechanism is introduced which operates on a portion of individuals of population. The extinction and recolonization mechanism is modeled, and solving the dynamics analytically shows that the genetic drift in the population with extinction/recolonization is doubled. The mathematical analysis of the interaction between selection and extinction/recolonization processes is carried out to assess the dynamics of motion of the macroscopic statistical properties of population. Computer simulations confirm that the theoretical predictions of described models are in good approximations. A mathematical model of GA dynamics was also examined, which describes the anti-predator vigilance in an animal group with respect to a known analytical solution of the problem, and showed a good agreement between them to find the evolutionarily stable strategies.
CASE tools and UML: state of the ART.
Agarwal, S
2001-05-01
With increasing need for automated tools to assist complex systems development, software design methods are becoming popular. This article analyzes the state of art in computer-aided software engineering (CASE) tools and unified modeling language (UML), focusing on their evolution, merits, and industry usage. It identifies managerial issues for the tools' adoption and recommends an action plan to select and implement them. While CASE and UML offer inherent advantages like cheaper, shorter, and efficient development cycles, they suffer from poor user satisfaction. The critical success factors for their implementation include, among others, management and staff commitment, proper corporate infrastructure, and user training.
Monocular tool control, eye dominance, and laterality in New Caledonian crows.
Martinho, Antone; Burns, Zackory T; von Bayern, Auguste M P; Kacelnik, Alex
2014-12-15
Tool use, though rare, is taxonomically widespread, but morphological adaptations for tool use are virtually unknown. We focus on the New Caledonian crow (NCC, Corvus moneduloides), which displays some of the most innovative tool-related behavior among nonhumans. One of their major food sources is larvae extracted from burrows with sticks held diagonally in the bill, oriented with individual, but not species-wide, laterality. Among possible behavioral and anatomical adaptations for tool use, NCCs possess unusually wide binocular visual fields (up to 60°), suggesting that extreme binocular vision may facilitate tool use. Here, we establish that during natural extractions, tool tips can only be viewed by the contralateral eye. Thus, maintaining binocular view of tool tips is unlikely to have selected for wide binocular fields; the selective factor is more likely to have been to allow each eye to see far enough across the midsagittal line to view the tool's tip monocularly. Consequently, we tested the hypothesis that tool side preference follows eye preference and found that eye dominance does predict tool laterality across individuals. This contrasts with humans' species-wide motor laterality and uncorrelated motor-visual laterality, possibly because bill-held tools are viewed monocularly and move in concert with eyes, whereas hand-held tools are visible to both eyes and allow independent combinations of eye preference and handedness. This difference may affect other models of coordination between vision and mechanical control, not necessarily involving tools. Copyright © 2014 Elsevier Ltd. All rights reserved.
Assessment of Spacecraft Systems Integration Using the Electric Propulsion Interactions Code (EPIC)
NASA Technical Reports Server (NTRS)
Mikellides, Ioannis G.; Kuharski, Robert A.; Mandell, Myron J.; Gardner, Barbara M.; Kauffman, William J. (Technical Monitor)
2002-01-01
SAIC is currently developing the Electric Propulsion Interactions Code 'EPIC', an interactive computer tool that allows the construction of a 3-D spacecraft model, and the assessment of interactions between its subsystems and the plume from an electric thruster. EPIC unites different computer tools to address the complexity associated with the interaction processes. This paper describes the overall architecture and capability of EPIC including the physics and algorithms that comprise its various components. Results from selected modeling efforts of different spacecraft-thruster systems are also presented.
Tenenhaus-Aziza, Fanny; Ellouze, Mariem
2015-02-01
The 8th International Conference on Predictive Modelling in Food was held in Paris, France in September 2013. One of the major topics of this conference was the transfer of knowledge and tools between academics and stakeholders of the food sector. During the conference, a "Software Fair" was held to provide information and demonstrations of predictive microbiology and risk assessment software. This article presents an overall description of the 16 software tools demonstrated at the session and provides a comparison based on several criteria such as the modeling approach, the different modules available (e.g. databases, predictors, fitting tools, risk assessment tools), the studied environmental factors (temperature, pH, aw, etc.), the type of media (broth or food) and the number and type of the provided micro-organisms (pathogens and spoilers). The present study is a guide to help users select the software tools which are most suitable to their specific needs, before they test and explore the tool(s) in more depth. Copyright © 2014 Elsevier Ltd. All rights reserved.
Dynamic Metabolic Model Building Based on the Ensemble Modeling Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, James C.
2016-10-01
Ensemble modeling of kinetic systems addresses the challenges of kinetic model construction, with respect to parameter value selection, and still allows for the rich insights possible from kinetic models. This project aimed to show that constructing, implementing, and analyzing such models is a useful tool for the metabolic engineering toolkit, and that they can result in actionable insights from models. Key concepts are developed and deliverable publications and results are presented.
Yan, Qiang; Fong, Stephen S.
2017-01-01
Metabolic diversity in microorganisms can provide the basis for creating novel biochemical products. However, most metabolic engineering projects utilize a handful of established model organisms and thus, a challenge for harnessing the potential of novel microbial functions is the ability to either heterologously express novel genes or directly utilize non-model organisms. Genetic manipulation of non-model microorganisms is still challenging due to organism-specific nuances that hinder universal molecular genetic tools and translatable knowledge of intracellular biochemical pathways and regulatory mechanisms. However, in the past several years, unprecedented progress has been made in synthetic biology, molecular genetics tools development, applications of omics data techniques, and computational tools that can aid in developing non-model hosts in a systematic manner. In this review, we focus on concerns and approaches related to working with non-model microorganisms including developing molecular genetics tools such as shuttle vectors, selectable markers, and expression systems. In addition, we will discuss: (1) current techniques in controlling gene expression (transcriptional/translational level), (2) advances in site-specific genome engineering tools [homologous recombination (HR) and clustered regularly interspaced short palindromic repeats (CRISPR)], and (3) advances in genome-scale metabolic models (GSMMs) in guiding design of non-model species. Application of these principles to metabolic engineering strategies for consolidated bioprocessing (CBP) will be discussed along with some brief comments on foreseeable future prospects. PMID:29123506
Osiurak, François; Granjon, Marine; Bonnevie, Isabelle; Brogniart, Joël; Mechtouff, Laura; Benoit, Amandine; Nighoghossian, Norbert; Lesourd, Mathieu
2018-05-01
Recent evidence indicates that some left brain-damaged (LBD) patients have difficulties to use familiar tools because of the inability to reason about physical object properties. A fundamental issue is to understand the residual capacity of those LBD patients in tool selection. Three LBD patients with tool use disorders, three right brain-damaged (RBD) patients, and six matched healthy controls performed a novel tool selection task, consisting in extracting a target out from a box by selecting the relevant tool among eight, four, or two tools. Three criteria were manipulated to make relevant and irrelevant tools (size, rigidity, shape). LBD patients selected a greater number of irrelevant tools and had more difficulties to solve the task compared to RBD patients and controls. All participants committed more errors for selecting relevant tools based on rigidity and shape than size. In some LBD patients, the difficulties persisted even in the 2-Choice condition. Our findings confirm that tool use disorders result from impaired technical reasoning, leading patients to meet difficulties in selecting tools based on their physical properties. We also go further by showing that these difficulties can decrease as the choice is reduced, at least for some properties, opening new avenues for rehabilitation programs. (JINS, 2018, 24, 524-529).
Window Selection Tool | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation
NASA Technical Reports Server (NTRS)
Hyman, Cody
2011-01-01
Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.
A standard-enabled workflow for synthetic biology.
Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach
2017-06-15
A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.
NASA Astrophysics Data System (ADS)
Ranatunga, T.
2016-12-01
Modeling of fate and transport of fecal bacteria in a watershed is generally a processed based approach that considers releases from manure, point sources, and septic systems. Overland transport with water and sediments, infiltration into soils, transport in the vadose zone and groundwater, die-off and growth processes, and in-stream transport are considered as the other major processes in bacteria simulation. This presentation will discuss a simulation of fecal indicator bacteria (E.coli) source loading and in-stream conditions of a non-tidal watershed (Cedar Bayou Watershed) in South Central Texas using two models; Spatially Explicit Load Enrichment Calculation Tool (SELECT) and Soil and Water Assessment Tool (SWAT). Furthermore, it will discuss a probable approach of bacteria source load reduction in order to meet the water quality standards in the streams. The selected watershed is listed as having levels of fecal indicator bacteria that posed a risk for contact recreation and wading by the Texas Commission of Environmental Quality (TCEQ). The SELECT modeling approach was used in estimating the bacteria source loading from land categories. Major bacteria sources considered were, failing septic systems, discharges from wastewater treatment facilities, excreta from livestock (Cattle, Horses, Sheep and Goat), excreta from Wildlife (Feral Hogs, and Deer), Pet waste (mainly from Dogs), and runoff from urban surfaces. The estimated source loads were input to the SWAT model in order to simulate the transport through the land and in-stream conditions. The calibrated SWAT model was then used to estimate the indicator bacteria in-stream concentrations for future years based on H-GAC's regional land use, population and household projections (up to 2040). Based on the in-stream reductions required to meet the water quality standards, the corresponding required source load reductions were estimated.
Yoo, Tae Keun; Kim, Sung Kean; Kim, Deok Won; Choi, Joon Yul; Lee, Wan Hyung; Oh, Ein; Park, Eun-Cheol
2013-11-01
A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women compared to the ability of conventional clinical decision tools. We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Examination Surveys. The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests, artificial neural networks (ANN), and logistic regression (LR) based on simple surveys. The machine learning models were compared to four conventional clinical decision tools: osteoporosis self-assessment tool (OST), osteoporosis risk assessment instrument (ORAI), simple calculated osteoporosis risk estimation (SCORE), and osteoporosis index of risk (OSIRIS). SVM had significantly better area under the curve (AUC) of the receiver operating characteristic than ANN, LR, OST, ORAI, SCORE, and OSIRIS for the training set. SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0% at total hip, femoral neck, or lumbar spine for the testing set. The significant factors selected by SVM were age, height, weight, body mass index, duration of menopause, duration of breast feeding, estrogen therapy, hyperlipidemia, hypertension, osteoarthritis, and diabetes mellitus. Considering various predictors associated with low bone density, the machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.
Influenza virus drug resistance: a time-sampled population genetics perspective.
Foll, Matthieu; Poh, Yu-Ping; Renzette, Nicholas; Ferrer-Admetlla, Anna; Bank, Claudia; Shim, Hyunjin; Malaspinas, Anna-Sapfo; Ewing, Gregory; Liu, Ping; Wegmann, Daniel; Caffrey, Daniel R; Zeldovich, Konstantin B; Bolon, Daniel N; Wang, Jennifer P; Kowalik, Timothy F; Schiffer, Celia A; Finberg, Robert W; Jensen, Jeffrey D
2014-02-01
The challenge of distinguishing genetic drift from selection remains a central focus of population genetics. Time-sampled data may provide a powerful tool for distinguishing these processes, and we here propose approximate Bayesian, maximum likelihood, and analytical methods for the inference of demography and selection from time course data. Utilizing these novel statistical and computational tools, we evaluate whole-genome datasets of an influenza A H1N1 strain in the presence and absence of oseltamivir (an inhibitor of neuraminidase) collected at thirteen time points. Results reveal a striking consistency amongst the three estimation procedures developed, showing strongly increased selection pressure in the presence of drug treatment. Importantly, these approaches re-identify the known oseltamivir resistance site, successfully validating the approaches used. Enticingly, a number of previously unknown variants have also been identified as being positively selected. Results are interpreted in the light of Fisher's Geometric Model, allowing for a quantification of the increased distance to optimum exerted by the presence of drug, and theoretical predictions regarding the distribution of beneficial fitness effects of contending mutations are empirically tested. Further, given the fit to expectations of the Geometric Model, results suggest the ability to predict certain aspects of viral evolution in response to changing host environments and novel selective pressures.
Seyssel, Kevin; Suter, Michel; Pattou, François; Caiazzo, Robert; Verkindt, Helene; Raverdy, Violeta; Jolivet, Mathieu; Disse, Emmanuel; Robert, Maud; Giusti, Vittorio
2018-06-19
Different factors, such as age, gender, preoperative weight but also the patient's motivation, are known to impact outcomes after Roux-en-Y gastric bypass (RYGBP). Weight loss prediction is helpful to define realistic expectations and maintain motivation during follow-up, but also to select good candidates for surgery and limit failures. Therefore, developing a realistic predictive tool appears interesting. A Swiss cohort (n = 444), who underwent RYGBP, was used, with multiple linear regression models, to predict weight loss up to 60 months after surgery considering age, height, gender and weight at baseline. We then applied our model on two French cohorts and compared predicted weight to the one finally reached. Accuracy of our model was controlled using root mean square error (RMSE). Mean weight loss was 43.6 ± 13.0 and 40.8 ± 15.4 kg at 12 and 60 months respectively. The model was reliable to predict weight loss (0.37 < R 2 < 0.48) and RMSE between 5.0 and 12.2 kg. High preoperative weight and young age were positively correlated to weight loss, as well as male gender. Correlations between predicted weight and real weight were highly significant in both validation cohorts (R ≥ 0.7 and P < 0.01) and RMSE increased throughout follow-up between 6.2 and 15.4 kg. Our statistical model to predict weight loss outcomes after RYGBP seems accurate. It could be a valuable tool to define realistic weight loss expectations and to improve patient selection and outcomes during follow-up. Further research is needed to demonstrate the interest of this model in improving patients' motivation and results and limit the failures.
NASA Astrophysics Data System (ADS)
Kong, Wenwen; Liu, Fei; Zhang, Chu; Bao, Yidan; Yu, Jiajia; He, Yong
2014-01-01
Tomatoes are cultivated around the world and gray mold is one of its most prominent and destructive diseases. An early disease detection method can decrease losses caused by plant diseases and prevent the spread of diseases. The activity of peroxidase (POD) is very important indicator of disease stress for plants. The objective of this study is to examine the possibility of fast detection of POD activity in tomato leaves which infected with Botrytis cinerea using hyperspectral imaging data. Five pre-treatment methods were investigated. Genetic algorithm-partial least squares (GA-PLS) was applied to select optimal wavelengths. A new fast learning neural algorithm named extreme learning machine (ELM) was employed as multivariate analytical tool in this study. 21 optimal wavelengths were selected by GA-PLS and used as inputs of three calibration models. The optimal prediction result was achieved by ELM model with selected wavelengths, and the r and RMSEP in validation were 0.8647 and 465.9880 respectively. The results indicated that hyperspectral imaging could be considered as a valuable tool for POD activity prediction. The selected wavelengths could be potential resources for instrument development.
EFS: an ensemble feature selection tool implemented as R-package and web-application.
Neumann, Ursula; Genze, Nikita; Heider, Dominik
2017-01-01
Feature selection methods aim at identifying a subset of features that improve the prediction performance of subsequent classification models and thereby also simplify their interpretability. Preceding studies demonstrated that single feature selection methods can have specific biases, whereas an ensemble feature selection has the advantage to alleviate and compensate for these biases. The software EFS (Ensemble Feature Selection) makes use of multiple feature selection methods and combines their normalized outputs to a quantitative ensemble importance. Currently, eight different feature selection methods have been integrated in EFS, which can be used separately or combined in an ensemble. EFS identifies relevant features while compensating specific biases of single methods due to an ensemble approach. Thereby, EFS can improve the prediction accuracy and interpretability in subsequent binary classification models. EFS can be downloaded as an R-package from CRAN or used via a web application at http://EFS.heiderlab.de.
Sauterey, Boris; Ward, Ben A; Follows, Michael J; Bowler, Chris; Claessen, David
2015-01-01
The functional and taxonomic biogeography of marine microbial systems reflects the current state of an evolving system. Current models of marine microbial systems and biogeochemical cycles do not reflect this fundamental organizing principle. Here, we investigate the evolutionary adaptive potential of marine microbial systems under environmental change and introduce explicit Darwinian adaptation into an ocean modelling framework, simulating evolving phytoplankton communities in space and time. To this end, we adopt tools from adaptive dynamics theory, evaluating the fitness of invading mutants over annual timescales, replacing the resident if a fitter mutant arises. Using the evolutionary framework, we examine how community assembly, specifically the emergence of phytoplankton cell size diversity, reflects the combined effects of bottom-up and top-down controls. When compared with a species-selection approach, based on the paradigm that "Everything is everywhere, but the environment selects", we show that (i) the selected optimal trait values are similar; (ii) the patterns emerging from the adaptive model are more robust, but (iii) the two methods lead to different predictions in terms of emergent diversity. We demonstrate that explicitly evolutionary approaches to modelling marine microbial populations and functionality are feasible and practical in time-varying, space-resolving settings and provide a new tool for exploring evolutionary interactions on a range of timescales in the ocean.
Katki, Hormuzd A; Kovalchik, Stephanie A; Petito, Lucia C; Cheung, Li C; Jacobs, Eric; Jemal, Ahmedin; Berg, Christine D; Chaturvedi, Anil K
2018-05-15
Lung cancer screening guidelines recommend using individualized risk models to refer ever-smokers for screening. However, different models select different screening populations. The performance of each model in selecting ever-smokers for screening is unknown. To compare the U.S. screening populations selected by 9 lung cancer risk models (the Bach model; the Spitz model; the Liverpool Lung Project [LLP] model; the LLP Incidence Risk Model [LLPi]; the Hoggart model; the Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial Model 2012 [PLCOM2012]; the Pittsburgh Predictor; the Lung Cancer Risk Assessment Tool [LCRAT]; and the Lung Cancer Death Risk Assessment Tool [LCDRAT]) and to examine their predictive performance in 2 cohorts. Population-based prospective studies. United States. Models selected U.S. screening populations by using data from the National Health Interview Survey from 2010 to 2012. Model performance was evaluated using data from 337 388 ever-smokers in the National Institutes of Health-AARP Diet and Health Study and 72 338 ever-smokers in the CPS-II (Cancer Prevention Study II) Nutrition Survey cohort. Model calibration (ratio of model-predicted to observed cases [expected-observed ratio]) and discrimination (area under the curve [AUC]). At a 5-year risk threshold of 2.0%, the models chose U.S. screening populations ranging from 7.6 million to 26 million ever-smokers. These disagreements occurred because, in both validation cohorts, 4 models (the Bach model, PLCOM2012, LCRAT, and LCDRAT) were well-calibrated (expected-observed ratio range, 0.92 to 1.12) and had higher AUCs (range, 0.75 to 0.79) than 5 models that generally overestimated risk (expected-observed ratio range, 0.83 to 3.69) and had lower AUCs (range, 0.62 to 0.75). The 4 best-performing models also had the highest sensitivity at a fixed specificity (and vice versa) and similar discrimination at a fixed risk threshold. These models showed better agreement on size of the screening population (7.6 million to 10.9 million) and achieved consensus on 73% of persons chosen. No consensus on risk thresholds for screening. The 9 lung cancer risk models chose widely differing U.S. screening populations. However, 4 models (the Bach model, PLCOM2012, LCRAT, and LCDRAT) most accurately predicted risk and performed best in selecting ever-smokers for screening. Intramural Research Program of the National Institutes of Health/National Cancer Institute.
Jung, Won-Mo; Park, In-Soo; Lee, Ye-Seul; Kim, Chang-Eop; Lee, Hyangsook; Hahm, Dae-Hyun; Park, Hi-Joon; Jang, Bo-Hyoung; Chae, Younbyoung
2018-04-12
Comprehension of the medical diagnoses of doctors and treatment of diseases is important to understand the underlying principle in selecting appropriate acupoints. The pattern recognition process that pertains to symptoms and diseases and informs acupuncture treatment in a clinical setting was explored. A total of 232 clinical records were collected using a Charting Language program. The relationship between symptom information and selected acupoints was trained using an artificial neural network (ANN). A total of 11 hidden nodes with the highest average precision score were selected through a tenfold cross-validation. Our ANN model could predict the selected acupoints based on symptom and disease information with an average precision score of 0.865 (precision, 0.911; recall, 0.811). This model is a useful tool for diagnostic classification or pattern recognition and for the prediction and modeling of acupuncture treatment based on clinical data obtained in a real-world setting. The relationship between symptoms and selected acupoints could be systematically characterized through knowledge discovery processes, such as pattern identification.
A System for Integrated Reliability and Safety Analyses
NASA Technical Reports Server (NTRS)
Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Coumeri, Marc; Scheidler, Peter, Jr.; Bonesteel, Charles
1999-01-01
We present an integrated reliability and aviation safety analysis tool. The reliability models for selected infrastructure components of the air traffic control system are described. The results of this model are used to evaluate the likelihood of seeing outcomes predicted by simulations with failures injected. We discuss the design of the simulation model, and the user interface to the integrated toolset.
Why Quantify Uncertainty in Ecosystem Studies: Obligation versus Discovery Tool?
NASA Astrophysics Data System (ADS)
Harmon, M. E.
2016-12-01
There are multiple motivations for quantifying uncertainty in ecosystem studies. One is as an obligation; the other is as a tool useful in moving ecosystem science toward discovery. While reporting uncertainty should become a routine expectation, a more convincing motivation involves discovery. By clarifying what is known and to what degree it is known, uncertainty analyses can point the way toward improvements in measurements, sampling designs, and models. While some of these improvements (e.g., better sampling designs) may lead to incremental gains, those involving models (particularly model selection) may require large gains in knowledge. To be fully harnessed as a discovery tool, attitudes toward uncertainty may have to change: rather than viewing uncertainty as a negative assessment of what was done, it should be viewed as positive, helpful assessment of what remains to be done.
Dissecting children's observational learning of complex actions through selective video displays.
Flynn, Emma; Whiten, Andrew
2013-10-01
Children can learn how to use complex objects by watching others, yet the relative importance of different elements they may observe, such as the interactions of the individual parts of the apparatus, a model's movements, and desirable outcomes, remains unclear. In total, 140 3-year-olds and 140 5-year-olds participated in a study where they observed a video showing tools being used to extract a reward item from a complex puzzle box. Conditions varied according to the elements that could be seen in the video: (a) the whole display, including the model's hands, the tools, and the box; (b) the tools and the box but not the model's hands; (c) the model's hands and the tools but not the box; (d) only the end state with the box opened; and (e) no demonstration. Children's later attempts at the task were coded to establish whether they imitated the hierarchically organized sequence of the model's actions, the action details, and/or the outcome. Children's successful retrieval of the reward from the box and the replication of hierarchical sequence information were reduced in all but the whole display condition. Only once children had attempted the task and witnessed a second demonstration did the display focused on the tools and box prove to be better for hierarchical sequence information than the display focused on the tools and hands only. Copyright © 2013 Elsevier Inc. All rights reserved.
Ko, Gene M; Garg, Rajni; Bailey, Barbara A; Kumar, Sunil
2016-01-01
Quantitative structure-activity relationship (QSAR) models can be used as a predictive tool for virtual screening of chemical libraries to identify novel drug candidates. The aims of this paper were to report the results of a study performed for descriptor selection, QSAR model development, and virtual screening for identifying novel HIV-1 integrase inhibitor drug candidates. First, three evolutionary algorithms were compared for descriptor selection: differential evolution-binary particle swarm optimization (DE-BPSO), binary particle swarm optimization, and genetic algorithms. Next, three QSAR models were developed from an ensemble of multiple linear regression, partial least squares, and extremely randomized trees models. A comparison of the performances of three evolutionary algorithms showed that DE-BPSO has a significant improvement over the other two algorithms. QSAR models developed in this study were used in consensus as a predictive tool for virtual screening of the NCI Open Database containing 265,242 compounds to identify potential novel HIV-1 integrase inhibitors. Six compounds were predicted to be highly active (plC50 > 6) by each of the three models. The use of a hybrid evolutionary algorithm (DE-BPSO) for descriptor selection and QSAR model development in drug design is a novel approach. Consensus modeling may provide better predictivity by taking into account a broader range of chemical properties within the data set conducive for inhibition that may be missed by an individual model. The six compounds identified provide novel drug candidate leads in the design of next generation HIV- 1 integrase inhibitors targeting drug resistant mutant viruses.
González-Ferrer, Arturo; Valcárcel, M Ángel; Cuesta, Martín; Cháfer, Joan; Runkle, Isabelle
2017-07-01
Hyponatremia is the most common type of electrolyte imbalance, occurring when serum sodium is below threshold levels, typically 135mmol/L. Electrolyte balance has been identified as one of the most challenging subjects for medical students, but also as one of the most relevant areas to learn about according to physicians and researchers. We present a computer-interpretable guideline (CIG) model that will be used for medical training to learn how to improve the diagnosis of hyponatremia applying an expert consensus document (ECDs). We used the PROForma set of tools to develop the model, using an iterative process involving two knowledge engineers (a computer science Ph.D. and a preventive medicine specialist) and two expert endocrinologists. We also carried out an initial validation of the model and a qualitative post-analysis from the results of a retrospective study (N=65 patients), comparing the consensus diagnosis of two experts with the output of the tool. The model includes over two-hundred "for", "against" and "neutral" arguments that are selectively triggered depending on the input value of more than forty patient-state variables. We share the methodology followed for the development process and the initial validation results, that achieved a high ratio of 61/65 agreements with the consensus diagnosis, having a kappa value of K=0.86 for overall agreement and K=0.80 for first-ranked agreement. Hospital care professionals involved in the project showed high expectations of using this tool for training, but the process to follow for a successful diagnosis and application is not trivial, as reported in this manuscript. Secondary benefits of using these tools are associated to improving research knowledge and existing clinical practice guidelines (CPGs) or ECDs. Beyond point-of-care clinical decision support, knowledge-based decision support systems are very attractive as a training tool, to help selected professionals to better understand difficult diseases that are underdiagnosed and/or incorrectly managed. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Saranya, Kunaparaju; John Rozario Jegaraj, J.; Ramesh Kumar, Katta; Venkateshwara Rao, Ghanta
2016-06-01
With the increased trend in automation of modern manufacturing industry, the human intervention in routine, repetitive and data specific activities of manufacturing is greatly reduced. In this paper, an attempt has been made to reduce the human intervention in selection of optimal cutting tool and process parameters for metal cutting applications, using Artificial Intelligence techniques. Generally, the selection of appropriate cutting tool and parameters in metal cutting is carried out by experienced technician/cutting tool expert based on his knowledge base or extensive search from huge cutting tool database. The present proposed approach replaces the existing practice of physical search for tools from the databooks/tool catalogues with intelligent knowledge-based selection system. This system employs artificial intelligence based techniques such as artificial neural networks, fuzzy logic and genetic algorithm for decision making and optimization. This intelligence based optimal tool selection strategy is developed using Mathworks Matlab Version 7.11.0 and implemented. The cutting tool database was obtained from the tool catalogues of different tool manufacturers. This paper discusses in detail, the methodology and strategies employed for selection of appropriate cutting tool and optimization of process parameters based on multi-objective optimization criteria considering material removal rate, tool life and tool cost.
Surface Roughness Model Based on Force Sensors for the Prediction of the Tool Wear
de Agustina, Beatriz; Rubio, Eva María; Sebastián, Miguel Ángel
2014-01-01
In this study, a methodology has been developed with the objective of evaluating the surface roughness obtained during turning processes by measuring the signals detected by a force sensor under the same cutting conditions. In this way, the surface quality achieved along the process is correlated to several parameters of the cutting forces (thrust forces, feed forces and cutting forces), so the effect that the tool wear causes on the surface roughness is evaluated. In a first step, the best cutting conditions (cutting parameters and radius of tool) for a certain quality surface requirement were found for pieces of UNS A97075. Next, with this selection a model of surface roughness based on the cutting forces was developed for different states of wear that simulate the behaviour of the tool throughout its life. The validation of this model reveals that it was effective for approximately 70% of the surface roughness values obtained. PMID:24714391
NASA Astrophysics Data System (ADS)
Li, Zhanjie; Yu, Jingshan; Xu, Xinyi; Sun, Wenchao; Pang, Bo; Yue, Jiajia
2018-06-01
Hydrological models are important and effective tools for detecting complex hydrological processes. Different models have different strengths when capturing the various aspects of hydrological processes. Relying on a single model usually leads to simulation uncertainties. Ensemble approaches, based on multi-model hydrological simulations, can improve application performance over single models. In this study, the upper Yalongjiang River Basin was selected for a case study. Three commonly used hydrological models (SWAT, VIC, and BTOPMC) were selected and used for independent simulations with the same input and initial values. Then, the BP neural network method was employed to combine the results from the three models. The results show that the accuracy of BP ensemble simulation is better than that of the single models.
RSAT: regulatory sequence analysis tools.
Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques
2008-07-01
The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published.
Spindle Thermal Error Optimization Modeling of a Five-axis Machine Tool
NASA Astrophysics Data System (ADS)
Guo, Qianjian; Fan, Shuo; Xu, Rufeng; Cheng, Xiang; Zhao, Guoyong; Yang, Jianguo
2017-05-01
Aiming at the problem of low machining accuracy and uncontrollable thermal errors of NC machine tools, spindle thermal error measurement, modeling and compensation of a two turntable five-axis machine tool are researched. Measurement experiment of heat sources and thermal errors are carried out, and GRA(grey relational analysis) method is introduced into the selection of temperature variables used for thermal error modeling. In order to analyze the influence of different heat sources on spindle thermal errors, an ANN (artificial neural network) model is presented, and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN, a new ABC-NN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors. In order to test the prediction performance of ABC-NN model, an experiment system is developed, the prediction results of LSR (least squares regression), ANN and ABC-NN are compared with the measurement results of spindle thermal errors. Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN, and the residual error is smaller than 3 μm, the new modeling method is feasible. The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.
Prognostic and Prediction Tools in Bladder Cancer: A Comprehensive Review of the Literature.
Kluth, Luis A; Black, Peter C; Bochner, Bernard H; Catto, James; Lerner, Seth P; Stenzl, Arnulf; Sylvester, Richard; Vickers, Andrew J; Xylinas, Evanguelos; Shariat, Shahrokh F
2015-08-01
This review focuses on risk assessment and prediction tools for bladder cancer (BCa). To review the current knowledge on risk assessment and prediction tools to enhance clinical decision making and counseling of patients with BCa. A literature search in English was performed using PubMed in July 2013. Relevant risk assessment and prediction tools for BCa were selected. More than 1600 publications were retrieved. Special attention was given to studies that investigated the clinical benefit of a prediction tool. Most prediction tools for BCa focus on the prediction of disease recurrence and progression in non-muscle-invasive bladder cancer or disease recurrence and survival after radical cystectomy. Although these tools are helpful, recent prediction tools aim to address a specific clinical problem, such as the prediction of organ-confined disease and lymph node metastasis to help identify patients who might benefit from neoadjuvant chemotherapy. Although a large number of prediction tools have been reported in recent years, many of them lack external validation. Few studies have investigated the clinical utility of any given model as measured by its ability to improve clinical decision making. There is a need for novel biomarkers to improve the accuracy and utility of prediction tools for BCa. Decision tools hold the promise of facilitating the shared decision process, potentially improving clinical outcomes for BCa patients. Prediction models need external validation and assessment of clinical utility before they can be incorporated into routine clinical care. We looked at models that aim to predict outcomes for patients with bladder cancer (BCa). We found a large number of prediction models that hold the promise of facilitating treatment decisions for patients with BCa. However, many models are missing confirmation in a different patient cohort, and only a few studies have tested the clinical utility of any given model as measured by its ability to improve clinical decision making. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Distributed Space Mission Design for Earth Observation Using Model-Based Performance Evaluation
NASA Technical Reports Server (NTRS)
Nag, Sreeja; LeMoigne-Stewart, Jacqueline; Cervantes, Ben; DeWeck, Oliver
2015-01-01
Distributed Space Missions (DSMs) are gaining momentum in their application to earth observation missions owing to their unique ability to increase observation sampling in multiple dimensions. DSM design is a complex problem with many design variables, multiple objectives determining performance and cost and emergent, often unexpected, behaviors. There are very few open-access tools available to explore the tradespace of variables, minimize cost and maximize performance for pre-defined science goals, and therefore select the most optimal design. This paper presents a software tool that can multiple DSM architectures based on pre-defined design variable ranges and size those architectures in terms of predefined science and cost metrics. The tool will help a user select Pareto optimal DSM designs based on design of experiments techniques. The tool will be applied to some earth observation examples to demonstrate its applicability in making some key decisions between different performance metrics and cost metrics early in the design lifecycle.
Engine System Model Development for Nuclear Thermal Propulsion
NASA Technical Reports Server (NTRS)
Nelson, Karl W.; Simpson, Steven P.
2006-01-01
In order to design, analyze, and evaluate conceptual Nuclear Thermal Propulsion (NTP) engine systems, an improved NTP design and analysis tool has been developed. The NTP tool utilizes the Rocket Engine Transient Simulation (ROCETS) system tool and many of the routines from the Enabler reactor model found in Nuclear Engine System Simulation (NESS). Improved non-nuclear component models and an external shield model were added to the tool. With the addition of a nearly complete system reliability model, the tool will provide performance, sizing, and reliability data for NERVA-Derived NTP engine systems. A new detailed reactor model is also being developed and will replace Enabler. The new model will allow more flexibility in reactor geometry and include detailed thermal hydraulics and neutronics models. A description of the reactor, component, and reliability models is provided. Another key feature of the modeling process is the use of comprehensive spreadsheets for each engine case. The spreadsheets include individual worksheets for each subsystem with data, plots, and scaled figures, making the output very useful to each engineering discipline. Sample performance and sizing results with the Enabler reactor model are provided including sensitivities. Before selecting an engine design, all figures of merit must be considered including the overall impacts on the vehicle and mission. Evaluations based on key figures of merit of these results and results with the new reactor model will be performed. The impacts of clustering and external shielding will also be addressed. Over time, the reactor model will be upgraded to design and analyze other NTP concepts with CERMET and carbide fuel cores.
NASA Technical Reports Server (NTRS)
Fabinsky, Beth
2006-01-01
WISE, the Wide Field Infrared Survey Explorer, is scheduled for launch in June 2010. The mission operations system for WISE requires a software modeling tool to help plan, integrate and simulate all spacecraft pointing and verify that no attitude constraints are violated. In the course of developing the requirements for this tool, an investigation was conducted into the design of similar tools for other space-based telescopes. This paper summarizes the ground software and processes used to plan and validate pointing for a selection of space telescopes; with this information as background, the design for WISE is presented.
Travel Demand Model Development and Application Guidelines (Rev.)
DOT National Transportation Integrated Search
1995-06-01
There is a new challenge confronting state and regional agencies -- the : selection and development of appropriate analysis tools for application to the : planning problems presented by the 1990 Federal Clear Air Act Amendments (CAAA), : the 1991 Fed...
Tools for assessing fall risk in the elderly: a systematic review and meta-analysis.
Park, Seong-Hi
2018-01-01
The prevention of falls among the elderly is arguably one of the most important public health issues in today's aging society. The aim of this study was to assess which tools best predict the risk of falls in the elderly. Electronic searches were performed using Medline, EMBASE, the Cochrane Library, CINAHL, etc., using the following keywords: "fall risk assessment", "elderly fall screening", and "elderly mobility scale". The QUADAS-2 was applied to assess the internal validity of the diagnostic studies. Selected studies were meta-analyzed with MetaDisc 1.4. A total of 33 studies were eligible out of the 2,321 studies retrieved from selected databases. Twenty-six assessment tools for fall risk were used in the selected articles, and they tended to vary based on the setting. The fall risk assessment tools currently used for the elderly did not show sufficiently high predictive validity for differentiating high and low fall risks. The Berg Balance scale and Mobility Interaction Fall chart showed stable and high specificity, while the Downton Fall Risk Index, Hendrich II Fall Risk Model, St. Thomas's Risk Assessment Tool in Falling elderly inpatients, Timed Up and Go test, and Tinetti Balance scale showed the opposite results. We concluded that rather than a single measure, two assessment tools used together would better evaluate the characteristics of falls by the elderly that can occur due to a multitude of factors and maximize the advantages of each for predicting the occurrence of falls.
Stein, Mart Lambertus; Rudge, James W; Coker, Richard; van der Weijden, Charlie; Krumkamp, Ralf; Hanvoravongchai, Piya; Chavez, Irwin; Putthasri, Weerasak; Phommasack, Bounlay; Adisasmito, Wiku; Touch, Sok; Sat, Le Minh; Hsu, Yu-Chen; Kretzschmar, Mirjam; Timen, Aura
2012-10-12
Health care planning for pandemic influenza is a challenging task which requires predictive models by which the impact of different response strategies can be evaluated. However, current preparedness plans and simulations exercises, as well as freely available simulation models previously made for policy makers, do not explicitly address the availability of health care resources or determine the impact of shortages on public health. Nevertheless, the feasibility of health systems to implement response measures or interventions described in plans and trained in exercises depends on the available resource capacity. As part of the AsiaFluCap project, we developed a comprehensive and flexible resource modelling tool to support public health officials in understanding and preparing for surges in resource demand during future pandemics. The AsiaFluCap Simulator is a combination of a resource model containing 28 health care resources and an epidemiological model. The tool was built in MS Excel© and contains a user-friendly interface which allows users to select mild or severe pandemic scenarios, change resource parameters and run simulations for one or multiple regions. Besides epidemiological estimations, the simulator provides indications on resource gaps or surpluses, and the impact of shortages on public health for each selected region. It allows for a comparative analysis of the effects of resource availability and consequences of different strategies of resource use, which can provide guidance on resource prioritising and/or mobilisation. Simulation results are displayed in various tables and graphs, and can also be easily exported to GIS software to create maps for geographical analysis of the distribution of resources. The AsiaFluCap Simulator is freely available software (http://www.cdprg.org) which can be used by policy makers, policy advisors, donors and other stakeholders involved in preparedness for providing evidence based and illustrative information on health care resource capacities during future pandemics. The tool can inform both preparedness plans and simulation exercises and can help increase the general understanding of dynamics in resource capacities during a pandemic. The combination of a mathematical model with multiple resources and the linkage to GIS for creating maps makes the tool unique compared to other available software.
ERIC Educational Resources Information Center
Sheehan, Kathleen M.
2017-01-01
A model-based approach for matching language learners to texts of appropriate difficulty is described. Results are communicated to test takers via a targeted reading range expressed on the reporting scale of an automated text complexity measurement tool (ATCMT). Test takers can use this feedback to select reading materials that are well matched to…
Mbeutcha, Aurélie; Mathieu, Romain; Rouprêt, Morgan; Gust, Kilian M; Briganti, Alberto; Karakiewicz, Pierre I; Shariat, Shahrokh F
2016-10-01
In the context of customized patient care for upper tract urothelial carcinoma (UTUC), decision-making could be facilitated by risk assessment and prediction tools. The aim of this study was to provide a critical overview of existing predictive models and to review emerging promising prognostic factors for UTUC. A literature search of articles published in English from January 2000 to June 2016 was performed using PubMed. Studies on risk group stratification models and predictive tools in UTUC were selected, together with studies on predictive factors and biomarkers associated with advanced-stage UTUC and oncological outcomes after surgery. Various predictive tools have been described for advanced-stage UTUC assessment, disease recurrence and cancer-specific survival (CSS). Most of these models are based on well-established prognostic factors such as tumor stage, grade and lymph node (LN) metastasis, but some also integrate newly described prognostic factors and biomarkers. These new prediction tools seem to reach a high level of accuracy, but they lack external validation and decision-making analysis. The combinations of patient-, pathology- and surgery-related factors together with novel biomarkers have led to promising predictive tools for oncological outcomes in UTUC. However, external validation of these predictive models is a prerequisite before their introduction into daily practice. New models predicting response to therapy are urgently needed to allow accurate and safe individualized management in this heterogeneous disease.
NASA Astrophysics Data System (ADS)
Chen, Hui; Tan, Chao; Lin, Zan; Wu, Tong
2018-01-01
Milk is among the most popular nutrient source worldwide, which is of great interest due to its beneficial medicinal properties. The feasibility of the classification of milk powder samples with respect to their brands and the determination of protein concentration is investigated by NIR spectroscopy along with chemometrics. Two datasets were prepared for experiment. One contains 179 samples of four brands for classification and the other contains 30 samples for quantitative analysis. Principal component analysis (PCA) was used for exploratory analysis. Based on an effective model-independent variable selection method, i.e., minimal-redundancy maximal-relevance (MRMR), only 18 variables were selected to construct a partial least-square discriminant analysis (PLS-DA) model. On the test set, the PLS-DA model based on the selected variable set was compared with the full-spectrum PLS-DA model, both of which achieved 100% accuracy. In quantitative analysis, the partial least-square regression (PLSR) model constructed by the selected subset of 260 variables outperforms significantly the full-spectrum model. It seems that the combination of NIR spectroscopy, MRMR and PLS-DA or PLSR is a powerful tool for classifying different brands of milk and determining the protein content.
General MACOS Interface for Modeling and Analysis for Controlled Optical Systems
NASA Technical Reports Server (NTRS)
Sigrist, Norbert; Basinger, Scott A.; Redding, David C.
2012-01-01
The General MACOS Interface (GMI) for Modeling and Analysis for Controlled Optical Systems (MACOS) enables the use of MATLAB as a front-end for JPL s critical optical modeling package, MACOS. MACOS is JPL s in-house optical modeling software, which has proven to be a superb tool for advanced systems engineering of optical systems. GMI, coupled with MACOS, allows for seamless interfacing with modeling tools from other disciplines to make possible integration of dynamics, structures, and thermal models with the addition of control systems for deformable optics and other actuated optics. This software package is designed as a tool for analysts to quickly and easily use MACOS without needing to be an expert at programming MACOS. The strength of MACOS is its ability to interface with various modeling/development platforms, allowing evaluation of system performance with thermal, mechanical, and optical modeling parameter variations. GMI provides an improved means for accessing selected key MACOS functionalities. The main objective of GMI is to marry the vast mathematical and graphical capabilities of MATLAB with the powerful optical analysis engine of MACOS, thereby providing a useful tool to anyone who can program in MATLAB. GMI also improves modeling efficiency by eliminating the need to write an interface function for each task/project, reducing error sources, speeding up user/modeling tasks, and making MACOS well suited for fast prototyping.
GAPIT: genome association and prediction integrated tool.
Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu
2012-09-15
Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.
Development of a component design tool for metal hydride heat pumps
NASA Astrophysics Data System (ADS)
Waters, Essene L.
Given current demands for more efficient and environmentally friendly energy sources, hydrogen based energy systems are an increasingly popular field of interest. Within the field, metal hydrides have become a prominent focus of research due to their large hydrogen storage capacity and relative system simplicity and safety. Metal hydride heat pumps constitute one such application, in which heat and hydrogen are transferred to and from metal hydrides. While a significant amount of work has been done to study such systems, the scope of materials selection has been quite limited. Typical studies compare only a few metal hydride materials and provide limited justification for the choice of those few. In this work, a metal hydride component design tool has been developed to enable the targeted down-selection of an extensive database of metal hydrides to identify the most promising materials for use in metal hydride thermal systems. The material database contains over 300 metal hydrides with various physical and thermodynamic properties included for each material. Sub-models for equilibrium pressure, thermophysical data, and default properties are used to predict the behavior of each material within the given system. For a given thermal system, this tool can be used to identify optimal materials out of over 100,000 possible hydride combinations. The selection tool described herein has been applied to a stationary combined heat and power system containing a high-temperature proton exchange membrane (PEM) fuel cell, a hot water tank, and two metal hydride beds used as a heat pump. A variety of factors can be used to select materials including efficiency, maximum and minimum system pressures, pressure difference, coefficient of performance (COP), and COP sensitivity. The targeted down-selection of metal hydrides for this system focuses on the system's COP for each potential pair. The values of COP and COP sensitivity have been used to identify pairs of highest interest for use in this application. The metal hydride component design tool developed in this work selects between metal hydride materials on an unprecedented scale. It can be easily applied to other hydrogen-based thermal systems, making it a powerful and versatile tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa
This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATLmore » Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.« less
Conducting field studies for testing pesticide leaching models
Smith, Charles N.; Parrish, Rudolph S.; Brown, David S.
1990-01-01
A variety of predictive models are being applied to evaluate the transport and transformation of pesticides in the environment. These include well known models such as the Pesticide Root Zone Model (PRZM), the Risk of Unsaturated-Saturated Transport and Transformation Interactions for Chemical Concentrations Model (RUSTIC) and the Groundwater Loading Effects of Agricultural Management Systems Model (GLEAMS). The potentially large impacts of using these models as tools for developing pesticide management strategies and regulatory decisions necessitates development of sound model validation protocols. This paper offers guidance on many of the theoretical and practical problems encountered in the design and implementation of field-scale model validation studies. Recommendations are provided for site selection and characterization, test compound selection, data needs, measurement techniques, statistical design considerations and sampling techniques. A strategy is provided for quantitatively testing models using field measurements.
Magnetically multiplexed heating of single domain nanoparticles
NASA Astrophysics Data System (ADS)
Christiansen, M. G.; Senko, A. W.; Chen, R.; Romero, G.; Anikeeva, P.
2014-05-01
Selective hysteretic heating of multiple collocated types of single domain magnetic nanoparticles (SDMNPs) by alternating magnetic fields (AMFs) may offer a useful tool for biomedical applications. The possibility of "magnetothermal multiplexing" has not yet been realized, in part due to prevalent use of linear response theory to model SDMNP heating in AMFs. Dynamic hysteresis modeling suggests that specific driving conditions play an underappreciated role in determining optimal material selection strategies for high heat dissipation. Motivated by this observation, magnetothermal multiplexing is theoretically predicted and empirically demonstrated by selecting SDMNPs with properties that suggest optimal hysteretic heat dissipation at dissimilar AMF driving conditions. This form of multiplexing could effectively offer multiple channels for minimally invasive biological signaling applications.
Perlis, Roy H
2013-07-01
Early identification of depressed individuals at high risk for treatment resistance could be helpful in selecting optimal setting and intensity of care. At present, validated tools to facilitate this risk stratification are rarely used in psychiatric practice. Data were drawn from the first two treatment levels of a multicenter antidepressant effectiveness study in major depressive disorder, the STAR*D (Sequenced Treatment Alternatives to Relieve Depression) cohort. This cohort was divided into training, testing, and validation subsets. Only clinical or sociodemographic variables available by or readily amenable to self-report were considered. Multivariate models were developed to discriminate individuals reaching remission with a first or second pharmacological treatment trial from those not reaching remission despite two trials. A logistic regression model achieved an area under the receiver operating characteristic curve exceeding .71 in training, testing, and validation cohorts and maintained good calibration across cohorts. Performance of three alternative models with machine learning approaches--a naïve Bayes classifier and a support vector machine, and a random forest model--was less consistent. Similar performance was observed between more and less severe depression, men and women, and primary versus specialty care sites. A web-based calculator was developed that implements this tool and provides graphical estimates of risk. Risk for treatment resistance among outpatients with major depressive disorder can be estimated with a simple model incorporating baseline sociodemographic and clinical features. Future studies should examine the performance of this model in other clinical populations and its utility in treatment selection or clinical trial design. Copyright © 2013 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Liu, Xinxue; Wong, Angela; Kadri, Sudarshan R; Corovic, Andrej; O'Donovan, Maria; Lao-Sirieix, Pierre; Lovat, Laurence B; Burnham, Rodney W; Fitzgerald, Rebecca C
2014-01-01
Barrett's esophagus (BE) occurs as consequence of reflux and is a risk factor for esophageal adenocarcinoma. The current "gold-standard" for diagnosing BE is endoscopy which remains prohibitively expensive and impractical as a population screening tool. We aimed to develop a pre-screening tool to aid decision making for diagnostic referrals. A prospective (training) cohort of 1603 patients attending for endoscopy was used for identification of risk factors to develop a risk prediction model. Factors associated with BE in the univariate analysis were selected to develop prediction models that were validated in an independent, external cohort of 477 non-BE patients referred for endoscopy with symptoms of reflux or dyspepsia. Two prediction models were developed separately for columnar lined epithelium (CLE) of any length and using a stricter definition of intestinal metaplasia (IM) with segments ≥ 2 cm with areas under the ROC curves (AUC) of 0.72 (95%CI: 0.67-0.77) and 0.81 (95%CI: 0.76-0.86), respectively. The two prediction models included demographics (age, sex), symptoms (heartburn, acid reflux, chest pain, abdominal pain) and medication for "stomach" symptoms. These two models were validated in the independent cohort with AUCs of 0.61 (95%CI: 0.54-0.68) and 0.64 (95%CI: 0.52-0.77) for CLE and IM ≥ 2 cm, respectively. We have identified and validated two prediction models for CLE and IM ≥ 2 cm. Both models have fair prediction accuracies and can select out around 20% of individuals unlikely to benefit from investigation for Barrett's esophagus. Such prediction models have the potential to generate useful cost-savings for BE screening among the symptomatic population.
DOT National Transportation Integrated Search
2017-08-01
Central to the effective design of work zones is being able to understand how drivers behave as they approach and enter a work zone area. States use simulation tools in modeling freeway work zones to predict work zone impacts and to select optimal de...
NASA Astrophysics Data System (ADS)
Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan
2015-04-01
Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.
NASA Astrophysics Data System (ADS)
Coarfa, Violeta Florentina
2007-12-01
Air toxics, also called hazardous air pollutants (HAPs), pose a serious threat to human health and the environment. Their study is important in the Houston area, where point sources, mostly located along the Ship Channel, mobile and area sources contribute to large emissions of such toxic pollutants. Previous studies carried out in this area found dangerous levels of different HAPs in the atmosphere. This thesis presents several studies that were performed for the aromatic and non-aromatic air toxics in the HGA. For these studies we developed several tools: (1) a refined chemical mechanism, which explicitly represents 18 aromatic air toxics that were lumped under two model species by the previous version, based on their reactivity with the hydroxyl radical; (2) an engineering version of an existing air toxics photochemical model that enables us to perform much faster long-term simulations compared to the original model, that leads to a 8--9 times improvement in the running time across different computing platforms; (3) a combined emission inventory based on the available emission databases. Using the developed tools, we quantified the mobile source impact on a few selected air toxics, and analyzed the temporal and spatial variation of selected aromatic and non-aromatic air toxics in a few regions within the Houston area; these regions were characterized by different emissions and environmental conditions.
Christin, Zachary; Bagstad, Kenneth J.; Verdone, Michael
2016-01-01
Restoring degraded forests and agricultural lands has become a global conservation priority. A growing number of tools can quantify ecosystem service tradeoffs associated with forest restoration. This evolving “tools landscape” presents a dilemma: more tools are available, but selecting appropriate tools has become more challenging. We present a Restoration Ecosystem Service Tool Selector (RESTS) framework that describes key characteristics of 13 ecosystem service assessment tools. Analysts enter information about their decision context, services to be analyzed, and desired outputs. Tools are filtered and presented based on five evaluative criteria: scalability, cost, time requirements, handling of uncertainty, and applicability to benefit-cost analysis. RESTS uses a spreadsheet interface but a web-based interface is planned. Given the rapid evolution of ecosystem services science, RESTS provides an adaptable framework to guide forest restoration decision makers toward tools that can help quantify ecosystem services in support of restoration.
The Shuttle Cost and Price model
NASA Technical Reports Server (NTRS)
Leary, Katherine; Stone, Barbara
1983-01-01
The Shuttle Cost and Price (SCP) model was developed as a tool to assist in evaluating major aspects of Shuttle operations that have direct and indirect economic consequences. It incorporates the major aspects of NASA Pricing Policy and corresponds to the NASA definition of STS operating costs. An overview of the SCP model is presented and the cost model portion of SCP is described in detail. Selected recent applications of the SCP model to NASA Pricing Policy issues are presented.
Harmony Search as a Powerful Tool for Feature Selection in QSPR Study of the Drugs Lipophilicity.
Bahadori, Behnoosh; Atabati, Morteza
2017-01-01
Aims & Scope: Lipophilicity represents one of the most studied and most frequently used fundamental physicochemical properties. In the present work, harmony search (HS) algorithm is suggested to feature selection in quantitative structure-property relationship (QSPR) modeling to predict lipophilicity of neutral, acidic, basic and amphotheric drugs that were determined by UHPLC. Harmony search is a music-based metaheuristic optimization algorithm. It was affected by the observation that the aim of music is to search for a perfect state of harmony. Semi-empirical quantum-chemical calculations at AM1 level were used to find the optimum 3D geometry of the studied molecules and variant descriptors (1497 descriptors) were calculated by the Dragon software. The selected descriptors by harmony search algorithm (9 descriptors) were applied for model development using multiple linear regression (MLR). In comparison with other feature selection methods such as genetic algorithm and simulated annealing, harmony search algorithm has better results. The root mean square error (RMSE) with and without leave-one out cross validation (LOOCV) were obtained 0.417 and 0.302, respectively. The results were compared with those obtained from the genetic algorithm and simulated annealing methods and it showed that the HS is a helpful tool for feature selection with fine performance. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Bayesian model selection applied to artificial neural networks used for water resources modeling
NASA Astrophysics Data System (ADS)
Kingston, Greer B.; Maier, Holger R.; Lambert, Martin F.
2008-04-01
Artificial neural networks (ANNs) have proven to be extremely valuable tools in the field of water resources engineering. However, one of the most difficult tasks in developing an ANN is determining the optimum level of complexity required to model a given problem, as there is no formal systematic model selection method. This paper presents a Bayesian model selection (BMS) method for ANNs that provides an objective approach for comparing models of varying complexity in order to select the most appropriate ANN structure. The approach uses Markov Chain Monte Carlo posterior simulations to estimate the evidence in favor of competing models and, in this study, three known methods for doing this are compared in terms of their suitability for being incorporated into the proposed BMS framework for ANNs. However, it is acknowledged that it can be particularly difficult to accurately estimate the evidence of ANN models. Therefore, the proposed BMS approach for ANNs incorporates a further check of the evidence results by inspecting the marginal posterior distributions of the hidden-to-output layer weights, which unambiguously indicate any redundancies in the hidden layer nodes. The fact that this check is available is one of the greatest advantages of the proposed approach over conventional model selection methods, which do not provide such a test and instead rely on the modeler's subjective choice of selection criterion. The advantages of a total Bayesian approach to ANN development, including training and model selection, are demonstrated on two synthetic and one real world water resources case study.
Flynn-Evans, Erin E.; Lockley, Steven W.
2016-01-01
Study Objectives: There is currently no questionnaire-based pre-screening tool available to detect non-24-hour sleep-wake rhythm disorder (N24HSWD) among blind patients. Our goal was to develop such a tool, derived from gold standard, objective hormonal measures of circadian entrainment status, for the detection of N24HSWD among those with visual impairment. Methods: We evaluated the contribution of 40 variables in their ability to predict N24HSWD among 127 blind women, classified using urinary 6-sulfatoxymelatonin period, an objective marker of circadian entrainment status in this population. We subjected the 40 candidate predictors to 1,000 bootstrapped iterations of a logistic regression forward selection model to predict N24HSWD, with model inclusion set at the p < 0.05 level. We removed any predictors that were not selected at least 1% of the time in the 1,000 bootstrapped models and applied a second round of 1,000 bootstrapped logistic regression forward selection models to the remaining 23 candidate predictors. We included all questions that were selected at least 10% of the time in the final model. We subjected the selected predictors to a final logistic regression model to predict N24SWD over 1,000 bootstrapped models to calculate the concordance statistic and adjusted optimism of the final model. We used this information to generate a predictive model and determined the sensitivity and specificity of the model. Finally, we applied the model to a cohort of 1,262 blind women who completed the survey, but did not collect urine samples. Results: The final model consisted of eight questions. The concordance statistic, adjusted for bootstrapping, was 0.85. The positive predictive value was 88%, the negative predictive value was 79%. Applying this model to our larger dataset of women, we found that 61% of those without light perception, and 27% with some degree of light perception, would be referred for further screening for N24HSWD. Conclusions: Our model has predictive utility sufficient to serve as a pre-screening questionnaire for N24HSWD among the blind. Citation: Flynn-Evans EE, Lockley SW. A pre-screening questionnaire to predict non-24-hour sleep-wake rhythm disorder (N24HSWD) among the blind. J Clin Sleep Med 2016;12(5):703–710. PMID:26951421
CalSimHydro Tool - A Web-based interactive tool for the CalSim 3.0 Hydrology Prepropessor
NASA Astrophysics Data System (ADS)
Li, P.; Stough, T.; Vu, Q.; Granger, S. L.; Jones, D. J.; Ferreira, I.; Chen, Z.
2011-12-01
CalSimHydro, the CalSim 3.0 Hydrology Preprocessor, is an application designed to automate the various steps in the computation of hydrologic inputs for CalSim 3.0, a water resources planning model developed jointly by California State Department of Water Resources and United States Bureau of Reclamation, Mid-Pacific Region. CalSimHydro consists of a five-step FORTRAN based program that runs the individual models in succession passing information from one model to the next and aggregating data as required by each model. The final product of CalSimHydro is an updated CalSim 3.0 state variable (SV) DSS input file. CalSimHydro consists of (1) a Rainfall-Runoff Model to compute monthly infiltration, (2) a Soil moisture and demand calculator (IDC) that estimates surface runoff, deep percolation, and water demands for natural vegetation cover and various crops other than rice, (3) a Rice Water Use Model to compute the water demands, deep percolation, irrigation return flow, and runoff from precipitation for the rice fields, (4) a Refuge Water Use Model that simulates the ponding operations for managed wetlands, and (5) a Data Aggregation and Transfer Module to aggregate the outputs from the above modules and transfer them to the CalSim SV input file. In this presentation, we describe a web-based user interface for CalSimHydro using Google Earth Plug-In. The CalSimHydro tool allows users to - interact with geo-referenced layers of the Water Budget Areas (WBA) and Demand Units (DU) displayed over the Sacramento Valley, - view the input parameters of the hydrology preprocessor for a selected WBA or DU in a time series plot or a tabular form, - edit the values of the input parameters in the table or by downloading a spreadsheet of the selected parameter in a selected time range, - run the CalSimHydro modules in the backend server and notify the user when the job is done, - visualize the model output and compare it with a base run result, - download the output SV file to be used to run CalSim 3.0. The CalSimHydro tool streamlines the complicated steps to configure and run the hydrology preprocessor by providing a user-friendly visual interface and back-end services to validate user inputs and manage the model execution. It is a powerful addition to the new CalSim 3.0 system.
User's manual for LINEAR, a FORTRAN program to derive linear aircraft models
NASA Technical Reports Server (NTRS)
Duke, Eugene L.; Patterson, Brian P.; Antoniewicz, Robert F.
1987-01-01
This report documents a FORTRAN program that provides a powerful and flexible tool for the linearization of aircraft models. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied nonlinear aerodynamic model. The system model determined by LINEAR consists of matrices for both state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.
NASA Astrophysics Data System (ADS)
Kuznetsova, M. M.; Liu, Y. H.; Rastaetter, L.; Pembroke, A. D.; Chen, L. J.; Hesse, M.; Glocer, A.; Komar, C. M.; Dorelli, J.; Roytershteyn, V.
2016-12-01
The presentation will provide overview of new tools, services and models implemented at the Community Coordinated Modeling Center (CCMC) to facilitate MMS dayside results analysis. We will provide updates on implementation of Particle-in-Cell (PIC) simulations at the CCMC and opportunities for on-line visualization and analysis of results of PIC simulations of asymmetric magnetic reconnection for different guide fields and boundary conditions. Fields, plasma parameters, particle distribution moments as well as particle distribution functions calculated in selected regions of the vicinity of reconnection sites can be analyzed through the web-based interactive visualization system. In addition there are options to request distribution functions in user selected regions of interest and to fly through simulated magnetic reconnection configurations and a map of distributions to facilitate comparisons with observations. A broad collection of global magnetosphere models hosted at the CCMC provide opportunity to put MMS observations and local PIC simulations into global context. We recently implemented the RECON-X post processing tool (Glocer et al, 2016) which allows users to determine the location of separator surface around closed field lines and between open field lines and solar wind field lines. The tool also finds the separatrix line where the two surfaces touch and positions of magnetic nulls. The surfaces and the separatrix line can be visualized relative to satellite positions in the dayside magnetosphere using an interactive HTML-5 visualization for each time step processed. To validate global magnetosphere models' capability to simulate locations of dayside magnetosphere boundaries we will analyze the proximity of MMS to simulated separatrix locations for a set of MMS diffusion region crossing events.
NASA Technical Reports Server (NTRS)
Allen, Cheryl L.
1991-01-01
Enhanced engineering tools can be obtained through the integration of expert system methodologies and existing design software. The application of these methodologies to the spacecraft design and cost model (SDCM) software provides an improved technique for the selection of hardware for unmanned spacecraft subsystem design. The knowledge engineering system (KES) expert system development tool was used to implement a smarter equipment section algorithm than that which is currently achievable through the use of a standard data base system. The guidance, navigation, and control subsystems of the SDCM software was chosen as the initial subsystem for implementation. The portions of the SDCM code which compute the selection criteria and constraints remain intact, and the expert system equipment selection algorithm is embedded within this existing code. The architecture of this new methodology is described and its implementation is reported. The project background and a brief overview of the expert system is described, and once the details of the design are characterized, an example of its implementation is demonstrated.
NASA Astrophysics Data System (ADS)
Zhao, Fei; Zhang, Chi; Yang, Guilin; Chen, Chinyin
2016-12-01
This paper presents an online estimation method of cutting error by analyzing of internal sensor readings. The internal sensors of numerical control (NC) machine tool are selected to avoid installation problem. The estimation mathematic model of cutting error was proposed to compute the relative position of cutting point and tool center point (TCP) from internal sensor readings based on cutting theory of gear. In order to verify the effectiveness of the proposed model, it was simulated and experimented in gear generating grinding process. The cutting error of gear was estimated and the factors which induce cutting error were analyzed. The simulation and experiments verify that the proposed approach is an efficient way to estimate the cutting error of work-piece during machining process.
Karakülah, G.; Dicle, O.; Sökmen, S.; Çelikoğlu, C.C.
2015-01-01
Summary Background The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians’ decision making. Objective The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. Methods The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. Results In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratio<0.1). Depending on the decisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. Conclusions The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options. PMID:25848413
Suner, A; Karakülah, G; Dicle, O; Sökmen, S; Çelikoğlu, C C
2015-01-01
The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians' decision making. The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratio<0.1). Depending on the decisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options.
Which benefits in the use of a modeling platform : The VSoil example.
NASA Astrophysics Data System (ADS)
Lafolie, François; Cousin, Isabelle; Mollier, Alain; Pot, Valérie; Maron, Pierre-Alain; Moitrier, Nicolas; Nouguier, Cedric; Moitrier, Nathalie; Beudez, Nicolas
2015-04-01
In the environmental community the need for coupling the models and the associated knowledges emerged recently. The development of a coupling tool or of a modeling platform is mainly driven by the necessity to create models accounting for multiple processes and to take into account the feed back between these processes. Models focusing on a restricted number of processes exist and thus the coupling of these numerical tools appeared as an efficient and rapid mean to fill up the identified gaps. Several tools have been proposed : OMS3 (David et al. 2013) ; CSDMS framework (Peckham et al. 2013) ; the Open MI project developed within the frame of European Community (Open MI, 2011). However, what we should expect from a modeling platform could be more ambitious than only coupling existing numerical codes. We believe that we need to share easily not only our numerical representations but also the attached knowledges. We need to rapidly and easily develop complex models to have tools to bring responses to current issues on soil functioning and soil evolution within the frame of global change. We also need to share in a common frame our visions of soil functioning at various scales, one the one hand to strengthen our collaborations, and, on the other hand, to make them visible by the other communities working on environmental issues. The presentation will briefly present the VSoil platform. The platform is able to manipulate concepts and numerical representations of these processes. The tool helps in assembling modules to create a model and automatically generates an executable code and a GUI. Potentialities of the tool will be illustrated on few selected cases.
A visualization method for teaching the geometric design of highways
DOT National Transportation Integrated Search
2000-04-11
In this project the authors employed state-of-the-art technology for developing visualization tools for teaching highway design. Specifically, the authors used photolog images as the basis for developing dynamic 3-D models of selected geometric eleme...
EPA Presentation Regarding the Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) Tool
This page contains a selection of the presentations that EPA has publicly presented about our work on the Midterm Evaluation (MTE). It highlights EPA's benchmarking and modeling activities relating to light duty greenhouse gas (GHG) emissions.
Effects-based strategy development through center of gravity and target system analysis
NASA Astrophysics Data System (ADS)
White, Christopher M.; Prendergast, Michael; Pioch, Nicholas; Jones, Eric K.; Graham, Stephen
2003-09-01
This paper describes an approach to effects-based planning in which a strategic-theater-level mission is refined into operational-level and ultimately tactical-level tasks and desired effects, informed by models of the expected enemy response at each level of abstraction. We describe a strategy development system that implements this approach and supports human-in-the-loop development of an effects-based plan. This system consists of plan authoring tools tightly integrated with a suite of center of gravity (COG) and target system analysis tools. A human planner employs the plan authoring tools to develop a hierarchy of tasks and desired effects. Upon invocation, the target system analysis tools use reduced-order models of enemy centers of gravity to select appropriate target set options for the achievement of desired effects, together with associated indicators for each option. The COG analysis tools also provide explicit models of the causal mechanisms linking tasks and desired effects to one another, and suggest appropriate observable indicators to guide ISR planning, execution monitoring, and campaign assessment. We are currently implementing the system described here as part of the AFRL-sponsored Effects Based Operations program.
A Flexible Statechart-to-Model-Checker Translator
NASA Technical Reports Server (NTRS)
Rouquette, Nicolas; Dunphy, Julia; Feather, Martin S.
2000-01-01
Many current-day software design tools offer some variant of statechart notation for system specification. We, like others, have built an automatic translator from (a subset of) statecharts to a model checker, for use to validate behavioral requirements. Our translator is designed to be flexible. This allows us to quickly adjust the translator to variants of statechart semantics, including problem-specific notational conventions that designers employ. Our system demonstration will be of interest to the following two communities: (1) Potential end-users: Our demonstration will show translation from statecharts created in a commercial UML tool (Rational Rose) to Promela, the input language of Holzmann's model checker SPIN. The translation is accomplished automatically. To accommodate the major variants of statechart semantics, our tool offers user-selectable choices among semantic alternatives. Options for customized semantic variants are also made available. The net result is an easy-to-use tool that operates on a wide range of statechart diagrams to automate the pathway to model-checking input. (2) Other researchers: Our translator embodies, in one tool, ideas and approaches drawn from several sources. Solutions to the major challenges of statechart-to-model-checker translation (e.g., determining which transition(s) will fire, handling of concurrent activities) are retired in a uniform, fully mechanized, setting. The way in which the underlying architecture of the translator itself facilitates flexible and customizable translation will also be evident.
Simulation environment and graphical visualization environment: a COPD use-case
2014-01-01
Background Today, many different tools are developed to execute and visualize physiological models that represent the human physiology. Most of these tools run models written in very specific programming languages which in turn simplify the communication among models. Nevertheless, not all of these tools are able to run models written in different programming languages. In addition, interoperability between such models remains an unresolved issue. Results In this paper we present a simulation environment that allows, first, the execution of models developed in different programming languages and second the communication of parameters to interconnect these models. This simulation environment, developed within the Synergy-COPD project, aims at helping and supporting bio-researchers and medical students understand the internal mechanisms of the human body through the use of physiological models. This tool is composed of a graphical visualization environment, which is a web interface through which the user can interact with the models, and a simulation workflow management system composed of a control module and a data warehouse manager. The control module monitors the correct functioning of the whole system. The data warehouse manager is responsible for managing the stored information and supporting its flow among the different modules. This simulation environment has been validated with the integration of three models: two deterministic, i.e. based on linear and differential equations, and one probabilistic, i.e., based on probability theory. These models have been selected based on the disease under study in this project, i.e., chronic obstructive pulmonary disease. Conclusion It has been proved that the simulation environment presented here allows the user to research and study the internal mechanisms of the human physiology by the use of models via a graphical visualization environment. A new tool for bio-researchers is ready for deployment in various use cases scenarios. PMID:25471327
A material based approach to creating wear resistant surfaces for hot forging
NASA Astrophysics Data System (ADS)
Babu, Sailesh
Tools and dies used in metal forming are characterized by extremely high temperatures at the interface, high local pressures and large metal to metal sliding. These harsh conditions result in accelerated wear of tooling. Lubrication of tools, done to improve metal flow drastically quenches the surface layers of the tools and compounds the tool failure problem. This phenomenon becomes a serious issue when parts forged at complex and are expected to meet tight tolerances. Unpredictable and hence uncontrolled wear and degradation of tooling result in poor part quality and premature tool failure that result in high scrap, shop downtime, poor efficiency and high cost. The objective of this dissertation is to develop a computer-based methodology for analyzing the requirements hot forging tooling to resist wear and plastic deformation and wear and predicting life cycle of forge tooling. Development of such is a system is complicated by the fact that wear and degradation of tooling is influenced by not only the die material used but also numerous process controls like lubricant, dilution ratio, forging temperature, equipment used, tool geometries among others. Phenomenological models available u1 the literature give us a good thumb rule to selecting materials but do not provide a way to evaluate pits performance in field. Once a material is chosen, there are no proven approaches to create surfaces out of these materials. Coating approaches like PVD and CVD cannot generate thick coatings necessary to withstand the conditions under hot forging. Welding cannot generate complex surfaces without several secondary operations like heat treating and machining. If careful procedures are not followed, welds crack and seldom survive forging loads. There is a strong need for an approach to selectively, reliably and precisely deposit material of choice reliably on an existing surface which exhibit not only good tribological properties but also good adhesion to the substrate. Dissertation outlines development of a new cyclic contact test design to recreate intermittent tempering seen in hot forging. This test has been used to validate the use of tempering parameters in modeling of in-service softening of tool steel surfaces. The dissertation also outlines an industrial case study, conducted at a forging company, to validate the wear model. This dissertation also outlines efforts at Ohio State University, to deposit Nickel Aluminide on AISI H13 substrate, using Laser Engineered Net Shaping (LENS). Dissertation reports results from an array of experiments conducted using LENS 750 machine, at various power levels, table speeds and hatch spacing. Results pertaining to bond quality, surface finish, compositional gradients and hardness are provided. Also, a thermal-based finite element numerical model that was used to simulate the LENS process is presented, along with some demonstrated results.
Anomaly Detection for Next-Generation Space Launch Ground Operations
NASA Technical Reports Server (NTRS)
Spirkovska, Lilly; Iverson, David L.; Hall, David R.; Taylor, William M.; Patterson-Hine, Ann; Brown, Barbara; Ferrell, Bob A.; Waterman, Robert D.
2010-01-01
NASA is developing new capabilities that will enable future human exploration missions while reducing mission risk and cost. The Fault Detection, Isolation, and Recovery (FDIR) project aims to demonstrate the utility of integrated vehicle health management (IVHM) tools in the domain of ground support equipment (GSE) to be used for the next generation launch vehicles. In addition to demonstrating the utility of IVHM tools for GSE, FDIR aims to mature promising tools for use on future missions and document the level of effort - and hence cost - required to implement an application with each selected tool. One of the FDIR capabilities is anomaly detection, i.e., detecting off-nominal behavior. The tool we selected for this task uses a data-driven approach. Unlike rule-based and model-based systems that require manual extraction of system knowledge, data-driven systems take a radically different approach to reasoning. At the basic level, they start with data that represent nominal functioning of the system and automatically learn expected system behavior. The behavior is encoded in a knowledge base that represents "in-family" system operations. During real-time system monitoring or during post-flight analysis, incoming data is compared to that nominal system operating behavior knowledge base; a distance representing deviation from nominal is computed, providing a measure of how far "out of family" current behavior is. We describe the selected tool for FDIR anomaly detection - Inductive Monitoring System (IMS), how it fits into the FDIR architecture, the operations concept for the GSE anomaly monitoring, and some preliminary results of applying IMS to a Space Shuttle GSE anomaly.
Computational tool for simulation of power and refrigeration cycles
NASA Astrophysics Data System (ADS)
Córdoba Tuta, E.; Reyes Orozco, M.
2016-07-01
Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.
Procedure for the Selection and Validation of a Calibration Model I-Description and Application.
Desharnais, Brigitte; Camirand-Lemyre, Félix; Mireault, Pascal; Skinner, Cameron D
2017-05-01
Calibration model selection is required for all quantitative methods in toxicology and more broadly in bioanalysis. This typically involves selecting the equation order (quadratic or linear) and weighting factor correctly modelizing the data. A mis-selection of the calibration model will generate lower quality control (QC) accuracy, with an error up to 154%. Unfortunately, simple tools to perform this selection and tests to validate the resulting model are lacking. We present a stepwise, analyst-independent scheme for selection and validation of calibration models. The success rate of this scheme is on average 40% higher than a traditional "fit and check the QCs accuracy" method of selecting the calibration model. Moreover, the process was completely automated through a script (available in Supplemental Data 3) running in RStudio (free, open-source software). The need for weighting was assessed through an F-test using the variances of the upper limit of quantification and lower limit of quantification replicate measurements. When weighting was required, the choice between 1/x and 1/x2 was determined by calculating which option generated the smallest spread of weighted normalized variances. Finally, model order was selected through a partial F-test. The chosen calibration model was validated through Cramer-von Mises or Kolmogorov-Smirnov normality testing of the standardized residuals. Performance of the different tests was assessed using 50 simulated data sets per possible calibration model (e.g., linear-no weight, quadratic-no weight, linear-1/x, etc.). This first of two papers describes the tests, procedures and outcomes of the developed procedure using real LC-MS-MS results for the quantification of cocaine and naltrexone. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
iPat: intelligent prediction and association tool for genomic research.
Chen, Chunpeng James; Zhang, Zhiwu
2018-06-01
The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. zhiwu.zhang@wsu.edu.
A software tool for dataflow graph scheduling
NASA Technical Reports Server (NTRS)
Jones, Robert L., III
1994-01-01
A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on multiple processors. The dataflow paradigm is very useful in exposing the parallelism inherent in algorithms. It provides a graphical and mathematical model which describes a partial ordering of algorithm tasks based on data precedence.
Semantic Importance Sampling for Statistical Model Checking
2015-01-16
SMT calls while maintaining correctness. Finally, we implement SIS in a tool called osmosis and use it to verify a number of stochastic systems with...2 surveys related work. Section 3 presents background definitions and concepts. Section 4 presents SIS, and Section 5 presents our tool osmosis . In...which I∗M|=Φ(x) = 1. We do this by first randomly selecting a cube c from C∗ with uniform probability since each cube has equal probability 9 5. OSMOSIS
Model Selection in the Analysis of Photoproduction Data
NASA Astrophysics Data System (ADS)
Landay, Justin
2017-01-01
Scattering experiments provide one of the most powerful and useful tools for probing matter to better understand its fundamental properties governed by the strong interaction. As the spectroscopy of the excited states of nucleons enters a new era of precision ushered in by improved experiments at Jefferson Lab and other facilities around the world, traditional partial-wave analysis methods must be adjusted accordingly. In this poster, we present a rigorous set of statistical tools and techniques that we implemented; most notably, the LASSO method, which serves for the selection of the simplest model, allowing us to avoid over fitting. In the case of establishing the spectrum of exited baryons, it avoids overpopulation of the spectrum and thus the occurrence of false-positives. This is a prerequisite to reliably compare theories like lattice QCD or quark models to experiments. Here, we demonstrate the principle by simultaneously fitting three observables in neutral pion photo-production, such as the differential cross section, beam asymmetry and target polarization across thousands of data points. Other authors include Michael Doring, Bin Hu, and Raquel Molina.
Duwe, Grant; Freske, Pamela J
2012-08-01
This study presents the results from efforts to revise the Minnesota Sex Offender Screening Tool-Revised (MnSOST-R), one of the most widely used sex offender risk-assessment tools. The updated instrument, the MnSOST-3, contains nine individual items, six of which are new. The population for this study consisted of the cross-validation sample for the MnSOST-R (N = 220) and a contemporary sample of 2,315 sex offenders released from Minnesota prisons between 2003 and 2006. To score and select items for the MnSOST-3, we used predicted probabilities generated from a multiple logistic regression model. We used bootstrap resampling to not only refine our selection of predictors but also internally validate the model. The results indicate the MnSOST-3 has a relatively high level of predictive discrimination, as evidenced by an apparent AUC of .821 and an optimism-corrected AUC of .796. The findings show the MnSOST-3 is well calibrated with actual recidivism rates for all but the highest risk offenders. Although estimating a penalized maximum likelihood model did not improve the overall calibration, the results suggest the MnSOST-3 may still be useful in helping identify high-risk offenders whose sexual recidivism risk exceeds 50%. Results from an interrater reliability assessment indicate the instrument, which is scored in a Microsoft Excel application, has an adequate degree of consistency across raters (ICC = .83 for both consistency and absolute agreement).
Predicting space telerobotic operator training performance from human spatial ability assessment
NASA Astrophysics Data System (ADS)
Liu, Andrew M.; Oman, Charles M.; Galvan, Raquel; Natapoff, Alan
2013-11-01
Our goal was to determine whether existing tests of spatial ability can predict an astronaut's qualification test performance after robotic training. Because training astronauts to be qualified robotics operators is so long and expensive, NASA is interested in tools that can predict robotics performance before training begins. Currently, the Astronaut Office does not have a validated tool to predict robotics ability as part of its astronaut selection or training process. Commonly used tests of human spatial ability may provide such a tool to predict robotics ability. We tested the spatial ability of 50 active astronauts who had completed at least one robotics training course, then used logistic regression models to analyze the correlation between spatial ability test scores and the astronauts' performance in their evaluation test at the end of the training course. The fit of the logistic function to our data is statistically significant for several spatial tests. However, the prediction performance of the logistic model depends on the criterion threshold assumed. To clarify the critical selection issues, we show how the probability of correct classification vs. misclassification varies as a function of the mental rotation test criterion level. Since the costs of misclassification are low, the logistic models of spatial ability and robotic performance are reliable enough only to be used to customize regular and remedial training. We suggest several changes in tracking performance throughout robotics training that could improve the range and reliability of predictive models.
Waste in health information systems: a systematic review.
Awang Kalong, Nadia; Yusof, Maryati
2017-05-08
Purpose The purpose of this paper is to discuss a systematic review on waste identification related to health information systems (HIS) in Lean transformation. Design/methodology/approach A systematic review was conducted on 19 studies to evaluate Lean transformation and tools used to remove waste related to HIS in clinical settings. Findings Ten waste categories were identified, along with their relationships and applications of Lean tool types related to HIS. Different Lean tools were used at the early and final stages of Lean transformation; the tool selection depended on the waste characteristic. Nine studies reported a positive impact from Lean transformation in improving daily work processes. The selection of Lean tools should be made based on the timing, purpose and characteristics of waste to be removed. Research limitations/implications Overview of waste and its category within HIS and its analysis from socio-technical perspectives enabled the identification of its root cause in a holistic and rigorous manner. Practical implications Understanding waste types, their root cause and review of Lean tools could subsequently lead to the identification of mitigation approach to prevent future error occurrence. Originality/value Specific waste models for HIS settings are yet to be developed. Hence, the identification of the waste categories could guide future implementation of Lean transformations in HIS settings.
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith
2015-01-01
Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.
NASA Astrophysics Data System (ADS)
Ki, Seo Jin; Ray, Chittaranjan
2015-03-01
A regional screening tool-which is useful in cases where few site-specific parameters are available for complex vadose zone models-assesses the leaching potential of pollutants to groundwater over large areas. In this study, the previous pesticide leaching tool used in Hawaii was revised to account for the release of new volatile organic compounds (VOCs) from the soil surface. The tool was modified to introduce expanded terms in the traditional pesticide ranking indices (i.e., retardation and attenuation factors), allowing the estimation of the leaching fraction of volatile chemicals based on recharge, soil, and chemical properties to be updated. Results showed that the previous tool significantly overestimated the mass fraction of VOCs leached through soils as the recharge rates increased above 0.001801 m/d. In contrast, the revised tool successfully delineated vulnerable areas to the selected VOCs based on two reference chemicals, a known leacher and non-leacher, which were determined in local conditions. The sensitivity analysis with the Latin-Hypercube-One-factor-At-a-Time method revealed that the new leaching tool was most sensitive to changes in the soil organic carbon sorption coefficient, fractional organic carbon content, and Henry's law constant; and least sensitive to parameters such as the bulk density, water content at field capacity, and particle density in soils. When the revised tool was compared to the analytical (STANMOD) and numerical (HYDRUS-1D) models as a susceptibility measure, it ranked particular VOCs well (e.g., benzene, carbofuran, and toluene) that were consistent with other two models under the given conditions. Therefore, the new leaching tool can be widely used to address intrinsic groundwater vulnerability to contamination of pesticides and VOCs, along with the DRASTIC method or similar Tier 1 models such as SCI-GROW and WIN-PST.
SPOTting model parameters using a ready-made Python package
NASA Astrophysics Data System (ADS)
Houska, Tobias; Kraft, Philipp; Breuer, Lutz
2015-04-01
The selection and parameterization of reliable process descriptions in ecological modelling is driven by several uncertainties. The procedure is highly dependent on various criteria, like the used algorithm, the likelihood function selected and the definition of the prior parameter distributions. A wide variety of tools have been developed in the past decades to optimize parameters. Some of the tools are closed source. Due to this, the choice for a specific parameter estimation method is sometimes more dependent on its availability than the performance. A toolbox with a large set of methods can support users in deciding about the most suitable method. Further, it enables to test and compare different methods. We developed the SPOT (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of modules, to analyze and optimize parameters of (environmental) models. SPOT comes along with a selected set of algorithms for parameter optimization and uncertainty analyses (Monte Carlo, MC; Latin Hypercube Sampling, LHS; Maximum Likelihood, MLE; Markov Chain Monte Carlo, MCMC; Scuffled Complex Evolution, SCE-UA; Differential Evolution Markov Chain, DE-MCZ), together with several likelihood functions (Bias, (log-) Nash-Sutcliff model efficiency, Correlation Coefficient, Coefficient of Determination, Covariance, (Decomposed-, Relative-, Root-) Mean Squared Error, Mean Absolute Error, Agreement Index) and prior distributions (Binomial, Chi-Square, Dirichlet, Exponential, Laplace, (log-, multivariate-) Normal, Pareto, Poisson, Cauchy, Uniform, Weibull) to sample from. The model-independent structure makes it suitable to analyze a wide range of applications. We apply all algorithms of the SPOT package in three different case studies. Firstly, we investigate the response of the Rosenbrock function, where the MLE algorithm shows its strengths. Secondly, we study the Griewank function, which has a challenging response surface for optimization methods. Here we see simple algorithms like the MCMC struggling to find the global optimum of the function, while algorithms like SCE-UA and DE-MCZ show their strengths. Thirdly, we apply an uncertainty analysis of a one-dimensional physically based hydrological model build with the Catchment Modelling Framework (CMF). The model is driven by meteorological and groundwater data from a Free Air Carbon Enrichment (FACE) experiment in Linden (Hesse, Germany). Simulation results are evaluated with measured soil moisture data. We search for optimal parameter sets of the van Genuchten-Mualem function and find different equally optimal solutions with some of the algorithms. The case studies reveal that the implemented SPOT methods work sufficiently well. They further show the benefit of having one tool at hand that includes a number of parameter search methods, likelihood functions and a priori parameter distributions within one platform independent package.
Towards a Framework for Modeling Space Systems Architectures
NASA Technical Reports Server (NTRS)
Shames, Peter; Skipper, Joseph
2006-01-01
Topics covered include: 1) Statement of the problem: a) Space system architecture is complex; b) Existing terrestrial approaches must be adapted for space; c) Need a common architecture methodology and information model; d) Need appropriate set of viewpoints. 2) Requirements on a space systems model. 3) Model Based Engineering and Design (MBED) project: a) Evaluated different methods; b) Adapted and utilized RASDS & RM-ODP; c) Identified useful set of viewpoints; d) Did actual model exchanges among selected subset of tools. 4) Lessons learned & future vision.
Learning, remembering, and predicting how to use tools: Distributed neurocognitive mechanisms
Buxbaum, Laurel J.
2016-01-01
The reasoning-based approach championed by Francois Osiurak and Arnaud Badets (Osiurak & Badets, 2016) denies the existence of sensory-motor memories of tool use except in limited circumstances, and suggests instead that most tool use is subserved solely by online technical reasoning about tool properties. In this commentary, I highlight the strengths and limitations of the reasoning-based approach and review a number of lines of evidence that manipulation knowledge is in fact used in tool action tasks. In addition, I present a “two route” neurocognitive model of tool use called the “Two Action Systems Plus (2AS+)” framework that posits a complementary role for online and stored information and specifies the neurocognitive substrates of task-relevant action selection. This framework, unlike the reasoning based approach, has the potential to integrate the existing psychological and functional neuroanatomic data in the tool use domain. PMID:28358565
Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan
2014-01-01
LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784
Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan
2014-12-15
LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.
Hammarström, Anne; Wiklund, Maria; Stålnacke, Britt-Marie; Lehti, Arja; Haukenes, Inger; Fjellman-Wiklund, Anncristine
2016-01-01
There is a need for tools addressing gender inequality in the everyday clinical work in health care. The aim of our paper was to develop a tool for increasing the awareness of gendered and intersectional processes in clinical assessment of patients, based on a study of pain rehabilitation. In the overarching project named "Equal care in rehabilitation" we used multiple methods (both quantitative and qualitative) in five sub studies. With a novel approach we used Grounded Theory in order to synthesize the results from our sub studies, in order to develop the gender equality tool. The gender equality tool described and developed in this article is thus based on results from sub studies about the processes of assessment and selection of patients in pain rehabilitation. Inspired by some questions in earlier tools, we posed open ended questions and inductively searched for findings and concepts relating to gendered and social selection processes in pain rehabilitation, in each of our sub studies. Through this process, the actual gender equality tool was developed as 15 questions about the process of assessing and selecting patients to pain rehabilitation. As a more comprehensive way of understanding the tool, we performed a final step of the GT analyses. Here we synthesized the results of the tool into a comprehensive model with two dimensions in relation to several possible discrimination axes. The process of assessing and selecting patients was visualized as a funnel, a top down process governed by gendered attitudes, rules and structures. We found that the clinicians judged inner and outer characteristics and status of patients in a gendered and intersectional way in the process of clinical decision-making which thus can be regarded as (potentially) biased with regard to gender, socio-economic status, ethnicity and age. The clinical implications of our tool are that the tool can be included in the systematic routine of clinical assessment of patients for both awareness raising and as a base for avoiding gender bias in clinical decision-making. The tool could also be used in team education for health professionals as an instrument for critical reflection on gender bias. Thus, tools for clinical assessment can be developed from empirical studies in various clinical settings. However, such a micro-level approach must be understood from a broader societal perspective including gender relations on both the macro- and the meso-level.
Hammarström, Anne; Wiklund, Maria; Stålnacke, Britt-Marie; Lehti, Arja; Haukenes, Inger; Fjellman-Wiklund, Anncristine
2016-01-01
Objective There is a need for tools addressing gender inequality in the everyday clinical work in health care. The aim of our paper was to develop a tool for increasing the awareness of gendered and intersectional processes in clinical assessment of patients, based on a study of pain rehabilitation. Methods In the overarching project named “Equal care in rehabilitation” we used multiple methods (both quantitative and qualitative) in five sub studies. With a novel approach we used Grounded Theory in order to synthesize the results from our sub studies, in order to develop the gender equality tool. The gender equality tool described and developed in this article is thus based on results from sub studies about the processes of assessment and selection of patients in pain rehabilitation. Inspired by some questions in earlier tools, we posed open ended questions and inductively searched for findings and concepts relating to gendered and social selection processes in pain rehabilitation, in each of our sub studies. Through this process, the actual gender equality tool was developed as 15 questions about the process of assessing and selecting patients to pain rehabilitation. As a more comprehensive way of understanding the tool, we performed a final step of the GT analyses. Here we synthesized the results of the tool into a comprehensive model with two dimensions in relation to several possible discrimination axes. Results The process of assessing and selecting patients was visualized as a funnel, a top down process governed by gendered attitudes, rules and structures. We found that the clinicians judged inner and outer characteristics and status of patients in a gendered and intersectional way in the process of clinical decision-making which thus can be regarded as (potentially) biased with regard to gender, socio-economic status, ethnicity and age. Implications The clinical implications of our tool are that the tool can be included in the systematic routine of clinical assessment of patients for both awareness raising and as a base for avoiding gender bias in clinical decision-making. The tool could also be used in team education for health professionals as an instrument for critical reflection on gender bias. Conclusions Thus, tools for clinical assessment can be developed from empirical studies in various clinical settings. However, such a micro-level approach must be understood from a broader societal perspective including gender relations on both the macro- and the meso-level. PMID:27055029
Morphogenic designer--an efficient tool to digitally design tooth forms.
Hajtó, J; Marinescu, C; Silva, N R F A
2014-01-01
Different digital software tools are available today for the purpose of designing anatomically correct anterior and posterior restorations. The current concepts present weaknesses, which can be potentially addressed by more advanced modeling tools, such as the ones already available in professional CAD (Computer Aided Design) graphical software. This study describes the morphogenic designer (MGD) as an efficient and easy method for digitally designing tooth forms for the anterior and posterior dentition. Anterior and posterior tooth forms were selected from a collection of digitalized natural teeth and subjectively assessed as "average". The models in the form of STL files were filtered, cleaned, idealized, and re-meshed to match the specifications of the software used. The shapes were then imported as wavefront ".obj" model into Modo 701, software built for modeling, texturing, visualization, and animation. In order to create a parametric design system, intentional interactive deformations were performed on the average tooth shapes and then further defined as morph targets. By combining various such parameters, several tooth shapes were formed virtually and their images presented. MGD proved to be a versatile and powerful tool for the purpose of esthetic and functional digital crown designs.
NASA Astrophysics Data System (ADS)
Attia, Khalid A. M.; Nassar, Mohammed W. I.; El-Zeiny, Mohamed B.; Serag, Ahmed
2017-01-01
For the first time, a new variable selection method based on swarm intelligence namely firefly algorithm is coupled with three different multivariate calibration models namely, concentration residual augmented classical least squares, artificial neural network and support vector regression in UV spectral data. A comparative study between the firefly algorithm and the well-known genetic algorithm was developed. The discussion revealed the superiority of using this new powerful algorithm over the well-known genetic algorithm. Moreover, different statistical tests were performed and no significant differences were found between all the models regarding their predictabilities. This ensures that simpler and faster models were obtained without any deterioration of the quality of the calibration.
Brouwers, Melissa C; Makarski, Julie; Kastner, Monika; Hayden, Leigh; Bhattacharyya, Onil
2015-03-15
Practice guideline (PG) implementability refers to PG features that promote their use. While there are tools and resources to promote PG implementability, none are based on an evidence-informed and multidisciplinary perspective. Our objectives were to (i) create a comprehensive and evidence-informed model of PG implementability, (ii) seek support for the model from the international PG community, (iii) map existing implementability tools on to the model, (iv) prioritize areas for further investigation, and (v) describe how the model can be used by PG developers, users, and researchers. A mixed methods approach was used. Using our completed realist review of the literature of seven different disciplines as the foundation, an iterative consensus process was used to create the beta version of the model. This was followed by (i) a survey of international stakeholders (guideline developers and users) to gather feedback and to refine the model, (ii) a content analysis comparing the model to existing PG tools, and (iii) a strategy to prioritize areas of the model for further research by members of the research team. The Guideline Implementability for Decision Excellence Model (GUIDE-M) is comprised of 3 core tactics, 7 domains, 9 subdomains, 44 attributes, and 40 subattributes and elements. Feedback on the beta version was received from 248 stakeholders from 34 countries. The model was rated as logical, relevant, and appropriate. Seven PG tools were selected and compared to the GUIDE-M: very few tools targeted the Contextualization and Deliberations domain. Also, fewer of the tools addressed PG appraisal than PG development and reporting functions. These findings informed the research priorities identified by the team. The GUIDE-M provides an evidence-informed international and multidisciplinary conceptualization of PG implementability. The model can be used by PG developers to help them create more implementable recommendations, by clinicians and other users to help them be better consumers of PGs, and by the research community to identify priorities for further investigation.
Free and Open Source GIS Tools: Role and Relevance in the Environmental Assessment Community
The presence of an explicit geographical context in most environmental decisions can complicate assessment and selection of management options. These decisions typically involve numerous data sources, complex environmental and ecological processes and their associated models, ris...
Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.
Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.
Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology
Faltermeier, Rupert; Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses. PMID:26693250
Pitassi, Claudio; Gonçalves, Antonio Augusto; Moreno Júnior, Valter de Assis
2014-01-01
The scope of this article is to identify and analyze the factors that influence the adoption of ICT tools in experiments with bioinformatics at the Brazilian Cancer Institute (INCA). It involves a descriptive and exploratory qualitative field study. Evidence was collected mainly based on in-depth interviews with the management team at the Research Center and the IT Division. The answers were analyzed using the categorical content method. The categories were selected from the scientific literature and consolidated in the Technology-Organization-Environment (TOE) framework created for this study. The model proposed made it possible to demonstrate how the factors selected impacted INCA´s adoption of bioinformatics systems and tools, contributing to the investigation of two critical areas for the development of the health industry in Brazil, namely technological innovation and bioinformatics. Based on the evidence collected, a research question was posed: to what extent can the alignment of the factors related to the adoption of ICT tools in experiments with bioinformatics increase the innovation capacity of a Brazilian biopharmaceutical organization?
The Lunar Mapping and Modeling Project
NASA Technical Reports Server (NTRS)
Nall, M.; French, R.; Noble, S.; Muery, K.
2010-01-01
The Lunar Mapping and Modeling Project (LMMP) is managing a suite of lunar mapping and modeling tools and data products that support lunar exploration activities, including the planning, de-sign, development, test, and operations associated with crewed and/or robotic operations on the lunar surface. Although the project was initiated primarily to serve the needs of the Constellation program, it is equally suited for supporting landing site selection and planning for a variety of robotic missions, including NASA science and/or human precursor missions and commercial missions such as those planned by the Google Lunar X-Prize participants. In addition, LMMP should prove to be a convenient and useful tool for scientific analysis and for education and public out-reach (E/PO) activities.
Evaluation of chiller modeling approaches and their usability for fault detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sreedharan, Priya
Selecting the model is an important and essential step in model based fault detection and diagnosis (FDD). Several factors must be considered in model evaluation, including accuracy, training data requirements, calibration effort, generality, and computational requirements. All modeling approaches fall somewhere between pure first-principles models, and empirical models. The objective of this study was to evaluate different modeling approaches for their applicability to model based FDD of vapor compression air conditioning units, which are commonly known as chillers. Three different models were studied: two are based on first-principles and the third is empirical in nature. The first-principles models are themore » Gordon and Ng Universal Chiller model (2nd generation), and a modified version of the ASHRAE Primary Toolkit model, which are both based on first principles. The DOE-2 chiller model as implemented in CoolTools{trademark} was selected for the empirical category. The models were compared in terms of their ability to reproduce the observed performance of an older chiller operating in a commercial building, and a newer chiller in a laboratory. The DOE-2 and Gordon-Ng models were calibrated by linear regression, while a direct-search method was used to calibrate the Toolkit model. The ''CoolTools'' package contains a library of calibrated DOE-2 curves for a variety of different chillers, and was used to calibrate the building chiller to the DOE-2 model. All three models displayed similar levels of accuracy. Of the first principles models, the Gordon-Ng model has the advantage of being linear in the parameters, which allows more robust parameter estimation methods to be used and facilitates estimation of the uncertainty in the parameter values. The ASHRAE Toolkit Model may have advantages when refrigerant temperature measurements are also available. The DOE-2 model can be expected to have advantages when very limited data are available to calibrate the model, as long as one of the previously identified models in the CoolTools library matches the performance of the chiller in question.« less
NASA's Solar System Treks: Online Portals for Planetary Mapping and Modeling
NASA Technical Reports Server (NTRS)
Day, Brian
2017-01-01
NASA's Solar System Treks are a suite of web-based of lunar and planetary mapping and modeling portals providing interactive visualization and analysis tools enabling mission planners, planetary scientists, students, and the general public to access mapped lunar data products from past and current missions for the Moon, Mars, Vesta, and more. New portals for additional planetary bodies are being planned. This presentation will recap significant enhancements to these toolsets during the past year and look ahead to future features and releases. Moon Trek is a new portal replacing its predecessor, the Lunar Mapping and Modeling Portal (LMMP), that significantly upgrades and builds upon the capabilities of LMMP. It features greatly improved navigation, 3D visualization, fly-overs, performance, and reliability. Additional data products and tools continue to be added. These include both generalized products as well as polar data products specifically targeting potential sites for NASA's Resource Prospector mission as well as for missions being planned by NASA's international partners. The latest release of Mars Trek includes new tools and data products requested by NASA's Planetary Science Division to support site selection and analysis for Mars Human Landing Exploration Zone Sites. Also being given very high priority by NASA Headquarters is Mars Trek's use as a means to directly involve the public in upcoming missions, letting them explore the areas the agency is focusing upon, understand what makes these sites so fascinating, follow the selection process, and get caught up in the excitement of exploring Mars. Phobos Trek, the latest effort in the Solar System Treks suite, is being developed in coordination with the International Phobos/Deimos Landing Site Working Group, with landing site selection and analysis for JAXA's MMX (Martian Moons eXploration) mission as a primary driver.
NASA's Solar System Treks: Online Portals for Planetary Mapping and Modeling
NASA Astrophysics Data System (ADS)
Day, B. H.; Law, E.
2017-12-01
NASA's Solar System Treks are a suite of web-based of lunar and planetary mapping and modeling portals providing interactive visualization and analysis tools enabling mission planners, planetary scientists, students, and the general public to access mapped lunar data products from past and current missions for the Moon, Mars, Vesta, and more. New portals for additional planetary bodies are being planned. This presentation will recap significant enhancements to these toolsets during the past year and look ahead to future features and releases. Moon Trek is a new portal replacing its predecessor, the Lunar Mapping and Modeling Portal (LMMP), that significantly upgrades and builds upon the capabilities of LMMP. It features greatly improved navigation, 3D visualization, fly-overs, performance, and reliability. Additional data products and tools continue to be added. These include both generalized products as well as polar data products specifically targeting potential sites for NASA's Resource Prospector mission as well as for missions being planned by NASA's international partners. The latest release of Mars Trek includes new tools and data products requested by NASA's Planetary Science Division to support site selection and analysis for Mars Human Landing Exploration Zone Sites. Also being given very high priority by NASA Headquarters is Mars Trek's use as a means to directly involve the public in upcoming missions, letting them explore the areas the agency is focusing upon, understand what makes these sites so fascinating, follow the selection process, and get caught up in the excitement of exploring Mars. Phobos Trek, the latest effort in the Solar System Treks suite, is being developed in coordination with the International Phobos/Deimos Landing Site Working Group, with landing site selection and analysis for JAXA's MMX mission as a primary driver.
Surgical tool detection and tracking in retinal microsurgery
NASA Astrophysics Data System (ADS)
Alsheakhali, Mohamed; Yigitsoy, Mehmet; Eslami, Abouzar; Navab, Nassir
2015-03-01
Visual tracking of surgical instruments is an essential part of eye surgery, and plays an important role for the surgeons as well as it is a key component of robotics assistance during the operation time. The difficulty of detecting and tracking medical instruments in-vivo images comes from its deformable shape, changes in brightness, and the presence of the instrument shadow. This paper introduces a new approach to detect the tip of surgical tool and its width regardless of its head shape and the presence of the shadows or vessels. The approach relies on integrating structural information about the strong edges from the RGB color model, and the tool location-based information from L*a*b color model. The probabilistic Hough transform was applied to get the strongest straight lines in the RGB-images, and based on information from the L* and a*, one of these candidates lines is selected as the edge of the tool shaft. Based on that line, the tool slope, the tool centerline and the tool tip could be detected. The tracking is performed by keeping track of the last detected tool tip and the tool slope, and filtering the Hough lines within a box around the last detected tool tip based on the slope differences. Experimental results demonstrate the high accuracy achieved in term of detecting the tool tip position, the tool joint point position, and the tool centerline. The approach also meets the real time requirements.
ERIC Educational Resources Information Center
DiYanni, Cara; Nini, Deniela; Rheel, Whitney; Livelli, Alicia
2012-01-01
This study explores connections between 3-, 4-, and 5-year-olds' performance in theory-of-mind tasks, their performance on an assessment of selective trust, and their decisions to (not) imitate the questionable tool choices of an adult model. The prediction was that all the tasks would be related, with improvements in theory of mind and selective…
Chemometric classification of casework arson samples based on gasoline content.
Sinkov, Nikolai A; Sandercock, P Mark L; Harynuk, James J
2014-02-01
Detection and identification of ignitable liquids (ILs) in arson debris is a critical part of arson investigations. The challenge of this task is due to the complex and unpredictable chemical nature of arson debris, which also contains pyrolysis products from the fire. ILs, most commonly gasoline, are complex chemical mixtures containing hundreds of compounds that will be consumed or otherwise weathered by the fire to varying extents depending on factors such as temperature, air flow, the surface on which IL was placed, etc. While methods such as ASTM E-1618 are effective, data interpretation can be a costly bottleneck in the analytical process for some laboratories. In this study, we address this issue through the application of chemometric tools. Prior to the application of chemometric tools such as PLS-DA and SIMCA, issues of chromatographic alignment and variable selection need to be addressed. Here we use an alignment strategy based on a ladder consisting of perdeuterated n-alkanes. Variable selection and model optimization was automated using a hybrid backward elimination (BE) and forward selection (FS) approach guided by the cluster resolution (CR) metric. In this work, we demonstrate the automated construction, optimization, and application of chemometric tools to casework arson data. The resulting PLS-DA and SIMCA classification models, trained with 165 training set samples, have provided classification of 55 validation set samples based on gasoline content with 100% specificity and sensitivity. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
System Modeling and Diagnostics for Liquefying-Fuel Hybrid Rockets
NASA Technical Reports Server (NTRS)
Poll, Scott; Iverson, David; Ou, Jeremy; Sanderfer, Dwight; Patterson-Hine, Ann
2003-01-01
A Hybrid Combustion Facility (HCF) was recently built at NASA Ames Research Center to study the combustion properties of a new fuel formulation that burns approximately three times faster than conventional hybrid fuels. Researchers at Ames working in the area of Integrated Vehicle Health Management recognized a good opportunity to apply IVHM techniques to a candidate technology for next generation launch systems. Five tools were selected to examine various IVHM techniques for the HCF. Three of the tools, TEAMS (Testability Engineering and Maintenance System), L2 (Livingstone2), and RODON, are model-based reasoning (or diagnostic) systems. Two other tools in this study, ICS (Interval Constraint Simulator) and IMS (Inductive Monitoring System) do not attempt to isolate the cause of the failure but may be used for fault detection. Models of varying scope and completeness were created, both qualitative and quantitative. In each of the models, the structure and behavior of the physical system are captured. In the qualitative models, the temporal aspects of the system behavior and the abstraction of sensor data are handled outside of the model and require the development of additional code. In the quantitative model, less extensive processing code is also necessary. Examples of fault diagnoses are given.
NASA Astrophysics Data System (ADS)
Holoien, Thomas W.-S.; Marshall, Philip J.; Wechsler, Risa H.
2017-06-01
We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of a subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holoien, Thomas W. -S.; Marshall, Philip J.; Wechsler, Risa H.
We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of amore » subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.« less
ERIC Educational Resources Information Center
Goldhaber, Dan; Grout, Cyrus; Huntington-Klein, Nick
2014-01-01
Evidence suggests teacher hiring in public schools is ad-hoc and often does not result in good selection amongst applicants. Some districts use structured selection instruments in the hiring process, but we know little about the efficacy of such tools. In this paper we evaluate the ability of applicant selection tools used by the Spokane Public…
A drill-soil system modelization for future Mars exploration
NASA Astrophysics Data System (ADS)
Finzi, A. E.; Lavagna, M.; Rocchitelli, G.
2004-01-01
This paper presents a first approach to the problem of modeling a drilling process to be carried on in the space environment by a dedicated payload. Systems devoted to work in space present very strict requirements in many different fields such as thermal response, electric power demand, reliability and so on. Thus, models devoted to the operational behaviour simulation represent a fundamental help in the design phase and give a great improvement in the final product quality. As the required power is the crucial constraint within drilling devices, the tool-soil interaction modelization and simulation are finalized to the computation of the power demand as a function of both the drill and the soil parameters. An accurate study of the tool and the soil separately has been firstly carried on and, secondly their interaction has been analyzed. The Dee-Dri system, designed by Tecnospazio and to be part of the lander components in the NASA's Mars Sample Return Mission, has been taken as the tool reference. The Deep-Drill system is a complex rotary tool devoted to the soil perforation and sample collection; it has to operate in a Martian zone made of rocks similar to the terrestrial basalt, then the modelization is restricted to the interaction analysis between the tool and materials belonging to the rock set. The tool geometric modelization has been faced by a finite element approach with a Langrangian formulation: for the static analysis a refined model is assumed considering both the actual geometry of the head and the rod screws; a simplified model has been used to deal with the dynamic analysis. The soil representation is based on the Mohr-Coulomb crack criterion and an Eulerian approach has been selected to model it. However, software limitations in dealing with the tool-soil interface definition required assuming a Langrangian formulation for the soil too. The interaction between the soil and the tool has been modeled by extending the two-dimensional Nishimatsu's theory for rock cutting for rotating perforation tools. A fine analysis on f.e.m. element choice for each part of the tool is presented together with static analysis results. The dynamic analysis results are limited to the first impact phenomenon between the rock and the tool head. The validity of both the theoretical and numerical models is confirmed by the good agreement between simulation results and data coming from the experiments done within the Tecnospazio facilities.
Gamma-ray Burst Prompt Correlations: Selection and Instrumental Effects
NASA Astrophysics Data System (ADS)
Dainotti, M. G.; Amati, L.
2018-05-01
The prompt emission mechanism of gamma-ray bursts (GRB) even after several decades remains a mystery. However, it is believed that correlations between observable GRB properties, given their huge luminosity/radiated energy and redshift distribution extending up to at least z ≈ 9, are promising possible cosmological tools. They also may help to discriminate among the most plausible theoretical models. Nowadays, the objective is to make GRBs standard candles, similar to supernovae (SNe) Ia, through well-established and robust correlations. However, differently from SNe Ia, GRBs span over several order of magnitude in their energetics, hence they cannot yet be considered standard candles. Additionally, being observed at very large distances, their physical properties are affected by selection biases, the so-called Malmquist bias or Eddington effect. We describe the state of the art on how GRB prompt correlations are corrected for these selection biases to employ them as redshift estimators and cosmological tools. We stress that only after an appropriate evaluation and correction for these effects, GRB correlations can be used to discriminate among the theoretical models of prompt emission, to estimate the cosmological parameters and to serve as distance indicators via redshift estimation.
Cheng, Tiejun; Li, Qingliang; Wang, Yanli; Bryant, Stephen H
2011-02-28
Aqueous solubility is recognized as a critical parameter in both the early- and late-stage drug discovery. Therefore, in silico modeling of solubility has attracted extensive interests in recent years. Most previous studies have been limited in using relatively small data sets with limited diversity, which in turn limits the predictability of derived models. In this work, we present a support vector machines model for the binary classification of solubility by taking advantage of the largest known public data set that contains over 46 000 compounds with experimental solubility. Our model was optimized in combination with a reduction and recombination feature selection strategy. The best model demonstrated robust performance in both cross-validation and prediction of two independent test sets, indicating it could be a practical tool to select soluble compounds for screening, purchasing, and synthesizing. Moreover, our work may be used for comparative evaluation of solubility classification studies ascribe to the use of completely public resources.
ISRU System Model Tool: From Excavation to Oxygen Production
NASA Technical Reports Server (NTRS)
Santiago-Maldonado, Edgardo; Linne, Diane L.
2007-01-01
In the late 80's, conceptual designs for an in situ oxygen production plant were documented in a study by Eagle Engineering [1]. In the "Summary of Findings" of this study, it is clearly pointed out that: "reported process mass and power estimates lack a consistent basis to allow comparison." The study goes on to say: "A study to produce a set of process mass, power, and volume requirements on a consistent basis is recommended." Today, approximately twenty years later, as humans plan to return to the moon and venture beyond, the need for flexible up-to-date models of the oxygen extraction production process has become even more clear. Multiple processes for the production of oxygen from lunar regolith are being investigated by NASA, academia, and industry. Three processes that have shown technical merit are molten regolith electrolysis, hydrogen reduction, and carbothermal reduction. These processes have been selected by NASA as the basis for the development of the ISRU System Model Tool (ISMT). In working to develop up-to-date system models for these processes NASA hopes to accomplish the following: (1) help in the evaluation process to select the most cost-effective and efficient process for further prototype development, (2) identify key parameters, (3) optimize the excavation and oxygen production processes, and (4) provide estimates on energy and power requirements, mass and volume of the system, oxygen production rate, mass of regolith required, mass of consumables, and other important parameters. Also, as confidence and high fidelity is achieved with each component's model, new techniques and processes can be introduced and analyzed at a fraction of the cost of traditional hardware development and test approaches. A first generation ISRU System Model Tool has been used to provide inputs to the Lunar Architecture Team studies.
Development of a Protocol and a Screening Tool for Selection of DNAPL Source Area Remediation
2012-02-01
the different remedial time frames used in the modeling case studies. • Matrix Diffusion: Modeling results demonstrated that in fractured rock ...being used for the ISCO, EISB and SEAR fractured rock numerical simulations at the field scale. Figure 2-4 presents the distribution of intrinsic...sedimentary limestone, sandstone, and shale, igneous basalts and granites, and metamorphous rock . For the modeling sites, three general geologies are
Category-selective attention modulates unconscious processes in the middle occipital gyrus.
Tu, Shen; Qiu, Jiang; Martens, Ulla; Zhang, Qinglin
2013-06-01
Many studies have revealed the top-down modulation (spatial attention, attentional load, etc.) on unconscious processing. However, there is little research about how category-selective attention could modulate the unconscious processing. In the present study, using functional magnetic resonance imaging (fMRI), the results showed that category-selective attention modulated unconscious face/tool processing in the middle occipital gyrus (MOG). Interestingly, MOG effects were of opposed direction for face and tool processes. During unconscious face processing, activation in MOG decreased under the face-selective attention compared with tool-selective attention. This result was in line with the predictive coding theory. During unconscious tool processing, however, activation in MOG increased under the tool-selective attention compared with face-selective attention. The different effects might be ascribed to an interaction between top-down category-selective processes and bottom-up processes in the partial awareness level as proposed by Kouider, De Gardelle, Sackur, and Dupoux (2010). Specifically, we suppose an "excessive activation" hypothesis. Copyright © 2013 Elsevier Inc. All rights reserved.
Data Provenance as a Tool for Debugging Hydrological Models based on Python
NASA Astrophysics Data System (ADS)
Wombacher, A.; Huq, M.; Wada, Y.; Van Beek, R.
2012-12-01
There is an increase in data volume used in hydrological modeling. The increasing data volume requires additional efforts in debugging models since a single output value is influenced by a multitude of input values. Thus, it is difficult to keep an overview among the data dependencies. Further, knowing these dependencies, it is a tedious job to infer all the relevant data values. The aforementioned data dependencies are also known as data provenance, i.e. the determination of how a particular value has been created and processed. The proposed tool infers the data provenance automatically from a python script and visualizes the dependencies as a graph without executing the script. To debug the model the user specifies the value of interest in space and time. The tool infers all related data values and displays them in the graph. The tool has been evaluated by hydrologists developing a model for estimating the global water demand [1]. The model uses multiple different data sources. The script we analysed has 120 lines of codes and used more than 3000 individual files, each of them representing a raster map of 360*720 cells. After importing the data of the files into a SQLite database, the data consumes around 40 GB of memory. Using the proposed tool a modeler is able to select individual values and infer which values have been used to calculate the value. Especially in cases of outliers or missing values it is a beneficial tool to provide the modeler with efficient information to investigate the unexpected behavior of the model. The proposed tool can be applied to many python scripts and has been tested with other scripts in different contexts. In case a python code contains an unknown function or class the tool requests additional information about the used function or class to enable the inference. This information has to be entered only once and can be shared with colleagues or in the community. Reference [1] Y. Wada, L. P. H. van Beek, D. Viviroli, H. H. Drr, R. Weingartner, and M. F. P. Bierkens, "Global monthly water stress: II. water demand and severity of water," Water Resources Research, vol. 47, 2011.
NASA Astrophysics Data System (ADS)
Goudarzi, Nasser
2016-04-01
In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the 19F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the 19F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.
A Biologically Inspired Computational Model of Basal Ganglia in Action Selection.
Baston, Chiara; Ursino, Mauro
2015-01-01
The basal ganglia (BG) are a subcortical structure implicated in action selection. The aim of this work is to present a new cognitive neuroscience model of the BG, which aspires to represent a parsimonious balance between simplicity and completeness. The model includes the 3 main pathways operating in the BG circuitry, that is, the direct (Go), indirect (NoGo), and hyperdirect pathways. The main original aspects, compared with previous models, are the use of a two-term Hebb rule to train synapses in the striatum, based exclusively on neuronal activity changes caused by dopamine peaks or dips, and the role of the cholinergic interneurons (affected by dopamine themselves) during learning. Some examples are displayed, concerning a few paradigmatic cases: action selection in basal conditions, action selection in the presence of a strong conflict (where the role of the hyperdirect pathway emerges), synapse changes induced by phasic dopamine, and learning new actions based on a previous history of rewards and punishments. Finally, some simulations show model working in conditions of altered dopamine levels, to illustrate pathological cases (dopamine depletion in parkinsonian subjects or dopamine hypermedication). Due to its parsimonious approach, the model may represent a straightforward tool to analyze BG functionality in behavioral experiments.
Impact and Penetration Simulations for Composite Wing-like Structures
NASA Technical Reports Server (NTRS)
Knight, Norman F.
1998-01-01
The goal of this research project was to develop methodologies for the analysis of wing-like structures subjected to impact loadings. Low-speed impact causing either no damage or only minimal damage and high-speed impact causing severe laminate damage and possible penetration of the structure were to be considered during this research effort. To address this goal, an assessment of current analytical tools for impact analysis was performed. Assessment of the analytical tools for impact and penetration simulations with regard to accuracy, modeling, and damage modeling was considered as well as robustness, efficient, and usage in a wing design environment. Following a qualitative assessment, selected quantitative evaluations will be performed using the leading simulation tools. Based on this assessment, future research thrusts for impact and penetration simulation of composite wing-like structures were identified.
Blodgett, David L.; Lucido, Jessica M.; Kreft, James M.
2016-01-01
Critical water-resources issues ranging from flood response to water scarcity make access to integrated water information, services, tools, and models essential. Since 1995 when the first water data web pages went online, the U.S. Geological Survey has been at the forefront of water data distribution and integration. Today, real-time and historical streamflow observations are available via web pages and a variety of web service interfaces. The Survey has built partnerships with Federal and State agencies to integrate hydrologic data providing continuous observations of surface and groundwater, temporally discrete water quality data, groundwater well logs, aquatic biology data, water availability and use information, and tools to help characterize the landscape for modeling. In this paper, we summarize the status and design patterns implemented for selected data systems. We describe how these systems contribute to a U.S. Federal Open Water Data Initiative and present some gaps and lessons learned that apply to global hydroinformatics data infrastructure.
Damude, S; Wevers, K P; Murali, R; Kruijff, S; Hoekstra, H J; Bastiaannet, E
2017-09-01
Completion lymph node dissection (CLND) in sentinel node (SN)-positive melanoma patients is accompanied with morbidity, while about 80% yield no additional metastases in non-sentinel nodes (NSNs). A prediction tool for NSN involvement could be of assistance in patient selection for CLND. This study investigated which parameters predict NSN-positivity, and whether the biomarker S-100B improves the accuracy of a prediction model. Recorded clinicopathologic factors were tested for their association with NSN-positivity in 110 SN-positive patients who underwent CLND. A prediction model was developed with multivariable logistic regression, incorporating all predictive factors. Five models were compared for their predictive power by calculating the Area Under the Curve (AUC). A weighted risk score, 'S-100B Non-Sentinel Node Risk Score' (SN-SNORS), was derived for the model with the highest AUC. Besides, a nomogram was developed as visual representation. NSN-positivity was present in 24 (21.8%) patients. Sex, ulceration, number of harvested SNs, number of positive SNs, and S-100B value were independently associated with NSN-positivity. The AUC for the model including all these factors was 0.78 (95%CI 0.69-0.88). SN-SNORS was the sum of scores for the five parameters. Scores of ≤9.5, 10-11.5, and ≥12 were associated with low (0%), intermediate (21.0%) and high (43.2%) risk of NSN involvement. A prediction tool based on five parameters, including the biomarker S-100B, showed accurate risk stratification for NSN-involvement in SN-positive melanoma patients. If validated in future studies, this tool could help to identify patients with low risk for NSN-involvement. Copyright © 2017 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.
AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...
The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
Error compensation for thermally induced errors on a machine tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krulewich, D.A.
1996-11-08
Heat flow from internal and external sources and the environment create machine deformations, resulting in positioning errors between the tool and workpiece. There is no industrially accepted method for thermal error compensation. A simple model has been selected that linearly relates discrete temperature measurements to the deflection. The biggest problem is how to locate the temperature sensors and to determine the number of required temperature sensors. This research develops a method to determine the number and location of temperature measurements.
2012-01-01
Background Health care planning for pandemic influenza is a challenging task which requires predictive models by which the impact of different response strategies can be evaluated. However, current preparedness plans and simulations exercises, as well as freely available simulation models previously made for policy makers, do not explicitly address the availability of health care resources or determine the impact of shortages on public health. Nevertheless, the feasibility of health systems to implement response measures or interventions described in plans and trained in exercises depends on the available resource capacity. As part of the AsiaFluCap project, we developed a comprehensive and flexible resource modelling tool to support public health officials in understanding and preparing for surges in resource demand during future pandemics. Results The AsiaFluCap Simulator is a combination of a resource model containing 28 health care resources and an epidemiological model. The tool was built in MS Excel© and contains a user-friendly interface which allows users to select mild or severe pandemic scenarios, change resource parameters and run simulations for one or multiple regions. Besides epidemiological estimations, the simulator provides indications on resource gaps or surpluses, and the impact of shortages on public health for each selected region. It allows for a comparative analysis of the effects of resource availability and consequences of different strategies of resource use, which can provide guidance on resource prioritising and/or mobilisation. Simulation results are displayed in various tables and graphs, and can also be easily exported to GIS software to create maps for geographical analysis of the distribution of resources. Conclusions The AsiaFluCap Simulator is freely available software (http://www.cdprg.org) which can be used by policy makers, policy advisors, donors and other stakeholders involved in preparedness for providing evidence based and illustrative information on health care resource capacities during future pandemics. The tool can inform both preparedness plans and simulation exercises and can help increase the general understanding of dynamics in resource capacities during a pandemic. The combination of a mathematical model with multiple resources and the linkage to GIS for creating maps makes the tool unique compared to other available software. PMID:23061807
Application of selection and estimation regular vine copula on go public company share
NASA Astrophysics Data System (ADS)
Hasna Afifah, R.; Noviyanti, Lienda; Bachrudin, Achmad
2018-03-01
The accuracy of financial risk management involving a large number of assets is needed, but information about dependencies among assets cannot be adequately analyzed. To analyze dependencies on a number of assets, several tools have been added to standard multivariate copula. However, these tools have not been adequately used in apps with higher dimensions. The bivariate parametric copula families can be used to solve it. The multivariate copula can be built from the bivariate parametric copula which is connected by a graphical representation to become Pair Copula Constructions (PCCs) or vine copula. The application of C-vine and D-vine copula have been used in some researches, but the use of C-vine and D-vine copula is more limited than R-vine copula. Therefore, this study used R-vine copula to provide flexibility for modeling complex dependencies on a high dimension. Since copula is a static model, while stock values change over time, then copula should be combined with the ARMA- GARCH model for modeling the movement of shares (volatility). The objective of this paper is to select and estimate R-vine copula which is used to analyze PT Jasa Marga (Persero) Tbk (JSMR), PT Waskita Karya (Persero) Tbk (WSKT), and PT Bank Mandiri (Persero) Tbk (BMRI) from august 31, 2014 to august 31, 2017. From the method it is obtained that the selected copulas for 2 edges at the first tree are survival Gumbel and the copula for edge at the second tree is Gaussian.
FEMME- post-Fire Emergency ManageMEnt tool.
NASA Astrophysics Data System (ADS)
Vieira, Diana; Serpa, Dalila; Rocha, João; Nunes, João; Keizer, Jacob
2017-04-01
Wildfires can have important impacts on hydrological and soil erosion processes in forest catchments, due to the destruction of vegetation cover and changes to soil properties. The involved processes however, are non-linear and not fully understood. This has severely limited the understanding on the impacts of wildfires, and, as a consequence, current runoff-erosion models are poorly adapted to recently burned forest conditions. Furthermore, while post-fire forestry operations and, to a lesser extent, post-fire soil conservation measures are commonly applied, their hydrological and erosion impacts continue poorly known, hampering decision-making by land owners and managers. Past post-wildfire research in Portugal has involved simple adaptations of plot-scale runoff-erosion models to post-fire conditions. This follow-up study focusses on model adaptation to selected post-fire soil conservation measures. To this end, full stock is taken of various datasets collected by several (past and ongoing research projects. The selected model is the Morgan-Morgan-Finney model (MMF, Morgan,2001), which already proved its suitability for post-fire conditions in Portugal (Vieira et al, 2010, 2014) as well as NW-Spain ( Fernández et al., 2010). The present results concerned runoff and erosion different burn severities and various post-fire mitigation treatments (mulch, hydromulch, needle cast, barriers), focussing on the plot and field scale. The results for both the first and the second year following the wildfire revealed good model efficiency, not only for burned and untreated conditions but also for burned and treated conditions. These results thus reinforced earlier findings that MMF is a suitable model for the envisaged post-fire soil erosion assessment tool, coined "FEMME". The data used for post-fire soil erosion calibration with the MMF already allows the delineation of the post-fire management FEMME tool. Nevertheless, further model assessment will address additional post-fire forestry operations (e.g. plowing) as well as upscaling to the catchment scale with the MMF model and compare it with the SWAT model.
Serino, Andrea; Canzoneri, Elisa; Marzolla, Marilena; di Pellegrino, Giuseppe; Magosso, Elisa
2015-01-01
Stimuli from different sensory modalities occurring on or close to the body are integrated in a multisensory representation of the space surrounding the body, i.e., peripersonal space (PPS). PPS dynamically modifies depending on experience, e.g., it extends after using a tool to reach far objects. However, the neural mechanism underlying PPS plasticity after tool use is largely unknown. Here we use a combined computational-behavioral approach to propose and test a possible mechanism accounting for PPS extension. We first present a neural network model simulating audio-tactile representation in the PPS around one hand. Simulation experiments showed that our model reproduced the main property of PPS neurons, i.e., selective multisensory response for stimuli occurring close to the hand. We used the neural network model to simulate the effects of a tool-use training. In terms of sensory inputs, tool use was conceptualized as a concurrent tactile stimulation from the hand, due to holding the tool, and an auditory stimulation from the far space, due to tool-mediated action. Results showed that after exposure to those inputs, PPS neurons responded also to multisensory stimuli far from the hand. The model thus suggests that synchronous pairing of tactile hand stimulation and auditory stimulation from the far space is sufficient to extend PPS, such as after tool-use. Such prediction was confirmed by a behavioral experiment, where we used an audio-tactile interaction paradigm to measure the boundaries of PPS representation. We found that PPS extended after synchronous tactile-hand stimulation and auditory-far stimulation in a group of healthy volunteers. Control experiments both in simulation and behavioral settings showed that the same amount of tactile and auditory inputs administered out of synchrony did not change PPS representation. We conclude by proposing a simple, biological-plausible model to explain plasticity in PPS representation after tool-use, which is supported by computational and behavioral data. PMID:25698947
Serino, Andrea; Canzoneri, Elisa; Marzolla, Marilena; di Pellegrino, Giuseppe; Magosso, Elisa
2015-01-01
Stimuli from different sensory modalities occurring on or close to the body are integrated in a multisensory representation of the space surrounding the body, i.e., peripersonal space (PPS). PPS dynamically modifies depending on experience, e.g., it extends after using a tool to reach far objects. However, the neural mechanism underlying PPS plasticity after tool use is largely unknown. Here we use a combined computational-behavioral approach to propose and test a possible mechanism accounting for PPS extension. We first present a neural network model simulating audio-tactile representation in the PPS around one hand. Simulation experiments showed that our model reproduced the main property of PPS neurons, i.e., selective multisensory response for stimuli occurring close to the hand. We used the neural network model to simulate the effects of a tool-use training. In terms of sensory inputs, tool use was conceptualized as a concurrent tactile stimulation from the hand, due to holding the tool, and an auditory stimulation from the far space, due to tool-mediated action. Results showed that after exposure to those inputs, PPS neurons responded also to multisensory stimuli far from the hand. The model thus suggests that synchronous pairing of tactile hand stimulation and auditory stimulation from the far space is sufficient to extend PPS, such as after tool-use. Such prediction was confirmed by a behavioral experiment, where we used an audio-tactile interaction paradigm to measure the boundaries of PPS representation. We found that PPS extended after synchronous tactile-hand stimulation and auditory-far stimulation in a group of healthy volunteers. Control experiments both in simulation and behavioral settings showed that the same amount of tactile and auditory inputs administered out of synchrony did not change PPS representation. We conclude by proposing a simple, biological-plausible model to explain plasticity in PPS representation after tool-use, which is supported by computational and behavioral data.
ERIC Educational Resources Information Center
Goldhaber, Dan; Grout, Cyrus; Huntington-Klein, Nick
2014-01-01
Evidence suggests that teacher hiring in public schools is ad hoc and often fails to result in good selection among applicants. Some districts use structured selection instruments in the hiring process, but we know little about the efficacy of such tools. In this paper, we evaluate the ability of applicant selection tools used by the Spokane…
Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.
Stockton, David B; Santamaria, Fidel
2017-10-01
We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.
Mutturi, Sarma
2017-06-27
Although handful tools are available for constraint-based flux analysis to generate knockout strains, most of these are either based on bilevel-MIP or its modifications. However, metaheuristic approaches that are known for their flexibility and scalability have been less studied. Moreover, in the existing tools, sectioning of search space to find optimal knocks has not been considered. Herein, a novel computational procedure, termed as FOCuS (Flower-pOllination coupled Clonal Selection algorithm), was developed to find the optimal reaction knockouts from a metabolic network to maximize the production of specific metabolites. FOCuS derives its benefits from nature-inspired flower pollination algorithm and artificial immune system-inspired clonal selection algorithm to converge to an optimal solution. To evaluate the performance of FOCuS, reported results obtained from both MIP and other metaheuristic-based tools were compared in selected case studies. The results demonstrated the robustness of FOCuS irrespective of the size of metabolic network and number of knockouts. Moreover, sectioning of search space coupled with pooling of priority reactions based on their contribution to objective function for generating smaller search space significantly reduced the computational time.
Keiderling, Timothy A
2017-12-01
Isotope labeling has a long history in chemistry as a tool for probing structure, offering enhanced sensitivity, or enabling site selection with a wide range of spectroscopic tools. Chirality sensitive methods such as electronic circular dichroism are global structural tools and have intrinsically low resolution. Consequently, they are generally insensitive to modifications to enhance site selectivity. The use of isotope labeling to modify vibrational spectra with unique resolvable frequency shifts can provide useful site-specific sensitivity, and these methods have been recently more widely expanded in biopolymer studies. While the spectral shifts resulting from changes in isotopic mass can provide resolution of modes from specific parts of the molecule and can allow detection of local change in structure with perturbation, these shifts alone do not directly indicate structure or chirality. With vibrational circular dichroism (VCD), the shifted bands and their resultant sign patterns can be used to indicate local conformations in labeled biopolymers, particularly if multiple labels are used and if their coupling is theoretically modeled. This mini-review discusses selected examples of the use of labeling specific amides in peptides to develop local structural insight with VCD spectra. © 2017 Wiley Periodicals, Inc.
Brawanski, Alexander
2017-01-01
Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data. PMID:28255331
Proescholdt, Martin A; Faltermeier, Rupert; Bele, Sylvia; Brawanski, Alexander
2017-01-01
Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.
NASA Astrophysics Data System (ADS)
Carrasco, D.; Trenti, M.; Mutch, S.; Oesch, P. A.
2018-06-01
The luminosity function is a fundamental observable for characterising how galaxies form and evolve throughout the cosmic history. One key ingredient to derive this measurement from the number counts in a survey is the characterisation of the completeness and redshift selection functions for the observations. In this paper, we present GLACiAR, an open python tool available on GitHub to estimate the completeness and selection functions in galaxy surveys. The code is tailored for multiband imaging surveys aimed at searching for high-redshift galaxies through the Lyman-break technique, but it can be applied broadly. The code generates artificial galaxies that follow Sérsic profiles with different indexes and with customisable size, redshift, and spectral energy distribution properties, adds them to input images, and measures the recovery rate. To illustrate this new software tool, we apply it to quantify the completeness and redshift selection functions for J-dropouts sources (redshift z 10 galaxies) in the Hubble Space Telescope Brightest of Reionizing Galaxies Survey. Our comparison with a previous completeness analysis on the same dataset shows overall agreement, but also highlights how different modelling assumptions for the artificial sources can impact completeness estimates.
Using constraints and their value for optimization of large ODE systems
Domijan, Mirela; Rand, David A.
2015-01-01
We provide analytical tools to facilitate a rigorous assessment of the quality and value of the fit of a complex model to data. We use this to provide approaches to model fitting, parameter estimation, the design of optimization functions and experimental optimization. This is in the context where multiple constraints are used to select or optimize a large model defined by differential equations. We illustrate the approach using models of circadian clocks and the NF-κB signalling system. PMID:25673300
NASA Technical Reports Server (NTRS)
Farrell, C. E.; Krauze, L. D.
1983-01-01
The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.
Fuel model selection for BEHAVE in midwestern oak savannas
Grabner, K.W.; Dwyer, J.P.; Cutter, B.E.
2001-01-01
BEHAVE, a fire behavior prediction system, can be a useful tool for managing areas with prescribed fire. However, the proper choice of fuel models can be critical in developing management scenarios. BEHAVE predictions were evaluated using four standardized fuel models that partially described oak savanna fuel conditions: Fuel Model 1 (Short Grass), 2 (Timber and Grass), 3 (Tall Grass), and 9 (Hardwood Litter). Although all four models yielded regressions with R2 in excess of 0.8, Fuel Model 2 produced the most reliable fire behavior predictions.
Batool, Fozia; Iqbal, Shahid; Akbar, Jamshed
2018-04-03
The present study describes Quantitative Structure Property Relationship (QSPR) modeling to relate metal ions characteristics with adsorption potential of Ficus carica leaves for 13 selected metal ions (Ca +2 , Cr +3 , Co +2 , Cu +2 , Cd +2 , K +1 , Mg +2 , Mn +2 , Na +1 , Ni +2 , Pb +2 , Zn +2 , and Fe +2 ) to generate QSPR model. A set of 21 characteristic descriptors were selected and relationship of these metal characteristics with adsorptive behavior of metal ions was investigated. Stepwise Multiple Linear Regression (SMLR) analysis and Artificial Neural Network (ANN) were applied for descriptors selection and model generation. Langmuir and Freundlich isotherms were also applied on adsorption data to generate proper correlation for experimental findings. Model generated indicated covalent index as the most significant descriptor, which is responsible for more than 90% predictive adsorption (α = 0.05). Internal validation of model was performed by measuring [Formula: see text] (0.98). The results indicate that present model is a useful tool for prediction of adsorptive behavior of different metal ions based on their ionic characteristics.
Weiss, Michael
2017-06-01
Appropriate model selection is important in fitting oral concentration-time data due to the complex character of the absorption process. When IV reference data are available, the problem is the selection of an empirical input function (absorption model). In the present examples a weighted sum of inverse Gaussian density functions (IG) was found most useful. It is shown that alternative models (gamma and Weibull density) are only valid if the input function is log-concave. Furthermore, it is demonstrated for the first time that the sum of IGs model can be also applied to fit oral data directly (without IV data). In the present examples, a weighted sum of two or three IGs was sufficient. From the parameters of this function, the model-independent measures AUC and mean residence time can be calculated. It turned out that a good fit of the data in the terminal phase is essential to avoid parameter biased estimates. The time course of fractional elimination rate and the concept of log-concavity have proved as useful tools in model selection.
Comparison of various tool wear prediction methods during end milling of metal matrix composite
NASA Astrophysics Data System (ADS)
Wiciak, Martyna; Twardowski, Paweł; Wojciechowski, Szymon
2018-02-01
In this paper, the problem of tool wear prediction during milling of hard-to-cut metal matrix composite Duralcan™ was presented. The conducted research involved the measurements of acceleration of vibrations during milling with constant cutting conditions, and evaluation of the flank wear. Subsequently, the analysis of vibrations in time and frequency domain, as well as the correlation of the obtained measures with the tool wear values were conducted. The validation of tool wear diagnosis in relation to selected diagnostic measures was carried out with the use of one variable and two variables regression models, as well as with the application of artificial neural networks (ANN). The comparative analysis of the obtained results enable.
Verification of the Icarus Material Response Tool
NASA Technical Reports Server (NTRS)
Schroeder, Olivia; Palmer, Grant; Stern, Eric; Schulz, Joseph; Muppidi, Suman; Martin, Alexandre
2017-01-01
Due to the complex physics encountered during reentry, material response solvers are used for two main purposes: improve the understanding of the physical phenomena; and design and size thermal protection systems (TPS). Icarus, is a three dimensional, unstructured material response tool that is intended to be used for design while maintaining the flexibility to easily implement physical models as needed. Because TPS selection and sizing is critical, it is of the utmost importance that the design tools be extensively verified and validated before their use. Verification tests aim at insuring that the numerical schemes and equations are implemented correctly by comparison to analytical solutions and grid convergence tests.
Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study
NASA Technical Reports Server (NTRS)
Flores, Melissa; Malin, Jane T.
2013-01-01
An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.
Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study
NASA Astrophysics Data System (ADS)
Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.
2013-09-01
An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.
Lainscsek, Claudia; Weyhenmeyer, Jonathan; Hernandez, Manuel E; Poizner, Howard; Sejnowski, Terrence J
2013-01-01
Time series analysis with delay differential equations (DDEs) reveals non-linear properties of the underlying dynamical system and can serve as a non-linear time-domain classification tool. Here global DDE models were used to analyze short segments of simulated time series from a known dynamical system, the Rössler system, in high noise regimes. In a companion paper, we apply the DDE model developed here to classify short segments of encephalographic (EEG) data recorded from patients with Parkinson's disease and healthy subjects. Nine simulated subjects in each of two distinct classes were generated by varying the bifurcation parameter b and keeping the other two parameters (a and c) of the Rössler system fixed. All choices of b were in the chaotic parameter range. We diluted the simulated data using white noise ranging from 10 to -30 dB signal-to-noise ratios (SNR). Structure selection was supervised by selecting the number of terms, delays, and order of non-linearity of the model DDE model that best linearly separated the two classes of data. The distances d from the linear dividing hyperplane was then used to assess the classification performance by computing the area A' under the ROC curve. The selected model was tested on untrained data using repeated random sub-sampling validation. DDEs were able to accurately distinguish the two dynamical conditions, and moreover, to quantify the changes in the dynamics. There was a significant correlation between the dynamical bifurcation parameter b of the simulated data and the classification parameter d from our analysis. This correlation still held for new simulated subjects with new dynamical parameters selected from each of the two dynamical regimes. Furthermore, the correlation was robust to added noise, being significant even when the noise was greater than the signal. We conclude that DDE models may be used as a generalizable and reliable classification tool for even small segments of noisy data.
Non-Linear Dynamical Classification of Short Time Series of the Rössler System in High Noise Regimes
Lainscsek, Claudia; Weyhenmeyer, Jonathan; Hernandez, Manuel E.; Poizner, Howard; Sejnowski, Terrence J.
2013-01-01
Time series analysis with delay differential equations (DDEs) reveals non-linear properties of the underlying dynamical system and can serve as a non-linear time-domain classification tool. Here global DDE models were used to analyze short segments of simulated time series from a known dynamical system, the Rössler system, in high noise regimes. In a companion paper, we apply the DDE model developed here to classify short segments of encephalographic (EEG) data recorded from patients with Parkinson’s disease and healthy subjects. Nine simulated subjects in each of two distinct classes were generated by varying the bifurcation parameter b and keeping the other two parameters (a and c) of the Rössler system fixed. All choices of b were in the chaotic parameter range. We diluted the simulated data using white noise ranging from 10 to −30 dB signal-to-noise ratios (SNR). Structure selection was supervised by selecting the number of terms, delays, and order of non-linearity of the model DDE model that best linearly separated the two classes of data. The distances d from the linear dividing hyperplane was then used to assess the classification performance by computing the area A′ under the ROC curve. The selected model was tested on untrained data using repeated random sub-sampling validation. DDEs were able to accurately distinguish the two dynamical conditions, and moreover, to quantify the changes in the dynamics. There was a significant correlation between the dynamical bifurcation parameter b of the simulated data and the classification parameter d from our analysis. This correlation still held for new simulated subjects with new dynamical parameters selected from each of the two dynamical regimes. Furthermore, the correlation was robust to added noise, being significant even when the noise was greater than the signal. We conclude that DDE models may be used as a generalizable and reliable classification tool for even small segments of noisy data. PMID:24379798
Multi-omics facilitated variable selection in Cox-regression model for cancer prognosis prediction.
Liu, Cong; Wang, Xujun; Genchev, Georgi Z; Lu, Hui
2017-07-15
New developments in high-throughput genomic technologies have enabled the measurement of diverse types of omics biomarkers in a cost-efficient and clinically-feasible manner. Developing computational methods and tools for analysis and translation of such genomic data into clinically-relevant information is an ongoing and active area of investigation. For example, several studies have utilized an unsupervised learning framework to cluster patients by integrating omics data. Despite such recent advances, predicting cancer prognosis using integrated omics biomarkers remains a challenge. There is also a shortage of computational tools for predicting cancer prognosis by using supervised learning methods. The current standard approach is to fit a Cox regression model by concatenating the different types of omics data in a linear manner, while penalty could be added for feature selection. A more powerful approach, however, would be to incorporate data by considering relationships among omics datatypes. Here we developed two methods: a SKI-Cox method and a wLASSO-Cox method to incorporate the association among different types of omics data. Both methods fit the Cox proportional hazards model and predict a risk score based on mRNA expression profiles. SKI-Cox borrows the information generated by these additional types of omics data to guide variable selection, while wLASSO-Cox incorporates this information as a penalty factor during model fitting. We show that SKI-Cox and wLASSO-Cox models select more true variables than a LASSO-Cox model in simulation studies. We assess the performance of SKI-Cox and wLASSO-Cox using TCGA glioblastoma multiforme and lung adenocarcinoma data. In each case, mRNA expression, methylation, and copy number variation data are integrated to predict the overall survival time of cancer patients. Our methods achieve better performance in predicting patients' survival in glioblastoma and lung adenocarcinoma. Copyright © 2017. Published by Elsevier Inc.
Selection Process for New Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Selection Process for Replacement Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Discrete event simulation tool for analysis of qualitative models of continuous processing systems
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)
1990-01-01
An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.
Khajouei, Hamid; Khajouei, Reza
2017-12-01
Appropriate knowledge, correct information, and relevant data are vital in medical diagnosis and treatment systems. Knowledge Management (KM) through its tools/techniques provides a pertinent framework for decision-making in healthcare systems. The objective of this study was to identify and prioritize the KM tools/techniques that apply to hospital setting. This is a descriptive-survey study. Data were collected using a -researcher-made questionnaire that was developed based on experts' opinions to select the appropriate tools/techniques from 26 tools/techniques of the Asian Productivity Organization (APO) model. Questions were categorized into five steps of KM (identifying, creating, storing, sharing, and applying the knowledge) according to this model. The study population consisted of middle and senior managers of hospitals and managing directors of Vice-Chancellor for Curative Affairs in Kerman University of Medical Sciences in Kerman, Iran. The data were analyzed in SPSS v.19 using one-sample t-test. Twelve out of 26 tools/techniques of the APO model were identified as the tools applicable in hospitals. "Knowledge café" and "APO knowledge management assessment tool" with respective means of 4.23 and 3.7 were the most and the least applicable tools in the knowledge identification step. "Mentor-mentee scheme", as well as "voice and Voice over Internet Protocol (VOIP)" with respective means of 4.20 and 3.52 were the most and the least applicable tools/techniques in the knowledge creation step. "Knowledge café" and "voice and VOIP" with respective means of 3.85 and 3.42 were the most and the least applicable tools/techniques in the knowledge storage step. "Peer assist and 'voice and VOIP' with respective means of 4.14 and 3.38 were the most and the least applicable tools/techniques in the knowledge sharing step. Finally, "knowledge worker competency plan" and "knowledge portal" with respective means of 4.38 and 3.85 were the most and the least applicable tools/techniques in the knowledge application step. The results showed that 12 out of 26 tools in the APO model are appropriate for hospitals of which 11 are significantly applicable, and "storytelling" is marginally applicable. In this study, the preferred tools/techniques for implementation of each of the five KM steps in hospitals are introduced. Copyright © 2017 Elsevier B.V. All rights reserved.
In Silico Approaches for Predicting Adme Properties
NASA Astrophysics Data System (ADS)
Madden, Judith C.
A drug requires a suitable pharmacokinetic profile to be efficacious in vivo in humans. The relevant pharmacokinetic properties include the absorption, distribution, metabolism, and excretion (ADME) profile of the drug. This chapter provides an overview of the definition and meaning of key ADME properties, recent models developed to predict these properties, and a guide as to how to select the most appropriate model(s) for a given query. Many tools using the state-of-the-art in silico methodology are now available to users, and it is anticipated that the continual evolution of these tools will provide greater ability to predict ADME properties in the future. However, caution must be exercised in applying these tools as data are generally available only for "successful" drugs, i.e., those that reach the marketplace, and little supplementary information, such as that for drugs that have a poor pharmacokinetic profile, is available. The possibilities of using these methods and possible integration into toxicity prediction are explored.
Spatial Selection and Local Adaptation Jointly Shape Life-History Evolution during Range Expansion.
Van Petegem, Katrien H P; Boeye, Jeroen; Stoks, Robby; Bonte, Dries
2016-11-01
In the context of climate change and species invasions, range shifts increasingly gain attention because the rates at which they occur in the Anthropocene induce rapid changes in biological assemblages. During range shifts, species experience multiple selection pressures. For poleward expansions in particular, it is difficult to interpret observed evolutionary dynamics because of the joint action of evolutionary processes related to spatial selection and to adaptation toward local climatic conditions. To disentangle the effects of these two processes, we integrated stochastic modeling and data from a common garden experiment, using the spider mite Tetranychus urticae as a model species. By linking the empirical data with those derived form a highly parameterized individual-based model, we infer that both spatial selection and local adaptation contributed to the observed latitudinal life-history divergence. Spatial selection best described variation in dispersal behavior, while variation in development was best explained by adaptation to the local climate. Divergence in life-history traits in species shifting poleward could consequently be jointly determined by contemporary evolutionary dynamics resulting from adaptation to the environmental gradient and from spatial selection. The integration of modeling with common garden experiments provides a powerful tool to study the contribution of these evolutionary processes on life-history evolution during range expansion.
Modeling selective attention using a neuromorphic analog VLSI device.
Indiveri, G
2000-12-01
Attentional mechanisms are required to overcome the problem of flooding a limited processing capacity system with information. They are present in biological sensory systems and can be a useful engineering tool for artificial visual systems. In this article we present a hardware model of a selective attention mechanism implemented on a very large-scale integration (VLSI) chip, using analog neuromorphic circuits. The chip exploits a spike-based representation to receive, process, and transmit signals. It can be used as a transceiver module for building multichip neuromorphic vision systems. We describe the circuits that carry out the main processing stages of the selective attention mechanism and provide experimental data for each circuit. We demonstrate the expected behavior of the model at the system level by stimulating the chip with both artificially generated control signals and signals obtained from a saliency map, computed from an image containing several salient features.
Lion, Sébastien
2009-09-07
Taking into account the interplay between spatial ecological dynamics and selection is a major challenge in evolutionary ecology. Although inclusive fitness theory has proven to be a very useful tool to unravel the interactions between spatial genetic structuring and selection, applications of the theory usually rely on simplifying demographic assumptions. In this paper, I attempt to bridge the gap between spatial demographic models and kin selection models by providing a method to compute approximations for relatedness coefficients in a spatial model with empty sites. Using spatial moment equations, I provide an approximation of nearest-neighbour relatedness on random regular networks, and show that this approximation performs much better than the ordinary pair approximation. I discuss the connection between the relatedness coefficients I define and those used in population genetics, and sketch some potential extensions of the theory.
Functional Fault Modeling of a Cryogenic System for Real-Time Fault Detection and Isolation
NASA Technical Reports Server (NTRS)
Ferrell, Bob; Lewis, Mark; Oostdyk, Rebecca; Perotti, Jose
2009-01-01
When setting out to model and/or simulate a complex mechanical or electrical system, a modeler is faced with a vast array of tools, software, equations, algorithms and techniques that may individually or in concert aid in the development of the model. Mature requirements and a well understood purpose for the model may considerably shrink the field of possible tools and algorithms that will suit the modeling solution. Is the model intended to be used in an offline fashion or in real-time? On what platform does it need to execute? How long will the model be allowed to run before it outputs the desired parameters? What resolution is desired? Do the parameters need to be qualitative or quantitative? Is it more important to capture the physics or the function of the system in the model? Does the model need to produce simulated data? All these questions and more will drive the selection of the appropriate tools and algorithms, but the modeler must be diligent to bear in mind the final application throughout the modeling process to ensure the model meets its requirements without needless iterations of the design. The purpose of this paper is to describe the considerations and techniques used in the process of creating a functional fault model of a liquid hydrogen (LH2) system that will be used in a real-time environment to automatically detect and isolate failures.
On the Usefulness of Hydrologic Landscapes on Hydrologic Model calibration and Selection
Hydrologic Landscapes (HLs) are units that can be used in aggregate to describe the watershed-scale hydrologic response of an area through use of physical and climatic properties. The HL assessment unit is a useful classification tool to relate and transfer hydrologically meaning...
78 FR 73549 - Government-Owned Inventions; Availability for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-06
... consistent with human MMA. This model could serve as a valuable research tool for designing treatments for... federally-funded research and development. Foreign patent applications are filed on selected inventions to...: Peter Soukas, J.D.; 301-435-4646; [email protected] . Collaborative Research Opportunity: The National...
Clay Caterpillars: A Tool for Ecology & Evolution Laboratories
ERIC Educational Resources Information Center
Barber, Nicholas A.
2012-01-01
I present a framework for ecology and evolution laboratory exercises using artificial caterpillars made from modeling clay. Students generate and test hypotheses about predation rates on caterpillars that differ in appearance or "behavior" to understand how natural selection by predators shapes distribution and physical characteristics of…
Managing the "Performance" in Performance Management.
ERIC Educational Resources Information Center
Repinski, Marilyn; Bartsch, Maryjo
1996-01-01
Describes a five-step approach to performance management which includes (1) redefining tasks; (2) identifying skills; (3) determining what development tools are necessary; (4) prioritizing skills development; and (5) developing an action plan. Presents a hiring model that includes job analysis, job description, selection, goal setting, evaluation,…
Choice-Based Segmentation as an Enrollment Management Tool
ERIC Educational Resources Information Center
Young, Mark R.
2002-01-01
This article presents an approach to enrollment management based on target marketing strategies developed from a choice-based segmentation methodology. Students are classified into "switchable" or "non-switchable" segments based on their probability of selecting specific majors. A modified multinomial logit choice model is used to identify…
Preliminary Exploration of Encounter During Transit Across Southern Africa
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stroud, Phillip David; Cuellar-Hengartner, Leticia; Kubicek, Deborah Ann
Los Alamos National Laboratory (LANL) is utilizing the Probability Effectiveness Methodology (PEM) tools, particularly the Pathway Analysis, Threat Response and Interdiction Options Tool (PATRIOT) to support the DNDO Architecture and Planning Directorate’s (APD) development of a multi-region terrorist risk assessment tool. The effort is divided into three stages. The first stage is an exploration of what can be done with PATRIOT essentially as is, to characterize encounter rate during transit across a single selected region. The second stage is to develop, condition, and implement required modifications to the data and conduct analysis to generate a well-founded assessment of the transitmore » reliability across that selected region, and to identify any issues in the process. The final stage is to extend the work to a full multi-region global model. This document provides the results of the first stage, namely preliminary explorations with PATRIOT to assess the transit reliability across the region of southern Africa.« less
TREXMO: A Translation Tool to Support the Use of Regulatory Occupational Exposure Models.
Savic, Nenad; Racordon, Dimitri; Buchs, Didier; Gasic, Bojan; Vernez, David
2016-10-01
Occupational exposure models vary significantly in their complexity, purpose, and the level of expertise required from the user. Different parameters in the same model may lead to different exposure estimates for the same exposure situation. This paper presents a tool developed to deal with this concern-TREXMO or TRanslation of EXposure MOdels. TREXMO integrates six commonly used occupational exposure models, namely, ART v.1.5, STOFFENMANAGER(®) v.5.1, ECETOC TRA v.3, MEASE v.1.02.01, EMKG-EXPO-TOOL, and EASE v.2.0. By enabling a semi-automatic translation between the parameters of these six models, TREXMO facilitates their simultaneous use. For a given exposure situation, defined by a set of parameters in one of the models, TREXMO provides the user with the most appropriate parameters to use in the other exposure models. Results showed that, once an exposure situation and parameters were set in ART, TREXMO reduced the number of possible outcomes in the other models by 1-4 orders of magnitude. The tool should manage to reduce the uncertain entry or selection of parameters in the six models, improve between-user reliability, and reduce the time required for running several models for a given exposure situation. In addition to these advantages, registrants of chemicals and authorities should benefit from more reliable exposure estimates for the risk characterization of dangerous chemicals under Regulation, Evaluation, Authorisation and restriction of CHemicals (REACH). © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Attia, Khalid A M; Nassar, Mohammed W I; El-Zeiny, Mohamed B; Serag, Ahmed
2017-01-05
For the first time, a new variable selection method based on swarm intelligence namely firefly algorithm is coupled with three different multivariate calibration models namely, concentration residual augmented classical least squares, artificial neural network and support vector regression in UV spectral data. A comparative study between the firefly algorithm and the well-known genetic algorithm was developed. The discussion revealed the superiority of using this new powerful algorithm over the well-known genetic algorithm. Moreover, different statistical tests were performed and no significant differences were found between all the models regarding their predictabilities. This ensures that simpler and faster models were obtained without any deterioration of the quality of the calibration. Copyright © 2016 Elsevier B.V. All rights reserved.
Top-attack modeling and automatic target detection using synthetic FLIR scenery
NASA Astrophysics Data System (ADS)
Weber, Bruce A.; Penn, Joseph A.
2004-09-01
A series of experiments have been performed to verify the utility of algorithmic tools for the modeling and analysis of cold-target signatures in synthetic, top-attack, FLIR video sequences. The tools include: MuSES/CREATION for the creation of synthetic imagery with targets, an ARL target detection algorithm to detect imbedded synthetic targets in scenes, and an ARL scoring algorithm, using Receiver-Operating-Characteristic (ROC) curve analysis, to evaluate detector performance. Cold-target detection variability was examined as a function of target emissivity, surrounding clutter type, and target placement in non-obscuring clutter locations. Detector metrics were also individually scored so as to characterize the effect of signature/clutter variations. Results show that using these tools, a detailed, physically meaningful, target detection analysis is possible and that scenario specific target detectors may be developed by selective choice and/or weighting of detector metrics. However, developing these tools into a reliable predictive capability will require the extension of these results to the modeling and analysis of a large number of data sets configured for a wide range of target and clutter conditions. Finally, these tools should also be useful for the comparison of competitive detection algorithms by providing well defined, and controllable target detection scenarios, as well as for the training and testing of expert human observers.
Selection of Thermal Worst-Case Orbits via Modified Efficient Global Optimization
NASA Technical Reports Server (NTRS)
Moeller, Timothy M.; Wilhite, Alan W.; Liles, Kaitlin A.
2014-01-01
Efficient Global Optimization (EGO) was used to select orbits with worst-case hot and cold thermal environments for the Stratospheric Aerosol and Gas Experiment (SAGE) III. The SAGE III system thermal model changed substantially since the previous selection of worst-case orbits (which did not use the EGO method), so the selections were revised to ensure the worst cases are being captured. The EGO method consists of first conducting an initial set of parametric runs, generated with a space-filling Design of Experiments (DoE) method, then fitting a surrogate model to the data and searching for points of maximum Expected Improvement (EI) to conduct additional runs. The general EGO method was modified by using a multi-start optimizer to identify multiple new test points at each iteration. This modification facilitates parallel computing and decreases the burden of user interaction when the optimizer code is not integrated with the model. Thermal worst-case orbits for SAGE III were successfully identified and shown by direct comparison to be more severe than those identified in the previous selection. The EGO method is a useful tool for this application and can result in computational savings if the initial Design of Experiments (DoE) is selected appropriately.
Analysis of a mammography teaching program based on an affordance design model.
Luo, Ping; Eikman, Edward A; Kealy, William; Qian, Wei
2006-12-01
The wide use of computer technology in education, particularly in mammogram reading, asks for e-learning evaluation. The existing media comparative studies, learner attitude evaluations, and performance tests are problematic. Based on an affordance design model, this study examined an existing e-learning program on mammogram reading. The selection criteria include content relatedness, representativeness, e-learning orientation, image quality, program completeness, and accessibility. A case study was conducted to examine the affordance features, functions, and presentations of the selected software. Data collection and analysis methods include interviews, protocol-based document analysis, and usability tests and inspection. Also some statistics were calculated. The examination of PBE identified that this educational software designed and programmed some tools. The learner can use these tools in the process of optimizing displays, scanning images, comparing different projections, marking the region of interests, constructing a descriptive report, assessing one's learning outcomes, and comparing one's decisions with the experts' decisions. Further, PBE provides some resources for the learner to construct one's knowledge and skills, including a categorized image library, a term-searching function, and some teaching links. Besides, users found it easy to navigate and carry out tasks. The users also reacted positively toward PBE's navigation system, instructional aids, layout, pace and flow of information, graphics, and other presentation design. The software provides learners with some cognitive tools, supporting their perceptual problem-solving processes and extending their capabilities. Learners can internalize the mental models in mammogram reading through multiple perceptual triangulations, sensitization of related features, semantic description of mammogram findings, and expert-guided semantic report construction. The design of these cognitive tools and the software interface matches the findings and principles in human learning and instructional design. Working with PBE's case-based simulations and categorized gallery, learners can enrich and transfer their experience to their jobs.
Development of a QFD-based expert system for CNC turning centre selection
NASA Astrophysics Data System (ADS)
Prasad, Kanika; Chakraborty, Shankar
2015-12-01
Computer numerical control (CNC) machine tools are automated devices capable of generating complicated and intricate product shapes in shorter time. Selection of the best CNC machine tool is a critical, complex and time-consuming task due to availability of a wide range of alternatives and conflicting nature of several evaluation criteria. Although, the past researchers had attempted to select the appropriate machining centres using different knowledge-based systems, mathematical models and multi-criteria decision-making methods, none of those approaches has given due importance to the voice of customers. The aforesaid limitation can be overcome using quality function deployment (QFD) technique, which is a systematic approach for integrating customers' needs and designing the product to meet those needs first time and every time. In this paper, the adopted QFD-based methodology helps in selecting CNC turning centres for a manufacturing organization, providing due importance to the voice of customers to meet their requirements. An expert system based on QFD technique is developed in Visual BASIC 6.0 to automate the CNC turning centre selection procedure for different production plans. Three illustrative examples are demonstrated to explain the real-time applicability of the developed expert system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jang, Dae -Heung; Anderson-Cook, Christine Michaela
When there are constraints on resources, an unreplicated factorial or fractional factorial design can allow efficient exploration of numerous factor and interaction effects. A half-normal plot is a common graphical tool used to compare the relative magnitude of effects and to identify important effects from these experiments when no estimate of error from the experiment is available. An alternative is to use a least absolute shrinkage and selection operation plot to examine the pattern of model selection terms from an experiment. We examine how both the half-normal and least absolute shrinkage and selection operation plots are impacted by the absencemore » of individual observations or an outlier, and the robustness of conclusions obtained from these 2 techniques for identifying important effects from factorial experiments. As a result, the methods are illustrated with 2 examples from the literature.« less
Jang, Dae -Heung; Anderson-Cook, Christine Michaela
2017-04-12
When there are constraints on resources, an unreplicated factorial or fractional factorial design can allow efficient exploration of numerous factor and interaction effects. A half-normal plot is a common graphical tool used to compare the relative magnitude of effects and to identify important effects from these experiments when no estimate of error from the experiment is available. An alternative is to use a least absolute shrinkage and selection operation plot to examine the pattern of model selection terms from an experiment. We examine how both the half-normal and least absolute shrinkage and selection operation plots are impacted by the absencemore » of individual observations or an outlier, and the robustness of conclusions obtained from these 2 techniques for identifying important effects from factorial experiments. As a result, the methods are illustrated with 2 examples from the literature.« less
A tool for multi-scale modelling of the renal nephron
Nickerson, David P.; Terkildsen, Jonna R.; Hamilton, Kirk L.; Hunter, Peter J.
2011-01-01
We present the development of a tool, which provides users with the ability to visualize and interact with a comprehensive description of a multi-scale model of the renal nephron. A one-dimensional anatomical model of the nephron has been created and is used for visualization and modelling of tubule transport in various nephron anatomical segments. Mathematical models of nephron segments are embedded in the one-dimensional model. At the cellular level, these segment models use models encoded in CellML to describe cellular and subcellular transport kinetics. A web-based presentation environment has been developed that allows the user to visualize and navigate through the multi-scale nephron model, including simulation results, at the different spatial scales encompassed by the model description. The Zinc extension to Firefox is used to provide an interactive three-dimensional view of the tubule model and the native Firefox rendering of scalable vector graphics is used to present schematic diagrams for cellular and subcellular scale models. The model viewer is embedded in a web page that dynamically presents content based on user input. For example, when viewing the whole nephron model, the user might be presented with information on the various embedded segment models as they select them in the three-dimensional model view. Alternatively, the user chooses to focus the model viewer on a cellular model located in a particular nephron segment in order to view the various membrane transport proteins. Selecting a specific protein may then present the user with a description of the mathematical model governing the behaviour of that protein—including the mathematical model itself and various simulation experiments used to validate the model against the literature. PMID:22670210
Design Considerations | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Gas Fills | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Understanding Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Books & Publications | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Efficient Windows Collaborative | Home
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Resources | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Provide Views | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Links | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Reducing Condensation | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Reduced Fading | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
EWC Membership | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Visible Transmittance | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
EWC Members | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Financing & Incentives | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith
2015-01-01
Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment. PMID:26368541
Developing a framework for energy technology portfolio selection
NASA Astrophysics Data System (ADS)
Davoudpour, Hamid; Ashrafi, Maryam
2012-11-01
Today, the increased consumption of energy in world, in addition to the risk of quick exhaustion of fossil resources, has forced industrial firms and organizations to utilize energy technology portfolio management tools viewed both as a process of diversification of energy sources and optimal use of available energy sources. Furthermore, the rapid development of technologies, their increasing complexity and variety, and market dynamics have made the task of technology portfolio selection difficult. Considering high level of competitiveness, organizations need to strategically allocate their limited resources to the best subset of possible candidates. This paper presents the results of developing a mathematical model for energy technology portfolio selection at a R&D center maximizing support of the organization's strategy and values. The model balances the cost and benefit of the entire portfolio.
Selective laser sintering: A qualitative and objective approach
NASA Astrophysics Data System (ADS)
Kumar, Sanjay
2003-10-01
This article presents an overview of selective laser sintering (SLS) work as reported in various journals and proceedings. Selective laser sintering was first done mainly on polymers and nylon to create prototypes for audio-visual help and fit-to-form tests. Gradually it was expanded to include metals and alloys to manufacture functional prototypes and develop rapid tooling. The growth gained momentum with the entry of commercial entities such as DTM Corporation and EOS GmbH Electro Optical Systems. Computational modeling has been used to understand the SLS process, optimize the process parameters, and enhance the efficiency of the sintering machine.
Optoelectronic simulation of GaAs solar cells with angularly selective filters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraus, Tobias, E-mail: tobias.kraus@ise.fraunhofer.de; Höhn, Oliver; Hauser, Hubert
We discuss the influence of angularly selective filters on thin film gallium arsenide solar cells. For this reason, the detailed balance model was refined to fit our needs with respect to Auger recombination, reflection, transmission, and realistic absorption. For calculating real systems, an approach was made to include optical effects of angularly selective filters into electron-hole dynamic equations implemented in PC1D, a one dimensional solar cell calculation tool. With this approach, we find a relative V{sub oc} increase of 5% for an idealized 100 nm GaAs cell, including Auger recombination.
Klijn, Sven L; Weijenberg, Matty P; Lemmens, Paul; van den Brandt, Piet A; Lima Passos, Valéria
2017-10-01
Background and objective Group-based trajectory modelling is a model-based clustering technique applied for the identification of latent patterns of temporal changes. Despite its manifold applications in clinical and health sciences, potential problems of the model selection procedure are often overlooked. The choice of the number of latent trajectories (class-enumeration), for instance, is to a large degree based on statistical criteria that are not fail-safe. Moreover, the process as a whole is not transparent. To facilitate class enumeration, we introduce a graphical summary display of several fit and model adequacy criteria, the fit-criteria assessment plot. Methods An R-code that accepts universal data input is presented. The programme condenses relevant group-based trajectory modelling output information of model fit indices in automated graphical displays. Examples based on real and simulated data are provided to illustrate, assess and validate fit-criteria assessment plot's utility. Results Fit-criteria assessment plot provides an overview of fit criteria on a single page, placing users in an informed position to make a decision. Fit-criteria assessment plot does not automatically select the most appropriate model but eases the model assessment procedure. Conclusions Fit-criteria assessment plot is an exploratory, visualisation tool that can be employed to assist decisions in the initial and decisive phase of group-based trajectory modelling analysis. Considering group-based trajectory modelling's widespread resonance in medical and epidemiological sciences, a more comprehensive, easily interpretable and transparent display of the iterative process of class enumeration may foster group-based trajectory modelling's adequate use.
Tool Steel Heat Treatment Optimization Using Neural Network Modeling
NASA Astrophysics Data System (ADS)
Podgornik, Bojan; Belič, Igor; Leskovšek, Vojteh; Godec, Matjaz
2016-11-01
Optimization of tool steel properties and corresponding heat treatment is mainly based on trial and error approach, which requires tremendous experimental work and resources. Therefore, there is a huge need for tools allowing prediction of mechanical properties of tool steels as a function of composition and heat treatment process variables. The aim of the present work was to explore the potential and possibilities of artificial neural network-based modeling to select and optimize vacuum heat treatment conditions depending on the hot work tool steel composition and required properties. In the current case training of the feedforward neural network with error backpropagation training scheme and four layers of neurons (8-20-20-2) scheme was based on the experimentally obtained tempering diagrams for ten different hot work tool steel compositions and at least two austenitizing temperatures. Results show that this type of modeling can be successfully used for detailed and multifunctional analysis of different influential parameters as well as to optimize heat treatment process of hot work tool steels depending on the composition. In terms of composition, V was found as the most beneficial alloying element increasing hardness and fracture toughness of hot work tool steel; Si, Mn, and Cr increase hardness but lead to reduced fracture toughness, while Mo has the opposite effect. Optimum concentration providing high KIc/HRC ratios would include 0.75 pct Si, 0.4 pct Mn, 5.1 pct Cr, 1.5 pct Mo, and 0.5 pct V, with the optimum heat treatment performed at lower austenitizing and intermediate tempering temperatures.
Quantum vision in three dimensions
NASA Astrophysics Data System (ADS)
Roth, Yehuda
We present four models for describing a 3-D vision. Similar to the mirror scenario, our models allow 3-D vision with no need for additional accessories such as stereoscopic glasses or a hologram film. These four models are based on brain interpretation rather than pure objective encryption. We consider the observer "subjective" selection of a measuring device and the corresponding quantum collapse into one of his selected states, as a tool for interpreting reality in according to the observer concepts. This is the basic concept of our study and it is introduced in the first model. Other models suggests "soften" versions that might be much easier to implement. Our quantum interpretation approach contribute to the following fields. In technology the proposed models can be implemented into real devices, allowing 3-D vision without additional accessories. Artificial intelligence: In the desire to create a machine that exchange information by using human terminologies, our interpretation approach seems to be appropriate.
75 FR 54403 - U.S. National Climate Assessment Objectives, Proposed Topics, and Next Steps
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-07
..., methods and design, tools for assessing climate change and impacts, dealing with uncertainty, sources of..., coordination with other Federal climate-related programs, design of documents and tailored communications with... methodological perspectives related to selecting model and downscaling outputs and approaches for their use in...
A Software Tool for the Rapid Analysis of the Sintering Behavior of Particulate Bodies
2017-11-01
bounded by a region that the user selects via cross hairs . Future plot analysis features, such as more complicated curve fitting and modeling functions...German RM. Grain growth behavior of tungsten heavy alloys based on the master sintering curve concept. Metallurgical and Materials Transactions A
USDA-ARS?s Scientific Manuscript database
Population genetics is a powerful tool for invasion biology and pest management, from tracing invasion pathways to informing management decisions with inference of population demographics. Genomics greatly increases the resolution of population-scale analyses, yet outside of model species with exten...
USDA-ARS?s Scientific Manuscript database
Demographic matrix modeling of invasive plant populations can be a powerful tool to identify key life stage transitions for targeted disruption in order to cause population decline. This approach can provide quantitative estimates of reductions in select vital rates needed to reduce population growt...
Case studies in key selected coral reefs and watersheds will be completed to provide scientific data, concepts and models that describe the responses of the functioning of these ecosystems to global change stressors. The studies will focus on relating global changes to local and...
Model for Presenting Resources in Scholar's Portal
ERIC Educational Resources Information Center
Feeney, Mary; Newby, Jill
2005-01-01
Presenting electronic resources to users through a federated search engine introduces unique opportunities and challenges to libraries. This article reports on the decision-making tools and processes used for selecting collections of electronic resources by a project team at the University of Arizona (UA) Libraries for the Association of Research…
Basic Skills Support in Business and Industry.
ERIC Educational Resources Information Center
Byatt, Janet; Davies, Karen
This guide is designed as a tool for English and Welsh businesses wanting to provide basic skills training for their employees. It provides practical solutions to the problems of identifying employees' basic skills needs and selecting the best model of training delivery to address identified training needs. The introductory section discusses basic…
Perseveration in Tool Use: A Window for Understanding the Dynamics of the Action-Selection Process
ERIC Educational Resources Information Center
Smitsman, Ad W.; Cox, Ralf F. A.
2008-01-01
Two experiments investigated how 3-year-old children select a tool to perform a manual task, with a focus on their perseverative parameter choices for the various relationships involved in handling a tool: the actor-to-tool relation and the tool-to-target relation (topology). The first study concerned the parameter value for the tool-to-target…
Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckers, Koenraad J; McCabe, Kevin
This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal energy for Production of Heat and electricity ('IR') Economically Simulated). GEOPHIRES combines engineering models of the reservoir, wellbores, and surface plant facilities of a geothermal plant with an economic model to estimate the capital and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy. The available end-use options are electricity, direct-use heat, and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to import temperature data (e.g., measured or from stand-alonemore » reservoir simulator), updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. In this paper, we provide an overview of all the updates and two case studies to illustrate the tool's new capabilities.« less
Benefits of Efficient Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Increased Light & View | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Windows for New Construction | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Performance Standards for Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Air Leakage (AL) | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
State Fact Sheets | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Fact Sheets & Publications | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Condensation Resistance (CR) | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Assessing Window Replacement Options | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
National Fenestration Rating Council (NFRC) | Efficient Windows
Collaborative Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring
Low Conductance Spacers | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Energy & Cost Savings | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
U-Factor (U-value) | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Tigers on trails: occupancy modeling for cluster sampling.
Hines, J E; Nichols, J D; Royle, J A; MacKenzie, D I; Gopalaswamy, A M; Kumar, N Samba; Karanth, K U
2010-07-01
Occupancy modeling focuses on inference about the distribution of organisms over space, using temporal or spatial replication to allow inference about the detection process. Inference based on spatial replication strictly requires that replicates be selected randomly and with replacement, but the importance of these design requirements is not well understood. This paper focuses on an increasingly popular sampling design based on spatial replicates that are not selected randomly and that are expected to exhibit Markovian dependence. We develop two new occupancy models for data collected under this sort of design, one based on an underlying Markov model for spatial dependence and the other based on a trap response model with Markovian detections. We then simulated data under the model for Markovian spatial dependence and fit the data to standard occupancy models and to the two new models. Bias of occupancy estimates was substantial for the standard models, smaller for the new trap response model, and negligible for the new spatial process model. We also fit these models to data from a large-scale tiger occupancy survey recently conducted in Karnataka State, southwestern India. In addition to providing evidence of a positive relationship between tiger occupancy and habitat, model selection statistics and estimates strongly supported the use of the model with Markovian spatial dependence. This new model provides another tool for the decomposition of the detection process, which is sometimes needed for proper estimation and which may also permit interesting biological inferences. In addition to designs employing spatial replication, we note the likely existence of temporal Markovian dependence in many designs using temporal replication. The models developed here will be useful either directly, or with minor extensions, for these designs as well. We believe that these new models represent important additions to the suite of modeling tools now available for occupancy estimation in conservation monitoring. More generally, this work represents a contribution to the topic of cluster sampling for situations in which there is a need for specific modeling (e.g., reflecting dependence) for the distribution of the variable(s) of interest among subunits.
What Lies Beneath? An Evaluation of Rapid Assessment Tools for Management of Hull Fouling
NASA Astrophysics Data System (ADS)
Clarke Murray, Cathryn; Therriault, Thomas W.; Pakhomov, Evgeny
2013-08-01
Despite an increased understanding of marine invasions, non-indigenous species (NIS) continue to be redistributed at both global and regional scales. Since prevention is an important element of NIS programs, monitoring vectors responsible for NIS introductions and spread, such as hull fouling, has become a priority and methods should be selected carefully to balance accuracy, time, and cost. Two common fouling assessment tools for the marine recreational boating vector were evaluated for accuracy using a traditional underwater SCUBA survey in coastal British Columbia: a dockside level of fouling assessment and a behavioral questionnaire model. Results showed that although rapid, dockside assessments did not provide an accurate assessment of fouling present below the surface, at least not in this region. In contrast, a questionnaire-based model using four easily obtained variables (boat type, age of antifouling paint, storage type, and occurrence of long distance trips) reliably identified boats carrying macrofouling species, a proxy for risk of NIS transport. Once validated, this fouling model tool could be applied in border inspection or quarantine situations where decisions must be made quickly. Further development and refinement of rapid assessment tools would improve our ability to prevent new introductions and manage spread of existing invasive species.
What lies beneath? An evaluation of rapid assessment tools for management of hull fouling.
Clarke Murray, Cathryn; Therriault, Thomas W; Pakhomov, Evgeny
2013-08-01
Despite an increased understanding of marine invasions, non-indigenous species (NIS) continue to be redistributed at both global and regional scales. Since prevention is an important element of NIS programs, monitoring vectors responsible for NIS introductions and spread, such as hull fouling, has become a priority and methods should be selected carefully to balance accuracy, time, and cost. Two common fouling assessment tools for the marine recreational boating vector were evaluated for accuracy using a traditional underwater SCUBA survey in coastal British Columbia: a dockside level of fouling assessment and a behavioral questionnaire model. Results showed that although rapid, dockside assessments did not provide an accurate assessment of fouling present below the surface, at least not in this region. In contrast, a questionnaire-based model using four easily obtained variables (boat type, age of antifouling paint, storage type, and occurrence of long distance trips) reliably identified boats carrying macrofouling species, a proxy for risk of NIS transport. Once validated, this fouling model tool could be applied in border inspection or quarantine situations where decisions must be made quickly. Further development and refinement of rapid assessment tools would improve our ability to prevent new introductions and manage spread of existing invasive species.
Algorithms for the Fractional Calculus: A Selection of Numerical Methods
NASA Technical Reports Server (NTRS)
Diethelm, K.; Ford, N. J.; Freed, A. D.; Luchko, Yu.
2003-01-01
Many recently developed models in areas like viscoelasticity, electrochemistry, diffusion processes, etc. are formulated in terms of derivatives (and integrals) of fractional (non-integer) order. In this paper we present a collection of numerical algorithms for the solution of the various problems arising in this context. We believe that this will give the engineer the necessary tools required to work with fractional models in an efficient way.
Lamberink, Herm J; Boshuisen, Kim; Otte, Willem M; Geleijns, Karin; Braun, Kees P J
2018-03-01
The objective of this study was to create a clinically useful tool for individualized prediction of seizure outcomes following antiepileptic drug withdrawal after pediatric epilepsy surgery. We used data from the European retrospective TimeToStop study, which included 766 children from 15 centers, to perform a proportional hazard regression analysis. The 2 outcome measures were seizure recurrence and seizure freedom in the last year of follow-up. Prognostic factors were identified through systematic review of the literature. The strongest predictors for each outcome were selected through backward selection, after which nomograms were created. The final models included 3 to 5 factors per model. Discrimination in terms of adjusted concordance statistic was 0.68 (95% confidence interval [CI] 0.67-0.69) for predicting seizure recurrence and 0.73 (95% CI 0.72-0.75) for predicting eventual seizure freedom. An online prediction tool is provided on www.epilepsypredictiontools.info/ttswithdrawal. The presented models can improve counseling of patients and parents regarding postoperative antiepileptic drug policies, by estimating individualized risks of seizure recurrence and eventual outcome. Wiley Periodicals, Inc. © 2018 International League Against Epilepsy.
Systemic safety project selection tool.
DOT National Transportation Integrated Search
2013-07-01
"The Systemic Safety Project Selection Tool presents a process for incorporating systemic safety planning into traditional safety management processes. The Systemic Tool provides a step-by-step process for conducting systemic safety analysis; conside...
MPEG-7-based description infrastructure for an audiovisual content analysis and retrieval system
NASA Astrophysics Data System (ADS)
Bailer, Werner; Schallauer, Peter; Hausenblas, Michael; Thallinger, Georg
2005-01-01
We present a case study of establishing a description infrastructure for an audiovisual content-analysis and retrieval system. The description infrastructure consists of an internal metadata model and access tool for using it. Based on an analysis of requirements, we have selected, out of a set of candidates, MPEG-7 as the basis of our metadata model. The openness and generality of MPEG-7 allow using it in broad range of applications, but increase complexity and hinder interoperability. Profiling has been proposed as a solution, with the focus on selecting and constraining description tools. Semantic constraints are currently only described in textual form. Conformance in terms of semantics can thus not be evaluated automatically and mappings between different profiles can only be defined manually. As a solution, we propose an approach to formalize the semantic constraints of an MPEG-7 profile using a formal vocabulary expressed in OWL, which allows automated processing of semantic constraints. We have defined the Detailed Audiovisual Profile as the profile to be used in our metadata model and we show how some of the semantic constraints of this profile can be formulated using ontologies. To work practically with the metadata model, we have implemented a MPEG-7 library and a client/server document access infrastructure.
Bayesian model selection validates a biokinetic model for zirconium processing in humans
2012-01-01
Background In radiation protection, biokinetic models for zirconium processing are of crucial importance in dose estimation and further risk analysis for humans exposed to this radioactive substance. They provide limiting values of detrimental effects and build the basis for applications in internal dosimetry, the prediction for radioactive zirconium retention in various organs as well as retrospective dosimetry. Multi-compartmental models are the tool of choice for simulating the processing of zirconium. Although easily interpretable, determining the exact compartment structure and interaction mechanisms is generally daunting. In the context of observing the dynamics of multiple compartments, Bayesian methods provide efficient tools for model inference and selection. Results We are the first to apply a Markov chain Monte Carlo approach to compute Bayes factors for the evaluation of two competing models for zirconium processing in the human body after ingestion. Based on in vivo measurements of human plasma and urine levels we were able to show that a recently published model is superior to the standard model of the International Commission on Radiological Protection. The Bayes factors were estimated by means of the numerically stable thermodynamic integration in combination with a recently developed copula-based Metropolis-Hastings sampler. Conclusions In contrast to the standard model the novel model predicts lower accretion of zirconium in bones. This results in lower levels of noxious doses for exposed individuals. Moreover, the Bayesian approach allows for retrospective dose assessment, including credible intervals for the initially ingested zirconium, in a significantly more reliable fashion than previously possible. All methods presented here are readily applicable to many modeling tasks in systems biology. PMID:22863152
Wheeler, David C.; Hickson, DeMarc A.; Waller, Lance A.
2010-01-01
Many diagnostic tools and goodness-of-fit measures, such as the Akaike information criterion (AIC) and the Bayesian deviance information criterion (DIC), are available to evaluate the overall adequacy of linear regression models. In addition, visually assessing adequacy in models has become an essential part of any regression analysis. In this paper, we focus on a spatial consideration of the local DIC measure for model selection and goodness-of-fit evaluation. We use a partitioning of the DIC into the local DIC, leverage, and deviance residuals to assess local model fit and influence for both individual observations and groups of observations in a Bayesian framework. We use visualization of the local DIC and differences in local DIC between models to assist in model selection and to visualize the global and local impacts of adding covariates or model parameters. We demonstrate the utility of the local DIC in assessing model adequacy using HIV prevalence data from pregnant women in the Butare province of Rwanda during 1989-1993 using a range of linear model specifications, from global effects only to spatially varying coefficient models, and a set of covariates related to sexual behavior. Results of applying the diagnostic visualization approach include more refined model selection and greater understanding of the models as applied to the data. PMID:21243121
PathCase-SB architecture and database design
2011-01-01
Background Integration of metabolic pathways resources and regulatory metabolic network models, and deploying new tools on the integrated platform can help perform more effective and more efficient systems biology research on understanding the regulation in metabolic networks. Therefore, the tasks of (a) integrating under a single database environment regulatory metabolic networks and existing models, and (b) building tools to help with modeling and analysis are desirable and intellectually challenging computational tasks. Description PathCase Systems Biology (PathCase-SB) is built and released. The PathCase-SB database provides data and API for multiple user interfaces and software tools. The current PathCase-SB system provides a database-enabled framework and web-based computational tools towards facilitating the development of kinetic models for biological systems. PathCase-SB aims to integrate data of selected biological data sources on the web (currently, BioModels database and KEGG), and to provide more powerful and/or new capabilities via the new web-based integrative framework. This paper describes architecture and database design issues encountered in PathCase-SB's design and implementation, and presents the current design of PathCase-SB's architecture and database. Conclusions PathCase-SB architecture and database provide a highly extensible and scalable environment with easy and fast (real-time) access to the data in the database. PathCase-SB itself is already being used by researchers across the world. PMID:22070889
Sathiyamoorthy, V; Sekar, T; Elango, N
2015-01-01
Formation of spikes prevents achievement of the better material removal rate (MRR) and surface finish while using plain NaNO3 aqueous electrolyte in electrochemical machining (ECM) of die tool steel. Hence this research work attempts to minimize the formation of spikes in the selected workpiece of high carbon high chromium die tool steel using copper nanoparticles suspended in NaNO3 aqueous electrolyte, that is, nanofluid. The selected influencing parameters are applied voltage and electrolyte discharge rate with three levels and tool feed rate with four levels. Thirty-six experiments were designed using Design Expert 7.0 software and optimization was done using multiobjective genetic algorithm (MOGA). This tool identified the best possible combination for achieving the better MRR and surface roughness. The results reveal that voltage of 18 V, tool feed rate of 0.54 mm/min, and nanofluid discharge rate of 12 lit/min would be the optimum values in ECM of HCHCr die tool steel. For checking the optimality obtained from the MOGA in MATLAB software, the maximum MRR of 375.78277 mm(3)/min and respective surface roughness Ra of 2.339779 μm were predicted at applied voltage of 17.688986 V, tool feed rate of 0.5399705 mm/min, and nanofluid discharge rate of 11.998816 lit/min. Confirmatory tests showed that the actual performance at the optimum conditions was 361.214 mm(3)/min and 2.41 μm; the deviation from the predicted performance is less than 4% which proves the composite desirability of the developed models.
Bayesian model selection: Evidence estimation based on DREAM simulation and bridge sampling
NASA Astrophysics Data System (ADS)
Volpi, Elena; Schoups, Gerrit; Firmani, Giovanni; Vrugt, Jasper A.
2017-04-01
Bayesian inference has found widespread application in Earth and Environmental Systems Modeling, providing an effective tool for prediction, data assimilation, parameter estimation, uncertainty analysis and hypothesis testing. Under multiple competing hypotheses, the Bayesian approach also provides an attractive alternative to traditional information criteria (e.g. AIC, BIC) for model selection. The key variable for Bayesian model selection is the evidence (or marginal likelihood) that is the normalizing constant in the denominator of Bayes theorem; while it is fundamental for model selection, the evidence is not required for Bayesian inference. It is computed for each hypothesis (model) by averaging the likelihood function over the prior parameter distribution, rather than maximizing it as by information criteria; the larger a model evidence the more support it receives among a collection of hypothesis as the simulated values assign relatively high probability density to the observed data. Hence, the evidence naturally acts as an Occam's razor, preferring simpler and more constrained models against the selection of over-fitted ones by information criteria that incorporate only the likelihood maximum. Since it is not particularly easy to estimate the evidence in practice, Bayesian model selection via the marginal likelihood has not yet found mainstream use. We illustrate here the properties of a new estimator of the Bayesian model evidence, which provides robust and unbiased estimates of the marginal likelihood; the method is coined Gaussian Mixture Importance Sampling (GMIS). GMIS uses multidimensional numerical integration of the posterior parameter distribution via bridge sampling (a generalization of importance sampling) of a mixture distribution fitted to samples of the posterior distribution derived from the DREAM algorithm (Vrugt et al., 2008; 2009). Some illustrative examples are presented to show the robustness and superiority of the GMIS estimator with respect to other commonly used approaches in the literature.
Design Guidance for New Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Design Guidance for Replacement Windows | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
Solar Heat Gain Coefficient (SHGC) | Efficient Windows Collaborative
Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Griffin John
Here, kinetic measurements are paired with in-situ spectroscopic characterization tools to investigate colloidally based, supported Pt catalytic model systems in order to elucidate the mechanisms by which metal and support work in tandem to dictate activity and selectivity. The results demonstrate oxide support materials, while inactive in absence of Pt nanoparticles, possess unique active sites for the selective conversion of gas phase molecules when paired with an active metal catalyst.
A Simple Approach to Account for Climate Model Interdependence in Multi-Model Ensembles
NASA Astrophysics Data System (ADS)
Herger, N.; Abramowitz, G.; Angelil, O. M.; Knutti, R.; Sanderson, B.
2016-12-01
Multi-model ensembles are an indispensable tool for future climate projection and its uncertainty quantification. Ensembles containing multiple climate models generally have increased skill, consistency and reliability. Due to the lack of agreed-on alternatives, most scientists use the equally-weighted multi-model mean as they subscribe to model democracy ("one model, one vote").Different research groups are known to share sections of code, parameterizations in their model, literature, or even whole model components. Therefore, individual model runs do not represent truly independent estimates. Ignoring this dependence structure might lead to a false model consensus, wrong estimation of uncertainty and effective number of independent models.Here, we present a way to partially address this problem by selecting a subset of CMIP5 model runs so that its climatological mean minimizes the RMSE compared to a given observation product. Due to the cancelling out of errors, regional biases in the ensemble mean are reduced significantly.Using a model-as-truth experiment we demonstrate that those regional biases persist into the future and we are not fitting noise, thus providing improved observationally-constrained projections of the 21st century. The optimally selected ensemble shows significantly higher global mean surface temperature projections than the original ensemble, where all the model runs are considered. Moreover, the spread is decreased well beyond that expected from the decreased ensemble size.Several previous studies have recommended an ensemble selection approach based on performance ranking of the model runs. Here, we show that this approach can perform even worse than randomly selecting ensemble members and can thus be harmful. We suggest that accounting for interdependence in the ensemble selection process is a necessary step for robust projections for use in impact assessments, adaptation and mitigation of climate change.
Blended near-optimal tools for flexible water resources decision making
NASA Astrophysics Data System (ADS)
Rosenberg, David
2015-04-01
State-of-the-art systems analysis techniques focus on efficiently finding optimal solutions. Yet an optimal solution is optimal only for the static modelled issues and managers often seek near-optimal alternatives that address un-modelled or changing objectives, preferences, limits, uncertainties, and other issues. Early on, Modelling to Generate Alternatives (MGA) formalized near-optimal as performance within a tolerable deviation from the optimal objective function value and identified a few maximally-different alternatives that addressed select un-modelled issues. This paper presents new stratified, Monte Carlo Markov Chain sampling and parallel coordinate plotting tools that generate and communicate the structure and full extent of the near-optimal region to an optimization problem. Plot controls allow users to interactively explore region features of most interest. Controls also streamline the process to elicit un-modelled issues and update the model formulation in response to elicited issues. Use for a single-objective water quality management problem at Echo Reservoir, Utah identifies numerous and flexible practices to reduce the phosphorus load to the reservoir and maintain close-to-optimal performance. Compared to MGA, the new blended tools generate more numerous alternatives faster, more fully show the near-optimal region, help elicit a larger set of un-modelled issues, and offer managers greater flexibility to cope in a changing world.
Software Tools to Support the Assessment of System Health
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
2013-01-01
This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of sensors that optimally meet the performance goals and the constraints. It identifies optimal sensor suite solutions by utilizing a merit (i.e., cost) function with one of several available optimization approaches. As part of its analysis, S4 can expose fault conditions that are difficult to diagnose due to an incomplete diagnostic philosophy and/or a lack of sensors. S4 was originally developed and applied to liquid rocket engines. It was subsequently used to study the optimized selection of sensors for a simulation ]based aircraft engine diagnostic system. The ETA Tool is a software ]based analysis tool that augments the testability analysis and reporting capabilities of a commercial ]off ]the ]shelf (COTS) package. An initial diagnostic assessment is performed by the COTS software using a user ]developed, qualitative, directed ]graph model of the system being analyzed. The ETA Tool accesses system design information captured within the model and the associated testability analysis output to create a series of six reports for various system engineering needs. These reports are highlighted in the presentation. The ETA Tool was developed by NASA to support the verification of fault management requirements early in the Launch Vehicle process. Due to their early development during the design process, the TEAMS ]based diagnostic model and the ETA Tool were able to positively influence the system design by highlighting gaps in failure detection, fault isolation, and failure recovery.
NASA Technical Reports Server (NTRS)
Burns, K. Lee; Altino, Karen
2008-01-01
The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.
ERIC Educational Resources Information Center
Public Impact, 2008
2008-01-01
This toolkit includes these separate sections: (1) Selection Preparation Guide; (2) Day-of-Interview Tools; (3) Candidate Rating Tools; and (4) Candidate Comparison and Decision Tools. Each of the sections is designed to be used at different stages of the selection process. The first section provides turnaround teacher competencies that are the…
ERIC Educational Resources Information Center
Public Impact, 2008
2008-01-01
This toolkit includes the following separate sections: (1) Selection Preparation Guide; (2) Day-of-Interview Tools; (3) Candidate Rating Tools; and (4) Candidate Comparison and Decision Tools. Each of the sections is designed to be used at different stages of the selection process. The first section provides a list of competencies that would…
Sheridan, Kimberly M; Konopasky, Abigail W; Kirkwood, Sophie; Defeyter, Margaret A
2016-03-19
Research indicates that in experimental settings, young children of 3-7 years old are unlikely to devise a simple tool to solve a problem. This series of exploratory studies done in museums in the US and UK explores how environment and ownership of materials may improve children's ability and inclination for (i) tool material selection and (ii) innovation. The first study takes place in a children's museum, an environment where children can use tools and materials freely. We replicated a tool innovation task in this environment and found that while 3-4 year olds showed the predicted low levels of innovation rates, 4-7 year olds showed higher rates of innovation than the younger children and than reported in prior studies. The second study explores the effect of whether the experimental materials are owned by the experimenter or the child on tool selection and innovation. Results showed that 5-6 year olds and 6-7 year olds were more likely to select tool material they owned compared to tool material owned by the experimenter, although ownership had no effect on tool innovation. We argue that learning environments supporting tool exploration and invention and conveying ownership over materials may encourage successful tool innovation at earlier ages. © 2016 The Author(s).
NASA Astrophysics Data System (ADS)
Irwan; Gustientiedina; Sunarti; Desnelita, Yenny
2017-12-01
The purpose of this study is to design a counseling model application for a decision-maker and consultation system. This application as an alternative guidance and individual career development for students, that include career knowledge, planning and alternative options from an expert tool based on knowledge and rule to provide the solutions on student’s career decisions. This research produces a counseling model application to obtain the important information about student career development and facilitating individual student’s development through the service form, to connect their plan with their career according to their talent, interest, ability, knowledge, personality and other supporting factors. This application model can be used as tool to get information faster and flexible for the student’s guidance and counseling. So, it can help students in doing selection and making decision that appropriate with their choice of works.
Lavado Contador, J F; Maneta, M; Schnabel, S
2006-10-01
The capability of Artificial Neural Network models to forecast near-surface soil moisture at fine spatial scale resolution has been tested for a 99.5 ha watershed located in SW Spain using several easy to achieve digital models of topographic and land cover variables as inputs and a series of soil moisture measurements as training data set. The study methods were designed in order to determining the potentials of the neural network model as a tool to gain insight into soil moisture distribution factors and also in order to optimize the data sampling scheme finding the optimum size of the training data set. Results suggest the efficiency of the methods in forecasting soil moisture, as a tool to assess the optimum number of field samples, and the importance of the variables selected in explaining the final map obtained.
Snoopy--a unifying Petri net framework to investigate biomolecular networks.
Rohr, Christian; Marwan, Wolfgang; Heiner, Monika
2010-04-01
To investigate biomolecular networks, Snoopy provides a unifying Petri net framework comprising a family of related Petri net classes. Models can be hierarchically structured, allowing for the mastering of larger networks. To move easily between the qualitative, stochastic and continuous modelling paradigms, models can be converted into each other. We get models sharing structure, but specialized by their kinetic information. The analysis and iterative reverse engineering of biomolecular networks is supported by the simultaneous use of several Petri net classes, while the graphical user interface adapts dynamically to the active one. Built-in animation and simulation are complemented by exports to various analysis tools. Snoopy facilitates the addition of new Petri net classes thanks to its generic design. Our tool with Petri net samples is available free of charge for non-commercial use at http://www-dssz.informatik.tu-cottbus.de/snoopy.html; supported operating systems: Mac OS X, Windows and Linux (selected distributions).
[Multivariate Adaptive Regression Splines (MARS), an alternative for the analysis of time series].
Vanegas, Jairo; Vásquez, Fabián
Multivariate Adaptive Regression Splines (MARS) is a non-parametric modelling method that extends the linear model, incorporating nonlinearities and interactions between variables. It is a flexible tool that automates the construction of predictive models: selecting relevant variables, transforming the predictor variables, processing missing values and preventing overshooting using a self-test. It is also able to predict, taking into account structural factors that might influence the outcome variable, thereby generating hypothetical models. The end result could identify relevant cut-off points in data series. It is rarely used in health, so it is proposed as a tool for the evaluation of relevant public health indicators. For demonstrative purposes, data series regarding the mortality of children under 5 years of age in Costa Rica were used, comprising the period 1978-2008. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
Interactive computation of coverage regions for indoor wireless communication
NASA Astrophysics Data System (ADS)
Abbott, A. Lynn; Bhat, Nitin; Rappaport, Theodore S.
1995-12-01
This paper describes a system which assists in the strategic placement of rf base stations within buildings. Known as the site modeling tool (SMT), this system allows the user to display graphical floor plans and to select base station transceiver parameters, including location and orientation, interactively. The system then computes and highlights estimated coverage regions for each transceiver, enabling the user to assess the total coverage within the building. For single-floor operation, the user can choose between distance-dependent and partition- dependent path-loss models. Similar path-loss models are also available for the case of multiple floors. This paper describes the method used by the system to estimate coverage for both directional and omnidirectional antennas. The site modeling tool is intended to be simple to use by individuals who are not experts at wireless communication system design, and is expected to be very useful in the specification of indoor wireless systems.
2013-06-01
Character in Sports Index CV Cross Validation FAS Faculty Appraisal Score FFM Five-Factor Model, also known as the “Big Five” GAM... FFM ). USMA does not allow personality testing as a selection tool. However, perhaps we may discover whether pre-admission information can predict...characteristic, and personality factors as described by the Five Factor Model ( FFM ) to determine their effect on one’s academic performance at USMA (Clark
Voss, Frank D.; Mastin, Mark C.
2012-01-01
A database was developed to automate model execution and to provide users with Internet access to voluminous data products ranging from summary figures to model output timeseries. Database-enabled Internet tools were developed to allow users to create interactive graphs of output results based on their analysis needs. For example, users were able to create graphs by selecting time intervals, greenhouse gas emission scenarios, general circulation models, and specific hydrologic variables.
Structure and software tools of AIDA.
Duisterhout, J S; Franken, B; Witte, F
1987-01-01
AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write implementation-specific code which can be selected and loaded by a special source loader, being part of the AIDA software. This feature is also accessible for maintaining software on different sites and on different installations.
Design of automation tools for management of descent traffic
NASA Technical Reports Server (NTRS)
Erzberger, Heinz; Nedell, William
1988-01-01
The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. This paper focuses primarily on the Descent Advisor which provides automation tools for managing descent traffic. The algorithms, automation modes, and graphical interfaces incorporated in the design are described. Information generated by the Descent Advisor tools is integrated into a plan view traffic display consisting of a high-resolution color monitor. Estimated arrival times of aircraft are presented graphically on a time line, which is also used interactively in combination with a mouse input device to select and schedule arrival times. Other graphical markers indicate the location of the fuel-optimum top-of-descent point and the predicted separation distances of aircraft at a designated time-control point. Computer generated advisories provide speed and descent clearances which the controller can issue to aircraft to help them arrive at the feeder gate at the scheduled times or with specified separation distances. Two types of horizontal guidance modes, selectable by the controller, provide markers for managing the horizontal flightpaths of aircraft under various conditions. The entire system consisting of descent advisor algorithm, a library of aircraft performance models, national airspace system data bases, and interactive display software has been implemented on a workstation made by Sun Microsystems, Inc. It is planned to use this configuration in operational evaluations at an en route center.
A Biologically Inspired Computational Model of Basal Ganglia in Action Selection
Baston, Chiara
2015-01-01
The basal ganglia (BG) are a subcortical structure implicated in action selection. The aim of this work is to present a new cognitive neuroscience model of the BG, which aspires to represent a parsimonious balance between simplicity and completeness. The model includes the 3 main pathways operating in the BG circuitry, that is, the direct (Go), indirect (NoGo), and hyperdirect pathways. The main original aspects, compared with previous models, are the use of a two-term Hebb rule to train synapses in the striatum, based exclusively on neuronal activity changes caused by dopamine peaks or dips, and the role of the cholinergic interneurons (affected by dopamine themselves) during learning. Some examples are displayed, concerning a few paradigmatic cases: action selection in basal conditions, action selection in the presence of a strong conflict (where the role of the hyperdirect pathway emerges), synapse changes induced by phasic dopamine, and learning new actions based on a previous history of rewards and punishments. Finally, some simulations show model working in conditions of altered dopamine levels, to illustrate pathological cases (dopamine depletion in parkinsonian subjects or dopamine hypermedication). Due to its parsimonious approach, the model may represent a straightforward tool to analyze BG functionality in behavioral experiments. PMID:26640481
GPCR-SSFE 2.0-a fragment-based molecular modeling web tool for Class A G-protein coupled receptors.
Worth, Catherine L; Kreuchwig, Franziska; Tiemann, Johanna K S; Kreuchwig, Annika; Ritschel, Michele; Kleinau, Gunnar; Hildebrand, Peter W; Krause, Gerd
2017-07-03
G-protein coupled receptors (GPCRs) are key players in signal transduction and therefore a large proportion of pharmaceutical drugs target these receptors. Structural data of GPCRs are sparse yet important for elucidating the molecular basis of GPCR-related diseases and for performing structure-based drug design. To ameliorate this problem, GPCR-SSFE 2.0 (http://www.ssfa-7tmr.de/ssfe2/), an intuitive web server dedicated to providing three-dimensional Class A GPCR homology models has been developed. The updated web server includes 27 inactive template structures and incorporates various new functionalities. Uniquely, it uses a fingerprint correlation scoring strategy for identifying the optimal templates, which we demonstrate captures structural features that sequence similarity alone is unable to do. Template selection is carried out separately for each helix, allowing both single-template models and fragment-based models to be built. Additionally, GPCR-SSFE 2.0 stores a comprehensive set of pre-calculated and downloadable homology models and also incorporates interactive loop modeling using the tool SL2, allowing knowledge-based input by the user to guide the selection process. For visual analysis, the NGL viewer is embedded into the result pages. Finally, blind-testing using two recently published structures shows that GPCR-SSFE 2.0 performs comparably or better than other state-of-the art GPCR modeling web servers. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Selection of reference standard during method development using the analytical hierarchy process.
Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun
2015-03-25
Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. Copyright © 2015 Elsevier B.V. All rights reserved.
Stember, Joseph N; Deng, Fang-Ming; Taneja, Samir S; Rosenkrantz, Andrew B
2014-08-01
To present results of a pilot study to develop software that identifies regions suspicious for prostate transition zone (TZ) tumor, free of user input. Eight patients with TZ tumors were used to develop the model by training a Naïve Bayes classifier to detect tumors based on selection of most accurate predictors among various signal and textural features on T2-weighted imaging (T2WI) and apparent diffusion coefficient (ADC) maps. Features tested as inputs were: average signal, signal standard deviation, energy, contrast, correlation, homogeneity and entropy (all defined on T2WI); and average ADC. A forward selection scheme was used on the remaining 20% of training set supervoxels to identify important inputs. The trained model was tested on a different set of ten patients, half with TZ tumors. In training cases, the software tiled the TZ with 4 × 4-voxel "supervoxels," 80% of which were used to train the classifier. Each of 100 iterations selected T2WI energy and average ADC, which therefore were deemed the optimal model input. The two-feature model was applied blindly to the separate set of test patients, again without operator input of suspicious foci. The software correctly predicted presence or absence of TZ tumor in all test patients. Furthermore, locations of predicted tumors corresponded spatially with locations of biopsies that had confirmed their presence. Preliminary findings suggest that this tool has potential to accurately predict TZ tumor presence and location, without operator input. © 2013 Wiley Periodicals, Inc.
On the interplay between mathematics and biology. Hallmarks toward a new systems biology
NASA Astrophysics Data System (ADS)
Bellomo, Nicola; Elaiw, Ahmed; Althiabi, Abdullah M.; Alghamdi, Mohammed Ali
2015-03-01
This paper proposes a critical analysis of the existing literature on mathematical tools developed toward systems biology approaches and, out of this overview, develops a new approach whose main features can be briefly summarized as follows: derivation of mathematical structures suitable to capture the complexity of biological, hence living, systems, modeling, by appropriate mathematical tools, Darwinian type dynamics, namely mutations followed by selection and evolution. Moreover, multiscale methods to move from genes to cells, and from cells to tissue are analyzed in view of a new systems biology approach.
Kim, Yusung; Tomé, Wolfgang A
2008-01-01
Voxel based iso-Tumor Control Probability (TCP) maps and iso-Complication maps are proposed as a plan-review tool especially for functional image-guided intensity-modulated radiotherapy (IMRT) strategies such as selective boosting (dose painting) and conformal avoidance IMRT. The maps employ voxel-based phenomenological biological dose-response models for target volumes and normal organs. Two IMRT strategies for prostate cancer, namely conventional uniform IMRT delivering an EUD = 84 Gy (equivalent uniform dose) to the entire PTV and selective boosting delivering an EUD = 82 Gy to the entire PTV, are investigated, to illustrate the advantages of this approach over iso-dose maps. Conventional uniform IMRT did yield a more uniform isodose map to the entire PTV while selective boosting did result in a nonuniform isodose map. However, when employing voxel based iso-TCP maps selective boosting exhibited a more uniform tumor control probability map compared to what could be achieved using conventional uniform IMRT, which showed TCP cold spots in high-risk tumor subvolumes despite delivering a higher EUD to the entire PTV. Voxel based iso-Complication maps are presented for rectum and bladder, and their utilization for selective avoidance IMRT strategies are discussed. We believe as the need for functional image guided treatment planning grows, voxel based iso-TCP and iso-Complication maps will become an important tool to assess the integrity of such treatment plans.
Sauterey, Boris; Ward, Ben A.; Follows, Michael J.; Bowler, Chris; Claessen, David
2015-01-01
The functional and taxonomic biogeography of marine microbial systems reflects the current state of an evolving system. Current models of marine microbial systems and biogeochemical cycles do not reflect this fundamental organizing principle. Here, we investigate the evolutionary adaptive potential of marine microbial systems under environmental change and introduce explicit Darwinian adaptation into an ocean modelling framework, simulating evolving phytoplankton communities in space and time. To this end, we adopt tools from adaptive dynamics theory, evaluating the fitness of invading mutants over annual timescales, replacing the resident if a fitter mutant arises. Using the evolutionary framework, we examine how community assembly, specifically the emergence of phytoplankton cell size diversity, reflects the combined effects of bottom-up and top-down controls. When compared with a species-selection approach, based on the paradigm that “Everything is everywhere, but the environment selects”, we show that (i) the selected optimal trait values are similar; (ii) the patterns emerging from the adaptive model are more robust, but (iii) the two methods lead to different predictions in terms of emergent diversity. We demonstrate that explicitly evolutionary approaches to modelling marine microbial populations and functionality are feasible and practical in time-varying, space-resolving settings and provide a new tool for exploring evolutionary interactions on a range of timescales in the ocean. PMID:25852217
Prediction of Agglomeration, Fouling, and Corrosion Tendency of Fuels in CFB Co-Combustion
NASA Astrophysics Data System (ADS)
Barišć, Vesna; Zabetta, Edgardo Coda; Sarkki, Juha
Prediction of agglomeration, fouling, and corrosion tendency of fuels is essential to the design of any CFB boiler. During the years, tools have been successfully developed at Foster Wheeler to help with such predictions for the most commercial fuels. However, changes in fuel market and the ever-growing demand for co-combustion capabilities pose a continuous need for development. This paper presents results from recently upgraded models used at Foster Wheeler to predict agglomeration, fouling, and corrosion tendency of a variety of fuels and mixtures. The models, subject of this paper, are semi-empirical computer tools that combine the theoretical basics of agglomeration/fouling/corrosion phenomena with empirical correlations. Correlations are derived from Foster Wheeler's experience in fluidized beds, including nearly 10,000 fuel samples and over 1,000 tests in about 150 CFB units. In these models, fuels are evaluated based on their classification, their chemical and physical properties by standard analyses (proximate, ultimate, fuel ash composition, etc.;.) alongside with Foster Wheeler own characterization methods. Mixtures are then evaluated taking into account the component fuels. This paper presents the predictive capabilities of the agglomeration/fouling/corrosion probability models for selected fuels and mixtures fired in full-scale. The selected fuels include coals and different types of biomass. The models are capable to predict the behavior of most fuels and mixtures, but also offer possibilities for further improvements.
RRegrs: an R package for computer-aided model selection with multiple regression models.
Tsiliki, Georgia; Munteanu, Cristian R; Seoane, Jose A; Fernandez-Lozano, Carlos; Sarimveis, Haralambos; Willighagen, Egon L
2015-01-01
Predictive regression models can be created with many different modelling approaches. Choices need to be made for data set splitting, cross-validation methods, specific regression parameters and best model criteria, as they all affect the accuracy and efficiency of the produced predictive models, and therefore, raising model reproducibility and comparison issues. Cheminformatics and bioinformatics are extensively using predictive modelling and exhibit a need for standardization of these methodologies in order to assist model selection and speed up the process of predictive model development. A tool accessible to all users, irrespectively of their statistical knowledge, would be valuable if it tests several simple and complex regression models and validation schemes, produce unified reports, and offer the option to be integrated into more extensive studies. Additionally, such methodology should be implemented as a free programming package, in order to be continuously adapted and redistributed by others. We propose an integrated framework for creating multiple regression models, called RRegrs. The tool offers the option of ten simple and complex regression methods combined with repeated 10-fold and leave-one-out cross-validation. Methods include Multiple Linear regression, Generalized Linear Model with Stepwise Feature Selection, Partial Least Squares regression, Lasso regression, and Support Vector Machines Recursive Feature Elimination. The new framework is an automated fully validated procedure which produces standardized reports to quickly oversee the impact of choices in modelling algorithms and assess the model and cross-validation results. The methodology was implemented as an open source R package, available at https://www.github.com/enanomapper/RRegrs, by reusing and extending on the caret package. The universality of the new methodology is demonstrated using five standard data sets from different scientific fields. Its efficiency in cheminformatics and QSAR modelling is shown with three use cases: proteomics data for surface-modified gold nanoparticles, nano-metal oxides descriptor data, and molecular descriptors for acute aquatic toxicity data. The results show that for all data sets RRegrs reports models with equal or better performance for both training and test sets than those reported in the original publications. Its good performance as well as its adaptability in terms of parameter optimization could make RRegrs a popular framework to assist the initial exploration of predictive models, and with that, the design of more comprehensive in silico screening applications.Graphical abstractRRegrs is a computer-aided model selection framework for R multiple regression models; this is a fully validated procedure with application to QSAR modelling.
NASA Astrophysics Data System (ADS)
Steinberg, P. D.; Brener, G.; Duffy, D.; Nearing, G. S.; Pelissier, C.
2017-12-01
Hyperparameterization, of statistical models, i.e. automated model scoring and selection, such as evolutionary algorithms, grid searches, and randomized searches, can improve forecast model skill by reducing errors associated with model parameterization, model structure, and statistical properties of training data. Ensemble Learning Models (Elm), and the related Earthio package, provide a flexible interface for automating the selection of parameters and model structure for machine learning models common in climate science and land cover classification, offering convenient tools for loading NetCDF, HDF, Grib, or GeoTiff files, decomposition methods like PCA and manifold learning, and parallel training and prediction with unsupervised and supervised classification, clustering, and regression estimators. Continuum Analytics is using Elm to experiment with statistical soil moisture forecasting based on meteorological forcing data from NASA's North American Land Data Assimilation System (NLDAS). There Elm is using the NSGA-2 multiobjective optimization algorithm for optimizing statistical preprocessing of forcing data to improve goodness-of-fit for statistical models (i.e. feature engineering). This presentation will discuss Elm and its components, including dask (distributed task scheduling), xarray (data structures for n-dimensional arrays), and scikit-learn (statistical preprocessing, clustering, classification, regression), and it will show how NSGA-2 is being used for automate selection of soil moisture forecast statistical models for North America.
Data Mining of Macromolecular Structures.
van Beusekom, Bart; Perrakis, Anastassis; Joosten, Robbie P
2016-01-01
The use of macromolecular structures is widespread for a variety of applications, from teaching protein structure principles all the way to ligand optimization in drug development. Applying data mining techniques on these experimentally determined structures requires a highly uniform, standardized structural data source. The Protein Data Bank (PDB) has evolved over the years toward becoming the standard resource for macromolecular structures. However, the process selecting the data most suitable for specific applications is still very much based on personal preferences and understanding of the experimental techniques used to obtain these models. In this chapter, we will first explain the challenges with data standardization, annotation, and uniformity in the PDB entries determined by X-ray crystallography. We then discuss the specific effect that crystallographic data quality and model optimization methods have on structural models and how validation tools can be used to make informed choices. We also discuss specific advantages of using the PDB_REDO databank as a resource for structural data. Finally, we will provide guidelines on how to select the most suitable protein structure models for detailed analysis and how to select a set of structure models suitable for data mining.
Max Tech Efficiency Electric HPWH with low-GWP Halogenated Refrigerant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nawaz, Kashif; Shen, Bo; Elatar, Ahmed F.
A scoping-level analysis was conducted to determine the maximum performance of an electric heat pump water heater (HPWH) with low GWP refrigerants (hydroflouroolefins (HFO), hydrofluorocarbons (HFC), and blends). A baseline heat pump water heater (GE GeoSpring) deploying R-134a was analyzed first using the DOE/ORNL Heat Pump Design Model (HPDM) modeling tool. The model was calibrated using experimental data to match the water temperature stratification in tank, first hour rating, energy factor and coefficient of performance. A CFD modeling tool was used to further refine the HPDM tank model. After calibration, the model was used to simulate the performance of alternativemore » refrigerants. The parametric analysis concluded that by appropriate selection of equipment size and condenser tube wrap configuration the overall performance of emerging low GWP refrigerants for HPWH application not only exceed the Energy Star Energy Factor criteria i.e. 2.20, but is also comparable to some of the most efficient products in the market.« less
Beseres Pollack, Jennifer; Cleveland, Andrew; Palmer, Terence A.; Reisinger, Anthony S.; Montagna, Paul A.
2012-01-01
Oyster reefs are one of the most threatened marine habitats on earth, with habitat loss resulting from water quality degradation, coastal development, destructive fishing practices, overfishing, and storm impacts. For successful and sustainable oyster reef restoration efforts, it is necessary to choose sites that support long-term growth and survival of oysters. Selection of suitable sites is critically important as it can greatly influence mortality factors and may largely determine the ultimate success of the restoration project. The application of Geographic Information Systems (GIS) provides an effective methodology for identifying suitable sites for oyster reef restoration and removes much of the uncertainty involved in the sometimes trial and error selection process. This approach also provides an objective and quantitative tool for planning future oyster reef restoration efforts. The aim of this study was to develop a restoration suitability index model and reef quality index model to characterize locations based on their potential for successful reef restoration within the Mission-Aransas Estuary, Texas, USA. The restoration suitability index model focuses on salinity, temperature, turbidity, dissolved oxygen, and depth, while the reef quality index model focuses on abundance of live oysters, dead shell, and spat. Size-specific Perkinsus marinus infection levels were mapped to illustrate general disease trends. This application was effective in identifying suitable sites for oyster reef restoration, is flexible in its use, and provides a mechanism for considering alternative approaches. The end product is a practical decision-support tool that can be used by coastal resource managers to improve oyster restoration efforts. As oyster reef restoration activities continue at small and large-scales, site selection criteria are critical for assisting stakeholders and managers and for maximizing long-term sustainability of oyster resources. PMID:22792410
Computational tool for the early screening of monoclonal antibodies for their viscosities
Agrawal, Neeraj J; Helk, Bernhard; Kumar, Sandeep; Mody, Neil; Sathish, Hasige A.; Samra, Hardeep S.; Buck, Patrick M; Li, Li; Trout, Bernhardt L
2016-01-01
Highly concentrated antibody solutions often exhibit high viscosities, which present a number of challenges for antibody-drug development, manufacturing and administration. The antibody sequence is a key determinant for high viscosity of highly concentrated solutions; therefore, a sequence- or structure-based tool that can identify highly viscous antibodies from their sequence would be effective in ensuring that only antibodies with low viscosity progress to the development phase. Here, we present a spatial charge map (SCM) tool that can accurately identify highly viscous antibodies from their sequence alone (using homology modeling to determine the 3-dimensional structures). The SCM tool has been extensively validated at 3 different organizations, and has proved successful in correctly identifying highly viscous antibodies. As a quantitative tool, SCM is amenable to high-throughput automated analysis, and can be effectively implemented during the antibody screening or engineering phase for the selection of low-viscosity antibodies. PMID:26399600
Technology Combination Analysis Tool (TCAT) for Active Debris Removal
NASA Astrophysics Data System (ADS)
Chamot, B.; Richard, M.; Salmon, T.; Pisseloup, A.; Cougnet, C.; Axthelm, R.; Saunder, C.; Dupont, C.; Lequette, L.
2013-08-01
This paper present the work of the Swiss Space Center EPFL within the CNES-funded OTV-2 study. In order to find the most performant Active Debris Removal (ADR) mission architectures and technologies, a tool was developed in order to design and compare ADR spacecraft, and to plan ADR campaigns to remove large debris. Two types of architectures are considered to be efficient: the Chaser (single-debris spacecraft), the Mothership/ Kits (multiple-debris spacecraft). Both are able to perform controlled re-entry. The tool includes modules to optimise the launch dates and the order of capture, to design missions and spacecraft, and to select launch vehicles. The propulsion, power and structure subsystems are sized by the tool thanks to high-level parametric models whilst the other ones are defined by their mass and power consumption. Final results are still under investigation by the consortium but two concrete examples of the tool's outputs are presented in the paper.
Bao, Le; Gu, Hong; Dunn, Katherine A; Bielawski, Joseph P
2007-02-08
Models of codon evolution have proven useful for investigating the strength and direction of natural selection. In some cases, a priori biological knowledge has been used successfully to model heterogeneous evolutionary dynamics among codon sites. These are called fixed-effect models, and they require that all codon sites are assigned to one of several partitions which are permitted to have independent parameters for selection pressure, evolutionary rate, transition to transversion ratio or codon frequencies. For single gene analysis, partitions might be defined according to protein tertiary structure, and for multiple gene analysis partitions might be defined according to a gene's functional category. Given a set of related fixed-effect models, the task of selecting the model that best fits the data is not trivial. In this study, we implement a set of fixed-effect codon models which allow for different levels of heterogeneity among partitions in the substitution process. We describe strategies for selecting among these models by a backward elimination procedure, Akaike information criterion (AIC) or a corrected Akaike information criterion (AICc). We evaluate the performance of these model selection methods via a simulation study, and make several recommendations for real data analysis. Our simulation study indicates that the backward elimination procedure can provide a reliable method for model selection in this setting. We also demonstrate the utility of these models by application to a single-gene dataset partitioned according to tertiary structure (abalone sperm lysin), and a multi-gene dataset partitioned according to the functional category of the gene (flagellar-related proteins of Listeria). Fixed-effect models have advantages and disadvantages. Fixed-effect models are desirable when data partitions are known to exhibit significant heterogeneity or when a statistical test of such heterogeneity is desired. They have the disadvantage of requiring a priori knowledge for partitioning sites. We recommend: (i) selection of models by using backward elimination rather than AIC or AICc, (ii) use a stringent cut-off, e.g., p = 0.0001, and (iii) conduct sensitivity analysis of results. With thoughtful application, fixed-effect codon models should provide a useful tool for large scale multi-gene analyses.
Easy Ergonomics: A Guide to Selecting Non-Powered Hand Tools
... identifying the presence or absence of basic ergonomic design features (Dababneh et al.*). The right tool will ... Cal/OSHA). Both agencies recognize the importance of design and selection of hand tools in strategies to ...
[A case with apraxia of tool use: selective inability to form a hand posture for a tool].
Hayakawa, Yuko; Fujii, Toshikatsu; Yamadori, Atsushi; Meguro, Kenichi; Suzuki, Kyoko
2015-03-01
Impaired tool use is recognized as a symptom of ideational apraxia. While many studies have focused on difficulties in producing gestures as a whole, using tools involves several steps; these include forming hand postures appropriate for the use of certain tool, selecting objects or body parts to act on, and producing gestures. In previously reported cases, both producing and recognizing hand postures were impaired. Here we report the first case showing a selective impairment of forming hand postures appropriate for tools with preserved recognition of the required hand postures. A 24-year-old, right-handed man was admitted to hospital because of sensory impairment of the right side of the body, mild aphasia, and impaired tool use due to left parietal subcortical hemorrhage. His ability to make symbolic gestures, copy finger postures, and orient his hand to pass a slit was well preserved. Semantic knowledge for tools and hand postures was also intact. He could flawlessly select the correct hand postures in recognition tasks. He only demonstrated difficulties in forming a hand posture appropriate for a tool. Once he properly grasped a tool by trial and error, he could use it without hesitation. These observations suggest that each step of tool use should be thoroughly examined in patients with ideational apraxia.
NASA Astrophysics Data System (ADS)
Lei, Xiaohui; Wang, Yuhui; Liao, Weihong; Jiang, Yunzhong; Tian, Yu; Wang, Hao
2011-09-01
Many regions are still threatened with frequent floods and water resource shortage problems in China. Consequently, the task of reproducing and predicting the hydrological process in watersheds is hard and unavoidable for reducing the risks of damage and loss. Thus, it is necessary to develop an efficient and cost-effective hydrological tool in China as many areas should be modeled. Currently, developed hydrological tools such as Mike SHE and ArcSWAT (soil and water assessment tool based on ArcGIS) show significant power in improving the precision of hydrological modeling in China by considering spatial variability both in land cover and in soil type. However, adopting developed commercial tools in such a large developing country comes at a high cost. Commercial modeling tools usually contain large numbers of formulas, complicated data formats, and many preprocessing or postprocessing steps that may make it difficult for the user to carry out simulation, thus lowering the efficiency of the modeling process. Besides, commercial hydrological models usually cannot be modified or improved to be suitable for some special hydrological conditions in China. Some other hydrological models are open source, but integrated into commercial GIS systems. Therefore, by integrating hydrological simulation code EasyDHM, a hydrological simulation tool named MWEasyDHM was developed based on open-source MapWindow GIS, the purpose of which is to establish the first open-source GIS-based distributed hydrological model tool in China by integrating modules of preprocessing, model computation, parameter estimation, result display, and analysis. MWEasyDHM provides users with a friendly manipulating MapWindow GIS interface, selectable multifunctional hydrological processing modules, and, more importantly, an efficient and cost-effective hydrological simulation tool. The general construction of MWEasyDHM consists of four major parts: (1) a general GIS module for hydrological analysis, (2) a preprocessing module for modeling inputs, (3) a model calibration module, and (4) a postprocessing module. The general GIS module for hydrological analysis is developed on the basis of totally open-source GIS software, MapWindow, which contains basic GIS functions. The preprocessing module is made up of three submodules including a DEM-based submodule for hydrological analysis, a submodule for default parameter calculation, and a submodule for the spatial interpolation of meteorological data. The calibration module contains parallel computation, real-time computation, and visualization. The postprocessing module includes model calibration and model results spatial visualization using tabular form and spatial grids. MWEasyDHM makes it possible for efficient modeling and calibration of EasyDHM, and promises further development of cost-effective applications in various watersheds.
Random and non-random mating populations: Evolutionary dynamics in meiotic drive.
Sarkar, Bijan
2016-01-01
Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. Copyright © 2015 Elsevier Inc. All rights reserved.
SU-F-T-405: Development of a Rapid Cardiac Contouring Tool Using Landmark-Driven Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelletier, C; Jung, J; Mosher, E
2016-06-15
Purpose: This study aims to develop a tool to rapidly delineate cardiac substructures for use in dosimetry for large-scale clinical trial or epidemiological investigations. The goal is to produce a system that can semi-automatically delineate nine cardiac structures to a reasonable accuracy within a couple of minutes. Methods: The cardiac contouring tool employs a Most Similar Atlas method, where a selection criterion is used to pre-select the most similar model to the patient from a library of pre-defined atlases. Sixty contrast-enhanced cardiac computed tomography angiography (CTA) scans (30 male and 30 female) were manually contoured to serve as the atlasmore » library. For each CTA 12 structures were delineated. Kabsch algorithm was used to compute the optimum rotation and translation matrices between the patient and atlas. Minimum root mean squared distance between the patient and atlas after transformation was used to select the most-similar atlas. An initial study using 10 CTA sets was performed to assess system feasibility. Leave-one patient out method was performed, and fit criteria were calculated to evaluate the fit accuracy compared to manual contours. Results: For the pilot study, mean dice indices of .895 were achieved for the whole heart, .867 for the ventricles, and .802 for the atria. In addition, mean distance was measured via the chord length distribution (CLD) between ground truth and the atlas structures for the four coronary arteries. The mean CLD for all coronary arteries was below 14mm, with the left circumflex artery showing the best agreement (7.08mm). Conclusion: The cardiac contouring tool is able to delineate cardiac structures with reasonable accuracy in less than 90 seconds. Pilot data indicates that the system is able to delineate the whole heart and ventricles within a reasonable accuracy using even a limited library. We are extending the atlas sets to 60 adult males and females in total.« less
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E. (Editor); Sullivan, Shannon (Editor); Sanchez, Alicia (Editor)
2008-01-01
This NASA Conference Publication features select papers and PowerPoint presentations from the Education and Training Track of MODSIM World 2007 Conference and Expo. Invited speakers and panelists of national and international renown, representing academia, industry and government, discussed how modeling and simulation (M&S) technology can be used to accelerate learning in the K-16 classroom, especially when using M&S technology as a tool for integrating science, technology, engineering and mathematics (STEM) classes. The presenters also addressed the application ofM&S technology to learning and training outside of the classroom. Specific sub-topics of the presentations included: learning theory; curriculum development; professional development; tools/user applications; implementation/infrastructure/issues; and workforce development. There was a session devoted to student M&S competitions in Virginia too, as well as a poster session.
Pacheco-Torres, Jesus; Mukherjee, Nobina; Walko, Martin; López-Larrubia, Pilar; Ballesteros, Paloma; Cerdan, Sebastian; Kocer, Armagan
2015-08-01
Liposomal drug delivery vehicles are promising nanomedicine tools for bringing cytotoxic drugs to cancerous tissues selectively. However, the triggered cargo release from liposomes in response to a target-specific stimulus has remained elusive. We report on functionalizing stealth-liposomes with an engineered ion channel and using these liposomes in vivo for releasing an imaging agent into a cerebral glioma rodent model. If the ambient pH drops below a threshold value, the channel generates temporary pores on the liposomes, thus allowing leakage of the intraluminal medicines. By using magnetic resonance spectroscopy and imaging, we show that engineered liposomes can detect the mildly acidic pH of the tumor microenvironment with 0.2 pH unit precision and they release their content into C6 glioma tumors selectively, in vivo. A drug delivery system with this level of sensitivity and selectivity to environmental stimuli may well serve as an optimal tool for environmentally-triggered and image-guided drug release. Cancer remains a leading cause of mortality worldwide. With advances in science, delivery systems of anti-cancer drugs have also become sophisticated. In this article, the authors designed and characterized functionalized liposomal vehicles, which would release the drug payload in a highly sensitive manner in response to a change in pH environment in an animal glioma model. The novel data would enable better future designs of drug delivery systems. Copyright © 2015 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Burke, Victoria; Greenberg, Daphne
2010-01-01
There are many readability tools that instructors can use to help adult learners select reading materials. We describe and compare different types of readability tools: formulas calculated by hand, tools found on the Web, tools embedded in a word processing program, and readability tools found in a commercial software program. Practitioners do not…
Measurement of W + bb and a search for MSSM Higgs bosons with the CMS detector at the LHC
NASA Astrophysics Data System (ADS)
O'Connor, Alexander Pinpin
Tooling used to cure composite laminates in the aerospace and automotive industries must provide a dimensionally stable geometry throughout the thermal cycle applied during the part curing process. This requires that the Coefficient of Thermal Expansion (CTE) of the tooling materials match that of the composite being cured. The traditional tooling material for production applications is a nickel alloy. Poor machinability and high material costs increase the expense of metallic tooling made from nickel alloys such as 'Invar 36' or 'Invar 42'. Currently, metallic tooling is unable to meet the needs of applications requiring rapid affordable tooling solutions. In applications where the tooling is not required to have the durability provided by metals, such as for small area repair, an opportunity exists for non-metallic tooling materials like graphite, carbon foams, composites, or ceramics and machinable glasses. Nevertheless, efficient machining of brittle, non-metallic materials is challenging due to low ductility, porosity, and high hardness. The machining of a layup tool comprises a large portion of the final cost. Achieving maximum process economy requires optimization of the machining process in the given tooling material. Therefore, machinability of the tooling material is a critical aspect of the overall cost of the tool. In this work, three commercially available, brittle/porous, non-metallic candidate tooling materials were selected, namely: (AAC) Autoclaved Aerated Concrete, CB1100 ceramic block and Cfoam carbon foam. Machining tests were conducted in order to evaluate the machinability of these materials using end milling. Chip formation, cutting forces, cutting tool wear, machining induced damage, surface quality and surface integrity were investigated using High Speed Steel (HSS), carbide, diamond abrasive and Polycrystalline Diamond (PCD) cutting tools. Cutting forces were found to be random in magnitude, which was a result of material porosity. The abrasive nature of Cfoam produced rapid tool wear when using HSS and PCD type cutting tools. However, tool wear was not significant in AAC or CB1100 regardless of the type of cutting edge. Machining induced damage was observed in the form of macro-scale chipping and fracture in combination with micro-scale cracking. Transverse rupture test results revealed significant reductions in residual strength and damage tolerance in CB1100. In contrast, AAC and Cfoam showed no correlation between machining induced damage and a reduction in surface integrity. Cutting forces in machining were modeled for all materials. Cutting force regression models were developed based on Design of Experiment and Analysis of Variance. A mechanistic cutting force model was proposed based upon conventional end milling force models and statistical distributions of material porosity. In order to validate the model, predicted cutting forces were compared to experimental results. Predicted cutting forces agreed well with experimental measurements. Furthermore, over the range of cutting conditions tested, the proposed model was shown to have comparable predictive accuracy to empirically produced regression models; greatly reducing the number of cutting tests required to simulate cutting forces. Further, this work demonstrates a key adaptation of metallic cutting force models to brittle porous material; a vital step in the research into the machining of these materials using end milling.
3D-Printed specimens as a valuable tool in anatomy education: A pilot study.
Garas, Monique; Vaccarezza, Mauro; Newland, George; McVay-Doornbusch, Kylie; Hasani, Jamila
2018-06-06
Three-dimensional (3D) printing is a modern technique of creating 3D-printed models that allows reproduction of human structures from MRI and CT scans via fusion of multiple layers of resin materials. To assess feasibility of this innovative resource as anatomy educational tool, we conducted a preliminary study on Curtin University undergraduate students to investigate the use of 3D models for anatomy learning as a main goal, to assess the effectiveness of different specimen types during the sessions and personally preferred anatomy learning tools among students as secondary aim. The study consisted of a pre-test, exposure to test (anatomical test) and post-test survey. During pre-test, all participants (both without prior experience and experienced groups) were given a brief introduction on laboratory safety and study procedure thus participants were exposed to 3D, wet and plastinated specimens of the heart, shoulder and thigh to identify the pinned structures (anatomical test). Then, participants were provided a post-test survey containing five questions. In total, 23 participants completed the anatomical test and post-test survey. A larger number of participants (85%) achieved right answers for 3D models compared to wet and plastinated materials, 74% of population selected 3D models as the most usable tool for identification of pinned structures and 45% chose 3D models as their preferred method of anatomy learning. This preliminary small-size study affirms the feasibility of 3D-printed models as a valuable asset in anatomy learning and shows their capability to be used adjacent to cadaveric materials and other widely used tools in anatomy education. Copyright © 2018 Elsevier GmbH. All rights reserved.
Design Space Exploration and Optimization Using Modern Ship Design Tools
2014-06-01
Model Editor will open again. Aviation Facilities Indicator should be set to MINOR AVN since we are required to have two helicopters in our design. The...other options are NONE and MAJOR AVN . The former should be selected if no aviation facilities are required and the latter should be chosen only for
The coaching process: an effective tool for professional development.
Kowalski, Karren; Casper, Colleen
2007-01-01
A model for coaching in nursing is described. Criteria for selecting a coach are discussed. Competencies for a coach are recommended. In addition, guidelines for caching sessions are provided as well as an example of an action plan outline to help the coachee identify areas of desired growth and options for developing these areas.
The Chemical Aquatic Fate and Effects (CAFE) database is a tool that facilitates assessments of accidental chemical releases into aquatic environments. CAFE contains aquatic toxicity data used in the development of species sensitivity distributions (SSDs) and the estimation of ha...
Capillary gas chromatography with mass spectrometric detection is the most commonly used technique for analyzing samples from Superfund sites. While the U.S. EPA has developed target lists of compounds for which library mass spectra are available on most mass spectrometer data s...
In the Laurentian Great Lakes Basin (GLB), corn acreage has been expanding since 2005 in response to high demand for corn as an ethanol feedstock. This study integrated remote sensing-derived products and the Soil and Water Assessment Tool (SWAT) withing a GIS modeling environme...
ERIC Educational Resources Information Center
Oakes, Wendy Peia; Lane, Kathleen Lynne; Ennis, Robin Parks
2016-01-01
This descriptive study reports data from one elementary school whose leadership team explored and installed systematic behavior screening as part of their tiered model of prevention. The authors compared student performance on two school-selected screening tools: the Student Risk Screening Scale for Internalizing and Externalizing (SRSS-IE) and…
Multimedia Projects in Education: Designing, Producing, and Assessing, Third Edition
ERIC Educational Resources Information Center
Ivers, Karen S.; Barron, Ann E.
2005-01-01
Building on the materials in the two previous successful editions, this book features approximately 40% all new material and updates the previous information. The authors use the DDD-E model (Decide, Design, Develop--Evaluate) to show how to select and plan multimedia projects, use presentation and development tools, manage graphics, audio, and…
Modeling Academic Achievement by Self-Reported versus Traced Goal Orientation
ERIC Educational Resources Information Center
Zhou, Mingming; Winne, Philip H.
2012-01-01
We examined achievement goals measured by self-reports and by traces (behavioral indicators) gathered as undergraduates used software tools to study a multimedia-formatted article. Traces were operationalized by tags participants applied to selections of text and hyperlinks they clicked in the article. Tags and hyperlinks were titled to represent…
Predicting Learners Styles Based on Fuzzy Model
ERIC Educational Resources Information Center
Alian, Marwah; Shaout, Adnan
2017-01-01
Learners style is grouped into four types mainly; Visual, auditory, kinesthetic and Read/Write. Each type of learners learns primarily through one of the main receiving senses, visual, listening, or by doing. Learner style has an effect on the learning process and learner's achievement. It is better to select suitable learning tool for the learner…
Advances and Challenges in Genomic Selection for Disease Resistance.
Poland, Jesse; Rutkoski, Jessica
2016-08-04
Breeding for disease resistance is a central focus of plant breeding programs, as any successful variety must have the complete package of high yield, disease resistance, agronomic performance, and end-use quality. With the need to accelerate the development of improved varieties, genomics-assisted breeding is becoming an important tool in breeding programs. With marker-assisted selection, there has been success in breeding for disease resistance; however, much of this work and research has focused on identifying, mapping, and selecting for major resistance genes that tend to be highly effective but vulnerable to breakdown with rapid changes in pathogen races. In contrast, breeding for minor-gene quantitative resistance tends to produce more durable varieties but is a more challenging breeding objective. As the genetic architecture of resistance shifts from single major R genes to a diffused architecture of many minor genes, the best approach for molecular breeding will shift from marker-assisted selection to genomic selection. Genomics-assisted breeding for quantitative resistance will therefore necessitate whole-genome prediction models and selection methodology as implemented for classical complex traits such as yield. Here, we examine multiple case studies testing whole-genome prediction models and genomic selection for disease resistance. In general, whole-genome models for disease resistance can produce prediction accuracy suitable for application in breeding. These models also largely outperform multiple linear regression as would be applied in marker-assisted selection. With the implementation of genomic selection for yield and other agronomic traits, whole-genome marker profiles will be available for the entire set of breeding lines, enabling genomic selection for disease at no additional direct cost. In this context, the scope of implementing genomics selection for disease resistance, and specifically for quantitative resistance and quarantined pathogens, becomes a tractable and powerful approach in breeding programs.
LIMO EEG: a toolbox for hierarchical LInear MOdeling of ElectroEncephaloGraphic data.
Pernet, Cyril R; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A
2011-01-01
Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses.
LIMO EEG: A Toolbox for Hierarchical LInear MOdeling of ElectroEncephaloGraphic Data
Pernet, Cyril R.; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A.
2011-01-01
Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses. PMID:21403915
Sankar, Punnaivanam; Alain, Krief; Aghila, Gnanasekaran
2010-05-24
We have developed a model structure-editing tool, ChemEd, programmed in JAVA, which allows drawing chemical structures on a graphical user interface (GUI) by selecting appropriate structural fragments defined in a fragment library. The terms representing the structural fragments are organized in fragment ontology to provide a conceptual support. ChemEd describes the chemical structure in an XML document (ChemFul) with rich semantics explicitly encoding the details of the chemical bonding, the hybridization status, and the electron environment around each atom. The document can be further processed through suitable algorithms and with the support of external chemical ontologies to generate understandable reports about the functional groups present in the structure and their specific environment.
LOS selective fading and AN/FRC-170(V) radio hybrid computer simulation phase A report
NASA Astrophysics Data System (ADS)
Klukis, M. K.; Lyon, T. I.; Walker, R.
1981-09-01
This report documents results of the first phase of modeling, simulation and study of the dual diversity AN/FRC-170(V) radio and frequency selective fading line of sight channel. Both hybrid computer and circuit technologies were used to develop a fast, accurate and flexible simulation tool to investigate changes and proposed improvements to the design of the AN/FRC-170(V) radio. In addition to the simulation study, a remote hybrid computer terminal was provided to DCEC for interactive study of the modeled radio and channel. Simulated performance of the radio for Rayleigh, line of sight two ray channels, and additive noise are included in the report.
McKinney, Brett A.; White, Bill C.; Grill, Diane E.; Li, Peter W.; Kennedy, Richard B.; Poland, Gregory A.; Oberg, Ann L.
2013-01-01
Relief-F is a nonparametric, nearest-neighbor machine learning method that has been successfully used to identify relevant variables that may interact in complex multivariate models to explain phenotypic variation. While several tools have been developed for assessing differential expression in sequence-based transcriptomics, the detection of statistical interactions between transcripts has received less attention in the area of RNA-seq analysis. We describe a new extension and assessment of Relief-F for feature selection in RNA-seq data. The ReliefSeq implementation adapts the number of nearest neighbors (k) for each gene to optimize the Relief-F test statistics (importance scores) for finding both main effects and interactions. We compare this gene-wise adaptive-k (gwak) Relief-F method with standard RNA-seq feature selection tools, such as DESeq and edgeR, and with the popular machine learning method Random Forests. We demonstrate performance on a panel of simulated data that have a range of distributional properties reflected in real mRNA-seq data including multiple transcripts with varying sizes of main effects and interaction effects. For simulated main effects, gwak-Relief-F feature selection performs comparably to standard tools DESeq and edgeR for ranking relevant transcripts. For gene-gene interactions, gwak-Relief-F outperforms all comparison methods at ranking relevant genes in all but the highest fold change/highest signal situations where it performs similarly. The gwak-Relief-F algorithm outperforms Random Forests for detecting relevant genes in all simulation experiments. In addition, Relief-F is comparable to the other methods based on computational time. We also apply ReliefSeq to an RNA-Seq study of smallpox vaccine to identify gene expression changes between vaccinia virus-stimulated and unstimulated samples. ReliefSeq is an attractive tool for inclusion in the suite of tools used for analysis of mRNA-Seq data; it has power to detect both main effects and interaction effects. Software Availability: http://insilico.utulsa.edu/ReliefSeq.php. PMID:24339943
2014-01-01
Background. Evidence rankings do not consider equally internal (IV), external (EV), and model validity (MV) for clinical studies including complementary and alternative medicine/integrative medicine (CAM/IM) research. This paper describe this model and offers an EV assessment tool (EVAT©) for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IM research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making. PMID:24734111
Shi, Xiaohu; Zhang, Jingfen; He, Zhiquan; Shang, Yi; Xu, Dong
2011-09-01
One of the major challenges in protein tertiary structure prediction is structure quality assessment. In many cases, protein structure prediction tools generate good structural models, but fail to select the best models from a huge number of candidates as the final output. In this study, we developed a sampling-based machine-learning method to rank protein structural models by integrating multiple scores and features. First, features such as predicted secondary structure, solvent accessibility and residue-residue contact information are integrated by two Radial Basis Function (RBF) models trained from different datasets. Then, the two RBF scores and five selected scoring functions developed by others, i.e., Opus-CA, Opus-PSP, DFIRE, RAPDF, and Cheng Score are synthesized by a sampling method. At last, another integrated RBF model ranks the structural models according to the features of sampling distribution. We tested the proposed method by using two different datasets, including the CASP server prediction models of all CASP8 targets and a set of models generated by our in-house software MUFOLD. The test result shows that our method outperforms any individual scoring function on both best model selection, and overall correlation between the predicted ranking and the actual ranking of structural quality.
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Rankin, Charles C.
2006-01-01
This document summarizes the STructural Analysis of General Shells (STAGS) development effort, STAGS performance for selected demonstration problems, and STAGS application problems illustrating selected advanced features available in the STAGS Version 5.0. Each problem is discussed including selected background information and reference solutions when available. The modeling and solution approach for each problem is described and illustrated. Numerical results are presented and compared with reference solutions, test data, and/or results obtained from mesh refinement studies. These solutions provide an indication of the overall capabilities of the STAGS nonlinear finite element analysis tool and provide users with representative cases, including input files, to explore these capabilities that may then be tailored to other applications.
Toward Genomics-Based Breeding in C3 Cool-Season Perennial Grasses.
Talukder, Shyamal K; Saha, Malay C
2017-01-01
Most important food and feed crops in the world belong to the C3 grass family. The future of food security is highly reliant on achieving genetic gains of those grasses. Conventional breeding methods have already reached a plateau for improving major crops. Genomics tools and resources have opened an avenue to explore genome-wide variability and make use of the variation for enhancing genetic gains in breeding programs. Major C3 annual cereal breeding programs are well equipped with genomic tools; however, genomic research of C3 cool-season perennial grasses is lagging behind. In this review, we discuss the currently available genomics tools and approaches useful for C3 cool-season perennial grass breeding. Along with a general review, we emphasize the discussion focusing on forage grasses that were considered orphan and have little or no genetic information available. Transcriptome sequencing and genotype-by-sequencing technology for genome-wide marker detection using next-generation sequencing (NGS) are very promising as genomics tools. Most C3 cool-season perennial grass members have no prior genetic information; thus NGS technology will enhance collinear study with other C3 model grasses like Brachypodium and rice. Transcriptomics data can be used for identification of functional genes and molecular markers, i.e., polymorphism markers and simple sequence repeats (SSRs). Genome-wide association study with NGS-based markers will facilitate marker identification for marker-assisted selection. With limited genetic information, genomic selection holds great promise to breeders for attaining maximum genetic gain of the cool-season C3 perennial grasses. Application of all these tools can ensure better genetic gains, reduce length of selection cycles, and facilitate cultivar development to meet the future demand for food and fodder.
Konstantinou, Kika; Ogollah, Reuben; Hay, Elaine M.; Dunn, Kate M.
2018-01-01
Background Identification of sciatica may assist timely management but can be challenging in clinical practice. Diagnostic models to identify sciatica have mainly been developed in secondary care settings with conflicting reference standard selection. This study explores the challenges of reference standard selection and aims to ascertain which combination of clinical assessment items best identify sciatica in people seeking primary healthcare. Methods Data on 394 low back-related leg pain consulters were analysed. Potential sciatica indicators were seven clinical assessment items. Two reference standards were used: (i) high confidence sciatica clinical diagnosis; (ii) high confidence sciatica clinical diagnosis with confirmatory magnetic resonance imaging findings. Multivariable logistic regression models were produced for both reference standards. A tool predicting sciatica diagnosis in low back-related leg pain was derived. Latent class modelling explored the validity of the reference standard. Results Model (i) retained five items; model (ii) retained six items. Four items remained in both models: below knee pain, leg pain worse than back pain, positive neural tension tests and neurological deficit. Model (i) was well calibrated (p = 0.18), discrimination was area under the receiver operating characteristic curve (AUC) 0.95 (95% CI 0.93, 0.98). Model (ii) showed good discrimination (AUC 0.82; 0.78, 0.86) but poor calibration (p = 0.004). Bootstrapping revealed minimal overfitting in both models. Agreement between the two latent classes and clinical diagnosis groups defined by model (i) was substantial, and fair for model (ii). Conclusion Four clinical assessment items were common in both reference standard definitions of sciatica. A simple scoring tool for identifying sciatica was developed. These criteria could be used clinically and in research to improve accuracy of identification of this subgroup of back pain patients. PMID:29621243
Applications of step-selection functions in ecology and conservation.
Thurfjell, Henrik; Ciuti, Simone; Boyce, Mark S
2014-01-01
Recent progress in positioning technology facilitates the collection of massive amounts of sequential spatial data on animals. This has led to new opportunities and challenges when investigating animal movement behaviour and habitat selection. Tools like Step Selection Functions (SSFs) are relatively new powerful models for studying resource selection by animals moving through the landscape. SSFs compare environmental attributes of observed steps (the linear segment between two consecutive observations of position) with alternative random steps taken from the same starting point. SSFs have been used to study habitat selection, human-wildlife interactions, movement corridors, and dispersal behaviours in animals. SSFs also have the potential to depict resource selection at multiple spatial and temporal scales. There are several aspects of SSFs where consensus has not yet been reached such as how to analyse the data, when to consider habitat covariates along linear paths between observations rather than at their endpoints, how many random steps should be considered to measure availability, and how to account for individual variation. In this review we aim to address all these issues, as well as to highlight weak features of this modelling approach that should be developed by further research. Finally, we suggest that SSFs could be integrated with state-space models to classify behavioural states when estimating SSFs.
A decision tool for selecting trench cap designs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paige, G.B.; Stone, J.J.; Lane, L.J.
1995-12-31
A computer based prototype decision support system (PDSS) is being developed to assist the risk manager in selecting an appropriate trench cap design for waste disposal sites. The selection of the {open_quote}best{close_quote} design among feasible alternatives requires consideration of multiple and often conflicting objectives. The methodology used in the selection process consists of: selecting and parameterizing decision variables using data, simulation models, or expert opinion; selecting feasible trench cap design alternatives; ordering the decision variables and ranking the design alternatives. The decision model is based on multi-objective decision theory and uses a unique approach to order the decision variables andmore » rank the design alternatives. Trench cap designs are evaluated based on federal regulations, hydrologic performance, cover stability and cost. Four trench cap designs, which were monitored for a four year period at Hill Air Force Base in Utah, are used to demonstrate the application of the PDSS and evaluate the results of the decision model. The results of the PDSS, using both data and simulations, illustrate the relative advantages of each of the cap designs and which cap is the {open_quotes}best{close_quotes} alternative for a given set of criteria and a particular importance order of those decision criteria.« less
Modelling of teeth of a gear transmission for modern manufacturing technologies
NASA Astrophysics Data System (ADS)
Monica, Z.; Banaś, W.; Ćwikla, G.; Topolska, S.
2017-08-01
The technological process of manufacturing of gear wheels is influenced by many factors. It is designated depending on the type of material from which the gear is to be produced, its heat treatment parameters, the required accuracy, the geometrical form and the modifications of the tooth. Therefor the parameters selection process is not easy and moreover it is unambiguous. Another important stage of the technological process is the selection of appropriate tools to properly machine teeth in the operations of both roughing and finishing. In the presented work the focus is put first of all on modern production methods of gears using technologically advanced instruments in comparison with conventional tools. Conventional processing tools such as gear hobbing cutters or Fellows gear-shaper cutters are used from the beginning of the machines for the production of gear wheels. With the development of technology and the creation of CNC machines designated for machining of gears wheel it was also developed the manufacturing technology as well as the design knowledge concerning the technological tools. Leading manufacturers of cutting tools extended the range of tools designated for machining of gears on the so-called hobbing cutters with inserted cemented carbide tips. The same have be introduced to Fellows gear-shaper cutters. The results of tests show that is advantaged to use hobbing cutters with inserted cemented carbide tips for milling gear wheels with a high number of teeth, where the time gains are very high, in relation to the use of conventional milling cutters.
Fedy, Bradley C.; Doherty, Kevin E.; Aldridge, Cameron L.; O'Donnell, Michael S.; Beck, Jeffrey L.; Bedrosian, Bryan; Gummer, David; Holloran, Matthew J.; Johnson, Gregory D.; Kaczor, Nicholas W.; Kirol, Christopher P.; Mandich, Cheryl A.; Marshall, David; McKee, Gwyn; Olson, Chad; Pratt, Aaron C.; Swanson, Christopher C.; Walker, Brett L.
2014-01-01
Animal habitat selection is an important and expansive area of research in ecology. In particular, the study of habitat selection is critical in habitat prioritization efforts for species of conservation concern. Landscape planning for species is happening at ever-increasing extents because of the appreciation for the role of landscape-scale patterns in species persistence coupled to improved datasets for species and habitats, and the expanding and intensifying footprint of human land uses on the landscape. We present a large-scale collaborative effort to develop habitat selection models across large landscapes and multiple seasons for prioritizing habitat for a species of conservation concern. Greater sage-grouse (Centrocercus urophasianus, hereafter sage-grouse) occur in western semi-arid landscapes in North America. Range-wide population declines of this species have been documented, and it is currently considered as “warranted but precluded” from listing under the United States Endangered Species Act. Wyoming is predicted to remain a stronghold for sage-grouse populations and contains approximately 37% of remaining birds. We compiled location data from 14 unique radiotelemetry studies (data collected 1994–2010) and habitat data from high-quality, biologically relevant, geographic information system (GIS) layers across Wyoming. We developed habitat selection models for greater sage-grouse across Wyoming for 3 distinct life stages: 1) nesting, 2) summer, and 3) winter. We developed patch and landscape models across 4 extents, producing statewide and regional (southwest, central, northeast) models for Wyoming. Habitat selection varied among regions and seasons, yet preferred habitat attributes generally matched the extensive literature on sage-grouse seasonal habitat requirements. Across seasons and regions, birds preferred areas with greater percentage sagebrush cover and avoided paved roads, agriculture, and forested areas. Birds consistently preferred areas with higher precipitation in the summer and avoided rugged terrain in the winter. Selection for sagebrush cover varied regionally with stronger selection in the Northeast region, likely because of limited availability, whereas avoidance of paved roads was fairly consistent across regions. We chose resource selection function (RSF) thresholds for each model set (seasonal × regional combination) that delineated important seasonal habitats for sage-grouse. Each model set showed good validation and discriminatory capabilities within study-site boundaries. We applied the nesting-season models to a novel area not included in model development. The percentage of independent nest locations that fell directly within identified important habitat was not overly impressive in the novel area (49%); however, including a 500-m buffer around important habitat captured 98% of independent nest locations within the novel area. We also used leks and associated peak male counts as a proxy for nesting habitat outside of the study sites used to develop the models. A 1.5-km buffer around the important nesting habitat boundaries included 77% of males counted at leks in Wyoming outside of the study sites. Data were not available to quantitatively test the performance of the summer and winter models outside our study sites. The collection of models presented here represents large-scale resource-management planning tools that are a significant advancement to previous tools in terms of spatial and temporal resolution.
NASA Astrophysics Data System (ADS)
Pound, M. W.; Wolfire, M. G.; Amarnath, N. S.
2004-07-01
The Dust InfraRed ToolBox (DIRT - a part of the Web Infrared ToolShed, or WITS {http://dustem.astro.umd.edu}) is a Java applet for modeling astrophysical processes in circumstellar shells around young and evolved stars. DIRT has been used by the astrophysics community for about 5 years. Users can automatically and efficiently search grids of pre-calculated models to fit their data. A large set of physical parameters and dust types are included in the model database, which contains over 500,000 models. We are adding new functionality to DIRT to support new missions like SIRTF and SOFIA. A new Instrument module allows for plotting of the model points convolved with the spatial and spectral responses of the selected instrument. This lets users better fit data from specific instruments. Currently, we have implemented modules for the Infrared Array Camera (IRAC) and Multiband Imaging Photometer (MIPS) on SIRTF. The models are based on the dust radiation transfer code of Wolfire & Cassinelli (1986) which accounts for multiple grain sizes and compositions. The model outputs are averaged over the instrument bands using the same weighting (νFν = constant) as the SIRTF data pipeline which allows the SIRTF data products to be compared directly with the model database. This work was supported in part by a NASA AISRP grant NAG 5-10751 and the SIRTF Legacy Science Program provided by NASA through an award issued by JPL under NASA contract 1407.
DeepQA: improving the estimation of single protein model quality with deep belief networks.
Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin
2016-12-05
Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. We introduce a novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information. The deep belief network is trained on several large datasets consisting of models from the Critical Assessment of Protein Structure Prediction (CASP) experiments, several publicly available datasets, and models generated by our in-house ab initio method. Our experiments demonstrate that deep belief network has better performance compared to Support Vector Machines and Neural Networks on the protein model quality assessment problem, and our method DeepQA achieves the state-of-the-art performance on CASP11 dataset. It also outperformed two well-established methods in selecting good outlier models from a large set of models of mostly low quality generated by ab initio modeling methods. DeepQA is a useful deep learning tool for protein single model quality assessment and protein structure prediction. The source code, executable, document and training/test datasets of DeepQA for Linux is freely available to non-commercial users at http://cactus.rnet.missouri.edu/DeepQA/ .
Forester, James D; Im, Hae Kyung; Rathouz, Paul J
2009-12-01
Patterns of resource selection by animal populations emerge as a result of the behavior of many individuals. Statistical models that describe these population-level patterns of habitat use can miss important interactions between individual animals and characteristics of their local environment; however, identifying these interactions is difficult. One approach to this problem is to incorporate models of individual movement into resource selection models. To do this, we propose a model for step selection functions (SSF) that is composed of a resource-independent movement kernel and a resource selection function (RSF). We show that standard case-control logistic regression may be used to fit the SSF; however, the sampling scheme used to generate control points (i.e., the definition of availability) must be accommodated. We used three sampling schemes to analyze simulated movement data and found that ignoring sampling and the resource-independent movement kernel yielded biased estimates of selection. The level of bias depended on the method used to generate control locations, the strength of selection, and the spatial scale of the resource map. Using empirical or parametric methods to sample control locations produced biased estimates under stronger selection; however, we show that the addition of a distance function to the analysis substantially reduced that bias. Assuming a uniform availability within a fixed buffer yielded strongly biased selection estimates that could be corrected by including the distance function but remained inefficient relative to the empirical and parametric sampling methods. As a case study, we used location data collected from elk in Yellowstone National Park, USA, to show that selection and bias may be temporally variable. Because under constant selection the amount of bias depends on the scale at which a resource is distributed in the landscape, we suggest that distance always be included as a covariate in SSF analyses. This approach to modeling resource selection is easily implemented using common statistical tools and promises to provide deeper insight into the movement ecology of animals.
Design and in vivo evaluation of more efficient and selective deep brain stimulation electrodes
NASA Astrophysics Data System (ADS)
Howell, Bryan; Huynh, Brian; Grill, Warren M.
2015-08-01
Objective. Deep brain stimulation (DBS) is an effective treatment for movement disorders and a promising therapy for treating epilepsy and psychiatric disorders. Despite its clinical success, the efficiency and selectivity of DBS can be improved. Our objective was to design electrode geometries that increased the efficiency and selectivity of DBS. Approach. We coupled computational models of electrodes in brain tissue with cable models of axons of passage (AOPs), terminating axons (TAs), and local neurons (LNs); we used engineering optimization to design electrodes for stimulating these neural elements; and the model predictions were tested in vivo. Main results. Compared with the standard electrode used in the Medtronic Model 3387 and 3389 arrays, model-optimized electrodes consumed 45-84% less power. Similar gains in selectivity were evident with the optimized electrodes: 50% of parallel AOPs could be activated while reducing activation of perpendicular AOPs from 44 to 48% with the standard electrode to 0-14% with bipolar designs; 50% of perpendicular AOPs could be activated while reducing activation of parallel AOPs from 53 to 55% with the standard electrode to 1-5% with an array of cathodes; and, 50% of TAs could be activated while reducing activation of AOPs from 43 to 100% with the standard electrode to 2-15% with a distal anode. In vivo, both the geometry and polarity of the electrode had a profound impact on the efficiency and selectivity of stimulation. Significance. Model-based design is a powerful tool that can be used to improve the efficiency and selectivity of DBS electrodes.
NASA Lunar and Planetary Mapping and Modeling
NASA Astrophysics Data System (ADS)
Day, B. H.; Law, E.
2016-12-01
NASA's Lunar and Planetary Mapping and Modeling Portals provide web-based suites of interactive visualization and analysis tools to enable mission planners, planetary scientists, students, and the general public to access mapped lunar data products from past and current missions for the Moon, Mars, and Vesta. New portals for additional planetary bodies are being planned. This presentation will recap significant enhancements to these toolsets during the past year and look forward to the results of the exciting work currently being undertaken. Additional data products and tools continue to be added to the Lunar Mapping and Modeling Portal (LMMP). These include both generalized products as well as polar data products specifically targeting potential sites for the Resource Prospector mission. Current development work on LMMP also includes facilitating mission planning and data management for lunar CubeSat missions, and working with the NASA Astromaterials Acquisition and Curation Office's Lunar Apollo Sample database in order to help better visualize the geographic contexts from which samples were retrieved. A new user interface provides, among other improvements, significantly enhanced 3D visualizations and navigation. Mars Trek, the project's Mars portal, has now been assigned by NASA's Planetary Science Division to support site selection and analysis for the Mars 2020 Rover mission as well as for the Mars Human Landing Exploration Zone Sites. This effort is concentrating on enhancing Mars Trek with data products and analysis tools specifically requested by the proposing teams for the various sites. Also being given very high priority by NASA Headquarters is Mars Trek's use as a means to directly involve the public in these upcoming missions, letting them explore the areas the agency is focusing upon, understand what makes these sites so fascinating, follow the selection process, and get caught up in the excitement of exploring Mars. The portals also serve as outstanding resources for education and outreach. As such, they have been designated by NASA's Science Mission Directorate as key supporting infrastructure for the new education programs selected through the division's recent CAN.
Development and validation of a general purpose linearization program for rigid aircraft models
NASA Technical Reports Server (NTRS)
Duke, E. L.; Antoniewicz, R. F.
1985-01-01
A FORTRAN program that provides the user with a powerful and flexible tool for the linearization of aircraft models is discussed. The program LINEAR numerically determines a linear systems model using nonlinear equations of motion and a user-supplied, nonlinear aerodynamic model. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model. Also, included in the report is a comparison of linear and nonlinear models for a high performance aircraft.
A lithospheric magnetic field model derived from the Swarm satellite magnetic field measurements
NASA Astrophysics Data System (ADS)
Hulot, G.; Thebault, E.; Vigneron, P.
2015-12-01
The Swarm constellation of satellites was launched in November 2013 and has since then delivered high quality scalar and vector magnetic field measurements. A consortium of several research institutions was selected by the European Space Agency (ESA) to provide a number of scientific products which will be made available to the scientific community. Within this framework, specific tools were tailor-made to better extract the magnetic signal emanating from Earth's the lithospheric. These tools rely on the scalar gradient measured by the lower pair of Swarm satellites and rely on a regional modeling scheme that is more sensitive to small spatial scales and weak signals than the standard spherical harmonic modeling. In this presentation, we report on various activities related to data analysis and processing. We assess the efficiency of this dedicated chain for modeling the lithospheric magnetic field using more than one year of measurements, and finally discuss refinements that are continuously implemented in order to further improve the robustness and the spatial resolution of the lithospheric field model.
Dashboard systems: implementing pharmacometrics from bench to bedside.
Mould, Diane R; Upton, Richard N; Wojciechowski, Jessica
2014-09-01
In recent years, there has been increasing interest in the development of medical decision-support tools, including dashboard systems. Dashboard systems are software packages that integrate information and calculations about therapeutics from multiple components into a single interface for use in the clinical environment. Given the high cost of medical care, and the increasing need to demonstrate positive clinical outcomes for reimbursement, dashboard systems may become an important tool for improving patient outcome, improving clinical efficiency and containing healthcare costs. Similarly the costs associated with drug development are also rising. The use of model-based drug development (MBDD) has been proposed as a tool to streamline this process, facilitating the selection of appropriate doses and making informed go/no-go decisions. However, complete implementation of MBDD has not always been successful owing to a variety of factors, including the resources required to provide timely modeling and simulation updates. The application of dashboard systems in drug development reduces the resource requirement and may expedite updating models as new data are collected, allowing modeling results to be available in a timely fashion. In this paper, we present some background information on dashboard systems and propose the use of these systems both in the clinic and during drug development.
Friction Stir Welding at MSFC: Kinematics
NASA Technical Reports Server (NTRS)
Nunes, A. C., Jr.
2001-01-01
In 1991 The Welding Institute of the United Kingdom patented the Friction Stir Welding (FSW) process. In FSW a rotating pin-tool is inserted into a weld seam and literally stirs the faying surfaces together as it moves up the seam. By April 2000 the American Welding Society International Welding and Fabricating Exposition featured several exhibits of commercial FSW processes and the 81st Annual Convention devoted a technical session to the process. The FSW process is of interest to Marshall Space Flight Center (MSFC) as a means of avoiding hot-cracking problems presented by the 2195 aluminum-lithium alloy, which is the primary constituent of the Lightweight Space Shuttle External Tank. The process has been under development at MSFC for External Tank applications since the early 1990's. Early development of the FSW process proceeded by cut-and-try empirical methods. A substantial and complex body of data resulted. A theoretical model was wanted to deal with the complexity and reduce the data to concepts serviceable for process diagnostics, optimization, parameter selection, etc. A first step in understanding the FSW process is to determine the kinematics, i.e., the flow field in the metal in the vicinity of the pin-tool. Given the kinematics, the dynamics, i.e., the forces, can be targeted. Given a completed model of the FSW process, attempts at rational design of tools and selection of process parameters can be made.
Canut Blasco, Andrés; Aguilar Alfaro, Lorenzo; Cobo Reinoso, Javier; Giménez Mestre, M José; Rodríguez-Gascón, Alicia
2015-01-01
The selection of multiresistant microorganisms, as a side-effect of the use of antimicrobials, together with the lack of new therapeutic drugs expected in the near future, forces to a rational use of antibiotics. The optimisation of antibacterial treatments based on pharmacokinetic/pharmacodynamic analysis (PK/PD) may contribute to prolong the life of antibiotics and to contain the bacterial resistance to them. A review is made of the importance of the appropriateness of the dose regimen selected, the application of PK/PD analysis of antimicrobials, the Monte Carlo simulation, PK/PD indices for efficacy, and PK/PD cut-off points. PK/PD analysis is also applicable to the prevention of bacterial resistance. Different methods have been used to study the factors that lead to its emergence and spread, such as in vitro and animal models, and resistance prevention studies (mutant selection window). Although the PK/PD analysis is a very useful tool for the selection of the most appropriate dose regimen of antibiotics, several problems limit its use in clinical practice. Copyright © 2013 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Reduction in training time of a deep learning model in detection of lesions in CT
NASA Astrophysics Data System (ADS)
Makkinejad, Nazanin; Tajbakhsh, Nima; Zarshenas, Amin; Khokhar, Ashfaq; Suzuki, Kenji
2018-02-01
Deep learning (DL) emerged as a powerful tool for object detection and classification in medical images. Building a well-performing DL model, however, requires a huge number of images for training, and it takes days to train a DL model even on a cutting edge high-performance computing platform. This study is aimed at developing a method for selecting a "small" number of representative samples from a large collection of training samples to train a DL model for the could be used to detect polyps in CT colonography (CTC), without compromising the classification performance. Our proposed method for representative sample selection (RSS) consists of a K-means clustering algorithm. For the performance evaluation, we applied the proposed method to select samples for the training of a massive training artificial neural network based DL model, to be used for the classification of polyps and non-polyps in CTC. Our results show that the proposed method reduce the training time by a factor of 15, while maintaining the classification performance equivalent to the model trained using the full training set. We compare the performance using area under the receiveroperating- characteristic curve (AUC).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405
Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina
2016-01-01
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.
Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...
2016-01-28
Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less
Hafen, G M; Hurst, C; Yearwood, J; Smith, J; Dzalilov, Z; Robinson, P J
2008-10-05
Cystic fibrosis is the most common fatal genetic disorder in the Caucasian population. Scoring systems for assessment of Cystic fibrosis disease severity have been used for almost 50 years, without being adapted to the milder phenotype of the disease in the 21st century. The aim of this current project is to develop a new scoring system using a database and employing various statistical tools. This study protocol reports the development of the statistical tools in order to create such a scoring system. The evaluation is based on the Cystic Fibrosis database from the cohort at the Royal Children's Hospital in Melbourne. Initially, unsupervised clustering of the all data records was performed using a range of clustering algorithms. In particular incremental clustering algorithms were used. The clusters obtained were characterised using rules from decision trees and the results examined by clinicians. In order to obtain a clearer definition of classes expert opinion of each individual's clinical severity was sought. After data preparation including expert-opinion of an individual's clinical severity on a 3 point-scale (mild, moderate and severe disease), two multivariate techniques were used throughout the analysis to establish a method that would have a better success in feature selection and model derivation: 'Canonical Analysis of Principal Coordinates' and 'Linear Discriminant Analysis'. A 3-step procedure was performed with (1) selection of features, (2) extracting 5 severity classes out of a 3 severity class as defined per expert-opinion and (3) establishment of calibration datasets. (1) Feature selection: CAP has a more effective "modelling" focus than DA.(2) Extraction of 5 severity classes: after variables were identified as important in discriminating contiguous CF severity groups on the 3-point scale as mild/moderate and moderate/severe, Discriminant Function (DF) was used to determine the new groups mild, intermediate moderate, moderate, intermediate severe and severe disease. (3) Generated confusion tables showed a misclassification rate of 19.1% for males and 16.5% for females, with a majority of misallocations into adjacent severity classes particularly for males. Our preliminary data show that using CAP for detection of selection features and Linear DA to derive the actual model in a CF database might be helpful in developing a scoring system. However, there are several limitations, particularly more data entry points are needed to finalize a score and the statistical tools have further to be refined and validated, with re-running the statistical methods in the larger dataset.
Osteoporosis risk prediction using machine learning and conventional methods.
Kim, Sung Kean; Yoo, Tae Keun; Oh, Ein; Kim, Deok Won
2013-01-01
A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women, and compared with the ability of a conventional clinical decision tool, osteoporosis self-assessment tool (OST). We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Surveys (KNHANES V-1). The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests (RF), artificial neural networks (ANN), and logistic regression (LR) based on various predictors associated with low bone density. The learning models were compared with OST. SVM had significantly better area under the curve (AUC) of the receiver operating characteristic (ROC) than ANN, LR, and OST. Validation on the test set showed that SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0%. We were the first to perform comparisons of the performance of osteoporosis prediction between the machine learning and conventional methods using population-based epidemiological data. The machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.
Microsatellites as targets of natural selection.
Haasl, Ryan J; Payseur, Bret A
2013-02-01
The ability to survey polymorphism on a genomic scale has enabled genome-wide scans for the targets of natural selection. Theory that connects patterns of genetic variation to evidence of natural selection most often assumes a diallelic locus and no recurrent mutation. Although these assumptions are suitable to selection that targets single nucleotide variants, fundamentally different types of mutation generate abundant polymorphism in genomes. Moreover, recent empirical results suggest that mutationally complex, multiallelic loci including microsatellites and copy number variants are sometimes targeted by natural selection. Given their abundance, the lack of inference methods tailored to the mutational peculiarities of these types of loci represents a notable gap in our ability to interrogate genomes for signatures of natural selection. Previous theoretical investigations of mutation-selection balance at multiallelic loci include assumptions that limit their application to inference from empirical data. Focusing on microsatellites, we assess the dynamics and population-level consequences of selection targeting mutationally complex variants. We develop general models of a multiallelic fitness surface, a realistic model of microsatellite mutation, and an efficient simulation algorithm. Using these tools, we explore mutation-selection-drift equilibrium at microsatellites and investigate the mutational history and selective regime of the microsatellite that causes Friedreich's ataxia. We characterize microsatellite selective events by their duration and cost, note similarities to sweeps from standing point variation, and conclude that it is premature to label microsatellites as ubiquitous agents of efficient adaptive change. Together, our models and simulation algorithm provide a powerful framework for statistical inference, which can be used to test the neutrality of microsatellites and other multiallelic variants.
Microsatellites as Targets of Natural Selection
Haasl, Ryan J.; Payseur, Bret A.
2013-01-01
The ability to survey polymorphism on a genomic scale has enabled genome-wide scans for the targets of natural selection. Theory that connects patterns of genetic variation to evidence of natural selection most often assumes a diallelic locus and no recurrent mutation. Although these assumptions are suitable to selection that targets single nucleotide variants, fundamentally different types of mutation generate abundant polymorphism in genomes. Moreover, recent empirical results suggest that mutationally complex, multiallelic loci including microsatellites and copy number variants are sometimes targeted by natural selection. Given their abundance, the lack of inference methods tailored to the mutational peculiarities of these types of loci represents a notable gap in our ability to interrogate genomes for signatures of natural selection. Previous theoretical investigations of mutation-selection balance at multiallelic loci include assumptions that limit their application to inference from empirical data. Focusing on microsatellites, we assess the dynamics and population-level consequences of selection targeting mutationally complex variants. We develop general models of a multiallelic fitness surface, a realistic model of microsatellite mutation, and an efficient simulation algorithm. Using these tools, we explore mutation-selection-drift equilibrium at microsatellites and investigate the mutational history and selective regime of the microsatellite that causes Friedreich’s ataxia. We characterize microsatellite selective events by their duration and cost, note similarities to sweeps from standing point variation, and conclude that it is premature to label microsatellites as ubiquitous agents of efficient adaptive change. Together, our models and simulation algorithm provide a powerful framework for statistical inference, which can be used to test the neutrality of microsatellites and other multiallelic variants. PMID:23104080
Primate archaeology reveals cultural transmission in wild chimpanzees (Pan troglodytes verus).
Luncz, Lydia V; Wittig, Roman M; Boesch, Christophe
2015-11-19
Recovering evidence of past human activities enables us to recreate behaviour where direct observations are missing. Here, we apply archaeological methods to further investigate cultural transmission processes in percussive tool use among neighbouring chimpanzee communities in the Taï National Park, Côte d'Ivoire, West Africa. Differences in the selection of nut-cracking tools between neighbouring groups are maintained over time, despite frequent female transfer, which leads to persistent cultural diversity between chimpanzee groups. Through the recovery of used tools in the suggested natal territory of immigrants, we have been able to reconstruct the tool material selection of females prior to migration. In combination with direct observations of tool selection of local residents and immigrants after migration, we uncovered temporal changes in tool selection for immigrating females. After controlling for ecological differences between territories of immigrants and residents our data suggest that immigrants abandoned their previous tool preference and adopted the pattern of their new community, despite previous personal proficiency of the same foraging task. Our study adds to the growing body of knowledge on the importance of conformist tendencies in animals. © 2015 The Author(s).
NASA Technical Reports Server (NTRS)
Seldner, K.
1976-01-01
The development of control systems for jet engines requires a real-time computer simulation. The simulation provides an effective tool for evaluating control concepts and problem areas prior to actual engine testing. The development and use of a real-time simulation of the Pratt and Whitney F100-PW100 turbofan engine is described. The simulation was used in a multi-variable optimal controls research program using linear quadratic regulator theory. The simulation is used to generate linear engine models at selected operating points and evaluate the control algorithm. To reduce the complexity of the design, it is desirable to reduce the order of the linear model. A technique to reduce the order of the model; is discussed. Selected results between high and low order models are compared. The LQR control algorithms can be programmed on digital computer. This computer will control the engine simulation over the desired flight envelope.
Crucial nesting habitat for gunnison sage-grouse: A spatially explicit hierarchical approach
Aldridge, Cameron L.; Saher, D.J.; Childers, T.M.; Stahlnecker, K.E.; Bowen, Z.H.
2012-01-01
Gunnison sage-grouse (Centrocercus minimus) is a species of special concern and is currently considered a candidate species under Endangered Species Act. Careful management is therefore required to ensure that suitable habitat is maintained, particularly because much of the species' current distribution is faced with exurban development pressures. We assessed hierarchical nest site selection patterns of Gunnison sage-grouse inhabiting the western portion of the Gunnison Basin, Colorado, USA, at multiple spatial scales, using logistic regression-based resource selection functions. Models were selected using Akaike Information Criterion corrected for small sample sizes (AIC c) and predictive surfaces were generated using model averaged relative probabilities. Landscape-scale factors that had the most influence on nest site selection included the proportion of sagebrush cover >5%, mean productivity, and density of 2 wheel-drive roads. The landscape-scale predictive surface captured 97% of known Gunnison sage-grouse nests within the top 5 of 10 prediction bins, implicating 57% of the basin as crucial nesting habitat. Crucial habitat identified by the landscape model was used to define the extent for patch-scale modeling efforts. Patch-scale variables that had the greatest influence on nest site selection were the proportion of big sagebrush cover >10%, distance to residential development, distance to high volume paved roads, and mean productivity. This model accurately predicted independent nest locations. The unique hierarchical structure of our models more accurately captures the nested nature of habitat selection, and allowed for increased discrimination within larger landscapes of suitable habitat. We extrapolated the landscape-scale model to the entire Gunnison Basin because of conservation concerns for this species. We believe this predictive surface is a valuable tool which can be incorporated into land use and conservation planning as well the assessment of future land-use scenarios. ?? 2011 The Wildlife Society.
User’s Guide for the SAS (Stand-Off Attack Simulation) Computer Model.
1982-01-15
A99QAXFD000-01 Albuquerque, New Mexico 87110 I1. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Director 15 January 1982 Defense Nuclear Aqency 13...computer model. SAS is an effective survivability and security system design tool which allows an analyst to compare the relative effectiveness of selected...mounted against other systems during uploading for dispersal or for non -emergency relocation. GLCM and LANCE must be mobilized and formed into convoys
Drum ring removal/installation tool
Andrade, William Andrew [Livermore, CA
2006-11-14
A handheld tool, or a pair of such tools, such as for use in removing/installing a bolt-type clamping ring on a container barrel/drum, where the clamping ring has a pair of clamping ends each with a throughbore. Each tool has an elongated handle and an elongated lever arm transversely connected to one end of the handle. The lever arm is capable of being inserted into the throughbore of a selected clamping end and leveraged with the handle to exert a first moment on the selected clamping end. Each tool also has a second lever arm, such as a socket with an open-ended slot, which is suspended alongside the first lever arm. The second lever arm is capable of engaging the selected clamping end and being leveraged with the handle to exert a second moment which is orthogonal to the first moment. In this manner, the first and second moments operate to hold the selected clamping end fixed relative to the tool so that the selected clamping end may be controlled with the handle. The pair of clamping ends may also be simultaneously and independently controlled with the use of two handles/tools so as to contort the geometry of the drum clamping ring and enable its removal/installation.
Federated Search Tools in Fusion Centers: Bridging Databases in the Information Sharing Environment
2012-09-01
considerable variation in how fusion centers plan for, gather requirements, select and acquire federated search tools to bridge disparate databases...centers, when considering integrating federated search tools; by evaluating the importance of the planning, requirements gathering, selection and...acquisition processes for integrating federated search tools; by acknowledging the challenges faced by some fusion centers during these integration processes
Kim, Mincheol; Jang, Yong-Chul; Lee, Seunguk
2013-10-15
The management of waste electrical and electronic equipment (WEEE) or electronic waste (e-waste) has become a major issue of concern for solid waste communities due to the large volumes of waste being generated from the consumption of modern electrical and electronic products. In 2003, Korea introduced the extended producer responsibility (EPR) system to reduce the amount of electronic products to be disposed and to promote resource recovery from WEEE. The EPR currently regulates a total of 10 electrical and electronic products. This paper presents the results of the application of the Delphi method and analytical hierarchy process (AHP) modeling to the WEEE management tool in the policy-making process. Specifically, this paper focuses on the application of the Delphi-AHP technique to determine the WEEE priority to be included in the EPR system. Appropriate evaluation criteria were derived using the Delphi method to assess the potential selection and priority among electrical and electronic products that will be regulated by the EPR system. Quantitative weightings from the AHP model were calculated to identify the priorities of electrical and electronic products to be potentially regulated. After applying all the criteria using the AHP model, the results indicate that the top 10 target recycling products for the expansion of the WEEE list were found to be vacuum cleaners, electric fans, rice cookers, large freezers, microwave ovens, water purifiers, air purifiers, humidifiers, dryers, and telephones in order from the first to last. The proposed Delphi-AHP method can offer a more efficient means of selecting WEEE than subjective assessment methods that are often based on professional judgment or limited available data. By providing WEEE items to be regulated, the proposed Delphi-AHP method can eliminate uncertainty and subjective assessment and enable WEEE management policy-makers to identify the priority of potential WEEE. More generally, the work performed in this study is an example of how Delphi-AHP modeling can be used as a decision-making process tool in WEEE management. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
González-Madroño, A; Mancha, A; Rodríguez, F J; Culebras, J; de Ulibarri, J I
2012-01-01
To ratify previous validations of the CONUT nutritional screening tool by the development of two probabilistic models using the parameters included in the CONUT, to see if the CONUT´s effectiveness could be improved. It is a two step prospective study. In Step 1, 101 patients were randomly selected, and SGA and CONUT was made. With data obtained an unconditional logistic regression model was developed, and two variants of CONUT were constructed: Model 1 was made by a method of logistic regression. Model 2 was made by dividing the probabilities of undernutrition obtained in model 1 in seven regular intervals. In step 2, 60 patients were selected and underwent the SGA, the original CONUT and the new models developed. The diagnostic efficacy of the original CONUT and the new models was tested by means of ROC curves. Both samples 1 and 2 were put together to measure the agreement degree between the original CONUT and SGA, and diagnostic efficacy parameters were calculated. No statistically significant differences were found between sample 1 and 2, regarding age, sex and medical/surgical distribution and undernutrition rates were similar (over 40%). The AUC for the ROC curves were 0.862 for the original CONUT, and 0.839 and 0.874, for model 1 and 2 respectively. The kappa index for the CONUT and SGA was 0.680. The CONUT, with the original scores assigned by the authors is equally good than mathematical models and thus is a valuable tool, highly useful and efficient for the purpose of Clinical Undernutrition screening.
Sequential Sampling Models in Cognitive Neuroscience: Advantages, Applications, and Extensions.
Forstmann, B U; Ratcliff, R; Wagenmakers, E-J
2016-01-01
Sequential sampling models assume that people make speeded decisions by gradually accumulating noisy information until a threshold of evidence is reached. In cognitive science, one such model--the diffusion decision model--is now regularly used to decompose task performance into underlying processes such as the quality of information processing, response caution, and a priori bias. In the cognitive neurosciences, the diffusion decision model has recently been adopted as a quantitative tool to study the neural basis of decision making under time pressure. We present a selective overview of several recent applications and extensions of the diffusion decision model in the cognitive neurosciences.
Interactive planning of miniplates
NASA Astrophysics Data System (ADS)
Gall, Markus; Reinbacher, Knut; Wallner, Jürgen; Stanzel, Jan; Chen, Xiaojun; Schwenzer-Zimmerer, Katja; Schmalstieg, Dieter; Egger, Jan
2017-03-01
In this contribution, a novel method for computer aided surgery planning of facial defects by using models of purchasable MedArtis Modus 2.0 miniplates is proposed. Implants of this kind, which belong to the osteosynthetic material, are commonly used for treating defects in the facial area. By placing them perpendicular on the defect, the miniplates are fixed on the healthy bone, bent with respect to the surface, to stabilize the defective area. Our software is able to fit a selection of the most common implant models to the surgeon's desired position in a 3D computer model. The fitting respects the local surface curvature and adjusts direction and position in any desired way. Conventional methods use Computed Tomography (CT) scans to generate STereoLithic (STL) models serving as bending template for the implants or use a bending tool during the surgery for readjusting the implant several times. Both approaches lead to undesirable expenses in time. With our visual planning tool, surgeons are able to pre-plan the final implant within just a few minutes. The resulting model can be stored in STL format, which is the commonly used format for 3D printing. With this technology, surgeons are able to print the implant just in time or use it for generating a bending tool, both leading to an exactly bent miniplate.
Linear versus quadratic portfolio optimization model with transaction cost
NASA Astrophysics Data System (ADS)
Razak, Norhidayah Bt Ab; Kamil, Karmila Hanim; Elias, Siti Masitah
2014-06-01
Optimization model is introduced to become one of the decision making tools in investment. Hence, it is always a big challenge for investors to select the best model that could fulfill their goal in investment with respect to risk and return. In this paper we aims to discuss and compare the portfolio allocation and performance generated by quadratic and linear portfolio optimization models namely of Markowitz and Maximin model respectively. The application of these models has been proven to be significant and popular among others. However transaction cost has been debated as one of the important aspects that should be considered for portfolio reallocation as portfolio return could be significantly reduced when transaction cost is taken into consideration. Therefore, recognizing the importance to consider transaction cost value when calculating portfolio' return, we formulate this paper by using data from Shariah compliant securities listed in Bursa Malaysia. It is expected that, results from this paper will effectively justify the advantage of one model to another and shed some lights in quest to find the best decision making tools in investment for individual investors.
Selection, calibration, and validation of models of tumor growth.
Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C
2016-11-01
This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory animals while demonstrating successful implementations of OPAL.
Generating DEM from LIDAR data - comparison of available software tools
NASA Astrophysics Data System (ADS)
Korzeniowska, K.; Lacka, M.
2011-12-01
In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.
Perini, Francesca; Caramazza, Alfonso; Peelen, Marius V.
2014-01-01
Functional neuroimaging studies have implicated the left lateral occipitotemporal cortex (LOTC) in both tool and hand perception but the functional role of this region is not fully known. Here, by using a task manipulation, we tested whether tool-/hand-selective LOTC contributes to the discrimination of tool-associated hand actions. Participants viewed briefly presented pictures of kitchen and garage tools while they performed one of two tasks: in the action task, they judged whether the tool is associated with a hand rotation action (e.g., screwdriver) or a hand squeeze action (e.g., garlic press), while in the location task they judged whether the tool is typically found in the kitchen (e.g., garlic press) or in the garage (e.g., screwdriver). Both tasks were performed on the same stimulus set and were matched for difficulty. Contrasting fMRI responses between these tasks showed stronger activity during the action task than the location task in both tool- and hand-selective LOTC regions, which closely overlapped. No differences were found in nearby object- and motion-selective control regions. Importantly, these findings were confirmed by a TMS study, which showed that effective TMS over the tool-/hand-selective LOTC region significantly slowed responses for tool action discriminations relative to tool location discriminations, with no such difference during sham TMS. We conclude that left LOTC contributes to the discrimination of tool-associated hand actions. PMID:25140142
Bosslet, Gabriel T; Carlos, W Graham; Tybor, David J; McCallister, Jennifer; Huebert, Candace; Henderson, Ashley; Miles, Matthew C; Twigg, Homer; Sears, Catherine R; Brown, Cynthia; Farber, Mark O; Lahm, Tim; Buckley, John D
2017-04-01
Few data have been published regarding scoring tools for selection of postgraduate medical trainee candidates that have wide applicability. The authors present a novel scoring tool developed to assist postgraduate programs in generating an institution-specific rank list derived from selected elements of the U.S. Electronic Residency Application System (ERAS) application. The authors developed and validated an ERAS and interview day scoring tool at five pulmonary and critical care fellowship programs: the ERAS Application Scoring Tool-Interview Scoring Tool. This scoring tool was then tested for intrarater correlation versus subjective rankings of ERAS applications. The process for development of the tool was performed at four other institutions, and it was performed alongside and compared with the "traditional" ranking methods at the five programs and compared with the submitted National Residency Match Program rank list. The ERAS Application Scoring Tool correlated highly with subjective faculty rankings at the primary institution (average Spearman's r = 0.77). The ERAS Application Scoring Tool-Interview Scoring Tool method correlated well with traditional ranking methodology at all five institutions (Spearman's r = 0.54, 0.65, 0.72, 0.77, and 0.84). This study validates a process for selecting and weighting components of the ERAS application and interview day to create a customizable, institution-specific tool for ranking candidates to postgraduate medical education programs. This scoring system can be used in future studies to compare the outcomes of fellowship training.
NASA Astrophysics Data System (ADS)
Lafolie, François; Cousin, Isabelle; Mollier, Alain; Pot, Valérie; Moitrier, Nicolas; Balesdent, Jérome; bruckler, Laurent; Moitrier, Nathalie; Nouguier, Cédric; Richard, Guy
2014-05-01
Models describing the soil functioning are valuable tools for addressing challenging issues related to agricultural production, soil protection or biogeochemical cycles. Coupling models that address different scientific fields is actually required in order to develop numerical tools able to simulate the complex interactions and feed-backs occurring within a soil profile in interaction with climate and human activities. We present here a component-based modelling platform named "VSoil", that aims at designing, developing, implementing and coupling numerical representation of biogeochemical and physical processes in soil, from the aggregate to the profile scales. The platform consists of four softwares, i) Vsoil_Processes dedicated to the conceptual description of processes and of their inputs and outputs, ii) Vsoil_Modules devoted to the development of numerical representation of elementary processes as modules, iii) Vsoil_Models which permits the coupling of modules to create models, iv) Vsoil_Player for the run of the model and the primary analysis of results. The platform is designed to be a collaborative tool, helping scientists to share not only their models, but also the scientific knowledge on which the models are built. The platform is based on the idea that processes of any kind can be described and characterized by their inputs (state variables required) and their outputs. The links between the processes are automatically detected by the platform softwares. For any process, several numerical representations (modules) can be developed and made available to platform users. When developing modules, the platform takes care of many aspects of the development task so that the user can focus on numerical calculations. Fortran2008 and C++ are the supported languages and existing codes can be easily incorporated into platform modules. Building a model from available modules simply requires selecting the processes being accounted for and for each process a module. During this task, the platform displays available modules and checks the compatibility between the modules. The model (main program) is automatically created when compatible modules have been selected for all the processes. A GUI is automatically generated to help the user providing parameters and initial situations. Numerical results can be immediately visualized, archived and exported. The platform also provides facilities to carry out sensitivity analysis. Parameters estimation and links with databases are being developed. The platform can be freely downloaded from the web site (http://www.inra.fr/sol_virtuel/) with a set of processes, variables, modules and models. However, it is designed so that any user can add its own components. Theses adds-on can be shared with co-workers by means of an export/import mechanism using the e-mail. The adds-on can also be made available to the whole community of platform users when developers asked for. A filtering tool is available to explore the content of the platform (processes, variables, modules, models).
Feng, Yao-Ze; Elmasry, Gamal; Sun, Da-Wen; Scannell, Amalia G M; Walsh, Des; Morcy, Noha
2013-06-01
Bacterial pathogens are the main culprits for outbreaks of food-borne illnesses. This study aimed to use the hyperspectral imaging technique as a non-destructive tool for quantitative and direct determination of Enterobacteriaceae loads on chicken fillets. Partial least squares regression (PLSR) models were established and the best model using full wavelengths was obtained in the spectral range 930-1450 nm with coefficients of determination R(2)≥ 0.82 and root mean squared errors (RMSEs) ≤ 0.47 log(10)CFUg(-1). In further development of simplified models, second derivative spectra and weighted PLS regression coefficients (BW) were utilised to select important wavelengths. However, the three wavelengths (930, 1121 and 1345 nm) selected from BW were competent and more preferred for predicting Enterobacteriaceae loads with R(2) of 0.89, 0.86 and 0.87 and RMSEs of 0.33, 0.40 and 0.45 log(10)CFUg(-1) for calibration, cross-validation and prediction, respectively. Besides, the constructed prediction map provided the distribution of Enterobacteriaceae bacteria on chicken fillets, which cannot be achieved by conventional methods. It was demonstrated that hyperspectral imaging is a potential tool for determining food sanitation and detecting bacterial pathogens on food matrix without using complicated laboratory regimes. Copyright © 2012 Elsevier Ltd. All rights reserved.
Oilseed rape: learning about ancient and recent polyploid evolution from a recent crop species.
Mason, A S; Snowdon, R J
2016-11-01
Oilseed rape (Brassica napus) is one of our youngest crop species, arising several times under cultivation in the last few thousand years and completely unknown in the wild. Oilseed rape originated from hybridisation events between progenitor diploid species B. rapa and B. oleracea, both important vegetable species. The diploid progenitors are also ancient polyploids, with remnants of two previous polyploidisation events evident in the triplicated genome structure. This history of polyploid evolution and human agricultural selection makes B. napus an excellent model with which to investigate processes of genomic evolution and selection in polyploid crops. The ease of de novo interspecific hybridisation, responsiveness to tissue culture, and the close relationship of oilseed rape to the model plant Arabidopsis thaliana, coupled with the recent availability of reference genome sequences and suites of molecular cytogenetic and high-throughput genotyping tools, allow detailed dissection of genetic, genomic and phenotypic interactions in this crop. In this review we discuss the past and present uses of B. napus as a model for polyploid speciation and evolution in crop species, along with current and developing analysis tools and resources. We further outline unanswered questions that may now be tractable to investigation. © 2016 German Botanical Society and The Royal Botanical Society of the Netherlands.
A virtual reality browser for Space Station models
NASA Technical Reports Server (NTRS)
Goldsby, Michael; Pandya, Abhilash; Aldridge, Ann; Maida, James
1993-01-01
The Graphics Analysis Facility at NASA/JSC has created a visualization and learning tool by merging its database of detailed geometric models with a virtual reality system. The system allows an interactive walk-through of models of the Space Station and other structures, providing detailed realistic stereo images. The user can activate audio messages describing the function and connectivity of selected components within his field of view. This paper presents the issues and trade-offs involved in the implementation of the VR system and discusses its suitability for its intended purposes.
A multi-fidelity analysis selection method using a constrained discrete optimization formulation
NASA Astrophysics Data System (ADS)
Stults, Ian C.
The purpose of this research is to develop a method for selecting the fidelity of contributing analyses in computer simulations. Model uncertainty is a significant component of result validity, yet it is neglected in most conceptual design studies. When it is considered, it is done so in only a limited fashion, and therefore brings the validity of selections made based on these results into question. Neglecting model uncertainty can potentially cause costly redesigns of concepts later in the design process or can even cause program cancellation. Rather than neglecting it, if one were to instead not only realize the model uncertainty in tools being used but also use this information to select the tools for a contributing analysis, studies could be conducted more efficiently and trust in results could be quantified. Methods for performing this are generally not rigorous or traceable, and in many cases the improvement and additional time spent performing enhanced calculations are washed out by less accurate calculations performed downstream. The intent of this research is to resolve this issue by providing a method which will minimize the amount of time spent conducting computer simulations while meeting accuracy and concept resolution requirements for results. In many conceptual design programs, only limited data is available for quantifying model uncertainty. Because of this data sparsity, traditional probabilistic means for quantifying uncertainty should be reconsidered. This research proposes to instead quantify model uncertainty using an evidence theory formulation (also referred to as Dempster-Shafer theory) in lieu of the traditional probabilistic approach. Specific weaknesses in using evidence theory for quantifying model uncertainty are identified and addressed for the purposes of the Fidelity Selection Problem. A series of experiments was conducted to address these weaknesses using n-dimensional optimization test functions. These experiments found that model uncertainty present in analyses with 4 or fewer input variables could be effectively quantified using a strategic distribution creation method; if more than 4 input variables exist, a Frontier Finding Particle Swarm Optimization should instead be used. Once model uncertainty in contributing analysis code choices has been quantified, a selection method is required to determine which of these choices should be used in simulations. Because much of the selection done for engineering problems is driven by the physics of the problem, these are poor candidate problems for testing the true fitness of a candidate selection method. Specifically moderate and high dimensional problems' variability can often be reduced to only a few dimensions and scalability often cannot be easily addressed. For these reasons a simple academic function was created for the uncertainty quantification, and a canonical form of the Fidelity Selection Problem (FSP) was created. Fifteen best- and worst-case scenarios were identified in an effort to challenge the candidate selection methods both with respect to the characteristics of the tradeoff between time cost and model uncertainty and with respect to the stringency of the constraints and problem dimensionality. The results from this experiment show that a Genetic Algorithm (GA) was able to consistently find the correct answer, but under certain circumstances, a discrete form of Particle Swarm Optimization (PSO) was able to find the correct answer more quickly. To better illustrate how the uncertainty quantification and discrete optimization might be conducted for a "real world" problem, an illustrative example was conducted using gas turbine engines.
Shape selection in Landsat time series: A tool for monitoring forest dynamics
Gretchen G. Moisen; Mary C. Meyer; Todd A. Schroeder; Xiyue Liao; Karen G. Schleeweis; Elizabeth A. Freeman; Chris Toney
2016-01-01
We present a new methodology for fitting nonparametric shape-restricted regression splines to time series of Landsat imagery for the purpose of modeling, mapping, and monitoring annual forest disturbance dynamics over nearly three decades. For each pixel and spectral band or index of choice in temporal Landsat data, our method delivers a smoothed rendition of...
Training University Faculty To Integrate Hypermedia into the Teacher Training Curriculum.
ERIC Educational Resources Information Center
Tucker, S. A.; And Others
Funded under the Apple Model Program for the Integration of Computers in the Preparation of Educators, the University of South Alabama began a 3-year project in 1989 to train faculty in its College of Education to incorporate hypermedia into their curriculum. HyperCard was selected as a course presentation and development tool because of its…
ERIC Educational Resources Information Center
Padgett, Ryan D.; Salisbury, Mark H.; An, Brian P.; Pascarella, Ernest T.
2010-01-01
The sophisticated analytical techniques available to institutional researchers give them an array of procedures to estimate a causal effect using observational data. But as many quantitative researchers have discovered, access to a wider selection of statistical tools does not necessarily ensure construction of a better analytical model. Moreover,…
Environmentally responsible development of oil and gas assets in the United States is facilitated by advancement of sector-specific air pollution emission measurement and modeling tools. Emissions from upstream oil and gas production are complex in nature due to the variety of e...
I PASS: an interactive policy analysis simulation system.
Doug Olson; Con Schallau; Wilbur Maki
1984-01-01
This paper describes an interactive policy analysis simulation system(IPASS) that can be used to analyze the long-term economic and demographic effects of alternative forest resource management policies. The IPASS model is a dynamic analytical tool that forecasts growth and development of an economy. It allows the user to introduce changes in selected parameters based...
ERIC Educational Resources Information Center
Rowan, Noell L.; Gillette, Patricia D.; Faul, Anna C.; Yankeelov, Pamela A.; Borders, Kevin W.; Deck, Stacy; Nicholas, Lori D.; Wiegand, Mark
2009-01-01
With focus on interdisciplinary education models, social work and physical therapy faculty from two proximate universities partnered to create an evidence-based geriatric assessment and brief intervention research, training, and service project for community-dwelling older adults. Assessment tools and interventions were selected from the…
Quantifying falsifiability of scientific theories
NASA Astrophysics Data System (ADS)
Nemenman, Ilya
I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.
Simulation of controllable permeation in PNIPAAm coated membranes
NASA Astrophysics Data System (ADS)
Ehrenhofer, Adrian; Wallmersperger, Thomas; Richter, Andreas
2016-04-01
Membranes separate fluid compartments and can comprise transport structures for selective permeation. In biology, channel proteins are specialized in their atomic structure to allow transport of specific compounds (selectivity). Conformational changes in protein structure allow the control of the permeation abilities by outer stimuli (gating). In polymeric membranes, the selectivity is due to electrostatic or size-exclusion. It can thus be controlled by size variation or electric charges. Controllable permeation can be useful to determine particle-size distributions in continuous flow, e.g. in microfluidics and biomedicine to gain cell diameter profiles in blood. The present approach uses patterned polyethylene terephthalate (PET) membranes with hydrogel surface coating for permeation control by size-exclusion. The thermosensitive hydrogel poly(N-isopropylacrylamide) (PNIPAAm) is structured with a cross-shaped pore geometry. A change in the temperature of the water flow through the membrane leads to a pore shape variation. The temperature dependent behavior of PNIPAAm can be numerically modeled with a temperature expansion model, where the swelling and deswelling is depicted by temperature dependent expansion coefficients. In the present study, the free swelling behavior was implemented to the Finite Element tool ABAQUS for the complex composite structure of the permeation control membrane. Experimental values of the geometry characteristics were derived from microscopy images with the tool Image J and compared to simulation results. Numerical simulations using the derived thermo-mechanical model for different pore geometries (circular, rectangle, cross and triangle) were performed. With this study, we show that the temperature expansion model with values from the free swelling behavior can be used to adequately predict the deformation behavior of the complex membrane system. The predictions can be used to optimize the behavior of the membrane pores and the overall performance of the smart membrane.
NASA Technical Reports Server (NTRS)
Chatterjee, Sharmista
1993-01-01
Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.
E-nursing documentation as a tool for quality assurance.
Rajkovic, Vladislav; Sustersic, Olga; Rajkovic, Uros
2006-01-01
The article presents the results of a project with which we describe the reengineering of nursing documentation. Documentation in nursing is an efficient tool for ensuring quality health care and consequently quality patient treatment along the whole clinical path. We have taken into account the nursing process and patient treatment based on Henderson theoretical model of nursing that consists of 14 basic living activities. The model of new documentation enables tracing, transparency, selectivity, monitoring and analyses. All these factors lead to improvements of a health system as well as to improved safety of patients and members of nursing teams. Thus the documentation was developed for three health care segments: secondary and tertiary level, dispensaries and community health care. The new quality introduced to the documentation process by information and communication technology is presented by a database model and a software prototype for managing documentation.
PowderSim: Lagrangian Discrete and Mesh-Free Continuum Simulation Code for Cohesive Soils
NASA Technical Reports Server (NTRS)
Johnson, Scott; Walton, Otis; Settgast, Randolph
2013-01-01
PowderSim is a calculation tool that combines a discrete-element method (DEM) module, including calibrated interparticle-interaction relationships, with a mesh-free, continuum, SPH (smoothed-particle hydrodynamics) based module that utilizes enhanced, calibrated, constitutive models capable of mimicking both large deformations and the flow behavior of regolith simulants and lunar regolith under conditions anticipated during in situ resource utilization (ISRU) operations. The major innovation introduced in PowderSim is to use a mesh-free method (SPH-based) with a calibrated and slightly modified critical-state soil mechanics constitutive model to extend the ability of the simulation tool to also address full-scale engineering systems in the continuum sense. The PowderSim software maintains the ability to address particle-scale problems, like size segregation, in selected regions with a traditional DEM module, which has improved contact physics and electrostatic interaction models.
Validation of Hydrodynamic Load Models Using CFD for the OC4-DeepCwind Semisubmersible: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benitz, M. A.; Schmidt, D. P.; Lackner, M. A.
Computational fluid dynamics (CFD) simulations were carried out on the OC4-DeepCwind semi-submersible to obtain a better understanding of how to set hydrodynamic coefficients for the structure when using an engineering tool such as FAST to model the system. The focus here was on the drag behavior and the effects of the free-surface, free-ends and multi-member arrangement of the semi-submersible structure. These effects are investigated through code-to-code comparisons and flow visualizations. The implications on mean load predictions from engineering tools are addressed. The work presented here suggests that selection of drag coefficients should take into consideration a variety of geometric factors.more » Furthermore, CFD simulations demonstrate large time-varying loads due to vortex shedding, which FAST's hydrodynamic module, HydroDyn, does not model. The implications of these oscillatory loads on the fatigue life needs to be addressed.« less
Kim, Yusung; Tomé, Wolfgang A.
2010-01-01
Summary Voxel based iso-Tumor Control Probability (TCP) maps and iso-Complication maps are proposed as a plan-review tool especially for functional image-guided intensity-modulated radiotherapy (IMRT) strategies such as selective boosting (dose painting) and conformal avoidance IMRT. The maps employ voxel-based phenomenological biological dose-response models for target volumes and normal organs. Two IMRT strategies for prostate cancer, namely conventional uniform IMRT delivering an EUD = 84 Gy (equivalent uniform dose) to the entire PTV and selective boosting delivering an EUD = 82 Gy to the entire PTV, are investigated, to illustrate the advantages of this approach over iso-dose maps. Conventional uniform IMRT did yield a more uniform isodose map to the entire PTV while selective boosting did result in a nonuniform isodose map. However, when employing voxel based iso-TCP maps selective boosting exhibited a more uniform tumor control probability map compared to what could be achieved using conventional uniform IMRT, which showed TCP cold spots in high-risk tumor subvolumes despite delivering a higher EUD to the entire PTV. Voxel based iso-Complication maps are presented for rectum and bladder, and their utilization for selective avoidance IMRT strategies are discussed. We believe as the need for functional image guided treatment planning grows, voxel based iso-TCP and iso-Complication maps will become an important tool to assess the integrity of such treatment plans. PMID:21151734
On the interplay between mathematics and biology: hallmarks toward a new systems biology.
Bellomo, Nicola; Elaiw, Ahmed; Althiabi, Abdullah M; Alghamdi, Mohammed Ali
2015-03-01
This paper proposes a critical analysis of the existing literature on mathematical tools developed toward systems biology approaches and, out of this overview, develops a new approach whose main features can be briefly summarized as follows: derivation of mathematical structures suitable to capture the complexity of biological, hence living, systems, modeling, by appropriate mathematical tools, Darwinian type dynamics, namely mutations followed by selection and evolution. Moreover, multiscale methods to move from genes to cells, and from cells to tissue are analyzed in view of a new systems biology approach. Copyright © 2014 Elsevier B.V. All rights reserved.
Mathers, Jonathan; Sitch, Alice; Parry, Jayne
2016-10-01
Medical schools are increasingly using novel tools to select applicants. The UK Clinical Aptitude Test (UKCAT) is one such tool and measures mental abilities, attitudes and professional behaviour conducive to being a doctor using constructs likely to be less affected by socio-demographic factors than traditional measures of potential. Universities are free to use UKCAT as they see fit but three broad modalities have been observed: 'borderline', 'factor' and 'threshold'. This paper aims to provide the first longitudinal analyses assessing the impact of the different uses of UKCAT on making offers to applicants with different socio-demographic characteristics. Multilevel regression was used to model the outcome of applications to UK medical schools during the period 2004-2011 (data obtained from UCAS), adjusted for sex, ethnicity, schooling, parental occupation, educational attainment, year of application and UKCAT use (borderline, factor and threshold). The three ways of using the UKCAT did not differ in their impact on making the selection process more equitable, other than a marked reversal for female advantage when applied in a 'threshold' manner. Our attempt to model the longitudinal impact of the use of the UKCAT in its threshold format found again the reversal of female advantage, but did not demonstrate similar statistically significant reductions of the advantages associated with White ethnicity, higher social class and selective schooling. Our findings demonstrate attenuation of the advantage of being female but no changes in admission rates based on White ethnicity, higher social class and selective schooling. In view of this, the utility of the UKCAT as a means to widen access to medical schools among non-White and less advantaged applicants remains unproven. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Modeling and Analysis of Asynchronous Systems Using SAL and Hybrid SAL
NASA Technical Reports Server (NTRS)
Tiwari, Ashish; Dutertre, Bruno
2013-01-01
We present formal models and results of formal analysis of two different asynchronous systems. We first examine a mid-value select module that merges the signals coming from three different sensors that are each asynchronously sampling the same input signal. We then consider the phase locking protocol proposed by Daly, Hopkins, and McKenna. This protocol is designed to keep a set of non-faulty (asynchronous) clocks phase locked even in the presence of Byzantine-faulty clocks on the network. All models and verifications have been developed using the SAL model checking tools and the Hybrid SAL abstractor.
DECIDE: a software for computer-assisted evaluation of diagnostic test performance.
Chiecchio, A; Bo, A; Manzone, P; Giglioli, F
1993-05-01
The evaluation of the performance of clinical tests is a complex problem involving different steps and many statistical tools, not always structured in an organic and rational system. This paper presents a software which provides an organic system of statistical tools helping evaluation of clinical test performance. The program allows (a) the building and the organization of a working database, (b) the selection of the minimal set of tests with the maximum information content, (c) the search of the model best fitting the distribution of the test values, (d) the selection of optimal diagnostic cut-off value of the test for every positive/negative situation, (e) the evaluation of performance of the combinations of correlated and uncorrelated tests. The uncertainty associated with all the variables involved is evaluated. The program works in a MS-DOS environment with EGA or higher performing graphic card.
Orthogonal Luciferase-Luciferin Pairs for Bioluminescence Imaging.
Jones, Krysten A; Porterfield, William B; Rathbun, Colin M; McCutcheon, David C; Paley, Miranda A; Prescher, Jennifer A
2017-02-15
Bioluminescence imaging with luciferase-luciferin pairs is widely used in biomedical research. Several luciferases have been identified in nature, and many have been adapted for tracking cells in whole animals. Unfortunately, the optimal luciferases for imaging in vivo utilize the same substrate and therefore cannot easily differentiate multiple cell types in a single subject. To develop a broader set of distinguishable probes, we crafted custom luciferins that can be selectively processed by engineered luciferases. Libraries of mutant enzymes were iteratively screened with sterically modified luciferins, and orthogonal enzyme-substrate "hits" were identified. These tools produced light when complementary enzyme-substrate partners interacted both in vitro and in cultured cell models. Based on their selectivity, these designer pairs will bolster multicomponent imaging and enable the direct interrogation of cell networks not currently possible with existing tools. Our screening platform is also general and will expedite the identification of more unique luciferases and luciferins, further expanding the bioluminescence toolkit.
Sun, Zhifu; Cunningham, Julie; Slager, Susan; Kocher, Jean-Pierre
2015-01-01
Bisulfite treatment-based methylation microarray (mainly Illumina 450K Infinium array) and next-generation sequencing (reduced representation bisulfite sequencing, Agilent SureSelect Human Methyl-Seq, NimbleGen SeqCap Epi CpGiant or whole-genome bisulfite sequencing) are commonly used for base resolution DNA methylome research. Although multiple tools and methods have been developed and used for the data preprocessing and analysis, confusions remains for these platforms including how and whether the 450k array should be normalized; which platform should be used to better fit researchers’ needs; and which statistical models would be more appropriate for differential methylation analysis. This review presents the commonly used platforms and compares the pros and cons of each in methylome profiling. We then discuss approaches to study design, data normalization, bias correction and model selection for differentially methylated individual CpGs and regions. PMID:26366945
Experimental evolution of a sexually selected display in yeast
Rogers, David W.; Greig, Duncan
2008-01-01
The fundamental principle underlying sexual selection theory is that an allele conferring an advantage in the competition for mates will spread through a population. Remarkably, this has never been demonstrated empirically. We have developed an experimental system using yeast for testing genetic models of sexual selection. Yeast signal to potential partners by producing an attractive pheromone; stronger signallers are preferred as mates. We tested the effect of high and low levels of sexual selection on the evolution of a gene determining the strength of this signal. Under high sexual selection, an allele encoding a stronger signal was able to invade a population of weak signallers, and we observed a corresponding increase in the amount of pheromone produced. By contrast, the strong signalling allele failed to invade under low sexual selection. Our results demonstrate, for the first time, the spread of a sexually selected allele through a population, confirming the central assumption of sexual selection theory. Our yeast system is a powerful tool for investigating the genetics of sexual selection. PMID:18842545
NASTRAN as an analytical research tool for composite mechanics and composite structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.
1976-01-01
Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.
2015-12-01
FINAL REPORT Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...TITLE AND SUBTITLE Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...project ER-201129 was to develop and validate a framework used to make bioremediation decisions based on site-specific physical and biogeochemical
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prindle, N.H.; Mendenhall, F.T.; Trauth, K.
1996-05-01
The Systems Prioritization Method (SPM) is a decision-aiding tool developed by Sandia National Laboratories (SNL). SPM provides an analytical basis for supporting programmatic decisions for the Waste Isolation Pilot Plant (WIPP) to meet selected portions of the applicable US EPA long-term performance regulations. The first iteration of SPM (SPM-1), the prototype for SPM< was completed in 1994. It served as a benchmark and a test bed for developing the tools needed for the second iteration of SPM (SPM-2). SPM-2, completed in 1995, is intended for programmatic decision making. This is Volume II of the three-volume final report of the secondmore » iteration of the SPM. It describes the technical input and model implementation for SPM-2, and presents the SPM-2 technical baseline and the activities, activity outcomes, outcome probabilities, and the input parameters for SPM-2 analysis.« less
Integration of tools for binding archetypes to SNOMED CT.
Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Ahlfeldt, Hans; Rector, Alan
2008-10-27
The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail.
Integration of tools for binding archetypes to SNOMED CT
Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Åhlfeldt, Hans; Rector, Alan
2008-01-01
Background The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Methods Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. Results An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Conclusion Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail. PMID:19007444
Survey of Technologies Relevant to Defense From Near-Earth Objects
NASA Technical Reports Server (NTRS)
Adams, R. B.; Alexander, R.; Bonometti, J.; Chapman, J.; Fincher, S.; Hopkins, R.; Kalkstein, M.; Polsgrove, T.; Statham, G.; White, S.
2004-01-01
Several recent near-miss encounters with asteroids and comets have focused attention on the threat of a catastrophic impact with the Earth. This Technical Publication reviews the historical impact record and current understanding of the number and location of near-Earth objects (NEOs) to address their impact probability. Various ongoing projects intended to survey and catalog the NEO population are also reviewed. Details are given of a Marshall Space Flight Center-led study intended to develop and assess various candidate systems for protection of the Earth against NEOs. Details of analytical tools, trajectory tools, and a tool that was created to model both the undeflected inbound path of an NEO as well as the modified, postdeflection path are given. A representative selection of these possible options was modeled and evaluated. It is hoped that this study will raise the level of attention about this very real threat and also demonstrate that successful defense is both possible and practicable, provided appropriate steps are taken.
Buchmann, Ilka; Randerath, Jennifer
2017-09-01
Frequently left brain damage (LBD) leads to limb apraxia, a disorder that can affect tool-use. Despite its impact on daily life, classical tests examining the pantomime of tool-use and imitation of gestures are seldom applied in clinical practice. The study's aim was to present a diagnostic approach which appears more strongly related to actions in daily life in order to sensitize applicants and patients about the relevance of the disorder before patients are discharged. Two tests were introduced that evaluate actual tool selection and tool-object-application: the Novel Tools (NTT) and the Familiar Tools (FTT) Test (parts of the DILA-S: Diagnostic Instrument for Limb Apraxia - Short Version). Normative data in healthy subjects (N = 82) was collected. Then the tests were applied in stroke patients with unilateral left brain damage (LBD: N = 33), a control right brain damage group (RBD: N = 20) as well as healthy age and gender matched controls (CL: N = 28, and CR, N = 18). The tests showed appropriate interrater-reliability and internal consistency as well as concurrent and divergent validity. To examine criterion validity based on the well-known left lateralization of limb apraxia, group comparisons were run. As expected, the LBD group demonstrated a high prevalence of tool-use apraxia (NTT: 36.4%, FTT: 48.5%) ranging from mild to severe impairment and scored worse than their control group (CL). A few RBD patients did demonstrate impairments in tool-use (NTT: 15%, FTT: 15%). On a group level they did not differ from their healthy controls (CR). Further, it was demonstrated that the selection and application of familiar and novel tools can be impaired selectively. Our study results suggest that real tool-use tests evaluating tool selection and tool application should be considered for standard diagnosis of limb apraxia in left as well as right brain damaged patients. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Mahdi, Chanif; Nurdiana, Nurdiana; Kikuchi, Takheshi; Fatchiyah, Fatchiyah
2014-01-01
To understand the structural features that dictate the selectivity of the two isoforms of the prostaglandin H2 synthase (PGHS/COX), the three-dimensional (3D) structure of COX-1/COX-2 was assessed by means of binding energy calculation of virtual molecular dynamic with using ligand alpha-Patchouli alcohol isomers. Molecular interaction studies with COX-1 and COX-2 were done using the molecular docking tools by Hex 8.0. Interactions were further visualized by using Discovery Studio Client 3.5 software tool. The binding energy of molecular interaction was calculated by AMBER12 and Virtual Molecular Dynamic 1.9.1 software. The analysis of the alpha-Patchouli alcohol isomer compounds showed that all alpha-Patchouli alcohol isomers were suggested as inhibitor of COX-1 and COX-2. Collectively, the scoring binding energy calculation (with PBSA Model Solvent) of alpha-Patchouli alcohol isomer compounds (CID442384, CID6432585, CID3080622, CID10955174, and CID56928117) was suggested as candidate for a selective COX-1 inhibitor and CID521903 as nonselective COX-1/COX-2. PMID:25484897
Tool use and mechanical problem solving in apraxia.
Goldenberg, G; Hagmann, S
1998-07-01
Moorlaas (1928) proposed that apraxic patients can identify objects and can remember the purpose they have been made for but do not know the way in which they must be used to achieve that purpose. Knowledge about the use of objects and tools can have two sources: It can be based on retrieval of instructions of use from semantic memory or on a direct inference of function from structure. The ability to infer function from structure enables subjects to use unfamiliar tools and to detect alternative uses of familiar tools. It is the basis of mechanical problem solving. The purpose of the present study was to analyze retrieval of instruction of use, mechanical problem solving, and actual tool use in patients with apraxia due to circumscribed lesions of the left hemisphere. For assessing mechanical problem solving we developed a test of selection and application of novel tools. Access to instruction of use was tested by pantomime of tool use. Actual tool use was examined for the same familiar tools. Forty two patients with left brain damage (LBD) and aphasia, 22 patients with right brain damage (RBD) and 22 controls were examined. Only LBD patients differed from controls on all tests. RBD patients had difficulties with the use but not with the selection of novel tools. In LBD patients there was a significant correlation between pantomime of tool use and novel tool selection but there were single cases who scored in the defective range on one of these tests and normally on the other. Analysis of LBD patients' lesions suggested that frontal lobe damage does not disturb novel tool selection. Only LBD patients who failed on pantomime of object use and on novel tool selection committed errors in actual use of familiar tools. The finding that mechanical problem solving is invariably defective in apraxic patients who commit errors with familiar tools is in good accord with clinical observations, as the gravity of their errors goes beyond what one would expect as a mere sequel of loss of access to instruction of use.