The development and testing of a skin tear risk assessment tool.
Newall, Nelly; Lewin, Gill F; Bulsara, Max K; Carville, Keryln J; Leslie, Gavin D; Roberts, Pam A
2017-02-01
The aim of the present study is to develop a reliable and valid skin tear risk assessment tool. The six characteristics identified in a previous case control study as constituting the best risk model for skin tear development were used to construct a risk assessment tool. The ability of the tool to predict skin tear development was then tested in a prospective study. Between August 2012 and September 2013, 1466 tertiary hospital patients were assessed at admission and followed up for 10 days to see if they developed a skin tear. The predictive validity of the tool was assessed using receiver operating characteristic (ROC) analysis. When the tool was found not to have performed as well as hoped, secondary analyses were performed to determine whether a potentially better performing risk model could be identified. The tool was found to have high sensitivity but low specificity and therefore have inadequate predictive validity. Secondary analysis of the combined data from this and the previous case control study identified an alternative better performing risk model. The tool developed and tested in this study was found to have inadequate predictive validity. The predictive validity of an alternative, more parsimonious model now needs to be tested. © 2015 Medicalhelplines.com Inc and John Wiley & Sons Ltd.
Comparison of Performance Predictions for New Low-Thrust Trajectory Tools
NASA Technical Reports Server (NTRS)
Polsgrove, Tara; Kos, Larry; Hopkins, Randall; Crane, Tracie
2006-01-01
Several low thrust trajectory optimization tools have been developed over the last 3% years by the Low Thrust Trajectory Tools development team. This toolset includes both low-medium fidelity and high fidelity tools which allow the analyst to quickly research a wide mission trade space and perform advanced mission design. These tools were tested using a set of reference trajectories that exercised each tool s unique capabilities. This paper compares the performance predictions of the various tools against several of the reference trajectories. The intent is to verify agreement between the high fidelity tools and to quantify the performance prediction differences between tools of different fidelity levels.
Virtual Beach: Decision Support Tools for Beach Pathogen Prediction
The Virtual Beach Managers Tool (VB) is decision-making software developed to help local beach managers make decisions as to when beaches should be closed due to predicted high levels of water borne pathogens. The tool is being developed under the umbrella of EPA's Advanced Monit...
A thermal sensation prediction tool for use by the profession
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fountain, M.E.; Huizenga, C.
1997-12-31
As part of a recent ASHRAE research project (781-RP), a thermal sensation prediction tool has been developed. This paper introduces the tool, describes the component thermal sensation models, and presents examples of how the tool can be used in practice. Since the main end product of the HVAC industry is the comfort of occupants indoors, tools for predicting occupant thermal response can be an important asset to designers of indoor climate control systems. The software tool presented in this paper incorporates several existing models for predicting occupant comfort.
Mulhearn, Tyler J; Watts, Logan L; Todd, E Michelle; Medeiros, Kelsey E; Connelly, Shane; Mumford, Michael D
2017-01-01
Although recent evidence suggests ethics education can be effective, the nature of specific training programs, and their effectiveness, varies considerably. Building on a recent path modeling effort, the present study developed and validated a predictive modeling tool for responsible conduct of research education. The predictive modeling tool allows users to enter ratings in relation to a given ethics training program and receive instantaneous evaluative information for course refinement. Validation work suggests the tool's predicted outcomes correlate strongly (r = 0.46) with objective course outcomes. Implications for training program development and refinement are discussed.
Towards a generalized energy prediction model for machine tools
Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H.; Dornfeld, David A.; Helu, Moneer; Rachuri, Sudarsan
2017-01-01
Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process. PMID:28652687
Towards a generalized energy prediction model for machine tools.
Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H; Dornfeld, David A; Helu, Moneer; Rachuri, Sudarsan
2017-04-01
Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process.
A web accessible software tool is being developed to predict the toxicity of unknown chemicals for a wide variety of endpoints. The tool will enable a user to easily predict the toxicity of a query compound by simply entering its structure in a 2-dimensional (2-D) chemical sketc...
A community resource benchmarking predictions of peptide binding to MHC-I molecules.
Peters, Bjoern; Bui, Huynh-Hoa; Frankild, Sune; Nielson, Morten; Lundegaard, Claus; Kostem, Emrah; Basch, Derek; Lamberth, Kasper; Harndahl, Mikkel; Fleri, Ward; Wilson, Stephen S; Sidney, John; Lund, Ole; Buus, Soren; Sette, Alessandro
2006-06-09
Recognition of peptides bound to major histocompatibility complex (MHC) class I molecules by T lymphocytes is an essential part of immune surveillance. Each MHC allele has a characteristic peptide binding preference, which can be captured in prediction algorithms, allowing for the rapid scan of entire pathogen proteomes for peptide likely to bind MHC. Here we make public a large set of 48,828 quantitative peptide-binding affinity measurements relating to 48 different mouse, human, macaque, and chimpanzee MHC class I alleles. We use this data to establish a set of benchmark predictions with one neural network method and two matrix-based prediction methods extensively utilized in our groups. In general, the neural network outperforms the matrix-based predictions mainly due to its ability to generalize even on a small amount of data. We also retrieved predictions from tools publicly available on the internet. While differences in the data used to generate these predictions hamper direct comparisons, we do conclude that tools based on combinatorial peptide libraries perform remarkably well. The transparent prediction evaluation on this dataset provides tool developers with a benchmark for comparison of newly developed prediction methods. In addition, to generate and evaluate our own prediction methods, we have established an easily extensible web-based prediction framework that allows automated side-by-side comparisons of prediction methods implemented by experts. This is an advance over the current practice of tool developers having to generate reference predictions themselves, which can lead to underestimating the performance of prediction methods they are not as familiar with as their own. The overall goal of this effort is to provide a transparent prediction evaluation allowing bioinformaticians to identify promising features of prediction methods and providing guidance to immunologists regarding the reliability of prediction tools.
Mueller, Martina; Wagner, Carol L; Annibale, David J; Knapp, Rebecca G; Hulsey, Thomas C; Almeida, Jonas S
2006-03-01
Approximately 30% of intubated preterm infants with respiratory distress syndrome (RDS) will fail attempted extubation, requiring reintubation and mechanical ventilation. Although ventilator technology and monitoring of premature infants have improved over time, optimal extubation remains challenging. Furthermore, extubation decisions for premature infants require complex informational processing, techniques implicitly learned through clinical practice. Computer-aided decision-support tools would benefit inexperienced clinicians, especially during peak neonatal intensive care unit (NICU) census. A five-step procedure was developed to identify predictive variables. Clinical expert (CE) thought processes comprised one model. Variables from that model were used to develop two mathematical models for the decision-support tool: an artificial neural network (ANN) and a multivariate logistic regression model (MLR). The ranking of the variables in the three models was compared using the Wilcoxon Signed Rank Test. The best performing model was used in a web-based decision-support tool with a user interface implemented in Hypertext Markup Language (HTML) and the mathematical model employing the ANN. CEs identified 51 potentially predictive variables for extubation decisions for an infant on mechanical ventilation. Comparisons of the three models showed a significant difference between the ANN and the CE (p = 0.0006). Of the original 51 potentially predictive variables, the 13 most predictive variables were used to develop an ANN as a web-based decision-tool. The ANN processes user-provided data and returns the prediction 0-1 score and a novelty index. The user then selects the most appropriate threshold for categorizing the prediction as a success or failure. Furthermore, the novelty index, indicating the similarity of the test case to the training case, allows the user to assess the confidence level of the prediction with regard to how much the new data differ from the data originally used for the development of the prediction tool. State-of-the-art, machine-learning methods can be employed for the development of sophisticated tools to aid clinicians' decisions. We identified numerous variables considered relevant for extubation decisions for mechanically ventilated premature infants with RDS. We then developed a web-based decision-support tool for clinicians which can be made widely available and potentially improve patient care world wide.
NASA Technical Reports Server (NTRS)
Sebok, Angelia; Wickens, Christopher; Sargent, Robert
2015-01-01
One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.
The Sequence Alignment to Predict Across Species Susceptibility (SeqAPASS) tool was developed to address needs for rapid, cost effective methods of species extrapolation of chemical susceptibility. Specifically, the SeqAPASS tool compares the primary sequence (Level 1), functiona...
Sugimoto, Masahiro; Takada, Masahiro; Toi, Masakazu
2014-12-09
Nomograms are a standard computational tool to predict the likelihood of an outcome using multiple available patient features. We have developed a more powerful data mining methodology, to predict axillary lymph node (AxLN) metastasis and response to neoadjuvant chemotherapy (NAC) in primary breast cancer patients. We developed websites to use these tools. The tools calculate the probability of AxLN metastasis (AxLN model) and pathological complete response to NAC (NAC model). As a calculation algorithm, we employed a decision tree-based prediction model known as the alternative decision tree (ADTree), which is an analog development of if-then type decision trees. An ensemble technique was used to combine multiple ADTree predictions, resulting in higher generalization abilities and robustness against missing values. The AxLN model was developed with training datasets (n=148) and test datasets (n=143), and validated using an independent cohort (n=174), yielding an area under the receiver operating characteristic curve (AUC) of 0.768. The NAC model was developed and validated with n=150 and n=173 datasets from a randomized controlled trial, yielding an AUC of 0.787. AxLN and NAC models require users to input up to 17 and 16 variables, respectively. These include pathological features, including human epidermal growth factor receptor 2 (HER2) status and imaging findings. Each input variable has an option of "unknown," to facilitate prediction for cases with missing values. The websites developed facilitate the use of these tools, and serve as a database for accumulating new datasets.
Cockpit System Situational Awareness Modeling Tool
NASA Technical Reports Server (NTRS)
Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara
2004-01-01
This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.
An Update on Design Tools for Optimization of CMC 3D Fiber Architectures
NASA Technical Reports Server (NTRS)
Lang, J.; DiCarlo, J.
2012-01-01
Objective: Describe and up-date progress for NASA's efforts to develop 3D architectural design tools for CMC in general and for SIC/SiC composites in particular. Describe past and current sequential work efforts aimed at: Understanding key fiber and tow physical characteristics in conventional 2D and 3D woven architectures as revealed by microstructures in the literature. Developing an Excel program for down-selecting and predicting key geometric properties and resulting key fiber-controlled properties for various conventional 3D architectures. Developing a software tool for accurately visualizing all the key geometric details of conventional 3D architectures. Validating tools by visualizing and predicting the Internal geometry and key mechanical properties of a NASA SIC/SIC panel with a 3D orthogonal architecture. Applying the predictive and visualization tools toward advanced 3D orthogonal SiC/SIC composites, and combining them into a user-friendly software program.
A statistical software tool, Stream Fish Community Predictor (SFCP), based on EMAP stream sampling in the mid-Atlantic Highlands, was developed to predict stream fish communities using stream and watershed characteristics. Step one in the tool development was a cluster analysis t...
Automated benchmarking of peptide-MHC class I binding predictions.
Trolle, Thomas; Metushi, Imir G; Greenbaum, Jason A; Kim, Yohan; Sidney, John; Lund, Ole; Sette, Alessandro; Peters, Bjoern; Nielsen, Morten
2015-07-01
Numerous in silico methods predicting peptide binding to major histocompatibility complex (MHC) class I molecules have been developed over the last decades. However, the multitude of available prediction tools makes it non-trivial for the end-user to select which tool to use for a given task. To provide a solid basis on which to compare different prediction tools, we here describe a framework for the automated benchmarking of peptide-MHC class I binding prediction tools. The framework runs weekly benchmarks on data that are newly entered into the Immune Epitope Database (IEDB), giving the public access to frequent, up-to-date performance evaluations of all participating tools. To overcome potential selection bias in the data included in the IEDB, a strategy was implemented that suggests a set of peptides for which different prediction methods give divergent predictions as to their binding capability. Upon experimental binding validation, these peptides entered the benchmark study. The benchmark has run for 15 weeks and includes evaluation of 44 datasets covering 17 MHC alleles and more than 4000 peptide-MHC binding measurements. Inspection of the results allows the end-user to make educated selections between participating tools. Of the four participating servers, NetMHCpan performed the best, followed by ANN, SMM and finally ARB. Up-to-date performance evaluations of each server can be found online at http://tools.iedb.org/auto_bench/mhci/weekly. All prediction tool developers are invited to participate in the benchmark. Sign-up instructions are available at http://tools.iedb.org/auto_bench/mhci/join. mniel@cbs.dtu.dk or bpeters@liai.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Automated benchmarking of peptide-MHC class I binding predictions
Trolle, Thomas; Metushi, Imir G.; Greenbaum, Jason A.; Kim, Yohan; Sidney, John; Lund, Ole; Sette, Alessandro; Peters, Bjoern; Nielsen, Morten
2015-01-01
Motivation: Numerous in silico methods predicting peptide binding to major histocompatibility complex (MHC) class I molecules have been developed over the last decades. However, the multitude of available prediction tools makes it non-trivial for the end-user to select which tool to use for a given task. To provide a solid basis on which to compare different prediction tools, we here describe a framework for the automated benchmarking of peptide-MHC class I binding prediction tools. The framework runs weekly benchmarks on data that are newly entered into the Immune Epitope Database (IEDB), giving the public access to frequent, up-to-date performance evaluations of all participating tools. To overcome potential selection bias in the data included in the IEDB, a strategy was implemented that suggests a set of peptides for which different prediction methods give divergent predictions as to their binding capability. Upon experimental binding validation, these peptides entered the benchmark study. Results: The benchmark has run for 15 weeks and includes evaluation of 44 datasets covering 17 MHC alleles and more than 4000 peptide-MHC binding measurements. Inspection of the results allows the end-user to make educated selections between participating tools. Of the four participating servers, NetMHCpan performed the best, followed by ANN, SMM and finally ARB. Availability and implementation: Up-to-date performance evaluations of each server can be found online at http://tools.iedb.org/auto_bench/mhci/weekly. All prediction tool developers are invited to participate in the benchmark. Sign-up instructions are available at http://tools.iedb.org/auto_bench/mhci/join. Contact: mniel@cbs.dtu.dk or bpeters@liai.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25717196
Updating Risk Prediction Tools: A Case Study in Prostate Cancer
Ankerst, Donna P.; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J.; Feng, Ziding; Sanda, Martin G.; Partin, Alan W.; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M.
2013-01-01
Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [−2]proPSA measured on an external case control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. PMID:22095849
Updating risk prediction tools: a case study in prostate cancer.
Ankerst, Donna P; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J; Feng, Ziding; Sanda, Martin G; Partin, Alan W; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M
2012-01-01
Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically, the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [-2]proPSA measured on an external case-control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Tools for studying dry-cured ham processing by using computed tomography.
Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena
2012-01-11
An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.
Rauh, Simone P; Rutters, Femke; van der Heijden, Amber A W A; Luimes, Thomas; Alssema, Marjan; Heymans, Martijn W; Magliano, Dianna J; Shaw, Jonathan E; Beulens, Joline W; Dekker, Jacqueline M
2018-02-01
Chronic cardiometabolic diseases, including cardiovascular disease (CVD), type 2 diabetes (T2D) and chronic kidney disease (CKD), share many modifiable risk factors and can be prevented using combined prevention programs. Valid risk prediction tools are needed to accurately identify individuals at risk. We aimed to validate a previously developed non-invasive risk prediction tool for predicting the combined 7-year-risk for chronic cardiometabolic diseases. The previously developed tool is stratified for sex and contains the predictors age, BMI, waist circumference, use of antihypertensives, smoking, family history of myocardial infarction/stroke, and family history of diabetes. This tool was externally validated, evaluating model performance using area under the receiver operating characteristic curve (AUC)-assessing discrimination-and Hosmer-Lemeshow goodness-of-fit (HL) statistics-assessing calibration. The intercept was recalibrated to improve calibration performance. The risk prediction tool was validated in 3544 participants from the Australian Diabetes, Obesity and Lifestyle Study (AusDiab). Discrimination was acceptable, with an AUC of 0.78 (95% CI 0.75-0.81) in men and 0.78 (95% CI 0.74-0.81) in women. Calibration was poor (HL statistic: p < 0.001), but improved considerably after intercept recalibration. Examination of individual outcomes showed that in men, AUC was highest for CKD (0.85 [95% CI 0.78-0.91]) and lowest for T2D (0.69 [95% CI 0.65-0.74]). In women, AUC was highest for CVD (0.88 [95% CI 0.83-0.94)]) and lowest for T2D (0.71 [95% CI 0.66-0.75]). Validation of our previously developed tool showed robust discriminative performance across populations. Model recalibration is recommended to account for different disease rates. Our risk prediction tool can be useful in large-scale prevention programs for identifying those in need of further risk profiling because of their increased risk for chronic cardiometabolic diseases.
In vitro models for the prediction of in vivo performance of oral dosage forms.
Kostewicz, Edmund S; Abrahamsson, Bertil; Brewster, Marcus; Brouwers, Joachim; Butler, James; Carlert, Sara; Dickinson, Paul A; Dressman, Jennifer; Holm, René; Klein, Sandra; Mann, James; McAllister, Mark; Minekus, Mans; Muenster, Uwe; Müllertz, Anette; Verwei, Miriam; Vertzoni, Maria; Weitschies, Werner; Augustijns, Patrick
2014-06-16
Accurate prediction of the in vivo biopharmaceutical performance of oral drug formulations is critical to efficient drug development. Traditionally, in vitro evaluation of oral drug formulations has focused on disintegration and dissolution testing for quality control (QC) purposes. The connection with in vivo biopharmaceutical performance has often been ignored. More recently, the switch to assessing drug products in a more biorelevant and mechanistic manner has advanced the understanding of drug formulation behavior. Notwithstanding this evolution, predicting the in vivo biopharmaceutical performance of formulations that rely on complex intraluminal processes (e.g. solubilization, supersaturation, precipitation…) remains extremely challenging. Concomitantly, the increasing demand for complex formulations to overcome low drug solubility or to control drug release rates urges the development of new in vitro tools. Development and optimizing innovative, predictive Oral Biopharmaceutical Tools is the main target of the OrBiTo project within the Innovative Medicines Initiative (IMI) framework. A combination of physico-chemical measurements, in vitro tests, in vivo methods, and physiology-based pharmacokinetic modeling is expected to create a unique knowledge platform, enabling the bottlenecks in drug development to be removed and the whole process of drug development to become more efficient. As part of the basis for the OrBiTo project, this review summarizes the current status of predictive in vitro assessment tools for formulation behavior. Both pharmacopoeia-listed apparatus and more advanced tools are discussed. Special attention is paid to major issues limiting the predictive power of traditional tools, including the simulation of dynamic changes in gastrointestinal conditions, the adequate reproduction of gastrointestinal motility, the simulation of supersaturation and precipitation, and the implementation of the solubility-permeability interplay. It is anticipated that the innovative in vitro biopharmaceutical tools arising from the OrBiTo project will lead to improved predictions for in vivo behavior of drug formulations in the GI tract. Copyright © 2013 Elsevier B.V. All rights reserved.
Patient-specific dosimetric endpoints based treatment plan quality control in radiotherapy.
Song, Ting; Staub, David; Chen, Mingli; Lu, Weiguo; Tian, Zhen; Jia, Xun; Li, Yongbao; Zhou, Linghong; Jiang, Steve B; Gu, Xuejun
2015-11-07
In intensity modulated radiotherapy (IMRT), the optimal plan for each patient is specific due to unique patient anatomy. To achieve such a plan, patient-specific dosimetric goals reflecting each patient's unique anatomy should be defined and adopted in the treatment planning procedure for plan quality control. This study is to develop such a personalized treatment plan quality control tool by predicting patient-specific dosimetric endpoints (DEs). The incorporation of patient specific DEs is realized by a multi-OAR geometry-dosimetry model, capable of predicting optimal DEs based on the individual patient's geometry. The overall quality of a treatment plan is then judged with a numerical treatment plan quality indicator and characterized as optimal or suboptimal. Taking advantage of clinically available prostate volumetric modulated arc therapy (VMAT) treatment plans, we built and evaluated our proposed plan quality control tool. Using our developed tool, six of twenty evaluated plans were identified as sub-optimal plans. After plan re-optimization, these suboptimal plans achieved better OAR dose sparing without sacrificing the PTV coverage, and the dosimetric endpoints of the re-optimized plans agreed well with the model predicted values, which validate the predictability of the proposed tool. In conclusion, the developed tool is able to accurately predict optimally achievable DEs of multiple OARs, identify suboptimal plans, and guide plan optimization. It is a useful tool for achieving patient-specific treatment plan quality control.
Atomic Oxygen Erosion Yield Predictive Tool for Spacecraft Polymers in Low Earth Orbit
NASA Technical Reports Server (NTRS)
Bank, Bruce A.; de Groh, Kim K.; Backus, Jane A.
2008-01-01
A predictive tool was developed to estimate the low Earth orbit (LEO) atomic oxygen erosion yield of polymers based on the results of the Polymer Erosion and Contamination Experiment (PEACE) Polymers experiment flown as part of the Materials International Space Station Experiment 2 (MISSE 2). The MISSE 2 PEACE experiment accurately measured the erosion yield of a wide variety of polymers and pyrolytic graphite. The 40 different materials tested were selected specifically to represent a variety of polymers used in space as well as a wide variety of polymer chemical structures. The resulting erosion yield data was used to develop a predictive tool which utilizes chemical structure and physical properties of polymers that can be measured in ground laboratory testing to predict the in-space atomic oxygen erosion yield of a polymer. The properties include chemical structure, bonding information, density and ash content. The resulting predictive tool has a correlation coefficient of 0.914 when compared with actual MISSE 2 space data for 38 polymers and pyrolytic graphite. The intent of the predictive tool is to be able to make estimates of atomic oxygen erosion yields for new polymers without requiring expensive and time consumptive in-space testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Starke, Michael R; Abdelaziz, Omar A; Jackson, Rogerick K
Residential Simulation Tool was developed to understand the impact of residential load consumption on utilities including the role of demand response. This is complicated as many different residential loads exist and are utilized for different purposes. The tool models human behavior and contributes this to load utilization, which contributes to the electrical consumption prediction by the tool. The tool integrates a number of different databases from Department of Energy and other Government websites to support the load consumption prediction.
Computer assisted blast design and assessment tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cameron, A.R.; Kleine, T.H.; Forsyth, W.W.
1995-12-31
In general the software required by a blast designer includes tools that graphically present blast designs (surface and underground), can analyze a design or predict its result, and can assess blasting results. As computers develop and computer literacy continues to rise the development of and use of such tools will spread. An example of the tools that are becoming available includes: Automatic blast pattern generation and underground ring design; blast design evaluation in terms of explosive distribution and detonation simulation; fragmentation prediction; blast vibration prediction and minimization; blast monitoring for assessment of dynamic performance; vibration measurement, display and signal processing;more » evaluation of blast results in terms of fragmentation; and risk and reliability based blast assessment. The authors have identified a set of criteria that are essential in choosing appropriate software blasting tools.« less
Rowlinson, Steve; Jia, Yunyan Andrea
2014-04-01
Existing heat stress risk management guidelines recommended by international standards are not practical for the construction industry which needs site supervision staff to make instant managerial decisions to mitigate heat risks. The ability of the predicted heat strain (PHS) model [ISO 7933 (2004). Ergonomics of the thermal environment analytical determination and interpretation of heat stress using calculation of the predicted heat strain. Geneva: International Standard Organisation] to predict maximum allowable exposure time (D lim) has now enabled development of localized, action-triggering and threshold-based guidelines for implementation by lay frontline staff on construction sites. This article presents a protocol for development of two heat stress management tools by applying the PHS model to its full potential. One of the tools is developed to facilitate managerial decisions on an optimized work-rest regimen for paced work. The other tool is developed to enable workers' self-regulation during self-paced work.
Software tool for portal dosimetry research.
Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C
2008-09-01
This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.
Zabor, Emily C; Coit, Daniel; Gershenwald, Jeffrey E; McMasters, Kelly M; Michaelson, James S; Stromberg, Arnold J; Panageas, Katherine S
2018-02-22
Prognostic models are increasingly being made available online, where they can be publicly accessed by both patients and clinicians. These online tools are an important resource for patients to better understand their prognosis and for clinicians to make informed decisions about treatment and follow-up. The goal of this analysis was to highlight the possible variability in multiple online prognostic tools in a single disease. To demonstrate the variability in survival predictions across online prognostic tools, we applied a single validation dataset to three online melanoma prognostic tools. Data on melanoma patients treated at Memorial Sloan Kettering Cancer Center between 2000 and 2014 were retrospectively collected. Calibration was assessed using calibration plots and discrimination was assessed using the C-index. In this demonstration project, we found important differences across the three models that led to variability in individual patients' predicted survival across the tools, especially in the lower range of predictions. In a validation test using a single-institution data set, calibration and discrimination varied across the three models. This study underscores the potential variability both within and across online tools, and highlights the importance of using methodological rigor when developing a prognostic model that will be made publicly available online. The results also reinforce that careful development and thoughtful interpretation, including understanding a given tool's limitations, are required in order for online prognostic tools that provide survival predictions to be a useful resource for both patients and clinicians.
A Study on Re-entry Predictions of Uncontrolled Space Objects for Space Situational Awareness
NASA Astrophysics Data System (ADS)
Choi, Eun-Jung; Cho, Sungki; Lee, Deok-Jin; Kim, Siwoo; Jo, Jung Hyun
2017-12-01
The key risk analysis technologies for the re-entry of space objects into Earth’s atmosphere are divided into four categories: cataloguing and databases of the re-entry of space objects, lifetime and re-entry trajectory predictions, break-up models after re-entry and multiple debris distribution predictions, and ground impact probability models. In this study, we focused on re- entry prediction, including orbital lifetime assessments, for space situational awareness systems. Re-entry predictions are very difficult and are affected by various sources of uncertainty. In particular, during uncontrolled re-entry, large spacecraft may break into several pieces of debris, and the surviving fragments can be a significant hazard for persons and properties on the ground. In recent years, specific methods and procedures have been developed to provide clear information for predicting and analyzing the re-entry of space objects and for ground-risk assessments. Representative tools include object reentry survival analysis tool (ORSAT) and debris assessment software (DAS) developed by National Aeronautics and Space Administration (NASA), spacecraft atmospheric re-entry and aerothermal break-up (SCARAB) and debris risk assessment and mitigation analysis (DRAMA) developed by European Space Agency (ESA), and semi-analytic tool for end of life analysis (STELA) developed by Centre National d’Etudes Spatiales (CNES). In this study, various surveys of existing re-entry space objects are reviewed, and an efficient re-entry prediction technique is suggested based on STELA, the life-cycle analysis tool for satellites, and DRAMA, a re-entry analysis tool. To verify the proposed method, the re-entry of the Tiangong-1 Space Lab, which is expected to re-enter Earth’s atmosphere shortly, was simulated. Eventually, these results will provide a basis for space situational awareness risk analyses of the re-entry of space objects.
Ingham, Steven C; Fanslau, Melody A; Burnham, Greg M; Ingham, Barbara H; Norback, John P; Schaffner, Donald W
2007-06-01
A computer-based tool (available at: www.wisc.edu/foodsafety/meatresearch) was developed for predicting pathogen growth in raw pork, beef, and poultry meat. The tool, THERM (temperature history evaluation for raw meats), predicts the growth of pathogens in pork and beef (Escherichia coli O157:H7, Salmonella serovars, and Staphylococcus aureus) and on poultry (Salmonella serovars and S. aureus) during short-term temperature abuse. The model was developed as follows: 25-g samples of raw ground pork, beef, and turkey were inoculated with a five-strain cocktail of the target pathogen(s) and held at isothermal temperatures from 10 to 43.3 degrees C. Log CFU per sample data were obtained for each pathogen and used to determine lag-phase duration (LPD) and growth rate (GR) by DMFit software. The LPD and GR were used to develop the THERM predictive tool, into which chronological time and temperature data for raw meat processing and storage are entered. The THERM tool then predicts a delta log CFU value for the desired pathogen-product combination. The accuracy of THERM was tested in 20 different inoculation experiments that involved multiple products (coarse-ground beef, skinless chicken breast meat, turkey scapula meat, and ground turkey) and temperature-abuse scenarios. With the time-temperature data from each experiment, THERM accurately predicted the pathogen growth and no growth (with growth defined as delta log CFU > 0.3) in 67, 85, and 95% of the experiments with E. coli 0157:H7, Salmonella serovars, and S. aureus, respectively, and yielded fail-safe predictions in the remaining experiments. We conclude that THERM is a useful tool for qualitatively predicting pathogen behavior (growth and no growth) in raw meats. Potential applications include evaluating process deviations and critical limits under the HACCP (hazard analysis critical control point) system.
On-Line, Self-Learning, Predictive Tool for Determining Payload Thermal Response
NASA Technical Reports Server (NTRS)
Jen, Chian-Li; Tilwick, Leon
2000-01-01
This paper will present the results of a joint ManTech / Goddard R&D effort, currently under way, to develop and test a computer based, on-line, predictive simulation model for use by facility operators to predict the thermal response of a payload during thermal vacuum testing. Thermal response was identified as an area that could benefit from the algorithms developed by Dr. Jeri for complex computer simulations. Most thermal vacuum test setups are unique since no two payloads have the same thermal properties. This requires that the operators depend on their past experiences to conduct the test which requires time for them to learn how the payload responds while at the same time limiting any risk of exceeding hot or cold temperature limits. The predictive tool being developed is intended to be used with the new Thermal Vacuum Data System (TVDS) developed at Goddard for the Thermal Vacuum Test Operations group. This model can learn the thermal response of the payload by reading a few data points from the TVDS, accepting the payload's current temperature as the initial condition for prediction. The model can then be used as a predictive tool to estimate the future payload temperatures according to a predetermined shroud temperature profile. If the error of prediction is too big, the model can be asked to re-learn the new situation on-line in real-time and give a new prediction. Based on some preliminary tests, we feel this predictive model can forecast the payload temperature of the entire test cycle within 5 degrees Celsius after it has learned 3 times during the beginning of the test. The tool will allow the operator to play "what-if' experiments to decide what is his best shroud temperature set-point control strategy. This tool will save money by minimizing guess work and optimizing transitions as well as making the testing process safer and easier to conduct.
Force Modelling in Orthogonal Cutting Considering Flank Wear Effect
NASA Astrophysics Data System (ADS)
Rathod, Kanti Bhikhubhai; Lalwani, Devdas I.
2017-05-01
In the present work, an attempt has been made to provide a predictive cutting force model during orthogonal cutting by combining two different force models, that is, a force model for a perfectly sharp tool plus considering the effect of edge radius and a force model for a worn tool. The first force model is for a perfectly sharp tool that is based on Oxley's predictive machining theory for orthogonal cutting as the Oxley's model is for perfectly sharp tool, the effect of cutting edge radius (hone radius) is added and improve model is presented. The second force model is based on worn tool (flank wear) that was proposed by Waldorf. Further, the developed combined force model is also used to predict flank wear width using inverse approach. The performance of the developed combined total force model is compared with the previously published results for AISI 1045 and AISI 4142 materials and found reasonably good agreement.
A simple prediction tool for inhaled corticosteroid response in asthmatic children.
Wu, Yi-Fan; Su, Ming-Wei; Chiang, Bor-Luen; Yang, Yao-Hsu; Tsai, Ching-Hui; Lee, Yungling L
2017-12-07
Inhaled corticosteroids are recommended as the first-line controller medication for childhood asthma owing to their multiple clinical benefits. However, heterogeneity in the response towards these drugs remains a significant clinical problem. Children aged 5 to 18 years with mild to moderate persistent asthma were recruited into the Taiwanese Consortium of Childhood Asthma Study. Their responses to inhaled corticosteroids were assessed based on their improvements in the asthma control test and peak expiratory flow. The predictors of responsiveness were demographic and clinical features that were available in primary care settings. We have developed a prediction model using logistic regression and have simplified it to formulate a practical tool. We assessed its predictive performance using the area under the receiver operating characteristic curve. Of the 73 asthmatic children with baseline and follow-up outcome measurements for inhaled corticosteroids treatment, 24 (33%) were defined as non-responders. The tool we have developed consisted of three predictors yielding a total score between 0 and 5, which are comprised of the following parameters: the age at physician-diagnosis of asthma, sex, and exhaled nitric oxide. Sensitivity and specificity of the tool for prediction of inhaled corticosteroids non-responsiveness, for a score of 3, were 0.75 and 0.69, respectively. The areas under the receiver operating characteristic curve for the prediction tool was 0.763. Our prediction tool represents a simple and low-cost method for predicting the response of inhaled corticosteroids treatment in asthmatic children.
Gillingham, Philip
2016-01-01
Recent developments in digital technology have facilitated the recording and retrieval of administrative data from multiple sources about children and their families. Combined with new ways to mine such data using algorithms which can ‘learn’, it has been claimed that it is possible to develop tools that can predict which individual children within a population are most likely to be maltreated. The proposed benefit is that interventions can then be targeted to the most vulnerable children and their families to prevent maltreatment from occurring. As expertise in predictive modelling increases, the approach may also be applied in other areas of social work to predict and prevent adverse outcomes for vulnerable service users. In this article, a glimpse inside the ‘black box’ of predictive tools is provided to demonstrate how their development for use in social work may not be straightforward, given the nature of the data recorded about service users and service activity. The development of predictive risk modelling (PRM) in New Zealand is focused on as an example as it may be the first such tool to be applied as part of ongoing reforms to child protection services. PMID:27559213
Gillingham, Philip
2016-06-01
Recent developments in digital technology have facilitated the recording and retrieval of administrative data from multiple sources about children and their families. Combined with new ways to mine such data using algorithms which can 'learn', it has been claimed that it is possible to develop tools that can predict which individual children within a population are most likely to be maltreated. The proposed benefit is that interventions can then be targeted to the most vulnerable children and their families to prevent maltreatment from occurring. As expertise in predictive modelling increases, the approach may also be applied in other areas of social work to predict and prevent adverse outcomes for vulnerable service users. In this article, a glimpse inside the 'black box' of predictive tools is provided to demonstrate how their development for use in social work may not be straightforward, given the nature of the data recorded about service users and service activity. The development of predictive risk modelling (PRM) in New Zealand is focused on as an example as it may be the first such tool to be applied as part of ongoing reforms to child protection services.
Conservation of a molecular target across species can be used as a line-of-evidence to predict the likelihood of chemical susceptibility. The web-based Sequence Alignment to Predict Across Species Susceptibility (SeqAPASS) tool was developed to simplify, streamline, and quantitat...
Patrick Nombo, Anna; Wendelin Mwanri, Akwilina; Brouwer-Brolsma, Elske M; Ramaiya, Kaushik L; Feskens, Edith
2018-05-28
Universal screening for hyperglycemia during pregnancy may be in-practical in resource constrained countries. Therefore, the aim of this study was to develop a simple, non-invasive practical tool to predict undiagnosed Gestational diabetes mellitus (GDM) in Tanzania. We used cross-sectional data of 609 pregnant women, without known diabetes, collected in six health facilities from Dar es Salaam city (urban). Women underwent screening for GDM during ante-natal clinics visit. Smoking habit, alcohol consumption, pre-existing hypertension, birth weight of the previous child, high parity, gravida, previous caesarean section, age, MUAC ≥28 cm, previous stillbirth, haemoglobin level, gestational age (weeks), family history of type 2 diabetes, intake of sweetened drinks (soda), physical activity, vegetables and fruits consumption were considered as important predictors for GDM. Multivariate logistic regression modelling was used to create the prediction model, using a cut-off value of 2.5 to minimise the number of undiagnosed GDM (false negatives). Mid-upper arm circumference (MUAC) ≥28 cm, previous stillbirth, and family history of type 2 diabetes were identified as significant risk factors of GDM with a sensitivity, specificity, positive predictive value, and negative predictive value of 69%, 53%, 12% and 95%, respectively. Moreover, the inclusion of these three predictors resulted in an area under the curve (AUC) of 0.64 (0.56-0.72), indicating that the current tool correctly classifies 64% of high risk individuals. The findings of this study indicate that MUAC, previous stillbirth, and family history of type 2 diabetes significantly predict GDM development in this Tanzanian population. However, the developed non-invasive practical tool to predict undiagnosed GDM only identified 6 out of 10 individuals at risk of developing GDM. Thus, further development of the tool is warranted, for instance by testing the impact of other known risk factors such as maternal age, pre-pregnancy BMI, hypertension during or before pregnancy and pregnancy weight gain. Copyright © 2018. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Sarni, W.
2017-12-01
Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.
NASA Astrophysics Data System (ADS)
Nurhuda; Lukito, A.; Masriyah
2018-01-01
This study aims to develop instructional tools and implement it to see the effectiveness. The method used in this research referred to Designing Effective Instruction. Experimental research with two-group pretest-posttest design method was conducted. The instructional tools have been developed is cooperative learning model with predict-observe-explain strategy on the topic of cuboid and cube volume which consist of lesson plans, POE tasks, and Tests. Instructional tools were of good quality by criteria of validity, practicality, and effectiveness. These instructional tools was very effective for teaching the volume of cuboid and cube. Cooperative instructional tool with predict-observe-explain (POE) strategy was good of quality because the teacher was easy to implement the steps of learning, students easy to understand the material and students’ learning outcomes completed classically. Learning by using this instructional tool was effective because learning activities were appropriate and students were very active. Students’ learning outcomes were completed classically and better than conventional learning. This study produced a good instructional tool and effectively used in learning. Therefore, these instructional tools can be used as an alternative to teach volume of cuboid and cube topics.
NASA Astrophysics Data System (ADS)
Okokpujie, Imhade Princess; Ikumapayi, Omolayo M.; Okonkwo, Ugochukwu C.; Salawu, Enesi Y.; Afolalu, Sunday A.; Dirisu, Joseph O.; Nwoke, Obinna N.; Ajayi, Oluseyi O.
2017-12-01
In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N), feed rate (f), axial depth of cut (a) and radial depth of cut (r). The experiment was designed using central composite design (CCD) in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM). The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.
Kharroubi, Akram; Saba, Elias; Ghannam, Ibrahim; Darwish, Hisham
2017-12-01
The need for simple self-assessment tools is necessary to predict women at high risk for developing osteoporosis. In this study, tools like the IOF One Minute Test, Fracture Risk Assessment Tool (FRAX), and Simple Calculated Osteoporosis Risk Estimation (SCORE) were found to be valid for Palestinian women. The threshold for predicting women at risk for each tool was estimated. The purpose of this study is to evaluate the validity of the updated IOF (International Osteoporosis Foundation) One Minute Osteoporosis Risk Assessment Test, FRAX, SCORE as well as age alone to detect the risk of developing osteoporosis in postmenopausal Palestinian women. Three hundred eighty-two women 45 years and older were recruited including 131 women with osteoporosis and 251 controls following bone mineral density (BMD) measurement, 287 completed questionnaires of the different risk assessment tools. Receiver operating characteristic (ROC) curves were evaluated for each tool using bone BMD as the gold standard for osteoporosis. The area under the ROC curve (AUC) was the highest for FRAX calculated with BMD for predicting hip fractures (0.897) followed by FRAX for major fractures (0.826) with cut-off values ˃1.5 and ˃7.8%, respectively. The IOF One Minute Test AUC (0.629) was the lowest compared to other tested tools but with sufficient accuracy for predicting the risk of developing osteoporosis with a cut-off value ˃4 total yes questions out of 18. SCORE test and age alone were also as good predictors of risk for developing osteoporosis. According to the ROC curve for age, women ≥64 years had a higher risk of developing osteoporosis. Higher percentage of women with low BMD (T-score ≤-1.5) or osteoporosis (T-score ≤-2.5) was found among women who were not exposed to the sun, who had menopause before the age of 45 years, or had lower body mass index (BMI) compared to controls. Women who often fall had lower BMI and approximately 27% of the recruited postmenopausal Palestinian women had accidents that caused fractures. Simple self-assessment tools like FRAX without BMD, SCORE, and the IOF One Minute Tests were valid for predicting Palestinian postmenopausal women at high risk of developing osteoporosis.
Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization
NASA Technical Reports Server (NTRS)
Gern, Frank H.
2012-01-01
This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.
Common features of microRNA target prediction tools
Peterson, Sarah M.; Thompson, Jeffrey A.; Ufkin, Melanie L.; Sathyanarayana, Pradeep; Liaw, Lucy; Congdon, Clare Bates
2014-01-01
The human genome encodes for over 1800 microRNAs (miRNAs), which are short non-coding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one miRNA to target multiple gene transcripts, miRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of miRNA targets is a critical initial step in identifying miRNA:mRNA target interactions for experimental validation. The available tools for miRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to miRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all miRNA target prediction tools, four main aspects of the miRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MiRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output. PMID:24600468
Common features of microRNA target prediction tools.
Peterson, Sarah M; Thompson, Jeffrey A; Ufkin, Melanie L; Sathyanarayana, Pradeep; Liaw, Lucy; Congdon, Clare Bates
2014-01-01
The human genome encodes for over 1800 microRNAs (miRNAs), which are short non-coding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one miRNA to target multiple gene transcripts, miRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of miRNA targets is a critical initial step in identifying miRNA:mRNA target interactions for experimental validation. The available tools for miRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to miRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all miRNA target prediction tools, four main aspects of the miRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MiRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output.
van Bokhorst-de van der Schueren, Marian A E; Guaitoli, Patrícia Realino; Jansma, Elise P; de Vet, Henrica C W
2014-02-01
Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. A systematic review of English, French, German, Spanish, Portuguese and Dutch articles identified via MEDLINE, Cinahl and EMBASE (from inception to the 2nd of February 2012). Additional studies were identified by checking reference lists of identified manuscripts. Search terms included key words for malnutrition, screening or assessment instruments, and terms for hospital setting and adults. Data were extracted independently by 2 authors. Only studies expressing the (construct, criterion or predictive) validity of a tool were included. 83 studies (32 screening tools) were identified: 42 studies on construct or criterion validity versus a reference method and 51 studies on predictive validity on outcome (i.e. length of stay, mortality or complications). None of the tools performed consistently well to establish the patients' nutritional status. For the elderly, MNA performed fair to good, for the adults MUST performed fair to good. SGA, NRS-2002 and MUST performed well in predicting outcome in approximately half of the studies reviewed in adults, but not in older patients. Not one single screening or assessment tool is capable of adequate nutrition screening as well as predicting poor nutrition related outcome. Development of new tools seems redundant and will most probably not lead to new insights. New studies comparing different tools within one patient population are required. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
NASA Technical Reports Server (NTRS)
Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris
2011-01-01
A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.
PredictSNP: Robust and Accurate Consensus Classifier for Prediction of Disease-Related Mutations
Bendl, Jaroslav; Stourac, Jan; Salanda, Ondrej; Pavelka, Antonin; Wieben, Eric D.; Zendulka, Jaroslav; Brezovsky, Jan; Damborsky, Jiri
2014-01-01
Single nucleotide variants represent a prevalent form of genetic variation. Mutations in the coding regions are frequently associated with the development of various genetic diseases. Computational tools for the prediction of the effects of mutations on protein function are very important for analysis of single nucleotide variants and their prioritization for experimental characterization. Many computational tools are already widely employed for this purpose. Unfortunately, their comparison and further improvement is hindered by large overlaps between the training datasets and benchmark datasets, which lead to biased and overly optimistic reported performances. In this study, we have constructed three independent datasets by removing all duplicities, inconsistencies and mutations previously used in the training of evaluated tools. The benchmark dataset containing over 43,000 mutations was employed for the unbiased evaluation of eight established prediction tools: MAPP, nsSNPAnalyzer, PANTHER, PhD-SNP, PolyPhen-1, PolyPhen-2, SIFT and SNAP. The six best performing tools were combined into a consensus classifier PredictSNP, resulting into significantly improved prediction performance, and at the same time returned results for all mutations, confirming that consensus prediction represents an accurate and robust alternative to the predictions delivered by individual tools. A user-friendly web interface enables easy access to all eight prediction tools, the consensus classifier PredictSNP and annotations from the Protein Mutant Database and the UniProt database. The web server and the datasets are freely available to the academic community at http://loschmidt.chemi.muni.cz/predictsnp. PMID:24453961
Flight Experiment Verification of Shuttle Boundary Layer Transition Prediction Tool
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Berger, Karen T.; Horvath, Thomas J.; Wood, William A.
2016-01-01
Boundary layer transition at hypersonic conditions is critical to the design of future high-speed aircraft and spacecraft. Accurate methods to predict transition would directly impact the aerothermodynamic environments used to size a hypersonic vehicle's thermal protection system. A transition prediction tool, based on wind tunnel derived discrete roughness correlations, was developed and implemented for the Space Shuttle return-to-flight program. This tool was also used to design a boundary layer transition flight experiment in order to assess correlation uncertainties, particularly with regard to high Mach-number transition and tunnel-to-flight scaling. A review is provided of the results obtained from the flight experiment in order to evaluate the transition prediction tool implemented for the Shuttle program.
2018-01-01
Background Around the world, depression is both under- and overtreated. The diamond clinical prediction tool was developed to assist with appropriate treatment allocation by estimating the 3-month prognosis among people with current depressive symptoms. Delivering clinical prediction tools in a way that will enhance their uptake in routine clinical practice remains challenging; however, mobile apps show promise in this respect. To increase the likelihood that an app-delivered clinical prediction tool can be successfully incorporated into clinical practice, it is important to involve end users in the app design process. Objective The aim of the study was to maximize patient engagement in an app designed to improve treatment allocation for depression. Methods An iterative, user-centered design process was employed. Qualitative data were collected via 2 focus groups with a community sample (n=17) and 7 semistructured interviews with people with depressive symptoms. The results of the focus groups and interviews were used by the computer engineering team to modify subsequent protoypes of the app. Results Iterative development resulted in 3 prototypes and a final app. The areas requiring the most substantial changes following end-user input were related to the iconography used and the way that feedback was provided. In particular, communicating risk of future depressive symptoms proved difficult; these messages were consistently misinterpreted and negatively viewed and were ultimately removed. All participants felt positively about seeing their results summarized after completion of the clinical prediction tool, but there was a need for a personalized treatment recommendation made in conjunction with a consultation with a health professional. Conclusions User-centered design led to valuable improvements in the content and design of an app designed to improve allocation of and engagement in depression treatment. Iterative design allowed us to develop a tool that allows users to feel hope, engage in self-reflection, and motivate them to treatment. The tool is currently being evaluated in a randomized controlled trial. PMID:29685864
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Callaghan, Michael E., E-mail: elspeth.raymond@health.sa.gov.au; Freemasons Foundation Centre for Men's Health, University of Adelaide; Urology Unit, Repatriation General Hospital, SA Health, Flinders Centre for Innovation in Cancer
Purpose: To identify, through a systematic review, all validated tools used for the prediction of patient-reported outcome measures (PROMs) in patients being treated with radiation therapy for prostate cancer, and provide a comparative summary of accuracy and generalizability. Methods and Materials: PubMed and EMBASE were searched from July 2007. Title/abstract screening, full text review, and critical appraisal were undertaken by 2 reviewers, whereas data extraction was performed by a single reviewer. Eligible articles had to provide a summary measure of accuracy and undertake internal or external validation. Tools were recommended for clinical implementation if they had been externally validated and foundmore » to have accuracy ≥70%. Results: The search strategy identified 3839 potential studies, of which 236 progressed to full text review and 22 were included. From these studies, 50 tools predicted gastrointestinal/rectal symptoms, 29 tools predicted genitourinary symptoms, 4 tools predicted erectile dysfunction, and no tools predicted quality of life. For patients treated with external beam radiation therapy, 3 tools could be recommended for the prediction of rectal toxicity, gastrointestinal toxicity, and erectile dysfunction. For patients treated with brachytherapy, 2 tools could be recommended for the prediction of urinary retention and erectile dysfunction. Conclusions: A large number of tools for the prediction of PROMs in prostate cancer patients treated with radiation therapy have been developed. Only a small minority are accurate and have been shown to be generalizable through external validation. This review provides an accessible catalogue of tools that are ready for clinical implementation as well as which should be prioritized for validation.« less
NASA Astrophysics Data System (ADS)
Johnston, Michael A.; Farrell, Damien; Nielsen, Jens Erik
2012-04-01
The exchange of information between experimentalists and theoreticians is crucial to improving the predictive ability of theoretical methods and hence our understanding of the related biology. However many barriers exist which prevent the flow of information between the two disciplines. Enabling effective collaboration requires that experimentalists can easily apply computational tools to their data, share their data with theoreticians, and that both the experimental data and computational results are accessible to the wider community. We present a prototype collaborative environment for developing and validating predictive tools for protein biophysical characteristics. The environment is built on two central components; a new python-based integration module which allows theoreticians to provide and manage remote access to their programs; and PEATDB, a program for storing and sharing experimental data from protein biophysical characterisation studies. We demonstrate our approach by integrating PEATSA, a web-based service for predicting changes in protein biophysical characteristics, into PEATDB. Furthermore, we illustrate how the resulting environment aids method development using the Potapov dataset of experimentally measured ΔΔGfold values, previously employed to validate and train protein stability prediction algorithms.
Validation of BEHAVE fire behavior predictions in oak savannas using five fuel models
Keith Grabner; John Dwyer; Bruce Cutter
1997-01-01
Prescribed fire is a valuable tool in the restoration and management of oak savannas. BEHAVE, a fire behavior prediction system developed by the United States Forest Service, can be a useful tool when managing oak savannas with prescribed fire. BEHAVE predictions of fire rate-of-spread and flame length were validated using four standardized fuel models: Fuel Model 1 (...
High-fidelity modeling and impact footprint prediction for vehicle breakup analysis
NASA Astrophysics Data System (ADS)
Ling, Lisa
For decades, vehicle breakup analysis had been performed for space missions that used nuclear heater or power units in order to assess aerospace nuclear safety for potential launch failures leading to inadvertent atmospheric reentry. Such pre-launch risk analysis is imperative to assess possible environmental impacts, obtain launch approval, and for launch contingency planning. In order to accurately perform a vehicle breakup analysis, the analysis tool should include a trajectory propagation algorithm coupled with thermal and structural analyses and influences. Since such a software tool was not available commercially or in the public domain, a basic analysis tool was developed by Dr. Angus McRonald prior to this study. This legacy software consisted of low-fidelity modeling and had the capability to predict vehicle breakup, but did not predict the surface impact point of the nuclear component. Thus the main thrust of this study was to develop and verify the additional dynamics modeling and capabilities for the analysis tool with the objectives to (1) have the capability to predict impact point and footprint, (2) increase the fidelity in the prediction of vehicle breakup, and (3) reduce the effort and time required to complete an analysis. The new functions developed for predicting the impact point and footprint included 3-degrees-of-freedom trajectory propagation, the generation of non-arbitrary entry conditions, sensitivity analysis, and the calculation of impact footprint. The functions to increase the fidelity in the prediction of vehicle breakup included a panel code to calculate the hypersonic aerodynamic coefficients for an arbitrary-shaped body and the modeling of local winds. The function to reduce the effort and time required to complete an analysis included the calculation of node failure criteria. The derivation and development of these new functions are presented in this dissertation, and examples are given to demonstrate the new capabilities and the improvements made, with comparisons between the results obtained from the upgraded analysis tool and the legacy software wherever applicable.
Wildland Fire Research: Tools and Technology Development
Scientific tools are needed to better quantify and predict the impact of smoke from wildlfires on public health. EPA research is supporting the development of new air quality monitors, advancing modeling and improving emissions inventories.
Tampa Bay Water Clarity Model (TBWCM): As a Predictive Tool
The Tampa Bay Water Clarity Model was developed as a predictive tool for estimating the impact of changing nutrient loads on water clarity as measured by secchi depth. The model combines a physical mixing model with an irradiance model and nutrient cycling model. A 10 segment bi...
Siedlecki, Sandra L; Albert, Nancy M
This article will describe how to assess interrater reliability and validity of risk assessment tools, using easy-to-follow formulas, and to provide calculations that demonstrate principles discussed. Clinical nurse specialists should be able to identify risk assessment tools that provide high-quality interrater reliability and the highest validity for predicting true events of importance to clinical settings. Making best practice recommendations for assessment tool use is critical to high-quality patient care and safe practices that impact patient outcomes and nursing resources. Optimal risk assessment tool selection requires knowledge about interrater reliability and tool validity. The clinical nurse specialist will understand the reliability and validity issues associated with risk assessment tools, and be able to evaluate tools using basic calculations. Risk assessment tools are developed to objectively predict quality and safety events and ultimately reduce the risk of event occurrence through preventive interventions. To ensure high-quality tool use, clinical nurse specialists must critically assess tool properties. The better the tool's ability to predict adverse events, the more likely that event risk is mediated. Interrater reliability and validity assessment is relatively an easy skill to master and will result in better decisions when selecting or making recommendations for risk assessment tool use.
Steger-Hartmann, Thomas; Länge, Reinhard; Heuck, Klaus
2011-05-01
The concentration of a pharmaceutical found in the environment is determined by the amount used by the patient, the excretion and metabolism pattern, and eventually by its persistence. Biological degradation or persistence of a pharmaceutical is experimentally tested rather late in the development of a pharmaceutical, often shortly before submission of the dossier to regulatory authorities. To investigate whether the aspect of persistence of a compound could be assessed early during drug development, we investigated whether biodegradation of pharmaceuticals could be predicted with the help of in silico tools. To assess the value of in silico prediction, we collected results for the OECD 301 degradation test ("ready biodegradability") of 42 drugs or drug synthesis intermediates and compared them to the prediction of the in silico tool BIOWIN. Of these compounds, 38 were predictable with BIOWIN, which is a module of the Estimation Programs Interface (EPI) Suite™ provided by the US EPA. The program failed to predict the two drugs which proved to be readily biodegradable in the degradation tests. On the other hand, BIOWIN predicted two compounds to be readily biodegradable which, however, proved to be persistent in the test setting. The comparison of experimental data with the predicted one resulted in a specificity of 94% and a sensitivity of 0%. The results of this study do not indicate that application of the biodegradation prediction tool BIOWIN is a feasible approach to assess the ready biodegradability during early drug development.
NASA Technical Reports Server (NTRS)
Wang, John T.; Bomarito, Geoffrey F.
2016-01-01
This study implements a plasticity tool to predict the nonlinear shear behavior of unidirectional composite laminates under multiaxial loadings, with an intent to further develop the tool for use in composite progressive damage analysis. The steps for developing the plasticity tool include establishing a general quadratic yield function, deriving the incremental elasto-plastic stress-strain relations using the yield function with associated flow rule, and integrating the elasto-plastic stress-strain relations with a modified Euler method and a substepping scheme. Micromechanics analyses are performed to obtain normal and shear stress-strain curves that are used in determining the plasticity parameters of the yield function. By analyzing a micromechanics model, a virtual testing approach is used to replace costly experimental tests for obtaining stress-strain responses of composites under various loadings. The predicted elastic moduli and Poisson's ratios are in good agreement with experimental data. The substepping scheme for integrating the elasto-plastic stress-strain relations is suitable for working with displacement-based finite element codes. An illustration problem is solved to show that the plasticity tool can predict the nonlinear shear behavior for a unidirectional laminate subjected to multiaxial loadings.
NASA Technical Reports Server (NTRS)
Lyle, Karen H.
2008-01-01
The Space Shuttle Columbia Accident Investigation Board recommended that NASA develop, validate, and maintain a modeling tool capable of predicting the damage threshold for debris impacts on the Space Shuttle Reinforced Carbon-Carbon (RCC) wing leading edge and nosecap assembly. The results presented in this paper are one part of a multi-level approach that supported the development of the predictive tool used to recertify the shuttle for flight following the Columbia Accident. The assessment of predictive capability was largely based on test analysis comparisons for simpler component structures. This paper provides comparisons of finite element simulations with test data for external tank foam debris impacts onto 6-in. square RCC flat panels. Both quantitative displacement and qualitative damage assessment correlations are provided. The comparisons show good agreement and provided the Space Shuttle Program with confidence in the predictive tool.
Conser, Christiana; Seebacher, Lizbeth; Fujino, David W; Reichard, Sarah; DiTomaso, Joseph M
2015-01-01
Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when "needs further evaluation" classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When "needs further evaluation" classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program.
Conser, Christiana; Seebacher, Lizbeth; Fujino, David W.; Reichard, Sarah; DiTomaso, Joseph M.
2015-01-01
Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when “needs further evaluation” classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When “needs further evaluation” classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program. PMID:25803830
Bolt, H. L.; Williams, C. E. J.; Brooks, R. V.; ...
2017-01-13
Hydrophobicity has proven to be an extremely useful parameter in small molecule drug discovery programmes given that it can be used as a predictive tool to enable rational design. For larger molecules, including peptoids, where folding is possible, the situation is more complicated and the average hydrophobicity (as determined by RP-HPLC retention time) may not always provide an effective predictive tool for rational design. Herein, we report the first ever application of partitioning experiments to determine the log D values for a series of peptoids. By comparing log D and average hydrophobicities we highlight the potential advantage of employing themore » former as a predictive tool in the rational design of biologically active peptoids.« less
Furmanchuk, Al'ona; Saal, James E; Doak, Jeff W; Olson, Gregory B; Choudhary, Alok; Agrawal, Ankit
2018-02-05
The regression model-based tool is developed for predicting the Seebeck coefficient of crystalline materials in the temperature range from 300 K to 1000 K. The tool accounts for the single crystal versus polycrystalline nature of the compound, the production method, and properties of the constituent elements in the chemical formula. We introduce new descriptive features of crystalline materials relevant for the prediction the Seebeck coefficient. To address off-stoichiometry in materials, the predictive tool is trained on a mix of stoichiometric and nonstoichiometric materials. The tool is implemented into a web application (http://info.eecs.northwestern.edu/SeebeckCoefficientPredictor) to assist field scientists in the discovery of novel thermoelectric materials. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolt, H. L.; Williams, C. E. J.; Brooks, R. V.
Hydrophobicity has proven to be an extremely useful parameter in small molecule drug discovery programmes given that it can be used as a predictive tool to enable rational design. For larger molecules, including peptoids, where folding is possible, the situation is more complicated and the average hydrophobicity (as determined by RP-HPLC retention time) may not always provide an effective predictive tool for rational design. Herein, we report the first ever application of partitioning experiments to determine the log D values for a series of peptoids. By comparing log D and average hydrophobicities we highlight the potential advantage of employing themore » former as a predictive tool in the rational design of biologically active peptoids.« less
Predictive Model of Linear Antimicrobial Peptides Active against Gram-Negative Bacteria.
Vishnepolsky, Boris; Gabrielian, Andrei; Rosenthal, Alex; Hurt, Darrell E; Tartakovsky, Michael; Managadze, Grigol; Grigolava, Maya; Makhatadze, George I; Pirtskhalava, Malak
2018-05-29
Antimicrobial peptides (AMPs) have been identified as a potential new class of anti-infectives for drug development. There are a lot of computational methods that try to predict AMPs. Most of them can only predict if a peptide will show any antimicrobial potency, but to the best of our knowledge, there are no tools which can predict antimicrobial potency against particular strains. Here we present a predictive model of linear AMPs being active against particular Gram-negative strains relying on a semi-supervised machine-learning approach with a density-based clustering algorithm. The algorithm can well distinguish peptides active against particular strains from others which may also be active but not against the considered strain. The available AMP prediction tools cannot carry out this task. The prediction tool based on the algorithm suggested herein is available on https://dbaasp.org.
Predicting Operator Execution Times Using CogTool
NASA Technical Reports Server (NTRS)
Santiago-Espada, Yamira; Latorella, Kara A.
2013-01-01
Researchers and developers of NextGen systems can use predictive human performance modeling tools as an initial approach to obtain skilled user performance times analytically, before system testing with users. This paper describes the CogTool models for a two pilot crew executing two different types of a datalink clearance acceptance tasks, and on two different simulation platforms. The CogTool time estimates for accepting and executing Required Time of Arrival and Interval Management clearances were compared to empirical data observed in video tapes and registered in simulation files. Results indicate no statistically significant difference between empirical data and the CogTool predictions. A population comparison test found no significant differences between the CogTool estimates and the empirical execution times for any of the four test conditions. We discuss modeling caveats and considerations for applying CogTool to crew performance modeling in advanced cockpit environments.
Decision-making tools in prostate cancer: from risk grouping to nomograms.
Fontanella, Paolo; Benecchi, Luigi; Grasso, Angelica; Patel, Vipul; Albala, David; Abbou, Claude; Porpiglia, Francesco; Sandri, Marco; Rocco, Bernardo; Bianchi, Giampaolo
2017-12-01
Prostate cancer (PCa) is the most common solid neoplasm and the second leading cause of cancer death in men. After the Partin tables were developed, a number of predictive and prognostic tools became available for risk stratification. These tools have allowed the urologist to better characterize this disease and lead to more confident treatment decisions for patients. The purpose of this study is to critically review the decision-making tools currently available to the urologist, from the moment when PCa is first diagnosed until patients experience metastatic progression and death. A systematic and critical analysis through Medline, EMBASE, Scopus and Web of Science databases was carried out in February 2016 as per the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. The search was conducted using the following key words: "prostate cancer," "prediction tools," "nomograms." Seventy-two studies were identified in the literature search. We summarized the results into six sections: Tools for prediction of life expectancy (before treatment), Tools for prediction of pathological stage (before treatment), Tools for prediction of survival and cancer-specific mortality (before/after treatment), Tools for prediction of biochemical recurrence (before/after treatment), Tools for prediction of metastatic progression (after treatment) and in the last section biomarkers and genomics. The management of PCa patients requires a tailored approach to deliver a truly personalized treatment. The currently available tools are of great help in helping the urologist in the decision-making process. These tests perform very well in high-grade and low-grade disease, while for intermediate-grade disease further research is needed. Newly discovered markers, genomic tests, and advances in imaging acquisition through mpMRI will help in instilling confidence that the appropriate treatments are being offered to patients with prostate cancer.
Cherkaoui, Imad; Sabouni, Radia; Ghali, Iraqi; Kizub, Darya; Billioux, Alexander C; Bennani, Kenza; Bourkadi, Jamal Eddine; Benmamoun, Abderrahmane; Lahlou, Ouafae; Aouad, Rajae El; Dooley, Kelly E
2014-01-01
Public tuberculosis (TB) clinics in urban Morocco. Explore risk factors for TB treatment default and develop a prediction tool. Assess consequences of default, specifically risk for transmission or development of drug resistance. Case-control study comparing patients who defaulted from TB treatment and patients who completed it using quantitative methods and open-ended questions. Results were interpreted in light of health professionals' perspectives from a parallel study. A predictive model and simple tool to identify patients at high risk of default were developed. Sputum from cases with pulmonary TB was collected for smear and drug susceptibility testing. 91 cases and 186 controls enrolled. Independent risk factors for default included current smoking, retreatment, work interference with adherence, daily directly observed therapy, side effects, quick symptom resolution, and not knowing one's treatment duration. Age >50 years, never smoking, and having friends who knew one's diagnosis were protective. A simple scoring tool incorporating these factors was 82.4% sensitive and 87.6% specific for predicting default in this population. Clinicians and patients described additional contributors to default and suggested locally-relevant intervention targets. Among 89 cases with pulmonary TB, 71% had sputum that was smear positive for TB. Drug resistance was rare. The causes of default from TB treatment were explored through synthesis of qualitative and quantitative data from patients and health professionals. A scoring tool with high sensitivity and specificity to predict default was developed. Prospective evaluation of this tool coupled with targeted interventions based on our findings is warranted. Of note, the risk of TB transmission from patients who default treatment to others is likely to be high. The commonly-feared risk of drug resistance, though, may be low; a larger study is required to confirm these findings.
NASA Astrophysics Data System (ADS)
Kardhana, Hadi; Arya, Doni Khaira; Hadihardaja, Iwan K.; Widyaningtyas, Riawan, Edi; Lubis, Atika
2017-11-01
Small-Scale Hydropower (SHP) had been important electric energy power source in Indonesia. Indonesia is vast countries, consists of more than 17.000 islands. It has large fresh water resource about 3 m of rainfall and 2 m of runoff. Much of its topography is mountainous, remote but abundant with potential energy. Millions of people do not have sufficient access to electricity, some live in the remote places. Recently, SHP development was encouraged for energy supply of the places. Development of global hydrology data provides opportunity to predict distribution of hydropower potential. In this paper, we demonstrate run-of-river type SHP spot prediction tool using SWAT and a river diversion algorithm. The use of Soil and Water Assessment Tool (SWAT) with input of CFSR (Climate Forecast System Re-analysis) of 10 years period had been implemented to predict spatially distributed flow cumulative distribution function (CDF). A simple algorithm to maximize potential head of a location by a river diversion expressing head race and penstock had been applied. Firm flow and power of the SHP were estimated from the CDF and the algorithm. The tool applied to Upper Citarum River Basin and three out of four existing hydropower locations had been well predicted. The result implies that this tool is able to support acceleration of SHP development at earlier phase.
Review of Orbiter Flight Boundary Layer Transition Data
NASA Technical Reports Server (NTRS)
Mcginley, Catherine B.; Berry, Scott A.; Kinder, Gerald R.; Barnell, maria; Wang, Kuo C.; Kirk, Benjamin S.
2006-01-01
In support of the Shuttle Return to Flight program, a tool was developed to predict when boundary layer transition would occur on the lower surface of the orbiter during reentry due to the presence of protuberances and cavities in the thermal protection system. This predictive tool was developed based on extensive wind tunnel tests conducted after the loss of the Space Shuttle Columbia. Recognizing that wind tunnels cannot simulate the exact conditions an orbiter encounters as it re-enters the atmosphere, a preliminary attempt was made to use the documented flight related damage and the orbiter transition times, as deduced from flight instrumentation, to calibrate the predictive tool. After flight STS-114, the Boundary Layer Transition Team decided that a more in-depth analysis of the historical flight data was needed to better determine the root causes of the occasional early transition times of some of the past shuttle flights. In this paper we discuss our methodology for the analysis, the various sources of shuttle damage information, the analysis of the flight thermocouple data, and how the results compare to the Boundary Layer Transition prediction tool designed for Return to Flight.
Development of the Rice Convection Model as a Space Weather Tool
2015-05-31
coupled to the ionosphere that is suitable for both scientific studies as well as a prediction tool. We are able to run the model faster than “real...of work by finding ways to fund a more systematic effort in making the RCM a space weather prediction tool for magnetospheric and ionospheric studies...convection electric field, total electron content, TEC, ionospheric convection, plasmasphere 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT
Family-Based Benchmarking of Copy Number Variation Detection Software.
Nutsua, Marcel Elie; Fischer, Annegret; Nebel, Almut; Hofmann, Sylvia; Schreiber, Stefan; Krawczak, Michael; Nothnagel, Michael
2015-01-01
The analysis of structural variants, in particular of copy-number variations (CNVs), has proven valuable in unraveling the genetic basis of human diseases. Hence, a large number of algorithms have been developed for the detection of CNVs in SNP array signal intensity data. Using the European and African HapMap trio data, we undertook a comparative evaluation of six commonly used CNV detection software tools, namely Affymetrix Power Tools (APT), QuantiSNP, PennCNV, GLAD, R-gada and VEGA, and assessed their level of pair-wise prediction concordance. The tool-specific CNV prediction accuracy was assessed in silico by way of intra-familial validation. Software tools differed greatly in terms of the number and length of the CNVs predicted as well as the number of markers included in a CNV. All software tools predicted substantially more deletions than duplications. Intra-familial validation revealed consistently low levels of prediction accuracy as measured by the proportion of validated CNVs (34-60%). Moreover, up to 20% of apparent family-based validations were found to be due to chance alone. Software using Hidden Markov models (HMM) showed a trend to predict fewer CNVs than segmentation-based algorithms albeit with greater validity. PennCNV yielded the highest prediction accuracy (60.9%). Finally, the pairwise concordance of CNV prediction was found to vary widely with the software tools involved. We recommend HMM-based software, in particular PennCNV, rather than segmentation-based algorithms when validity is the primary concern of CNV detection. QuantiSNP may be used as an additional tool to detect sets of CNVs not detectable by the other tools. Our study also reemphasizes the need for laboratory-based validation, such as qPCR, of CNVs predicted in silico.
Risk Prediction Models for Acute Kidney Injury in Critically Ill Patients: Opus in Progressu.
Neyra, Javier A; Leaf, David E
2018-05-31
Acute kidney injury (AKI) is a complex systemic syndrome associated with high morbidity and mortality. Among critically ill patients admitted to intensive care units (ICUs), the incidence of AKI is as high as 50% and is associated with dismal outcomes. Thus, the development and validation of clinical risk prediction tools that accurately identify patients at high risk for AKI in the ICU is of paramount importance. We provide a comprehensive review of 3 clinical risk prediction tools that have been developed for incident AKI occurring in the first few hours or days following admission to the ICU. We found substantial heterogeneity among the clinical variables that were examined and included as significant predictors of AKI in the final models. The area under the receiver operating characteristic curves was ∼0.8 for all 3 models, indicating satisfactory model performance, though positive predictive values ranged from only 23 to 38%. Hence, further research is needed to develop more accurate and reproducible clinical risk prediction tools. Strategies for improved assessment of AKI susceptibility in the ICU include the incorporation of dynamic (time-varying) clinical parameters, as well as biomarker, functional, imaging, and genomic data. © 2018 S. Karger AG, Basel.
Online Analysis of Wind and Solar Part I: Ramping Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.
2012-01-31
To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.
Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology
NASA Technical Reports Server (NTRS)
Knight, Norman F.
1998-01-01
The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.
Engineering Property Prediction Tools for Tailored Polymer Composite Structures (49465)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Kunc, Vlastimil
2009-12-29
Process and constitutive models as well as characterization tools and testing methods were developed to determine stress-strain responses, damage development, strengths and creep of long-fiber thermoplastics (LFTs). The developed models were implemented in Moldflow and ABAQUS and have been validated against LFT data obtained experimentally.
Erbel, Raimund; Lehmann, Nils; Churzidse, Sofia; Rauwolf, Michael; Mahabadi, Amir A; Möhlenkamp, Stefan; Moebus, Susanne; Bauer, Marcus; Kälsch, Hagen; Budde, Thomas; Montag, Michael; Schmermund, Axel; Stang, Andreas; Führer-Sakel, Dagmar; Weimar, Christian; Roggenbuck, Ulla; Dragano, Nico; Jöckel, Karl-Heinz
2014-11-07
Coronary artery calcification (CAC), as a sign of atherosclerosis, can be detected and progression quantified using computed tomography (CT). We develop a tool for predicting CAC progression. In 3481 participants (45-74 years, 53.1% women) CAC percentiles at baseline (CACb) and after five years (CAC₅y) were evaluated, demonstrating progression along gender-specific percentiles, which showed exponentially shaped age-dependence. Using quantile regression on the log-scale (log(CACb+1)) we developed a tool to individually predict CAC₅y, and compared to observed CAC₅y. The difference between observed and predicted CAC₅y (log-scale, mean±SD) was 0.08±1.11 and 0.06±1.29 in men and women. Agreement reached a kappa-value of 0.746 (95% confidence interval: 0.732-0.760) and concordance correlation (log-scale) of 0.886 (0.879-0.893). Explained variance of observed by predicted log(CAC₅y+1) was 80.1% and 72.0% in men and women, and 81.0 and 73.6% including baseline risk factors. Evaluating the tool in 1940 individuals with CACb>0 and CACb<400 at baseline, of whom 242 (12.5%) developed CAC₅y>400, yielded a sensitivity of 59.5%, specificity 96.1%, (+) and (-) predictive values of 68.3% and 94.3%. A pre-defined acceptance range around predicted CAC₅y contained 68.1% of observed CAC₅y; only 20% were expected by chance. Age, blood pressure, lipid-lowering medication, diabetes, and smoking contributed to progression above the acceptance range in men and, excepting age, in women. CAC nearly inevitably progresses with limited influence of cardiovascular risk factors. This allowed the development of a mathematical tool for prediction of individual CAC progression, enabling anticipation of the age when CAC thresholds of high risk are reached. © The Author 2014. Published by Oxford University Press on behalf of the European Society of Cardiology.
Müller, Martin; Seidenberg, Ruth; Schuh, Sabine K; Exadaktylos, Aristomenis K; Schechter, Clyde B; Leichtle, Alexander B; Hautz, Wolf E
2018-01-01
Patients presenting with suspected urinary tract infection are common in every day emergency practice. Urine flow cytometry has replaced microscopic urine evaluation in many emergency departments, but interpretation of the results remains challenging. The aim of this study was to develop and validate tools that predict urine culture growth out of urine flow cytometry parameter. This retrospective study included all adult patients that presented in a large emergency department between January and July 2017 with a suspected urinary tract infection and had a urine flow cytometry as well as a urine culture obtained. The objective was to identify urine flow cytometry parameters that reliably predict urine culture growth and mixed flora growth. The data set was split into a training (70%) and a validation set (30%) and different decision-making approaches were developed and validated. Relevant urine culture growth (respectively mixed flora growth) was found in 40.2% (7.2% respectively) of the 613 patients included. The number of leukocytes and bacteria in flow cytometry were highly associated with urine culture growth, but mixed flora growth could not be sufficiently predicted from the urine flow cytometry parameters. A decision tree, predictive value figures, a nomogram, and a cut-off table to predict urine culture growth from bacteria and leukocyte count were developed, validated and compared. Urine flow cytometry parameters are insufficient to predict mixed flora growth. However, the prediction of urine culture growth based on bacteria and leukocyte count is highly accurate and the developed tools should be used as part of the decision-making process of ordering a urine culture or starting an antibiotic therapy if a urogenital infection is suspected.
Seidenberg, Ruth; Schuh, Sabine K.; Exadaktylos, Aristomenis K.; Schechter, Clyde B.; Leichtle, Alexander B.; Hautz, Wolf E.
2018-01-01
Objective Patients presenting with suspected urinary tract infection are common in every day emergency practice. Urine flow cytometry has replaced microscopic urine evaluation in many emergency departments, but interpretation of the results remains challenging. The aim of this study was to develop and validate tools that predict urine culture growth out of urine flow cytometry parameter. Methods This retrospective study included all adult patients that presented in a large emergency department between January and July 2017 with a suspected urinary tract infection and had a urine flow cytometry as well as a urine culture obtained. The objective was to identify urine flow cytometry parameters that reliably predict urine culture growth and mixed flora growth. The data set was split into a training (70%) and a validation set (30%) and different decision-making approaches were developed and validated. Results Relevant urine culture growth (respectively mixed flora growth) was found in 40.2% (7.2% respectively) of the 613 patients included. The number of leukocytes and bacteria in flow cytometry were highly associated with urine culture growth, but mixed flora growth could not be sufficiently predicted from the urine flow cytometry parameters. A decision tree, predictive value figures, a nomogram, and a cut-off table to predict urine culture growth from bacteria and leukocyte count were developed, validated and compared. Conclusions Urine flow cytometry parameters are insufficient to predict mixed flora growth. However, the prediction of urine culture growth based on bacteria and leukocyte count is highly accurate and the developed tools should be used as part of the decision-making process of ordering a urine culture or starting an antibiotic therapy if a urogenital infection is suspected. PMID:29474463
A ligand predication tool based on modeling and reasoning with imprecise probabilistic knowledge.
Liu, Weiru; Yue, Anbu; Timson, David J
2010-04-01
Ligand prediction has been driven by a fundamental desire to understand more about how biomolecules recognize their ligands and by the commercial imperative to develop new drugs. Most of the current available software systems are very complex and time-consuming to use. Therefore, developing simple and efficient tools to perform initial screening of interesting compounds is an appealing idea. In this paper, we introduce our tool for very rapid screening for likely ligands (either substrates or inhibitors) based on reasoning with imprecise probabilistic knowledge elicited from past experiments. Probabilistic knowledge is input to the system via a user-friendly interface showing a base compound structure. A prediction of whether a particular compound is a substrate is queried against the acquired probabilistic knowledge base and a probability is returned as an indication of the prediction. This tool will be particularly useful in situations where a number of similar compounds have been screened experimentally, but information is not available for all possible members of that group of compounds. We use two case studies to demonstrate how to use the tool. 2009 Elsevier Ireland Ltd. All rights reserved.
Subramanyam, Rajeev; Yeramaneni, Samrat; Hossain, Mohamed Monir; Anneken, Amy M; Varughese, Anna M
2016-05-01
Perioperative respiratory adverse events (PRAEs) are the most common cause of serious adverse events in children receiving anesthesia. Our primary aim of this study was to develop and validate a risk prediction tool for the occurrence of PRAE from the onset of anesthesia induction until discharge from the postanesthesia care unit in children younger than 18 years undergoing elective ambulatory anesthesia for surgery and radiology. The incidence of PRAE was studied. We analyzed data from 19,059 patients from our department's quality improvement database. The predictor variables were age, sex, ASA physical status, morbid obesity, preexisting pulmonary disorder, preexisting neurologic disorder, and location of ambulatory anesthesia (surgery or radiology). Composite PRAE was defined as the presence of any 1 of the following events: intraoperative bronchospasm, intraoperative laryngospasm, postoperative apnea, postoperative laryngospasm, postoperative bronchospasm, or postoperative prolonged oxygen requirement. Development and validation of the risk prediction tool for PRAE were performed using a split sampling technique to split the database into 2 independent cohorts based on the year when the patient received ambulatory anesthesia for surgery and radiology using logistic regression. A risk score was developed based on the regression coefficients from the validation tool. The performance of the risk prediction tool was assessed by using tests of discrimination and calibration. The overall incidence of composite PRAE was 2.8%. The derivation cohort included 8904 patients, and the validation cohort included 10,155 patients. The risk of PRAE was 3.9% in the development cohort and 1.8% in the validation cohort. Age ≤ 3 years (versus >3 years), ASA physical status II or III (versus ASA physical status I), morbid obesity, preexisting pulmonary disorder, and surgery (versus radiology) significantly predicted the occurrence of PRAE in a multivariable logistic regression model. A risk score in the range of 0 to 3 was assigned to each significant variable in the logistic regression model, and final score for all risk factors ranged from 0 to 11. A cutoff score of 4 was derived from a receiver operating characteristic curve to determine the high-risk category. The model C-statistic and the corresponding SE for the derivation and validation cohort was 0.64 ± 0.01 and 0.63 ± 0.02, respectively. Sensitivity and SE of the risk prediction tool to identify children at risk for PRAE was 77.6 ± 0.02 in the derivation cohort and 76.2 ± 0.03 in the validation cohort. The risk tool developed and validated from our study cohort identified 5 risk factors: age ≤ 3 years (versus >3 years), ASA physical status II and III (versus ASA physical status I), morbid obesity, preexisting pulmonary disorder, and surgery (versus radiology) for PRAE. This tool can be used to provide an individual risk score for each patient to predict the risk of PRAE in the preoperative period.
Wachtler, Caroline; Coe, Amy; Davidson, Sandra; Fletcher, Susan; Mendoza, Antonette; Sterling, Leon; Gunn, Jane
2018-04-23
Around the world, depression is both under- and overtreated. The diamond clinical prediction tool was developed to assist with appropriate treatment allocation by estimating the 3-month prognosis among people with current depressive symptoms. Delivering clinical prediction tools in a way that will enhance their uptake in routine clinical practice remains challenging; however, mobile apps show promise in this respect. To increase the likelihood that an app-delivered clinical prediction tool can be successfully incorporated into clinical practice, it is important to involve end users in the app design process. The aim of the study was to maximize patient engagement in an app designed to improve treatment allocation for depression. An iterative, user-centered design process was employed. Qualitative data were collected via 2 focus groups with a community sample (n=17) and 7 semistructured interviews with people with depressive symptoms. The results of the focus groups and interviews were used by the computer engineering team to modify subsequent protoypes of the app. Iterative development resulted in 3 prototypes and a final app. The areas requiring the most substantial changes following end-user input were related to the iconography used and the way that feedback was provided. In particular, communicating risk of future depressive symptoms proved difficult; these messages were consistently misinterpreted and negatively viewed and were ultimately removed. All participants felt positively about seeing their results summarized after completion of the clinical prediction tool, but there was a need for a personalized treatment recommendation made in conjunction with a consultation with a health professional. User-centered design led to valuable improvements in the content and design of an app designed to improve allocation of and engagement in depression treatment. Iterative design allowed us to develop a tool that allows users to feel hope, engage in self-reflection, and motivate them to treatment. The tool is currently being evaluated in a randomized controlled trial. ©Caroline Wachtler, Amy Coe, Sandra Davidson, Susan Fletcher, Antonette Mendoza, Leon Sterling, Jane Gunn. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 23.04.2018.
Improving Environmental Model Calibration and Prediction
2011-01-18
REPORT Final Report - Improving Environmental Model Calibration and Prediction 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: First, we have continued to...develop tools for efficient global optimization of environmental models. Our algorithms are hybrid algorithms that combine evolutionary strategies...toward practical hybrid optimization tools for environmental models. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 18-01-2011 13
Sebok, Angelia; Wickens, Christopher D
2017-03-01
The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.
Liau, Siow Yen; Mohamed Izham, M I; Hassali, M A; Shafie, A A
2010-01-01
Cardiovascular diseases, the main causes of hospitalisations and death globally, have put an enormous economic burden on the healthcare system. Several risk factors are associated with the occurrence of cardiovascular events. At the heart of efficient prevention of cardiovascular disease is the concept of risk assessment. This paper aims to review the available cardiovascular risk-assessment tools and its applicability in predicting cardiovascular risk among Asian populations. A systematic search was performed using keywords as MeSH and Boolean terms. A total of 25 risk-assessment tools were identified. Of these, only two risk-assessment tools (8%) were derived from an Asian population. These risk-assessment tools differ in various ways, including characteristics of the derivation sample, type of study, time frame of follow-up, end points, statistical analysis and risk factors included. Very few cardiovascular risk-assessment tools were developed in Asian populations. In order to accurately predict the cardiovascular risk of our population, there is a need to develop a risk-assessment tool based on local epidemiological data.
Flanagan, Talia; Van Peer, Achiel; Lindahl, Anders
2016-08-25
Regulatory interactions are an important part of the drug development and licensing process. A survey on the use of biopharmaceutical tools for regulatory purposes has been carried out within the industry community of the EU project OrBiTo within Innovative Medicines Initiative (IMI). The aim was to capture current practice and experience in using in vitro and in silico biopharmaceutics tools at various stages of development, what barriers exist or are perceived, and to understand the current gaps in regulatory biopharmaceutics. The survey indicated that biorelevant dissolution testing and physiologically based modelling and simulation are widely applied throughout development to address a number of biopharmaceutics issues. However, data from these in vitro and in silico predictive biopharmaceutics tools are submitted to regulatory authorities far less often than they are used for internal risk assessment and decision making. This may prevent regulators from becoming familiar with these tools and how they are applied in industry, and limits the opportunities for biopharmaceutics scientists working in industry to understand the acceptability of these tools in the regulatory environment. It is anticipated that the advanced biopharmaceutics tools and understanding delivered in the next years by OrBiTo and other initiatives in the area of predictive tools will also be of value in the regulatory setting, and provide a basis for more informed and confident biopharmaceutics risk assessment and regulatory decision making. To enable the regulatory potential of predictive biopharmaceutics tools to be realized, further scientific dialogue is needed between industry, regulators and scientists in academia, and more examples need to be published to demonstrate the applicability of these tools. Copyright © 2016 Elsevier B.V. All rights reserved.
The Virtual Beach Manager Toolset (VB) is a set of decision support software tools developed to help local beach managers make decisions as to when beaches should be closed due to predicted high levels of water borne pathogens. The tools are being developed under the umbrella of...
Siaw-Sakyi, Vincent
2017-12-01
Wound infection is proving to be a challenge for health care professionals. The associated complications and cost of wound infection is immense and can lead to death in extreme cases. Current management of wound infection is largely subjective and relies on the knowledge of the health care professional to identify and initiate treatment. In response, we have developed an infection prediction and assessment tool. The Wound Infection Risk-Assessment and Evaluation tool (WIRE) and its management strategy is a tool with the aim to bring objectivity to infection prediction, assessment and management. A local audit carried out indicated a high infection prediction rate. More work is being done to improve its effectiveness.
A review of statistical updating methods for clinical prediction models.
Su, Ting-Li; Jaki, Thomas; Hickey, Graeme L; Buchan, Iain; Sperrin, Matthew
2018-01-01
A clinical prediction model is a tool for predicting healthcare outcomes, usually within a specific population and context. A common approach is to develop a new clinical prediction model for each population and context; however, this wastes potentially useful historical information. A better approach is to update or incorporate the existing clinical prediction models already developed for use in similar contexts or populations. In addition, clinical prediction models commonly become miscalibrated over time, and need replacing or updating. In this article, we review a range of approaches for re-using and updating clinical prediction models; these fall in into three main categories: simple coefficient updating, combining multiple previous clinical prediction models in a meta-model and dynamic updating of models. We evaluated the performance (discrimination and calibration) of the different strategies using data on mortality following cardiac surgery in the United Kingdom: We found that no single strategy performed sufficiently well to be used to the exclusion of the others. In conclusion, useful tools exist for updating existing clinical prediction models to a new population or context, and these should be implemented rather than developing a new clinical prediction model from scratch, using a breadth of complementary statistical methods.
Tomaselli Muensterman, Elena; Tisdale, James E
2018-06-08
Prolongation of the heart rate-corrected QT (QTc) interval increases the risk for torsades de pointes (TdP), a potentially fatal arrhythmia. The likelihood of TdP is higher in patients with risk factors, which include female sex, older age, heart failure with reduced ejection fraction, hypokalemia, hypomagnesemia, concomitant administration of ≥ 2 QTc interval-prolonging medications, among others. Assessment and quantification of risk factors may facilitate prediction of patients at highest risk for developing QTc interval prolongation and TdP. Investigators have utilized the field of predictive analytics, which generates predictions using techniques including data mining, modeling, machine learning, and others, to develop methods of risk quantification and prediction of QTc interval prolongation. Predictive analytics have also been incorporated into clinical decision support (CDS) tools to alert clinicians regarding patients at increased risk of developing QTc interval prolongation. The objectives of this paper are to assess the effectiveness of predictive analytics for identification of patients at risk of drug-induced QTc interval prolongation, and to discuss the efficacy of incorporation of predictive analytics into CDS tools in clinical practice. A systematic review of English language articles (human subjects only) was performed, yielding 57 articles, with an additional 4 articles identified from other sources; a total of 10 articles were included in this review. Risk scores for QTc interval prolongation have been developed in various patient populations including those in cardiac intensive care units (ICUs) and in broader populations of hospitalized or health system patients. One group developed a risk score that includes information regarding genetic polymorphisms; this score significantly predicted TdP. Development of QTc interval prolongation risk prediction models and incorporation of these models into CDS tools reduces the risk of QTc interval prolongation in cardiac ICUs and identifies health-system patients at increased risk for mortality. The impact of these QTc interval prolongation predictive analytics on overall patient safety outcomes, such as TdP and sudden cardiac death relative to the cost of development and implementation, requires further study. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
NASA Astrophysics Data System (ADS)
Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich
2016-04-01
MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.
Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)
2001-01-01
The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.
Klein, Julie; Eales, James; Zürbig, Petra; Vlahou, Antonia; Mischak, Harald; Stevens, Robert
2013-04-01
In this study, we have developed Proteasix, an open-source peptide-centric tool that can be used to predict in silico the proteases involved in naturally occurring peptide generation. We developed a curated cleavage site (CS) database, containing 3500 entries about human protease/CS combinations. On top of this database, we built a tool, Proteasix, which allows CS retrieval and protease associations from a list of peptides. To establish the proof of concept of the approach, we used a list of 1388 peptides identified from human urine samples, and compared the prediction to the analysis of 1003 randomly generated amino acid sequences. Metalloprotease activity was predominantly involved in urinary peptide generation, and more particularly to peptides associated with extracellular matrix remodelling, compared to proteins from other origins. In comparison, random sequences returned almost no results, highlighting the specificity of the prediction. This study provides a tool that can facilitate linking of identified protein fragments to predicted protease activity, and therefore into presumed mechanisms of disease. Experiments are needed to confirm the in silico hypotheses; nevertheless, this approach may be of great help to better understand molecular mechanisms of disease, and define new biomarkers, and therapeutic targets. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Analysis Tools for CFD Multigrid Solvers
NASA Technical Reports Server (NTRS)
Mineck, Raymond E.; Thomas, James L.; Diskin, Boris
2004-01-01
Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.
Zambetti, Benjamin R; Thomas, Fridtjof; Hwang, Inyong; Brown, Allen C; Chumpia, Mason; Ellis, Robert T; Naik, Darshan; Khouzam, Rami N; Ibebuogu, Uzoma N; Reed, Guy L
2017-01-01
In ST-elevation myocardial infarction (STEMI), acute kidney injury (AKI) may increase subsequent morbidity and mortality. Still, it remains difficult to predict AKI risk in these patients. We sought to 1) determine the frequency and clinical outcomes of AKI and, 2) develop, validate and compare a web-based tool for predicting AKI. In a racially diverse series of 1144 consecutive STEMI patients, Stage 1 or greater AKI occurred in 12.9% and was severe (Stage 2-3) in 2.9%. AKI was associated with increased mortality (5.7-fold, unadjusted) and hospital stay (2.5-fold). AKI was associated with systolic dysfunction, increased left ventricular end-diastolic pressures, hypotension and intra-aortic balloon counterpulsation. A computational algorithm (UT-AKI) was derived and internally validated. It showed higher sensitivity and improved overall prediction for AKI (area under the curve 0.76) vs. other published indices. Higher UT-AKI scores were associated with more severe AKI, longer hospital stay and greater hospital mortality. In a large, racially diverse cohort of STEMI patients, Stage 1 or greater AKI was relatively common and was associated with significant morbidity and mortality. A web-accessible, internally validated tool was developed with improved overall value for predicting AKI. By identifying patients at increased risk, this tool may help physicians tailor post-procedural diagnostic and therapeutic strategies after STEMI to reduce AKI and its associated morbidity and mortality.
A Clinical Tool for the Prediction of Venous Thromboembolism in Pediatric Trauma Patients.
Connelly, Christopher R; Laird, Amy; Barton, Jeffrey S; Fischer, Peter E; Krishnaswami, Sanjay; Schreiber, Martin A; Zonies, David H; Watters, Jennifer M
2016-01-01
Although rare, the incidence of venous thromboembolism (VTE) in pediatric trauma patients is increasing, and the consequences of VTE in children are significant. Studies have demonstrated increasing VTE risk in older pediatric trauma patients and improved VTE rates with institutional interventions. While national evidence-based guidelines for VTE screening and prevention are in place for adults, none exist for pediatric patients, to our knowledge. To develop a risk prediction calculator for VTE in children admitted to the hospital after traumatic injury to assist efforts in developing screening and prophylaxis guidelines for this population. Retrospective review of 536,423 pediatric patients 0 to 17 years old using the National Trauma Data Bank from January 1, 2007, to December 31, 2012. Five mixed-effects logistic regression models of varying complexity were fit on a training data set. Model validity was determined by comparison of the area under the receiver operating characteristic curve (AUROC) for the training and validation data sets from the original model fit. A clinical tool to predict the risk of VTE based on individual patient clinical characteristics was developed from the optimal model. Diagnosis of VTE during hospital admission. Venous thromboembolism was diagnosed in 1141 of 536,423 children (overall rate, 0.2%). The AUROCs in the training data set were high (range, 0.873-0.946) for each model, with minimal AUROC attenuation in the validation data set. A prediction tool was developed from a model that achieved a balance of high performance (AUROCs, 0.945 and 0.932 in the training and validation data sets, respectively; P = .048) and parsimony. Points are assigned to each variable considered (Glasgow Coma Scale score, age, sex, intensive care unit admission, intubation, transfusion of blood products, central venous catheter placement, presence of pelvic or lower extremity fractures, and major surgery), and the points total is converted to a VTE risk score. The predicted risk of VTE ranged from 0.0% to 14.4%. We developed a simple clinical tool to predict the risk of developing VTE in pediatric trauma patients. It is based on a model created using a large national database and was internally validated. The clinical tool requires external validation but provides an initial step toward the development of the specific VTE protocols for pediatric trauma patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris
RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variablemore » resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less
WORMHOLE: Novel Least Diverged Ortholog Prediction through Machine Learning
Sutphin, George L.; Mahoney, J. Matthew; Sheppard, Keith; Walton, David O.; Korstanje, Ron
2016-01-01
The rapid advancement of technology in genomics and targeted genetic manipulation has made comparative biology an increasingly prominent strategy to model human disease processes. Predicting orthology relationships between species is a vital component of comparative biology. Dozens of strategies for predicting orthologs have been developed using combinations of gene and protein sequence, phylogenetic history, and functional interaction with progressively increasing accuracy. A relatively new class of orthology prediction strategies combines aspects of multiple methods into meta-tools, resulting in improved prediction performance. Here we present WORMHOLE, a novel ortholog prediction meta-tool that applies machine learning to integrate 17 distinct ortholog prediction algorithms to identify novel least diverged orthologs (LDOs) between 6 eukaryotic species—humans, mice, zebrafish, fruit flies, nematodes, and budding yeast. Machine learning allows WORMHOLE to intelligently incorporate predictions from a wide-spectrum of strategies in order to form aggregate predictions of LDOs with high confidence. In this study we demonstrate the performance of WORMHOLE across each combination of query and target species. We show that WORMHOLE is particularly adept at improving LDO prediction performance between distantly related species, expanding the pool of LDOs while maintaining low evolutionary distance and a high level of functional relatedness between genes in LDO pairs. We present extensive validation, including cross-validated prediction of PANTHER LDOs and evaluation of evolutionary divergence and functional similarity, and discuss future applications of machine learning in ortholog prediction. A WORMHOLE web tool has been developed and is available at http://wormhole.jax.org/. PMID:27812085
WORMHOLE: Novel Least Diverged Ortholog Prediction through Machine Learning.
Sutphin, George L; Mahoney, J Matthew; Sheppard, Keith; Walton, David O; Korstanje, Ron
2016-11-01
The rapid advancement of technology in genomics and targeted genetic manipulation has made comparative biology an increasingly prominent strategy to model human disease processes. Predicting orthology relationships between species is a vital component of comparative biology. Dozens of strategies for predicting orthologs have been developed using combinations of gene and protein sequence, phylogenetic history, and functional interaction with progressively increasing accuracy. A relatively new class of orthology prediction strategies combines aspects of multiple methods into meta-tools, resulting in improved prediction performance. Here we present WORMHOLE, a novel ortholog prediction meta-tool that applies machine learning to integrate 17 distinct ortholog prediction algorithms to identify novel least diverged orthologs (LDOs) between 6 eukaryotic species-humans, mice, zebrafish, fruit flies, nematodes, and budding yeast. Machine learning allows WORMHOLE to intelligently incorporate predictions from a wide-spectrum of strategies in order to form aggregate predictions of LDOs with high confidence. In this study we demonstrate the performance of WORMHOLE across each combination of query and target species. We show that WORMHOLE is particularly adept at improving LDO prediction performance between distantly related species, expanding the pool of LDOs while maintaining low evolutionary distance and a high level of functional relatedness between genes in LDO pairs. We present extensive validation, including cross-validated prediction of PANTHER LDOs and evaluation of evolutionary divergence and functional similarity, and discuss future applications of machine learning in ortholog prediction. A WORMHOLE web tool has been developed and is available at http://wormhole.jax.org/.
Power hand tool kinetics associated with upper limb injuries in an automobile assembly plant.
Ku, Chia-Hua; Radwin, Robert G; Karsh, Ben-Tzion
2007-06-01
This study investigated the relationship between pneumatic nutrunner handle reactions, workstation characteristics, and prevalence of upper limb injuries in an automobile assembly plant. Tool properties (geometry, inertial properties, and motor characteristics), fastener properties, orientation relative to the fastener, and the position of the tool operator (horizontal and vertical distances) were measured for 69 workstations using 15 different pneumatic nutrunners. Handle reaction response was predicted using a deterministic mechanical model of the human operator and tool that was previously developed in our laboratory, specific to the measured tool, workstation, and job factors. Handle force was a function of target torque, tool geometry and inertial properties, motor speed, work orientation, and joint hardness. The study found that tool target torque was not well correlated with predicted handle reaction force (r=0.495) or displacement (r=0.285). The individual tool, tool shape, and threaded fastener joint hardness all affected predicted forces and displacements (p<0.05). The average peak handle force and displacement for right-angle tools were twice as great as pistol grip tools. Soft-threaded fastener joints had the greatest average handle forces and displacements. Upper limb injury cases were identified using plant OSHA 200 log and personnel records. Predicted handle forces for jobs where injuries were reported were significantly greater than those jobs free of injuries (p<0.05), whereas target torque and predicted handle displacement did not show statistically significant differences. The study concluded that quantification of handle reaction force, rather than target torque alone, is necessary for identifying stressful power hand tool operations and for controlling exposure to forces in manufacturing jobs involving power nutrunners. Therefore, a combination of tool, work station, and task requirements should be considered.
Analytical Tools for Space Suit Design
NASA Technical Reports Server (NTRS)
Aitchison, Lindsay
2011-01-01
As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.
Lee, Ciaran M; Davis, Timothy H; Bao, Gang
2018-04-01
What is the topic of this review? In this review, we analyse the performance of recently described tools for CRISPR/Cas9 guide RNA design, in particular, design tools that predict CRISPR/Cas9 activity. What advances does it highlight? Recently, many tools designed to predict CRISPR/Cas9 activity have been reported. However, the majority of these tools lack experimental validation. Our analyses indicate that these tools have poor predictive power. Our preliminary results suggest that target site accessibility should be considered in order to develop better guide RNA design tools with improved predictive power. The recent adaptation of the clustered regulatory interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein 9 (Cas9) system for targeted genome engineering has led to its widespread application in many fields worldwide. In order to gain a better understanding of the design rules of CRISPR/Cas9 systems, several groups have carried out large library-based screens leading to some insight into sequence preferences among highly active target sites. To facilitate CRISPR/Cas9 design, these studies have spawned a plethora of guide RNA (gRNA) design tools with algorithms based solely on direct or indirect sequence features. Here, we demonstrate that the predictive power of these tools is poor, suggesting that sequence features alone cannot accurately inform the cutting efficiency of a particular CRISPR/Cas9 gRNA design. Furthermore, we demonstrate that DNA target site accessibility influences the activity of CRISPR/Cas9. With further optimization, we hypothesize that it will be possible to increase the predictive power of gRNA design tools by including both sequence and target site accessibility metrics. © 2017 The Authors. Experimental Physiology © 2017 The Physiological Society.
NASA Astrophysics Data System (ADS)
Adesta, Erry Yulian T.; Riza, Muhammad; Avicena
2018-03-01
Tool wear prediction plays a significant role in machining industry for proper planning and control machining parameters and optimization of cutting conditions. This paper aims to investigate the effect of tool path strategies that are contour-in and zigzag tool path strategies applied on tool wear during pocket milling process. The experiments were carried out on CNC vertical machining centre by involving PVD coated carbide inserts. Cutting speed, feed rate and depth of cut were set to vary. In an experiment with three factors at three levels, Response Surface Method (RSM) design of experiment with a standard called Central Composite Design (CCD) was employed. Results obtained indicate that tool wear increases significantly at higher range of feed per tooth compared to cutting speed and depth of cut. This result of this experimental work is then proven statistically by developing empirical model. The prediction model for the response variable of tool wear for contour-in strategy developed in this research shows a good agreement with experimental work.
Overview: What's Worked and What Hasn't as a Guide towards Predictive Admissions Tool Development
ERIC Educational Resources Information Center
Siu, Eric; Reiter, Harold I.
2009-01-01
Admissions committees and researchers around the globe have used diligence and imagination to develop and implement various screening measures with the ultimate goal of predicting future clinical and professional performance. What works for predicting future job performance in the human resources world and in most of the academic world may not,…
AN APPROACH TO PREDICT RISKS TO WILDLIFE POPULATIONS FROM MERCURY AND OTHER STRESSORS
The U.S. Environmental Protection Agency's National Health and Environmental Effects Research Laboratory (NHEERL) is developing tools for predicting risks of multiple stressors to wildlife populations, which support the development of risk-based protective criteria. NHEERL's res...
McFarlane, Judith; Pennings, Jacquelyn; Liu, Fuqin; Gilroy, Heidi; Nava, Angeles; Maddoux, John A; Montalvo-Liendo, Nora; Paulson, René
2016-02-01
To develop a tool to predict risk for return to a shelter, 150 women with children, exiting a domestic violence shelter, were evaluated every 4 months for 24 months to determine risk factors for returning to a shelter. The study identified four risk factors, including danger for murder, woman's age (i.e., older women), tangible support (i.e., access to money, transportation), and child witness to verbal abuse of the mother. An easy to use, quick triage tool with a weighted score was derived, which can identify with 90% accuracy abused women with children most likely to return to shelters. © The Author(s) 2015.
Development and Validation of the Texas Best Management Practice Evaluation Tool (TBET)
USDA-ARS?s Scientific Manuscript database
Conservation planners need simple yet accurate tools to predict sediment and nutrient losses from agricultural fields to guide conservation practice implementation and increase cost-effectiveness. The Texas Best management practice Evaluation Tool (TBET), which serves as an input/output interpreter...
An information model for use in software management estimation and prediction
NASA Technical Reports Server (NTRS)
Li, Ningda R.; Zelkowitz, Marvin V.
1993-01-01
This paper describes the use of cluster analysis for determining the information model within collected software engineering development data at the NASA/GSFC Software Engineering Laboratory. We describe the Software Management Environment tool that allows managers to predict development attributes during early phases of a software project and the modifications we propose to allow it to develop dynamic models for better predictions of these attributes.
Ghali, Iraqi; Kizub, Darya; Billioux, Alexander C.; Bennani, Kenza; Bourkadi, Jamal Eddine; Benmamoun, Abderrahmane; Lahlou, Ouafae; Aouad, Rajae El; Dooley, Kelly E.
2014-01-01
Setting Public tuberculosis (TB) clinics in urban Morocco. Objective Explore risk factors for TB treatment default and develop a prediction tool. Assess consequences of default, specifically risk for transmission or development of drug resistance. Design Case-control study comparing patients who defaulted from TB treatment and patients who completed it using quantitative methods and open-ended questions. Results were interpreted in light of health professionals’ perspectives from a parallel study. A predictive model and simple tool to identify patients at high risk of default were developed. Sputum from cases with pulmonary TB was collected for smear and drug susceptibility testing. Results 91 cases and 186 controls enrolled. Independent risk factors for default included current smoking, retreatment, work interference with adherence, daily directly observed therapy, side effects, quick symptom resolution, and not knowing one’s treatment duration. Age >50 years, never smoking, and having friends who knew one’s diagnosis were protective. A simple scoring tool incorporating these factors was 82.4% sensitive and 87.6% specific for predicting default in this population. Clinicians and patients described additional contributors to default and suggested locally-relevant intervention targets. Among 89 cases with pulmonary TB, 71% had sputum that was smear positive for TB. Drug resistance was rare. Conclusion The causes of default from TB treatment were explored through synthesis of qualitative and quantitative data from patients and health professionals. A scoring tool with high sensitivity and specificity to predict default was developed. Prospective evaluation of this tool coupled with targeted interventions based on our findings is warranted. Of note, the risk of TB transmission from patients who default treatment to others is likely to be high. The commonly-feared risk of drug resistance, though, may be low; a larger study is required to confirm these findings. PMID:24699682
Anthology of the Development of Radiation Transport Tools as Applied to Single Event Effects
NASA Astrophysics Data System (ADS)
Reed, R. A.; Weller, R. A.; Akkerman, A.; Barak, J.; Culpepper, W.; Duzellier, S.; Foster, C.; Gaillardin, M.; Hubert, G.; Jordan, T.; Jun, I.; Koontz, S.; Lei, F.; McNulty, P.; Mendenhall, M. H.; Murat, M.; Nieminen, P.; O'Neill, P.; Raine, M.; Reddell, B.; Saigné, F.; Santin, G.; Sihver, L.; Tang, H. H. K.; Truscott, P. R.; Wrobel, F.
2013-06-01
This anthology contains contributions from eleven different groups, each developing and/or applying Monte Carlo-based radiation transport tools to simulate a variety of effects that result from energy transferred to a semiconductor material by a single particle event. The topics span from basic mechanisms for single-particle induced failures to applied tasks like developing websites to predict on-orbit single event failure rates using Monte Carlo radiation transport tools.
Fukui, Sakiko; Morita, Tatsuya; Yoshiuchi, Kazuhiro
2017-08-01
The aim of this study was to investigate the predictive value of a clinical tool to predict whether discharged cancer patients die at home, comparing groups of case who died at home and control who died in hospitals or other facilities. We conducted a nationwide case-control study to identify the determinants of home death for a discharged cancer patient. We randomly selected nurses in charge of 2000 home-visit nursing agencies from all 5813 agencies in Japan by referring to the nationwide databases in January 2013. The nurses were asked to report variables of their patients' place of death, patients' and caregivers' clinical statuses, and their preferences for home death. We used logistic regression analysis and developed a clinical tool to accurately predict it and investigated their predictive values. We identified 466 case and 478 control patients. Five predictive variables of home death were obtained: patients' and caregivers' preferences for home death [OR (95% CI) 2.66 (1.99-3.55)], availability of visiting physicians [2.13 (1.67-2.70)], 24-h contact between physicians and nurses [1.68 (1.30-2.18)], caregivers' experiences of deathwatch at home [1.41 (1.13-1.75)], and patients' insights as to their own prognosis [1.23 (1.02-1.50)]. We calculated the scores predicting home death for each variable (range 6-28). When using a cutoff point of 16, home death was predicted with a sensitivity of 0.72 and a specificity of 0.81 with the Harrell's c-statistic of 0.84. This simple clinical tool for healthcare professionals can help predict whether a discharged patient is likely to die at home.
Facilitating Adoption of News Tool to Develop Clinical Decision Making
ERIC Educational Resources Information Center
Brown, Robin T.
2017-01-01
This scholarly project was a non-experimental, pre/post-test design to (a) facilitate the voluntary adoption of the National Early Warning Score (NEWS), and (b) develop clinical decision making (CDM) in one cohort of junior level nursing students participating in a simulation lab. NEWS is an evidence-based predictive scoring tool developed by the…
NASA Astrophysics Data System (ADS)
Mia, Mozammel; Al Bashir, Mahmood; Dhar, Nikhil Ranjan
2016-10-01
Hard turning is increasingly employed in machining, lately, to replace time-consuming conventional turning followed by grinding process. An excessive amount of tool wear in hard turning is one of the main hurdles to be overcome. Many researchers have developed tool wear model, but most of them developed it for a particular work-tool-environment combination. No aggregate model is developed that can be used to predict the amount of principal flank wear for specific machining time. An empirical model of principal flank wear (VB) has been developed for the different hardness of workpiece (HRC40, HRC48 and HRC56) while turning by coated carbide insert with different configurations (SNMM and SNMG) under both dry and high pressure coolant conditions. Unlike other developed model, this model includes the use of dummy variables along with the base empirical equation to entail the effect of any changes in the input conditions on the response. The base empirical equation for principal flank wear is formulated adopting the Exponential Associate Function using the experimental results. The coefficient of dummy variable reflects the shifting of the response from one set of machining condition to another set of machining condition which is determined by simple linear regression. The independent cutting parameters (speed, rate, depth of cut) are kept constant while formulating and analyzing this model. The developed model is validated with different sets of machining responses in turning hardened medium carbon steel by coated carbide inserts. For any particular set, the model can be used to predict the amount of principal flank wear for specific machining time. Since the predicted results exhibit good resemblance with experimental data and the average percentage error is <10 %, this model can be used to predict the principal flank wear for stated conditions.
ERIC Educational Resources Information Center
Culpepper, Steven Andrew
2010-01-01
Statistical prediction remains an important tool for decisions in a variety of disciplines. An equally important issue is identifying factors that contribute to more or less accurate predictions. The time series literature includes well developed methods for studying predictability and volatility over time. This article develops…
Phenology prediction component of GypsES
Jesse A. Logan; Lukas P. Schaub; F. William Ravlin
1991-01-01
Prediction of phenology is an important component of most pest management programs, and considerable research effort has been expended toward development of predictive tools for gypsy moth phenology. Although phenological prediction is potentially valuable for timing of spray applications (e.g. Bt, or Gypcheck) and other management activities (e.g. placement and...
The Complexity of Developmental Predictions from Dual Process Models
ERIC Educational Resources Information Center
Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.
2011-01-01
Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…
Mathematical modelling and numerical simulation of forces in milling process
NASA Astrophysics Data System (ADS)
Turai, Bhanu Murthy; Satish, Cherukuvada; Prakash Marimuthu, K.
2018-04-01
Machining of the material by milling induces forces, which act on the work piece material, tool and which in turn act on the machining tool. The forces involved in milling process can be quantified, mathematical models help to predict these forces. A lot of research has been carried out in this area in the past few decades. The current research aims at developing a mathematical model to predict forces at different levels which arise machining of Aluminium6061 alloy. Finite element analysis was used to develop a FE model to predict the cutting forces. Simulation was done for varying cutting conditions. Different experiments was designed using Taguchi method. A L9 orthogonal array was designed and the output was measure for the different experiments. The same was used to develop the mathematical model.
DEVELOPMENT AND USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOLS FOR POLLUTION PREVENTION
The use of Computer-Aided Process Engineering (CAPE) and process simulation tools has become established industry practice to predict simulation software, new opportunities are available for the creation of a wide range of ancillary tools that can be used from within multiple sim...
van der Put, Claudia E; Stams, Geert Jan J M
2013-12-01
In the juvenile justice system, much attention is paid to estimating the risk for recidivism among juvenile offenders. However, it is also important to estimate the risk for problematic child-rearing situations (care needs) in juvenile offenders, because these problems are not always related to recidivism. In the present study, an actuarial care needs assessment tool for juvenile offenders, the Youth Offender Care Needs Assessment Tool (YO-CNAT), was developed to predict the probability of (a) a future supervision order imposed by the child welfare agency, (b) a future entitlement to care indicated by the youth care agency, and (c) future incidents involving child abuse, domestic violence, and/or sexual norm trespassing behavior at the juvenile's address. The YO-CNAT has been developed for use by the police and is based solely on information available in police registration systems. It is designed to assist a police officer without clinical expertise in making a quick assessment of the risk for problematic child-rearing situations. The YO-CNAT was developed on a sample of 1,955 juvenile offenders and was validated on another sample of 2,045 juvenile offenders. The predictive validity (area under the receiver-operating-characteristic curve) scores ranged between .70 (for predicting future entitlement to care) and .75 (for predicting future worrisome incidents at the juvenile's address); therefore, the predictive accuracy of the test scores of the YO-CNAT was sufficient to justify its use as a screening instrument for the police in deciding to refer a juvenile offender to the youth care agency for further assessment into care needs.
2014-01-01
Background It is important to predict the quality of a protein structural model before its native structure is known. The method that can predict the absolute local quality of individual residues in a single protein model is rare, yet particularly needed for using, ranking and refining protein models. Results We developed a machine learning tool (SMOQ) that can predict the distance deviation of each residue in a single protein model. SMOQ uses support vector machines (SVM) with protein sequence and structural features (i.e. basic feature set), including amino acid sequence, secondary structures, solvent accessibilities, and residue-residue contacts to make predictions. We also trained a SVM model with two new additional features (profiles and SOV scores) on 20 CASP8 targets and found that including them can only improve the performance when real deviations between native and model are higher than 5Å. The SMOQ tool finally released uses the basic feature set trained on 85 CASP8 targets. Moreover, SMOQ implemented a way to convert predicted local quality scores into a global quality score. SMOQ was tested on the 84 CASP9 single-domain targets. The average difference between the residue-specific distance deviation predicted by our method and the actual distance deviation on the test data is 2.637Å. The global quality prediction accuracy of the tool is comparable to other good tools on the same benchmark. Conclusion SMOQ is a useful tool for protein single model quality assessment. Its source code and executable are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/. PMID:24776231
Cao, Renzhi; Wang, Zheng; Wang, Yiheng; Cheng, Jianlin
2014-04-28
It is important to predict the quality of a protein structural model before its native structure is known. The method that can predict the absolute local quality of individual residues in a single protein model is rare, yet particularly needed for using, ranking and refining protein models. We developed a machine learning tool (SMOQ) that can predict the distance deviation of each residue in a single protein model. SMOQ uses support vector machines (SVM) with protein sequence and structural features (i.e. basic feature set), including amino acid sequence, secondary structures, solvent accessibilities, and residue-residue contacts to make predictions. We also trained a SVM model with two new additional features (profiles and SOV scores) on 20 CASP8 targets and found that including them can only improve the performance when real deviations between native and model are higher than 5Å. The SMOQ tool finally released uses the basic feature set trained on 85 CASP8 targets. Moreover, SMOQ implemented a way to convert predicted local quality scores into a global quality score. SMOQ was tested on the 84 CASP9 single-domain targets. The average difference between the residue-specific distance deviation predicted by our method and the actual distance deviation on the test data is 2.637Å. The global quality prediction accuracy of the tool is comparable to other good tools on the same benchmark. SMOQ is a useful tool for protein single model quality assessment. Its source code and executable are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/.
Development of a Boundary Layer Property Interpolation Tool in Support of Orbiter Return To Flight
NASA Technical Reports Server (NTRS)
Greene, Francis A.; Hamilton, H. Harris
2006-01-01
A new tool was developed to predict the boundary layer quantities required by several physics-based predictive/analytic methods that assess damaged Orbiter tile. This new tool, the Boundary Layer Property Prediction (BLPROP) tool, supplies boundary layer values used in correlations that determine boundary layer transition onset and surface heating-rate augmentation/attenuation factors inside tile gouges (i.e. cavities). BLPROP interpolates through a database of computed solutions and provides boundary layer and wall data (delta, theta, Re(sub theta)/M(sub e), Re(sub theta)/M(sub e), Re(sub theta), P(sub w), and q(sub w)) based on user input surface location and free stream conditions. Surface locations are limited to the Orbiter s windward surface. Constructed using predictions from an inviscid w/boundary-layer method and benchmark viscous CFD, the computed database covers the hypersonic continuum flight regime based on two reference flight trajectories. First-order one-dimensional Lagrange interpolation accounts for Mach number and angle-of-attack variations, whereas non-dimensional normalization accounts for differences between the reference and input Reynolds number. Employing the same computational methods used to construct the database, solutions at other trajectory points taken from previous STS flights were computed: these results validate the BLPROP algorithm. Percentage differences between interpolated and computed values are presented and are used to establish the level of uncertainty of the new tool.
In silico prediction of splice-altering single nucleotide variants in the human genome.
Jian, Xueqiu; Boerwinkle, Eric; Liu, Xiaoming
2014-12-16
In silico tools have been developed to predict variants that may have an impact on pre-mRNA splicing. The major limitation of the application of these tools to basic research and clinical practice is the difficulty in interpreting the output. Most tools only predict potential splice sites given a DNA sequence without measuring splicing signal changes caused by a variant. Another limitation is the lack of large-scale evaluation studies of these tools. We compared eight in silico tools on 2959 single nucleotide variants within splicing consensus regions (scSNVs) using receiver operating characteristic analysis. The Position Weight Matrix model and MaxEntScan outperformed other methods. Two ensemble learning methods, adaptive boosting and random forests, were used to construct models that take advantage of individual methods. Both models further improved prediction, with outputs of directly interpretable prediction scores. We applied our ensemble scores to scSNVs from the Catalogue of Somatic Mutations in Cancer database. Analysis showed that predicted splice-altering scSNVs are enriched in recurrent scSNVs and known cancer genes. We pre-computed our ensemble scores for all potential scSNVs across the human genome, providing a whole genome level resource for identifying splice-altering scSNVs discovered from large-scale sequencing studies.
Mahmood, Khalid; Jung, Chol-Hee; Philip, Gayle; Georgeson, Peter; Chung, Jessica; Pope, Bernard J; Park, Daniel J
2017-05-16
Genetic variant effect prediction algorithms are used extensively in clinical genomics and research to determine the likely consequences of amino acid substitutions on protein function. It is vital that we better understand their accuracies and limitations because published performance metrics are confounded by serious problems of circularity and error propagation. Here, we derive three independent, functionally determined human mutation datasets, UniFun, BRCA1-DMS and TP53-TA, and employ them, alongside previously described datasets, to assess the pre-eminent variant effect prediction tools. Apparent accuracies of variant effect prediction tools were influenced significantly by the benchmarking dataset. Benchmarking with the assay-determined datasets UniFun and BRCA1-DMS yielded areas under the receiver operating characteristic curves in the modest ranges of 0.52 to 0.63 and 0.54 to 0.75, respectively, considerably lower than observed for other, potentially more conflicted datasets. These results raise concerns about how such algorithms should be employed, particularly in a clinical setting. Contemporary variant effect prediction tools are unlikely to be as accurate at the general prediction of functional impacts on proteins as reported prior. Use of functional assay-based datasets that avoid prior dependencies promises to be valuable for the ongoing development and accurate benchmarking of such tools.
Validation of a modified Medical Resource Model for mass gatherings.
Smith, Wayne P; Tuffin, Heather; Stratton, Samuel J; Wallis, Lee A
2013-02-01
A modified Medical Resource Model to predict the medical resources required at mass gatherings based on the risk profile of events has been developed. This study was undertaken to validate this tool using data from events held in both a developed and a developing country. A retrospective study was conducted utilizing prospectively gathered data from individual events at Old Trafford Stadium in Manchester, United Kingdom, and Ellis Park Stadium, Johannesburg, South Africa. Both stadia are similar in design and spectator capacity. Data for Professional Football as well as Rugby League and Rugby Union (respectively) matches were used for the study. The medical resources predicted for the events were determined by entering the risk profile of each of the events into the Medical Resource Model. A recently developed South African tool was used to predetermine medical staffing for mass gatherings. For the study, the medical resources actually required to deal with the patient load for events within the control sample from the two stadia were compared with the number of needed resources predicted by the Medical Resource Model when that tool was applied retrospectively to the study events. The comparison was used to determine if the newly developed tool was either over- or under-predicting the resource requirements. In the case of Ellis Park, the model under-predicted the basic life support (BLS) requirement for 1.5% of the events in the data set. Mean over-prediction was 209.1 minutes for BLS availability. Old Trafford displayed no events for which the Medical Resource Model would have under-predicted. The mean over-prediction of BLS availability for Old Trafford was 671.6 minutes. The intermediate life support (ILS) requirement for Ellis Park was under-predicted for seven of the total 66 events (10.6% of the events), all of which had one factor in common, that being relatively low spectator attendance numbers. Modelling for ILS at Old Trafford did not under-predict for any events. The ILS requirements showed a mean over-prediction of 161.4 minutes ILS availability for Ellis Park compared with 425.2 minutes for Old Trafford. Of the events held at Ellis Park, the Medical Resource Model under-predicted the ambulance requirement in 4.5% of the events. For Old Trafford events, the under-prediction was higher: 7.5% of cases. The medical resources that are deployed at a mass gathering should best match the requirement for patient care at a particular event. An important consideration for any model is that it does not continually under-predict the resources required in relation to the actual requirement. With the exception of a specific subset of events at Ellis Park, the rate of under-prediction for this model was acceptable.
Variable context Markov chains for HIV protease cleavage site prediction.
Oğul, Hasan
2009-06-01
Deciphering the knowledge of HIV protease specificity and developing computational tools for detecting its cleavage sites in protein polypeptide chain are very desirable for designing efficient and specific chemical inhibitors to prevent acquired immunodeficiency syndrome. In this study, we developed a generative model based on a generalization of variable order Markov chains (VOMC) for peptide sequences and adapted the model for prediction of their cleavability by certain proteases. The new method, called variable context Markov chains (VCMC), attempts to identify the context equivalence based on the evolutionary similarities between individual amino acids. It was applied for HIV-1 protease cleavage site prediction problem and shown to outperform existing methods in terms of prediction accuracy on a common dataset. In general, the method is a promising tool for prediction of cleavage sites of all proteases and encouraged to be used for any kind of peptide classification problem as well.
Rabin, Borsika A.; Gaglio, Bridget; Sanders, Tristan; Nekhlyudov, Larissa; Dearing, James W.; Bull, Sheana; Glasgow, Russell E.; Marcus, Alfred
2013-01-01
Cancer prognosis is of keen interest for cancer patients, their caregivers and providers. Prognostic tools have been developed to guide patient-physician communication and decision-making. Given the proliferation of prognostic tools, it is timely to review existing online cancer prognostic tools and discuss implications for their use in clinical settings. Using a systematic approach, we searched the Internet, Medline, and consulted with experts to identify existing online prognostic tools. Each was reviewed for content and format. Twenty-two prognostic tools addressing 89 different cancers were identified. Tools primarily focused on prostate (n=11), colorectal (n=10), breast (n=8), and melanoma (n=6), though at least one tool was identified for most malignancies. The input variables for the tools included cancer characteristics (n=22), patient characteristics (n=18), and comorbidities (n=9). Effect of therapy on prognosis was included in 15 tools. The most common predicted outcome was cancer specific survival/mortality (n=17). Only a few tools (n=4) suggested patients as potential target users. A comprehensive repository of online prognostic tools was created to understand the state-of-the-art in prognostic tool availability and characteristics. Use of these tools may support communication and understanding about cancer prognosis. Dissemination, testing, refinement of existing, and development of new tools under different conditions are needed. PMID:23956026
Inflammation-driven malnutrition: a new screening tool predicts outcome in Crohn's disease.
Jansen, Irene; Prager, Matthias; Valentini, Luzia; Büning, Carsten
2016-09-01
Malnutrition is a frequent feature in Crohn's disease (CD), affects patient outcome and must be recognised. For chronic inflammatory diseases, recent guidelines recommend the development of combined malnutrition and inflammation risk scores. We aimed to design and evaluate a new screening tool that combines both malnutrition and inflammation parameters that might help predict clinical outcome. In a prospective cohort study, we examined fifty-five patients with CD in remission (Crohn's disease activity index (CDAI) <200) at 0 and 6 months. We assessed disease activity (CDAI, Harvey-Bradshaw index), inflammation (C-reactive protein (CRP), faecal calprotectin (FC)), malnutrition (BMI, subjective global assessment (SGA), serum albumin, handgrip strength), body composition (bioelectrical impedance analysis) and administered the newly developed 'Malnutrition Inflammation Risk Tool' (MIRT; containing BMI, unintentional weight loss over 3 months and CRP). All parameters were evaluated regarding their ability to predict disease outcome prospectively at 6 months. At baseline, more than one-third of patients showed elevated inflammatory markers despite clinical remission (36·4 % CRP ≥5 mg/l, 41·5 % FC ≥100 µg/g). Prevalence of malnutrition at baseline according to BMI, SGA and serum albumin was 2-16 %. At 6 months, MIRT significantly predicted outcome in numerous nutritional and clinical parameters (SGA, CD-related flares, hospitalisations and surgeries). In contrast, SGA, handgrip strength, BMI, albumin and body composition had no influence on the clinical course. The newly developed MIRT was found to reliably predict clinical outcome in CD patients. This screening tool might be used to facilitate clinical decision making, including treatment of both inflammation and malnutrition in order to prevent complications.
SITE CHARACTERIZATION TO SUPPORT MODEL DEVELOPMENT FOR CONTAMINANTS IN GROUND WATER
The development of conceptual and predictive models is an important tool to guide site characterization in support of monitoring contaminants in ground water. The accuracy of predictive models is limited by the adequacy of the input data and the assumptions made to constrain mod...
NASA Technical Reports Server (NTRS)
Likhanskii, Alexandre
2012-01-01
This report is the final report of a SBIR Phase I project. It is identical to the final report submitted, after some proprietary information of administrative nature has been removed. The development of a numerical simulation tool for dielectric barrier discharge (DBD) plasma actuator is reported. The objectives of the project were to analyze and predict DBD operation at wide range of ambient gas pressures. It overcomes the limitations of traditional DBD codes which are limited to low-speed applications and have weak prediction capabilities. The software tool allows DBD actuator analysis and prediction for subsonic to hypersonic flow regime. The simulation tool is based on the VORPAL code developed by Tech-X Corporation. VORPAL's capability of modeling DBD plasma actuator at low pressures (0.1 to 10 torr) using kinetic plasma modeling approach, and at moderate to atmospheric pressures (1 to 10 atm) using hydrodynamic plasma modeling approach, were demonstrated. In addition, results of experiments with pulsed+bias DBD configuration that were performed for validation purposes are reported.
Jet Measurements for Development of Jet Noise Prediction Tools
NASA Technical Reports Server (NTRS)
Bridges, James E.
2006-01-01
The primary focus of my presentation is the development of the jet noise prediction code JeNo with most examples coming from the experimental work that drove the theoretical development and validation. JeNo is a statistical jet noise prediction code, based upon the Lilley acoustic analogy. Our approach uses time-average 2-D or 3-D mean and turbulent statistics of the flow as input. The output is source distributions and spectral directivity.
Liu, Xinxue; Wong, Angela; Kadri, Sudarshan R; Corovic, Andrej; O'Donovan, Maria; Lao-Sirieix, Pierre; Lovat, Laurence B; Burnham, Rodney W; Fitzgerald, Rebecca C
2014-01-01
Barrett's esophagus (BE) occurs as consequence of reflux and is a risk factor for esophageal adenocarcinoma. The current "gold-standard" for diagnosing BE is endoscopy which remains prohibitively expensive and impractical as a population screening tool. We aimed to develop a pre-screening tool to aid decision making for diagnostic referrals. A prospective (training) cohort of 1603 patients attending for endoscopy was used for identification of risk factors to develop a risk prediction model. Factors associated with BE in the univariate analysis were selected to develop prediction models that were validated in an independent, external cohort of 477 non-BE patients referred for endoscopy with symptoms of reflux or dyspepsia. Two prediction models were developed separately for columnar lined epithelium (CLE) of any length and using a stricter definition of intestinal metaplasia (IM) with segments ≥ 2 cm with areas under the ROC curves (AUC) of 0.72 (95%CI: 0.67-0.77) and 0.81 (95%CI: 0.76-0.86), respectively. The two prediction models included demographics (age, sex), symptoms (heartburn, acid reflux, chest pain, abdominal pain) and medication for "stomach" symptoms. These two models were validated in the independent cohort with AUCs of 0.61 (95%CI: 0.54-0.68) and 0.64 (95%CI: 0.52-0.77) for CLE and IM ≥ 2 cm, respectively. We have identified and validated two prediction models for CLE and IM ≥ 2 cm. Both models have fair prediction accuracies and can select out around 20% of individuals unlikely to benefit from investigation for Barrett's esophagus. Such prediction models have the potential to generate useful cost-savings for BE screening among the symptomatic population.
NASA Astrophysics Data System (ADS)
Heatwole, K. K.; McCray, J.; Lowe, K.
2005-12-01
Individual sewage disposal systems (ISDS) have demonstrated the capability to be an effective method of treatment for domestic wastewater. They also are advantageous from a water resources standpoint because there is little water leaving the local hydrologic system. However, if unfavorable settings exist, ISDS can have a detrimental effect on local water-quality. This presentation will focus on assessing the potential impacts of a large housing development to area water quality. The residential development plans to utilize ISDS to accommodate all domestic wastewater generated within the development. The area of interest is located just west of Brighton, Colorado, on the northwestern margin of the Denver Basin. Efforts of this research will focus on impacts of ISDS to local groundwater and surface water systems. The Arapahoe Aquifer, which exists at relatively shallow depths in the area of proposed development, is suspected to be vulnerable to contamination from ISDS. Additionally, the local water quality of the Arapahoe Aquifer was not well known at the start of the study. As a result, nitrate was selected as a fo-cus water quality parameter because it is easily produced through nitrification of septic tank effluent and because of the previous agricultural practices that could be another potential source of nitrate. Several different predictive tools were used to attempt to predict the potential impacts of ISDS to water quality in the Arapahoe Aquifer. The objectives of these tools were to 1) assess the vulnerability of the Arapahoe Aquifer to ni-trate contamination, 2) predict the nitrate load to the aquifer, and 3) determine the sensitivity of different parameter inputs and the overall prediction uncertainty. These predictive tools began with very simple mass-loading calcula-tions and progressed to more complex, vadose-zone numerical contaminant transport modeling.
Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng
2015-01-01
Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs.
2015-01-01
Background Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. Results The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Conclusions Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs. PMID:26677932
Fan Noise Prediction with Applications to Aircraft System Noise Assessment
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Envia, Edmane; Burley, Casey L.
2009-01-01
This paper describes an assessment of current fan noise prediction tools by comparing measured and predicted sideline acoustic levels from a benchmark fan noise wind tunnel test. Specifically, an empirical method and newly developed coupled computational approach are utilized to predict aft fan noise for a benchmark test configuration. Comparisons with sideline noise measurements are performed to assess the relative merits of the two approaches. The study identifies issues entailed in coupling the source and propagation codes, as well as provides insight into the capabilities of the tools in predicting the fan noise source and subsequent propagation and radiation. In contrast to the empirical method, the new coupled computational approach provides the ability to investigate acoustic near-field effects. The potential benefits/costs of these new methods are also compared with the existing capabilities in a current aircraft noise system prediction tool. The knowledge gained in this work provides a basis for improved fan source specification in overall aircraft system noise studies.
Mysara, Mohamed; Elhefnawi, Mahmoud; Garibaldi, Jonathan M
2012-06-01
The investigation of small interfering RNA (siRNA) and its posttranscriptional gene-regulation has become an extremely important research topic, both for fundamental reasons and for potential longer-term therapeutic benefits. Several factors affect the functionality of siRNA including positional preferences, target accessibility and other thermodynamic features. State of the art tools aim to optimize the selection of target siRNAs by identifying those that may have high experimental inhibition. Such tools implement artificial neural network models as Biopredsi and ThermoComposition21, and linear regression models as DSIR, i-Score and Scales, among others. However, all these models have limitations in performance. In this work, a neural-network trained new siRNA scoring/efficacy prediction model was developed based on combining two existing scoring algorithms (ThermoComposition21 and i-Score), together with the whole stacking energy (ΔG), in a multi-layer artificial neural network. These three parameters were chosen after a comparative combinatorial study between five well known tools. Our developed model, 'MysiRNA' was trained on 2431 siRNA records and tested using three further datasets. MysiRNA was compared with 11 alternative existing scoring tools in an evaluation study to assess the predicted and experimental siRNA efficiency where it achieved the highest performance both in terms of correlation coefficient (R(2)=0.600) and receiver operating characteristics analysis (AUC=0.808), improving the prediction accuracy by up to 18% with respect to sensitivity and specificity of the best available tools. MysiRNA is a novel, freely accessible model capable of predicting siRNA inhibition efficiency with improved specificity and sensitivity. This multiclassifier approach could help improve the performance of prediction in several bioinformatics areas. MysiRNA model, part of MysiRNA-Designer package [1], is expected to play a key role in siRNA selection and evaluation. Copyright © 2012 Elsevier Inc. All rights reserved.
GAPIT: genome association and prediction integrated tool.
Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu
2012-09-15
Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.
Hinge Moment Coefficient Prediction Tool and Control Force Analysis of Extra-300 Aerobatic Aircraft
NASA Astrophysics Data System (ADS)
Nurohman, Chandra; Arifianto, Ony; Barecasco, Agra
2018-04-01
This paper presents the development of tool that is applicable to predict hinge moment coefficients of subsonic aircraft based on Roskam’s method, including the validation and its application to predict hinge moment coefficient of an Extra-300. The hinge moment coefficients are used to predict the stick forces of the aircraft during several aerobatic maneuver i.e. inside loop, half cuban 8, split-s, and aileron roll. The maximum longitudinal stick force is 566.97 N occurs in inside loop while the maximum lateral stick force is 340.82 N occurs in aileron roll. Furthermore, validation hinge moment prediction method is performed using Cessna 172 data.
Development and Overview of CPAS Sasquatch Airdrop Landing Location Predictor Software
NASA Technical Reports Server (NTRS)
Bledsoe, Kristin J.; Bernatovich, Michael A.
2015-01-01
The Capsule Parachute Assembly System (CPAS) is the parachute system for NASA's Orion spacecraft. CPAS is currently in the Engineering Development Unit (EDU) phase of testing. The test program consists of numerous drop tests, wherein a test article rigged with parachutes is extracted from an aircraft. During such tests, range safety is paramount, as is the recoverability of the parachutes and test article. It is crucial to establish a release point from the aircraft that will ensure that the article and all items released from it during flight will land in a designated safe area. The Sasquatch footprint tool was developed to determine this safe release point and to predict the probable landing locations (footprints) of the payload and all released objects. In 2012, a new version of Sasquatch, called Sasquatch Polygons, was developed that significantly upgraded the capabilities of the footprint tool. Key improvements were an increase in the accuracy of the predictions, and the addition of an interface with the Debris Tool (DT), an in-flight debris avoidance tool for use on the test observation helicopter. Additional enhancements include improved data presentation for communication with test personnel and a streamlined code structure. This paper discusses the development, validation, and performance of Sasquatch Polygons, as well as its differences from the original Sasquatch footprint tool.
Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base
NASA Technical Reports Server (NTRS)
Bryant, Richard B., Jr.; Carrelli, David J.
2006-01-01
The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.
DOT National Transportation Integrated Search
2013-09-01
In this project, researchers from the University of Florida developed a sketch planning tool that can be used to conduct statewide and regional assessments of transportation facilities potentially vulnerable to sea level change trends. Possible futur...
EPAs ToxCast Research Program: Developing Predictive Bioactivity Signatures for Chemicals
The international community needs better predictive tools for assessing the hazards and risks of chemicals. It is technically feasible to collect bioactivity data on virtually all chemicals of potential concern ToxCast is providing a proof of concept for obtaining predictive, b...
Assessing Bleeding Risk in Patients Taking Anticoagulants
Shoeb, Marwa; Fang, Margaret C.
2013-01-01
Anticoagulant medications are commonly used for the prevention and treatment of thromboembolism. Although highly effective, they are also associated with significant bleeding risks. Numerous individual clinical factors have been linked to an increased risk of hemorrhage, including older age, anemia, and renal disease. To help quantify hemorrhage risk for individual patients, a number of clinical risk prediction tools have been developed. These risk prediction tools differ in how they were derived and how they identify and weight individual risk factors. At present, their ability to effective predict anticoagulant-associated hemorrhage remains modest. Use of risk prediction tools to estimate bleeding in clinical practice is most influential when applied to patients at the lower spectrum of thromboembolic risk, when the risk of hemorrhage will more strongly affect clinical decisions about anticoagulation. Using risk tools may also help counsel and inform patients about their potential risk for hemorrhage while on anticoagulants, and can identify patients who might benefit from more careful management of anticoagulation. PMID:23479259
Navy Fuel Composition and Screening Tool (FCAST) v2.8
2016-05-10
allowed us to develop partial least squares (PLS) models based on gas chromatography–mass spectrometry (GC-MS) data that predict fuel properties. The...Chemometric property modeling Partial least squares PLS Compositional profiler Naval Air Systems Command Air-4.4.5 Patuxent River Naval Air Station Patuxent...Cumulative predicted residual error sum of squares DiEGME Diethylene glycol monomethyl ether FCAST Fuel Composition and Screening Tool FFP Fit for
A generic rabies risk assessment tool to support surveillance.
Ward, Michael P; Hernández-Jover, Marta
2015-06-01
The continued spread of rabies in Indonesia poses a risk to human and animal populations in the remaining free islands, as well as the neighbouring rabies-free countries of Timor Leste, Papua New Guinea and Australia. Here we describe the development of a generic risk assessment tool which can be used to rapidly determine the vulnerability of rabies-free islands, so that scarce resources can be targeted to surveillance activities and the sensitivity of surveillance systems increased. The tool was developed by integrating information on the historical spread of rabies, anthropological studies, and the opinions of local animal health experts. The resulting tool is based on eight critical parameters that can be estimated from the literature, expert opinion, observational studies and information generated from routine surveillance. In the case study presented, results generated by this tool were most sensitive to the probability that dogs are present on private and fishing boats and it was predicted that rabies-infection (one infected case) might occur in a rabies-free island (upper 95% prediction interval) with a volume of 1000 boats movements. With 25,000 boat movements, the median of the probability distribution would be equal to one infected case, with an upper 95% prediction interval of six infected cases. This tool could also be used at the national-level to guide control and eradication plans. An initial recommendation from this study is to develop a surveillance programme to determine the likelihood that boats transport dogs, for example by port surveillance or regularly conducted surveys of fisherman and passenger ferries. However, the illegal nature of dog transportation from rabies-infected to rabies-free islands is a challenge for developing such surveillance. Copyright © 2014 Elsevier B.V. All rights reserved.
Toxicity Estimation Software Tool (TEST)
The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...
Update on Supersonic Jet Noise Research at NASA
NASA Technical Reports Server (NTRS)
Henderson, Brenda
2010-01-01
An update on jet noise research conducted in the Fundamental Aeronautics and Integrated Systems Research Programs was presented. Highlighted research projects included those focused on the development of prediction tools, diagnostic tools, and noise reduction concepts.
Development and evaluation of a predictive algorithm for telerobotic task complexity
NASA Technical Reports Server (NTRS)
Gernhardt, M. L.; Hunter, R. C.; Hedgecock, J. C.; Stephenson, A. G.
1993-01-01
There is a wide range of complexity in the various telerobotic servicing tasks performed in subsea, space, and hazardous material handling environments. Experience with telerobotic servicing has evolved into a knowledge base used to design tasks to be 'telerobot friendly.' This knowledge base generally resides in a small group of people. Written documentation and requirements are limited in conveying this knowledge base to serviceable equipment designers and are subject to misinterpretation. A mathematical model of task complexity based on measurable task parameters and telerobot performance characteristics would be a valuable tool to designers and operational planners. Oceaneering Space Systems and TRW have performed an independent research and development project to develop such a tool for telerobotic orbital replacement unit (ORU) exchange. This algorithm was developed to predict an ORU exchange degree of difficulty rating (based on the Cooper-Harper rating used to assess piloted operations). It is based on measurable parameters of the ORU, attachment receptacle and quantifiable telerobotic performance characteristics (e.g., link length, joint ranges, positional accuracy, tool lengths, number of cameras, and locations). The resulting algorithm can be used to predict task complexity as the ORU parameters, receptacle parameters, and telerobotic characteristics are varied.
Efficient prediction of human protein-protein interactions at a global scale.
Schoenrock, Andrew; Samanfar, Bahram; Pitre, Sylvain; Hooshyar, Mohsen; Jin, Ke; Phillips, Charles A; Wang, Hui; Phanse, Sadhna; Omidi, Katayoun; Gui, Yuan; Alamgir, Md; Wong, Alex; Barrenäs, Fredrik; Babu, Mohan; Benson, Mikael; Langston, Michael A; Green, James R; Dehne, Frank; Golshani, Ashkan
2014-12-10
Our knowledge of global protein-protein interaction (PPI) networks in complex organisms such as humans is hindered by technical limitations of current methods. On the basis of short co-occurring polypeptide regions, we developed a tool called MP-PIPE capable of predicting a global human PPI network within 3 months. With a recall of 23% at a precision of 82.1%, we predicted 172,132 putative PPIs. We demonstrate the usefulness of these predictions through a range of experiments. The speed and accuracy associated with MP-PIPE can make this a potential tool to study individual human PPI networks (from genomic sequences alone) for personalized medicine.
Predicting tool life in turning operations using neural networks and image processing
NASA Astrophysics Data System (ADS)
Mikołajczyk, T.; Nowicki, K.; Bustillo, A.; Yu Pimenov, D.
2018-05-01
A two-step method is presented for the automatic prediction of tool life in turning operations. First, experimental data are collected for three cutting edges under the same constant processing conditions. In these experiments, the parameter of tool wear, VB, is measured with conventional methods and the same parameter is estimated using Neural Wear, a customized software package that combines flank wear image recognition and Artificial Neural Networks (ANNs). Second, an ANN model of tool life is trained with the data collected from the first two cutting edges and the subsequent model is evaluated on two different subsets for the third cutting edge: the first subset is obtained from the direct measurement of tool wear and the second is obtained from the Neural Wear software that estimates tool wear using edge images. Although the complete-automated solution, Neural Wear software for tool wear recognition plus the ANN model of tool life prediction, presented a slightly higher error than the direct measurements, it was within the same range and can meet all industrial requirements. These results confirm that the combination of image recognition software and ANN modelling could potentially be developed into a useful industrial tool for low-cost estimation of tool life in turning operations.
Aerodynamics and thermal physics of helicopter ice accretion
NASA Astrophysics Data System (ADS)
Han, Yiqiang
Ice accretion on aircraft introduces significant loss in airfoil performance. Reduced lift-to- drag ratio reduces the vehicle capability to maintain altitude and also limits its maneuverability. Current ice accretion performance degradation modeling approaches are calibrated only to a limited envelope of liquid water content, impact velocity, temperature, and water droplet size; consequently inaccurate aerodynamic performance degradations are estimated. The reduced ice accretion prediction capabilities in the glaze ice regime are primarily due to a lack of knowledge of surface roughness induced by ice accretion. A comprehensive understanding of the ice roughness effects on airfoil heat transfer, ice accretion shapes, and ultimately aerodynamics performance is critical for the design of ice protection systems. Surface roughness effects on both heat transfer and aerodynamic performance degradation on airfoils have been experimentally evaluated. Novel techniques, such as ice molding and casting methods and transient heat transfer measurement using non-intrusive thermal imaging methods, were developed at the Adverse Environment Rotor Test Stand (AERTS) facility at Penn State. A novel heat transfer scaling method specifically for turbulent flow regime was also conceived. A heat transfer scaling parameter, labeled as Coefficient of Stanton and Reynolds Number (CSR = Stx/Rex --0.2), has been validated against reference data found in the literature for rough flat plates with Reynolds number (Re) up to 1x107, for rough cylinders with Re ranging from 3x104 to 4x106, and for turbine blades with Re from 7.5x105 to 7x106. This is the first time that the effect of Reynolds number is shown to be successfully eliminated on heat transfer magnitudes measured on rough surfaces. Analytical models for ice roughness distribution, heat transfer prediction, and aerodynamics performance degradation due to ice accretion have also been developed. The ice roughness prediction model was developed based on a set of 82 experimental measurements and also compared to existing predictions tools. Two reference predictions found in the literature yielded 76% and 54% discrepancy with respect to experimental testing, whereas the proposed ice roughness prediction model resulted in a 31% minimum accuracy in prediction. It must be noted that the accuracy of the proposed model is within the ice shape reproduction uncertainty of icing facilities. Based on the new ice roughness prediction model and the CSR heat transfer scaling method, an icing heat transfer model was developed. The approach achieved high accuracy in heat transfer prediction compared to experiments conducted at the AERTS facility. The discrepancy between predictions and experimental results was within +/-15%, which was within the measurement uncertainty range of the facility. By combining both the ice roughness and heat transfer predictions, and incorporating the modules into an existing ice prediction tool (LEWICE), improved prediction capability was obtained, especially for the glaze regime. With the available ice shapes accreted at the AERTS facility and additional experiments found in the literature, 490 sets of experimental ice shapes and corresponding aerodynamics testing data were available. A physics-based performance degradation empirical tool was developed and achieved a mean absolute deviation of 33% when compared to the entire experimental dataset, whereas 60% to 243% discrepancies were observed using legacy drag penalty prediction tools. Rotor torque predictions coupling Blade Element Momentum Theory and the proposed drag performance degradation tool was conducted on a total of 17 validation cases. The coupled prediction tool achieved a 10% predicting error for clean rotor conditions, and 16% error for iced rotor conditions. It was shown that additional roughness element could affect the measured drag by up to 25% during experimental testing, emphasizing the need of realistic ice structures during aerodynamics modeling and testing for ice accretion.
SETAC Short Course: Introduction to interspecies toxicity extrapolation using EPA’s Web-ICE tool
The Web-ICE tool is a user friendly interface that contains modules to predict acute toxicity to over 500 species of aquatic (algae, invertebrates, fish) and terrestrial (birds and mammals) taxa. The tool contains a suite of over 3000 ICE models developed from a database of over ...
The Population Life-course Exposure to Health Effects Modeling (PLETHEM) platform being developed provides a tool that links results from emerging toxicity testing tools to exposure estimates for humans as defined by the USEPA. A reverse dosimetry case study using phthalates was ...
NASA Technical Reports Server (NTRS)
Ling, Lisa
2014-01-01
For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanamori, Masashi, E-mail: kanamori.masashi@jaxa.jp; Takahashi, Takashi, E-mail: takahashi.takashi@jaxa.jp; Aoyama, Takashi, E-mail: aoyama.takashi@jaxa.jp
2015-10-28
Shown in this paper is an introduction of a prediction tool for the propagation of loud noise with the application to the aeronautics in mind. The tool, named SPnoise, is based on HOWARD approach, which can express almost exact multidimensionality of the diffraction effect at the cost of back scattering. This paper argues, in particular, the prediction of the effect of atmospheric turbulence on sonic boom as one of the important issues in aeronautics. Thanks to the simple and efficient modeling of the atmospheric turbulence, SPnoise successfully re-creates the feature of the effect, which often emerges in the region justmore » behind the front and rear shock waves in the sonic boom signature.« less
NASA Astrophysics Data System (ADS)
Di Lorenzo, R.; Ingarao, G.; Fonti, V.
2007-05-01
The crucial task in the prevention of ductile fracture is the availability of a tool for the prediction of such defect occurrence. The technical literature presents a wide investigation on this topic and many contributions have been given by many authors following different approaches. The main class of approaches regards the development of fracture criteria: generally, such criteria are expressed by determining a critical value of a damage function which depends on stress and strain paths: ductile fracture is assumed to occur when such critical value is reached during the analysed process. There is a relevant drawback related to the utilization of ductile fracture criteria; in fact each criterion usually has good performances in the prediction of fracture for particular stress - strain paths, i.e. it works very well for certain processes but may provide no good results for other processes. On the other hand, the approaches based on damage mechanics formulation are very effective from a theoretical point of view but they are very complex and their proper calibration is quite difficult. In this paper, two different approaches are investigated to predict fracture occurrence in cold forming operations. The final aim of the proposed method is the achievement of a tool which has a general reliability i.e. it is able to predict fracture for different forming processes. The proposed approach represents a step forward within a research project focused on the utilization of innovative predictive tools for ductile fracture. The paper presents a comparison between an artificial neural network design procedure and an approach based on statistical tools; both the approaches were aimed to predict fracture occurrence/absence basing on a set of stress and strain paths data. The proposed approach is based on the utilization of experimental data available, for a given material, on fracture occurrence in different processes. More in detail, the approach consists in the analysis of experimental tests in which fracture occurs followed by the numerical simulations of such processes in order to track the stress-strain paths in the workpiece region where fracture is expected. Such data are utilized to build up a proper data set which was utilized both to train an artificial neural network and to perform a statistical analysis aimed to predict fracture occurrence. The developed statistical tool is properly designed and optimized and is able to recognize the fracture occurrence. The reliability and predictive capability of the statistical method were compared with the ones obtained from an artificial neural network developed to predict fracture occurrence. Moreover, the approach is validated also in forming processes characterized by a complex fracture mechanics.
NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less
Cohen-Stavi, Chandra; Leventer-Roberts, Maya; Balicer, Ran D
2017-01-01
Objective To directly compare the performance and externally validate the three most studied prediction tools for osteoporotic fractures—QFracture, FRAX, and Garvan—using data from electronic health records. Design Retrospective cohort study. Setting Payer provider healthcare organisation in Israel. Participants 1 054 815 members aged 50 to 90 years for comparison between tools and cohorts of different age ranges, corresponding to those in each tools’ development study, for tool specific external validation. Main outcome measure First diagnosis of a major osteoporotic fracture (for QFracture and FRAX tools) and hip fractures (for all three tools) recorded in electronic health records from 2010 to 2014. Observed fracture rates were compared to probabilities predicted retrospectively as of 2010. Results The observed five year hip fracture rate was 2.7% and the rate for major osteoporotic fractures was 7.7%. The areas under the receiver operating curve (AUC) for hip fracture prediction were 82.7% for QFracture, 81.5% for FRAX, and 77.8% for Garvan. For major osteoporotic fractures, AUCs were 71.2% for QFracture and 71.4% for FRAX. All the tools underestimated the fracture risk, but the average observed to predicted ratios and the calibration slopes of FRAX were closest to 1. Tool specific validation analyses yielded hip fracture prediction AUCs of 88.0% for QFracture (among those aged 30-100 years), 81.5% for FRAX (50-90 years), and 71.2% for Garvan (60-95 years). Conclusions Both QFracture and FRAX had high discriminatory power for hip fracture prediction, with QFracture performing slightly better. This performance gap was more pronounced in previous studies, likely because of broader age inclusion criteria for QFracture validations. The simpler FRAX performed almost as well as QFracture for hip fracture prediction, and may have advantages if some of the input data required for QFracture are not available. However, both tools require calibration before implementation. PMID:28104610
Boundary Layer Transition Results From STS-114
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Horvath, Thomas J.; Cassady, Amy M.; Kirk, Benjamin S.; Wang, K. C.; Hyatt, Andrew J.
2006-01-01
The tool for predicting the onset of boundary layer transition from damage to and/or repair of the thermal protection system developed in support of Shuttle Return to Flight is compared to the STS-114 flight results. The Boundary Layer Transition (BLT) Tool is part of a suite of tools that analyze the aerothermodynamic environment of the local thermal protection system to allow informed disposition of damage for making recommendations to fly as is or to repair. Using mission specific trajectory information and details of each damage site or repair, the expected time of transition onset is predicted to help determine the proper aerothermodynamic environment to use in the subsequent thermal and stress analysis of the local structure. The boundary layer transition criteria utilized for the tool was developed from ground-based measurements to account for the effect of both protuberances and cavities and has been calibrated against flight data. Computed local boundary layer edge conditions provided the means to correlate the experimental results and then to extrapolate to flight. During STS-114, the BLT Tool was utilized and was part of the decision making process to perform an extravehicular activity to remove the large gap fillers. The role of the BLT Tool during this mission, along with the supporting information that was acquired for the on-orbit analysis, is reviewed. Once the large gap fillers were removed, all remaining damage sites were cleared for reentry as is. Post-flight analysis of the transition onset time revealed excellent agreement with BLT Tool predictions.
External validation of a simple clinical tool used to predict falls in people with Parkinson disease
Duncan, Ryan P.; Cavanaugh, James T.; Earhart, Gammon M.; Ellis, Terry D.; Ford, Matthew P.; Foreman, K. Bo; Leddy, Abigail L.; Paul, Serene S.; Canning, Colleen G.; Thackeray, Anne; Dibble, Leland E.
2015-01-01
Background Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. METHODS We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. RESULTS The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76 –0.89), comparable to the developmental study. CONCLUSION The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual’s risk of an impending fall. PMID:26003412
Duncan, Ryan P; Cavanaugh, James T; Earhart, Gammon M; Ellis, Terry D; Ford, Matthew P; Foreman, K Bo; Leddy, Abigail L; Paul, Serene S; Canning, Colleen G; Thackeray, Anne; Dibble, Leland E
2015-08-01
Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76-0.89), comparable to the developmental study. The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual's risk of an impending fall. Copyright © 2015 Elsevier Ltd. All rights reserved.
RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis.
Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab
2012-01-01
RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. http://www.cemb.edu.pk/sw.html RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language.
William Elliot; David Hall
2005-01-01
The Water Erosion Prediction Project (WEPP) Fuel Management (FuMe) tool was developed to estimate sediment generated by fuel management activities. WEPP FuMe estimates sediment generated for 12 fuel-related conditions from a single input. This fact sheet identifies the intended users and uses, required inputs, what the model does, and tells the user how to obtain the...
NASA Astrophysics Data System (ADS)
gochis, David; hooper, Rick; parodi, Antonio; Jha, Shantenu; Yu, Wei; Zaslavsky, Ilya; Ganapati, Dinesh
2014-05-01
The community WRF-Hydro system is currently being used in a variety of flood prediction and regional hydroclimate impacts assessment applications around the world. Despite its increasingly wide use certain cyberinfrastructure bottlenecks exist in the setup, execution and post-processing of WRF-Hydro model runs. These bottlenecks result in wasted time, labor, data transfer bandwidth and computational resource use. Appropriate development and use of cyberinfrastructure to setup and manage WRF-Hydro modeling applications will streamline the entire workflow of hydrologic model predictions. This talk will present recent advances in the development and use of new open-source cyberinfrastructure tools for the WRF-Hydro architecture. These tools include new web-accessible pre-processing applications, supercomputer job management applications and automated verification and visualization applications. The tools will be described successively and then demonstrated in a set of flash flood use cases for recent destructive flood events in the U.S. and in Europe. Throughout, an emphasis on the implementation and use of community data standards for data exchange is made.
Novel inter and intra prediction tools under consideration for the emerging AV1 video codec
NASA Astrophysics Data System (ADS)
Joshi, Urvang; Mukherjee, Debargha; Han, Jingning; Chen, Yue; Parker, Sarah; Su, Hui; Chiang, Angie; Xu, Yaowu; Liu, Zoe; Wang, Yunqing; Bankoski, Jim; Wang, Chen; Keyder, Emil
2017-09-01
Google started the WebM Project in 2010 to develop open source, royalty- free video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec AV1, in a consortium of major tech companies called the Alliance for Open Media, that achieves at least a generational improvement in coding efficiency over VP9. In this paper, we focus primarily on new tools in AV1 that improve the prediction of pixel blocks before transforms, quantization and entropy coding are invoked. Specifically, we describe tools and coding modes that improve intra, inter and combined inter-intra prediction. Results are presented on standard test sets.
Bankruptcy Prevention: New Effort to Reflect on Legal and Social Changes.
Kliestik, Tomas; Misankova, Maria; Valaskova, Katarina; Svabova, Lucia
2018-04-01
Every corporation has an economic and moral responsibility to its stockholders to perform well financially. However, the number of bankruptcies in Slovakia has been growing for several years without an apparent macroeconomic cause. To prevent a rapid denigration and to prevent the outflow of foreign capital, various efforts are being zealously implemented. Robust analysis using conventional bankruptcy prediction tools revealed that the existing models are adaptable to local conditions, particularly local legislation. Furthermore, it was confirmed that most of these outdated tools have sufficient capability to warn of impending financial problems several years in advance. A novel bankruptcy prediction tool that outperforms the conventional models was developed. However, it is increasingly challenging to predict bankruptcy risk as corporations have become more global and more complex and as they have developed sophisticated schemes to hide their actual situations under the guise of "optimization" for tax authorities. Nevertheless, scepticism remains because economic engineers have established bankruptcy as a strategy to limit the liability resulting from court-imposed penalties.
The development of conceptual and predictive models is an important tool to guide site characterization in support of monitoring contaminants in ground water. The accuracy of predictive models is limited by the adequacy of the input data and the assumptions made to constrain mod...
NASA Technical Reports Server (NTRS)
West, Jeff; Strutzenberg, Louise L.; Putnam, Gabriel C.; Liever, Peter A.; Williams, Brandon R.
2012-01-01
This paper presents development efforts to establish modeling capabilities for launch vehicle liftoff acoustics and ignition transient environment predictions. Peak acoustic loads experienced by the launch vehicle occur during liftoff with strong interaction between the vehicle and the launch facility. Acoustic prediction engineering tools based on empirical models are of limited value in efforts to proactively design and optimize launch vehicles and launch facility configurations for liftoff acoustics. Modeling approaches are needed that capture the important details of the plume flow environment including the ignition transient, identify the noise generation sources, and allow assessment of the effects of launch pad geometric details and acoustic mitigation measures such as water injection. This paper presents a status of the CFD tools developed by the MSFC Fluid Dynamics Branch featuring advanced multi-physics modeling capabilities developed towards this goal. Validation and application examples are presented along with an overview of application in the prediction of liftoff environments and the design of targeted mitigation measures such as launch pad configuration and sound suppression water placement.
D-MATRIX: A web tool for constructing weight matrix of conserved DNA motifs
Sen, Naresh; Mishra, Manoj; Khan, Feroz; Meena, Abha; Sharma, Ashok
2009-01-01
Despite considerable efforts to date, DNA motif prediction in whole genome remains a challenge for researchers. Currently the genome wide motif prediction tools required either direct pattern sequence (for single motif) or weight matrix (for multiple motifs). Although there are known motif pattern databases and tools for genome level prediction but no tool for weight matrix construction. Considering this, we developed a D-MATRIX tool which predicts the different types of weight matrix based on user defined aligned motif sequence set and motif width. For retrieval of known motif sequences user can access the commonly used databases such as TFD, RegulonDB, DBTBS, Transfac. DMATRIX program uses a simple statistical approach for weight matrix construction, which can be converted into different file formats according to user requirement. It provides the possibility to identify the conserved motifs in the coregulated genes or whole genome. As example, we successfully constructed the weight matrix of LexA transcription factor binding site with the help of known sosbox cisregulatory elements in Deinococcus radiodurans genome. The algorithm is implemented in C-Sharp and wrapped in ASP.Net to maintain a user friendly web interface. DMATRIX tool is accessible through the CIMAP domain network. Availability http://203.190.147.116/dmatrix/ PMID:19759861
D-MATRIX: a web tool for constructing weight matrix of conserved DNA motifs.
Sen, Naresh; Mishra, Manoj; Khan, Feroz; Meena, Abha; Sharma, Ashok
2009-07-27
Despite considerable efforts to date, DNA motif prediction in whole genome remains a challenge for researchers. Currently the genome wide motif prediction tools required either direct pattern sequence (for single motif) or weight matrix (for multiple motifs). Although there are known motif pattern databases and tools for genome level prediction but no tool for weight matrix construction. Considering this, we developed a D-MATRIX tool which predicts the different types of weight matrix based on user defined aligned motif sequence set and motif width. For retrieval of known motif sequences user can access the commonly used databases such as TFD, RegulonDB, DBTBS, Transfac. D-MATRIX program uses a simple statistical approach for weight matrix construction, which can be converted into different file formats according to user requirement. It provides the possibility to identify the conserved motifs in the co-regulated genes or whole genome. As example, we successfully constructed the weight matrix of LexA transcription factor binding site with the help of known sos-box cis-regulatory elements in Deinococcus radiodurans genome. The algorithm is implemented in C-Sharp and wrapped in ASP.Net to maintain a user friendly web interface. D-MATRIX tool is accessible through the CIMAP domain network. http://203.190.147.116/dmatrix/
Identifying gnostic predictors of the vaccine response.
Haining, W Nicholas; Pulendran, Bali
2012-06-01
Molecular predictors of the response to vaccination could transform vaccine development. They would allow larger numbers of vaccine candidates to be rapidly screened, shortening the development time for new vaccines. Gene-expression based predictors of vaccine response have shown early promise. However, a limitation of gene-expression based predictors is that they often fail to reveal the mechanistic basis of their ability to classify response. Linking predictive signatures to the function of their component genes would advance basic understanding of vaccine immunity and also improve the robustness of vaccine prediction. New analytic tools now allow more biological meaning to be extracted from predictive signatures. Functional genomic approaches to perturb gene expression in mammalian cells permit the function of predictive genes to be surveyed in highly parallel experiments. The challenge for vaccinologists is therefore to use these tools to embed mechanistic insights into predictors of vaccine response. Copyright © 2012 Elsevier Ltd. All rights reserved.
Towards a National Space Weather Predictive Capability
NASA Astrophysics Data System (ADS)
Fox, N. J.; Lindstrom, K. L.; Ryschkewitsch, M. G.; Anderson, B. J.; Gjerloev, J. W.; Merkin, V. G.; Kelly, M. A.; Miller, E. S.; Sitnov, M. I.; Ukhorskiy, A. Y.; Erlandson, R. E.; Barnes, R. J.; Paxton, L. J.; Sotirelis, T.; Stephens, G.; Comberiate, J.
2014-12-01
National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review datasets, tools and models that have resulted from research by scientists at JHU/APL, and examine how they could be applied to support space weather applications in coordination with other community assets and capabilities.
Fuzzy regression modeling for tool performance prediction and degradation detection.
Li, X; Er, M J; Lim, B S; Zhou, J H; Gan, O P; Rutkowski, L
2010-10-01
In this paper, the viability of using Fuzzy-Rule-Based Regression Modeling (FRM) algorithm for tool performance and degradation detection is investigated. The FRM is developed based on a multi-layered fuzzy-rule-based hybrid system with Multiple Regression Models (MRM) embedded into a fuzzy logic inference engine that employs Self Organizing Maps (SOM) for clustering. The FRM converts a complex nonlinear problem to a simplified linear format in order to further increase the accuracy in prediction and rate of convergence. The efficacy of the proposed FRM is tested through a case study - namely to predict the remaining useful life of a ball nose milling cutter during a dry machining process of hardened tool steel with a hardness of 52-54 HRc. A comparative study is further made between four predictive models using the same set of experimental data. It is shown that the FRM is superior as compared with conventional MRM, Back Propagation Neural Networks (BPNN) and Radial Basis Function Networks (RBFN) in terms of prediction accuracy and learning speed.
Application of predictive modelling techniques in industry: from food design up to risk assessment.
Membré, Jeanne-Marie; Lambert, Ronald J W
2008-11-30
In this communication, examples of applications of predictive microbiology in industrial contexts (i.e. Nestlé and Unilever) are presented which cover a range of applications in food safety from formulation and process design to consumer safety risk assessment. A tailor-made, private expert system, developed to support safe product/process design assessment is introduced as an example of how predictive models can be deployed for use by non-experts. Its use in conjunction with other tools and software available in the public domain is discussed. Specific applications of predictive microbiology techniques are presented relating to investigations of either growth or limits to growth with respect to product formulation or process conditions. An example of a probabilistic exposure assessment model for chilled food application is provided and its potential added value as a food safety management tool in an industrial context is weighed against its disadvantages. The role of predictive microbiology in the suite of tools available to food industry and some of its advantages and constraints are discussed.
Mallika, V; Sivakumar, K C; Jaichand, S; Soniya, E V
2010-07-13
Type III Polyketide synthases (PKS) are family of proteins considered to have significant roles in the biosynthesis of various polyketides in plants, fungi and bacteria. As these proteins shows positive effects to human health, more researches are going on regarding this particular protein. Developing a tool to identify the probability of sequence being a type III polyketide synthase will minimize the time consumption and manpower efforts. In this approach, we have designed and implemented PKSIIIpred, a high performance prediction server for type III PKS where the classifier is Support Vector Machines (SVMs). Based on the limited training dataset, the tool efficiently predicts the type III PKS superfamily of proteins with high sensitivity and specificity. The PKSIIIpred is available at http://type3pks.in/prediction/. We expect that this tool may serve as a useful resource for type III PKS researchers. Currently work is being progressed for further betterment of prediction accuracy by including more sequence features in the training dataset.
Primer on consumer marketing research : procedures, methods, and tools
DOT National Transportation Integrated Search
1994-03-01
The Volpe Center developed a marketing research primer which provides a guide to the approach, procedures, and research tools used by private industry in predicting consumer response. The final two chapters of the primer focus on the challenges of do...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Evan
There exist hundreds of building energy software tools, both web- and disk-based. These tools exhibit considerable range in approach and creativity, with some being highly specialized and others able to consider the building as a whole. However, users are faced with a dizzying array of choices and, often, conflicting results. The fragmentation of development and deployment efforts has hampered tool quality and market penetration. The purpose of this review is to provide information for defining the desired characteristics of residential energy tools, and to encourage future tool development that improves on current practice. This project entails (1) creating a frameworkmore » for describing possible technical and functional characteristics of such tools, (2) mapping existing tools onto this framework, (3) exploring issues of tool accuracy, and (4) identifying ''best practice'' and strategic opportunities for tool design. evaluated 50 web-based residential calculators, 21 of which we regard as ''whole-house'' tools(i.e., covering a range of end uses). Of the whole-house tools, 13 provide open-ended energy calculations, 5 normalize the results to actual costs (a.k.a ''bill-disaggregation tools''), and 3 provide both options. Across the whole-house tools, we found a range of 5 to 58 house-descriptive features (out of 68 identified in our framework) and 2 to 41 analytical and decision-support features (55 possible). We also evaluated 15 disk-based residential calculators, six of which are whole-house tools. Of these tools, 11 provide open-ended calculations, 1 normalizes the results to actual costs, and 3 provide both options. These tools offered ranges of 18 to 58 technical features (70 possible) and 10 to 40 user- and decision-support features (56 possible). The comparison shows that such tools can employ many approaches and levels of detail. Some tools require a relatively small number of well-considered inputs while others ask a myriad of questions and still miss key issues. The value of detail has a lot to do with the type of question(s) being asked by the user (e.g., the availability of dozens of miscellaneous appliances is immaterial for a user attempting to evaluate the potential for space-heating savings by installing a new furnace). More detail does not, according to our evaluation, automatically translate into a ''better'' or ''more accurate'' tool. Efforts to quantify and compare the ''accuracy'' of these tools are difficult at best, and prior tool-comparison studies have not undertaken this in a meaningful way. The ability to evaluate accuracy is inherently limited by the availability of measured data. Furthermore, certain tool outputs can only be measured against ''actual'' values that are themselves calculated (e.g., HVAC sizing), while others are rarely if ever available (e.g., measured energy use or savings for specific measures). Similarly challenging is to understand the sources of inaccuracies. There are many ways in which quantitative errors can occur in tools, ranging from programming errors to problems inherent in a tool's design. Due to hidden assumptions and non-variable ''defaults'', most tools cannot be fully tested across the desirable range of building configurations, operating conditions, weather locations, etc. Many factors conspire to confound performance comparisons among tools. Differences in inputs can range from weather city, to types of HVAC systems, to appliance characteristics, to occupant-driven effects such as thermostat management. Differences in results would thus no doubt emerge from an extensive comparative exercise, but the sources or implications of these differences for the purposes of accuracy evaluation or tool development would remain largely unidentifiable (especially given the paucity of technical documentation available for most tools). For the tools that we tested, the predicted energy bills for a single test building ranged widely (by nearly a factor of three), and far more so at the end-use level. Most tools over-predicted energy bills and all over-predicted consumption. Variability was lower among disk-based tools,but they more significantly over-predicted actual use. The deviations (over-predictions) we observed from actual bills corresponded to up to $1400 per year (approx. 250 percent of the actual bills). For bill-disaggregation tools, wherein the results are forced to equal actual bills, the accuracy issue shifts to whether or not the total is properly attributed to the various end uses and to whether savings calculations are done accurately (a challenge that demands relatively rare end-use data). Here, too, we observed a number of dubious results. Energy savings estimates automatically generated by the web-based tools varied from $46/year (5 percent of predicted use) to $625/year (52 percent of predicted use).« less
Status of Technology Development to enable Large Stable UVOIR Space Telescopes
NASA Astrophysics Data System (ADS)
Stahl, H. Philip; MSFC AMTD Team
2017-01-01
NASA MSFC has two funded Strategic Astrophysics Technology projects to develop technology for potential future large missions: AMTD and PTC. The Advanced Mirror Technology Development (AMTD) project is developing technology to make mechanically stable mirrors for a 4-meter or larger UVOIR space telescope. AMTD is demonstrating this technology by making a 1.5 meter diameter x 200 mm thick ULE(C) mirror that is 1/3rd scale of a full size 4-m mirror. AMTD is characterizing the mechanical and thermal performance of this mirror and of a 1.2-meter Zerodur(R) mirror to validate integrate modeling tools. Additionally, AMTD has developed integrated modeling tools which are being used to evaluate primary mirror systems for a potential Habitable Exoplanet Mission and analyzed the interaction between optical telescope wavefront stability and coronagraph contrast leakage. Predictive Thermal Control (PTC) project is developing technology to enable high stability thermal wavefront performance by using integrated modeling tools to predict and actively control the thermal environment of a 4-m or larger UVOIR space telescope.
An integrative approach to ortholog prediction for disease-focused and other functional studies.
Hu, Yanhui; Flockhart, Ian; Vinayagam, Arunachalam; Bergwitz, Clemens; Berger, Bonnie; Perrimon, Norbert; Mohr, Stephanie E
2011-08-31
Mapping of orthologous genes among species serves an important role in functional genomics by allowing researchers to develop hypotheses about gene function in one species based on what is known about the functions of orthologs in other species. Several tools for predicting orthologous gene relationships are available. However, these tools can give different results and identification of predicted orthologs is not always straightforward. We report a simple but effective tool, the Drosophila RNAi Screening Center Integrative Ortholog Prediction Tool (DIOPT; http://www.flyrnai.org/diopt), for rapid identification of orthologs. DIOPT integrates existing approaches, facilitating rapid identification of orthologs among human, mouse, zebrafish, C. elegans, Drosophila, and S. cerevisiae. As compared to individual tools, DIOPT shows increased sensitivity with only a modest decrease in specificity. Moreover, the flexibility built into the DIOPT graphical user interface allows researchers with different goals to appropriately 'cast a wide net' or limit results to highest confidence predictions. DIOPT also displays protein and domain alignments, including percent amino acid identity, for predicted ortholog pairs. This helps users identify the most appropriate matches among multiple possible orthologs. To facilitate using model organisms for functional analysis of human disease-associated genes, we used DIOPT to predict high-confidence orthologs of disease genes in Online Mendelian Inheritance in Man (OMIM) and genes in genome-wide association study (GWAS) data sets. The results are accessible through the DIOPT diseases and traits query tool (DIOPT-DIST; http://www.flyrnai.org/diopt-dist). DIOPT and DIOPT-DIST are useful resources for researchers working with model organisms, especially those who are interested in exploiting model organisms such as Drosophila to study the functions of human disease genes.
Pharmacological mechanism-based drug safety assessment and prediction.
Abernethy, D R; Woodcock, J; Lesko, L J
2011-06-01
Advances in cheminformatics, bioinformatics, and pharmacology in the context of biological systems are now at a point that these tools can be applied to mechanism-based drug safety assessment and prediction. The development of such predictive tools at the US Food and Drug Administration (FDA) will complement ongoing efforts in drug safety that are focused on spontaneous adverse event reporting and active surveillance to monitor drug safety. This effort will require the active collaboration of scientists in the pharmaceutical industry, academe, and the National Institutes of Health, as well as those at the FDA, to reach its full potential. Here, we describe the approaches and goals for the mechanism-based drug safety assessment and prediction program.
Huang, Sheng-Feng; Chang, Jung-San; Sheu, Chau-Chyun; Liu, Yu-Ting; Lin, Ying-Chi
2016-09-01
Pneumonia is a leading cause of death in medical intensive care units (MICUs). Delayed or inappropriate antibiotic therapy largely increases morbidity and mortality. Multidrug-resistant (MDR) micro-organisms are major reasons for inappropriate antibiotic use. Currently there is no good antibiotic decision-making tool designed for critically ill patients. The objective of this study was to develop a convenient MDR prediction scoring system for patients admitted to MICUs with pneumonia. A retrospective cohort study was conducted using databases and chart reviews of pneumonia patients admitted to a 30-bed MICU from 2012 to 2013. Forward logistic regression was applied to identify independent MDR risk factors for prediction tool development. A total of 283 pneumonia episodes from 263 patients with positive cultures from blood or respiratory secretions were recruited, of which 154 (54.4%) were MDR episodes. Long-term ventilation (OR = 11.09; P = 0.026), residence in a long-term care facility (OR = 2.50; P = 0.005), MDR infection/colonisation during the preceding 90 days (OR = 2.08; P = 0.041), current hospitalisation ≥2 days (OR = 1.98; P = 0.019) and stroke (OR = 1.81; P = 0.035) were identified as independent predictors for MDR pneumonia. The area under the ROC curve of this prediction tool was much higher than that of ATS/IDSA classification (0.69 vs. 0.54; P <0.001). The prediction accuracy of this tool with risk score ≥1 for MDR infections was 63.7%. This simple five-item, one-step scoring tool for critically ill patients admitted to the MICU could help physicians provide timely appropriate empirical antibiotics. Copyright © 2016 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zi-Kui; Gleeson, Brian; Shang, Shunli
This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities,more » which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.« less
DEMO: Sequence Alignment to Predict Across Species Susceptibility
The US Environmental Protection Agency Sequence Alignment to Predict Across Species Susceptibility tool (SeqAPASS; https://seqapass.epa.gov/seqapass/) was developed to comparatively evaluate protein sequence and structural similarity across species as a means to extrapolate toxic...
Nouretdinov, Ilia; Costafreda, Sergi G; Gammerman, Alexander; Chervonenkis, Alexey; Vovk, Vladimir; Vapnik, Vladimir; Fu, Cynthia H Y
2011-05-15
There is rapidly accumulating evidence that the application of machine learning classification to neuroimaging measurements may be valuable for the development of diagnostic and prognostic prediction tools in psychiatry. However, current methods do not produce a measure of the reliability of the predictions. Knowing the risk of the error associated with a given prediction is essential for the development of neuroimaging-based clinical tools. We propose a general probabilistic classification method to produce measures of confidence for magnetic resonance imaging (MRI) data. We describe the application of transductive conformal predictor (TCP) to MRI images. TCP generates the most likely prediction and a valid measure of confidence, as well as the set of all possible predictions for a given confidence level. We present the theoretical motivation for TCP, and we have applied TCP to structural and functional MRI data in patients and healthy controls to investigate diagnostic and prognostic prediction in depression. We verify that TCP predictions are as accurate as those obtained with more standard machine learning methods, such as support vector machine, while providing the additional benefit of a valid measure of confidence for each prediction. Copyright © 2010 Elsevier Inc. All rights reserved.
Guidelines for reporting and using prediction tools for genetic variation analysis.
Vihinen, Mauno
2013-02-01
Computational prediction methods are widely used for the analysis of human genome sequence variants and their effects on gene/protein function, splice site aberration, pathogenicity, and disease risk. New methods are frequently developed. We believe that guidelines are essential for those writing articles about new prediction methods, as well as for those applying these tools in their research, so that the necessary details are reported. This will enable readers to gain the full picture of technical information, performance, and interpretation of results, and to facilitate comparisons of related methods. Here, we provide instructions on how to describe new methods, report datasets, and assess the performance of predictive tools. We also discuss what details of predictor implementation are essential for authors to understand. Similarly, these guidelines for the use of predictors provide instructions on what needs to be delineated in the text, as well as how researchers can avoid unwarranted conclusions. They are applicable to most prediction methods currently utilized. By applying these guidelines, authors will help reviewers, editors, and readers to more fully comprehend prediction methods and their use. © 2012 Wiley Periodicals, Inc.
GAP Final Technical Report 12-14-04
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrew J. Bordner, PhD, Senior Research Scientist
2004-12-14
The Genomics Annotation Platform (GAP) was designed to develop new tools for high throughput functional annotation and characterization of protein sequences and structures resulting from genomics and structural proteomics, benchmarking and application of those tools. Furthermore, this platform integrated the genomic scale sequence and structural analysis and prediction tools with the advanced structure prediction and bioinformatics environment of ICM. The development of GAP was primarily oriented towards the annotation of new biomolecular structures using both structural and sequence data. Even though the amount of protein X-ray crystal data is growing exponentially, the volume of sequence data is growing even moremore » rapidly. This trend was exploited by leveraging the wealth of sequence data to provide functional annotation for protein structures. The additional information provided by GAP is expected to assist the majority of the commercial users of ICM, who are involved in drug discovery, in identifying promising drug targets as well in devising strategies for the rational design of therapeutics directed at the protein of interest. The GAP also provided valuable tools for biochemistry education, and structural genomics centers. In addition, GAP incorporates many novel prediction and analysis methods not available in other molecular modeling packages. This development led to signing the first Molsoft agreement in the structural genomics annotation area with the University of oxford Structural Genomics Center. This commercial agreement validated the Molsoft efforts under the GAP project and provided the basis for further development of the large scale functional annotation platform.« less
Development and Application of Predictive Tools for MHD Stability Limits in Tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brennan, Dylan; Miller, G. P.
This is a project to develop and apply analytic and computational tools to answer physics questions relevant to the onset of non-ideal magnetohydrodynamic (MHD) instabilities in toroidal magnetic confinement plasmas. The focused goal of the research is to develop predictive tools for these instabilities, including an inner layer solution algorithm, a resistive wall with control coils, and energetic particle effects. The production phase compares studies of instabilities in such systems using analytic techniques, PEST- III and NIMROD. Two important physics puzzles are targeted as guiding thrusts for the analyses. The first is to form an accurate description of the physicsmore » determining whether the resistive wall mode or a tearing mode will appear first as β is increased at low rotation and low error fields in DIII-D. The second is to understand the physical mechanism behind recent NIMROD results indicating strong damping and stabilization from energetic particle effects on linear resistive modes. The work seeks to develop a highly relevant predictive tool for ITER, advance the theoretical description of this physics in general, and analyze these instabilities in experiments such as ASDEX Upgrade, DIII-D, JET, JT-60U and NTSX. The awardee on this grant is the University of Tulsa. The research efforts are supervised principally by Dr. Brennan. Support is included for two graduate students, and a strong collaboration with Dr. John M. Finn of LANL. The work includes several ongoing collaborations with General Atomics, PPPL, and the NIMROD team, among others.« less
Müller, Daniel J.; Ng, Chee H.; Byron, Keith; Berk, Michael; Singh, Ajeet B.
2017-01-01
Background Pharmacogenetic-based dosing support tools have been developed to personalize antidepressant-prescribing practice. However, the clinical validity of these tools has not been adequately tested, particularly for specific antidepressants. Objective To examine the concordance between the actual dose and a polygene pharmacogenetic predicted dose of desvenlafaxine needed to achieve symptom remission. Materials and methods A 10-week, open-label, prospective trial of desvenlafaxine among Caucasian adults with major depressive disorder (n=119) was conducted. Dose was clinically adjusted and at the completion of the trial, the clinical dose needed to achieve remission was compared with the predicted dose needed to achieve remission. Results Among remitters (n=95), there was a strong concordance (Kendall’s τ-b=0.84, P=0.0001; Cohen’s κ=0.82, P=0.0001) between the actual and the predicted dose need to achieve symptom remission, showing high sensitivity (≥85%), specificity (≥86%), and accuracy (≥89%) of the tool. Conclusion Findings provide initial evidence for the clinical validity of a polygene pharmacogenetic-based tool for desvenlafaxine dosing. PMID:27779571
Bousman, Chad A; Müller, Daniel J; Ng, Chee H; Byron, Keith; Berk, Michael; Singh, Ajeet B
2017-01-01
Pharmacogenetic-based dosing support tools have been developed to personalize antidepressant-prescribing practice. However, the clinical validity of these tools has not been adequately tested, particularly for specific antidepressants. To examine the concordance between the actual dose and a polygene pharmacogenetic predicted dose of desvenlafaxine needed to achieve symptom remission. A 10-week, open-label, prospective trial of desvenlafaxine among Caucasian adults with major depressive disorder (n=119) was conducted. Dose was clinically adjusted and at the completion of the trial, the clinical dose needed to achieve remission was compared with the predicted dose needed to achieve remission. Among remitters (n=95), there was a strong concordance (Kendall's τ-b=0.84, P=0.0001; Cohen's κ=0.82, P=0.0001) between the actual and the predicted dose need to achieve symptom remission, showing high sensitivity (≥85%), specificity (≥86%), and accuracy (≥89%) of the tool. Findings provide initial evidence for the clinical validity of a polygene pharmacogenetic-based tool for desvenlafaxine dosing.
GAPIT version 2: an enhanced integrated tool for genomic association and prediction
USDA-ARS?s Scientific Manuscript database
Most human diseases and agriculturally important traits are complex. Dissecting their genetic architecture requires continued development of innovative and powerful statistical methods. Corresponding advances in computing tools are critical to efficiently use these statistical innovations and to enh...
Feasibility of lane closures using probe data : technical brief.
DOT National Transportation Integrated Search
2017-04-01
This study developed an on-line system analysis tool called the Work Zone Interactive : Management Application - Planning (WIMAP-P), an easy-to-use and easy-to-learn tool for : predicting the traffic impact caused by work zone lane closures on freewa...
Nutritional Risk in Emergency-2017: A New Simplified Proposal for a Nutrition Screening Tool.
Marcadenti, Aline; Mendes, Larissa Loures; Rabito, Estela Iraci; Fink, Jaqueline da Silva; Silva, Flávia Moraes
2018-03-13
There are many nutrition screening tools currently being applied in hospitals to identify risk of malnutrition. However, multivariate statistical models are not usually employed to take into account the importance of each variable included in the instrument's development. To develop and evaluate the concurrent and predictive validities of a new screening tool of nutrition risk. A prospective cohort study was developed, in which 4 nutrition screening tools were applied to all patients. Length of stay in hospital and mortality were considered to test the predictive validity, and the concurrent validity was tested by comparing the Nuritional Risk in Emergency (NRE)-2017 to the other tools. A total of 748 patients were included. The final NRE-2017 score was composed of 6 questions (advanced age, metabolic stress of the disease, decreased appetite, changing of food consistency, unintentional weight loss, and muscle mass loss) with answers yes or no. The prevalence of nutrition risk was 50.7% and 38.8% considering the cutoff points 1.0 and 1.5, respectively. The NRE-2017 showed a satisfactory power to indentify risk of malnutrition (area under the curve >0.790 for all analyses). According to the NRE-2017, patients at risk of malnutrition have twice as high relative risk of a very long hospital stay. The hazard ratio for mortality was 2.78 (1.03-7.49) when the cutoff adopted by the NRE-2017 was 1.5 points. NRE-2017 is a new, easy-to-apply nutrition screening tool which uses 6 bi-categoric features to detect the risk of malnutrition, and it presented a good concurrent and predictive validity. © 2018 American Society for Parenteral and Enteral Nutrition.
Hsu, Kuo-Hsiang; Su, Bo-Han; Tu, Yi-Shu; Lin, Olivia A.; Tseng, Yufeng J.
2016-01-01
With advances in the development and application of Ames mutagenicity in silico prediction tools, the International Conference on Harmonisation (ICH) has amended its M7 guideline to reflect the use of such prediction models for the detection of mutagenic activity in early drug safety evaluation processes. Since current Ames mutagenicity prediction tools only focus on functional group alerts or side chain modifications of an analog series, these tools are unable to identify mutagenicity derived from core structures or specific scaffolds of a compound. In this study, a large collection of 6512 compounds are used to perform scaffold tree analysis. By relating different scaffolds on constructed scaffold trees with Ames mutagenicity, four major and one minor novel mutagenic groups of scaffold are identified. The recognized mutagenic groups of scaffold can serve as a guide for medicinal chemists to prevent the development of potentially mutagenic therapeutic agents in early drug design or development phases, by modifying the core structures of mutagenic compounds to form non-mutagenic compounds. In addition, five series of substructures are provided as recommendations, for direct modification of potentially mutagenic scaffolds to decrease associated mutagenic activities. PMID:26863515
Gene Unprediction with Spurio: A tool to identify spurious protein sequences.
Höps, Wolfram; Jeffryes, Matt; Bateman, Alex
2018-01-01
We now have access to the sequences of tens of millions of proteins. These protein sequences are essential for modern molecular biology and computational biology. The vast majority of protein sequences are derived from gene prediction tools and have no experimental supporting evidence for their translation. Despite the increasing accuracy of gene prediction tools there likely exists a large number of spurious protein predictions in the sequence databases. We have developed the Spurio tool to help identify spurious protein predictions in prokaryotes. Spurio searches the query protein sequence against a prokaryotic nucleotide database using tblastn and identifies homologous sequences. The tblastn matches are used to score the query sequence's likelihood of being a spurious protein prediction using a Gaussian process model. The most informative feature is the appearance of stop codons within the presumed translation of homologous DNA sequences. Benchmarking shows that the Spurio tool is able to distinguish spurious from true proteins. However, transposon proteins are prone to be predicted as spurious because of the frequency of degraded homologs found in the DNA sequence databases. Our initial experiments suggest that less than 1% of the proteins in the UniProtKB sequence database are likely to be spurious and that Spurio is able to identify over 60 times more spurious proteins than the AntiFam resource. The Spurio software and source code is available under an MIT license at the following URL: https://bitbucket.org/bateman-group/spurio.
Agur, Zvia; Elishmereni, Moran; Kheifetz, Yuri
2014-01-01
Despite its great promise, personalized oncology still faces many hurdles, and it is increasingly clear that targeted drugs and molecular biomarkers alone yield only modest clinical benefit. One reason is the complex relationships between biomarkers and the patient's response to drugs, obscuring the true weight of the biomarkers in the overall patient's response. This complexity can be disentangled by computational models that integrate the effects of personal biomarkers into a simulator of drug-patient dynamic interactions, for predicting the clinical outcomes. Several computational tools have been developed for personalized oncology, notably evidence-based tools for simulating pharmacokinetics, Bayesian-estimated tools for predicting survival, etc. We describe representative statistical and mathematical tools, and discuss their merits, shortcomings and preliminary clinical validation attesting to their potential. Yet, the individualization power of mathematical models alone, or statistical models alone, is limited. More accurate and versatile personalization tools can be constructed by a new application of the statistical/mathematical nonlinear mixed effects modeling (NLMEM) approach, which until recently has been used only in drug development. Using these advanced tools, clinical data from patient populations can be integrated with mechanistic models of disease and physiology, for generating personal mathematical models. Upon a more substantial validation in the clinic, this approach will hopefully be applied in personalized clinical trials, P-trials, hence aiding the establishment of personalized medicine within the main stream of clinical oncology. © 2014 Wiley Periodicals, Inc.
Prediction Of Abrasive And Diffusive Tool Wear Mechanisms In Machining
NASA Astrophysics Data System (ADS)
Rizzuti, S.; Umbrello, D.
2011-01-01
Tool wear prediction is regarded as very important task in order to maximize tool performance, minimize cutting costs and improve the quality of workpiece in cutting. In this research work, an experimental campaign was carried out at the varying of cutting conditions with the aim to measure both crater and flank tool wear, during machining of an AISI 1045 with an uncoated carbide tool P40. Parallel a FEM-based analysis was developed in order to study the tool wear mechanisms, taking also into account the influence of the cutting conditions and the temperature reached on the tool surfaces. The results show that, when the temperature of the tool rake surface is lower than the activation temperature of the diffusive phenomenon, the wear rate can be estimated applying an abrasive model. In contrast, in the tool area where the temperature is higher than the diffusive activation temperature, the wear rate can be evaluated applying a diffusive model. Finally, for a temperature ranges within the above cited values an adopted abrasive-diffusive wear model furnished the possibility to correctly evaluate the tool wear phenomena.
NASA Technical Reports Server (NTRS)
Kwak, Dochan
2005-01-01
Over the past 30 years, numerical methods and simulation tools for fluid dynamic problems have advanced as a new discipline, namely, computational fluid dynamics (CFD). Although a wide spectrum of flow regimes are encountered in many areas of science and engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to a large demand for predicting the aerodynamic performance characteristics of flight vehicles, such as commercial, military, and space vehicles. As flow analysis is required to be more accurate and computationally efficient for both commercial and mission-oriented applications (such as those encountered in meteorology, aerospace vehicle development, general fluid engineering and biofluid analysis) CFD tools for engineering become increasingly important for predicting safety, performance and cost. This paper presents the author's perspective on the maturity of CFD, especially from an aerospace engineering point of view.
Damude, S; Wevers, K P; Murali, R; Kruijff, S; Hoekstra, H J; Bastiaannet, E
2017-09-01
Completion lymph node dissection (CLND) in sentinel node (SN)-positive melanoma patients is accompanied with morbidity, while about 80% yield no additional metastases in non-sentinel nodes (NSNs). A prediction tool for NSN involvement could be of assistance in patient selection for CLND. This study investigated which parameters predict NSN-positivity, and whether the biomarker S-100B improves the accuracy of a prediction model. Recorded clinicopathologic factors were tested for their association with NSN-positivity in 110 SN-positive patients who underwent CLND. A prediction model was developed with multivariable logistic regression, incorporating all predictive factors. Five models were compared for their predictive power by calculating the Area Under the Curve (AUC). A weighted risk score, 'S-100B Non-Sentinel Node Risk Score' (SN-SNORS), was derived for the model with the highest AUC. Besides, a nomogram was developed as visual representation. NSN-positivity was present in 24 (21.8%) patients. Sex, ulceration, number of harvested SNs, number of positive SNs, and S-100B value were independently associated with NSN-positivity. The AUC for the model including all these factors was 0.78 (95%CI 0.69-0.88). SN-SNORS was the sum of scores for the five parameters. Scores of ≤9.5, 10-11.5, and ≥12 were associated with low (0%), intermediate (21.0%) and high (43.2%) risk of NSN involvement. A prediction tool based on five parameters, including the biomarker S-100B, showed accurate risk stratification for NSN-involvement in SN-positive melanoma patients. If validated in future studies, this tool could help to identify patients with low risk for NSN-involvement. Copyright © 2017 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.
Predictive models of moth development
USDA-ARS?s Scientific Manuscript database
Degree-day models link ambient temperature to insect life-stages, making such models valuable tools in integrated pest management. These models increase management efficacy by predicting pest phenology. In Wisconsin, the top insect pest of cranberry production is the cranberry fruitworm, Acrobasis v...
Prediction of Hydrolysis Products of Organic Chemicals under Environmental pH Conditions
Cheminformatics-based software tools can predict the molecular structure of transformation products using a library of transformation reaction schemes. This paper presents the development of such a library for abiotic hydrolysis of organic chemicals under environmentally relevant...
From the lab - Predicting Autism in High-Risk Infants | NIH MedlinePlus the Magazine
... High-Risk Infants Follow us Photo: iStock Predicting Autism in High-Risk Infants AN NIH-SUPPORTED STUDY ... high-risk, 6-month-old infants will develop autism spectrum disorder by age 2. Such a tool ...
Integrated Wind Power Planning Tool
NASA Astrophysics Data System (ADS)
Rosgaard, M. H.; Giebel, G.; Nielsen, T. S.; Hahmann, A.; Sørensen, P.; Madsen, H.
2012-04-01
This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the working title "Integrated Wind Power Planning Tool". The project commenced October 1, 2011, and the goal is to integrate a numerical weather prediction (NWP) model with purely statistical tools in order to assess wind power fluctuations, with focus on long term power system planning for future wind farms as well as short term forecasting for existing wind farms. Currently, wind power fluctuation models are either purely statistical or integrated with NWP models of limited resolution. With regard to the latter, one such simulation tool has been developed at the Wind Energy Division, Risø DTU, intended for long term power system planning. As part of the PSO project the inferior NWP model used at present will be replaced by the state-of-the-art Weather Research & Forecasting (WRF) model. Furthermore, the integrated simulation tool will be improved so it can handle simultaneously 10-50 times more turbines than the present ~ 300, as well as additional atmospheric parameters will be included in the model. The WRF data will also be input for a statistical short term prediction model to be developed in collaboration with ENFOR A/S; a danish company that specialises in forecasting and optimisation for the energy sector. This integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting for its spatio-temporal dependencies, and depending on the prevailing weather conditions defined by the WRF output. The output from the integrated prediction tool constitute scenario forecasts for the coming period, which can then be fed into any type of system model or decision making problem to be solved. The high resolution of the WRF results loaded into the integrated prediction model will ensure a high accuracy data basis is available for use in the decision making process of the Danish transmission system operator, and the need for high accuracy predictions will only increase over the next decade as Denmark approaches the goal of 50% wind power based electricity in 2020, from the current 20%.
Sterken, David J; Mooney, JoAnn; Ropele, Diana; Kett, Alysha; Vander Laan, Karen J
2015-01-01
Hospital acquired pressure ulcers (HAPU) are serious, debilitating, and preventable complications in all inpatient populations. Despite evidence of the development of pressure ulcers in the pediatric population, minimal research has been done. Based on observations gathered during quarterly HAPU audits, bedside nursing staff recognized trends in pressure ulcer locations that were not captured using current pressure ulcer risk assessment tools. Together, bedside nurses and nursing leadership created and conducted multiple research studies to investigate the validity and reliability of the Pediatric Pressure Ulcer Prediction and Evaluation Tool (PPUPET). Copyright © 2015 Elsevier Inc. All rights reserved.
Antibody specific epitope prediction-emergence of a new paradigm.
Sela-Culang, Inbal; Ofran, Yanay; Peters, Bjoern
2015-04-01
The development of accurate tools for predicting B-cell epitopes is important but difficult. Traditional methods have examined which regions in an antigen are likely binding sites of an antibody. However, it is becoming increasingly clear that most antigen surface residues will be able to bind one or more of the myriad of possible antibodies. In recent years, new approaches have emerged for predicting an epitope for a specific antibody, utilizing information encoded in antibody sequence or structure. Applying such antibody-specific predictions to groups of antibodies in combination with easily obtainable experimental data improves the performance of epitope predictions. We expect that further advances of such tools will be possible with the integration of immunoglobulin repertoire sequencing data. Copyright © 2015 Elsevier B.V. All rights reserved.
Sharma, Ashok K; Srivastava, Gopal N; Roy, Ankita; Sharma, Vineet K
2017-01-01
The experimental methods for the prediction of molecular toxicity are tedious and time-consuming tasks. Thus, the computational approaches could be used to develop alternative methods for toxicity prediction. We have developed a tool for the prediction of molecular toxicity along with the aqueous solubility and permeability of any molecule/metabolite. Using a comprehensive and curated set of toxin molecules as a training set, the different chemical and structural based features such as descriptors and fingerprints were exploited for feature selection, optimization and development of machine learning based classification and regression models. The compositional differences in the distribution of atoms were apparent between toxins and non-toxins, and hence, the molecular features were used for the classification and regression. On 10-fold cross-validation, the descriptor-based, fingerprint-based and hybrid-based classification models showed similar accuracy (93%) and Matthews's correlation coefficient (0.84). The performances of all the three models were comparable (Matthews's correlation coefficient = 0.84-0.87) on the blind dataset. In addition, the regression-based models using descriptors as input features were also compared and evaluated on the blind dataset. Random forest based regression model for the prediction of solubility performed better ( R 2 = 0.84) than the multi-linear regression (MLR) and partial least square regression (PLSR) models, whereas, the partial least squares based regression model for the prediction of permeability (caco-2) performed better ( R 2 = 0.68) in comparison to the random forest and MLR based regression models. The performance of final classification and regression models was evaluated using the two validation datasets including the known toxins and commonly used constituents of health products, which attests to its accuracy. The ToxiM web server would be a highly useful and reliable tool for the prediction of toxicity, solubility, and permeability of small molecules.
Sharma, Ashok K.; Srivastava, Gopal N.; Roy, Ankita; Sharma, Vineet K.
2017-01-01
The experimental methods for the prediction of molecular toxicity are tedious and time-consuming tasks. Thus, the computational approaches could be used to develop alternative methods for toxicity prediction. We have developed a tool for the prediction of molecular toxicity along with the aqueous solubility and permeability of any molecule/metabolite. Using a comprehensive and curated set of toxin molecules as a training set, the different chemical and structural based features such as descriptors and fingerprints were exploited for feature selection, optimization and development of machine learning based classification and regression models. The compositional differences in the distribution of atoms were apparent between toxins and non-toxins, and hence, the molecular features were used for the classification and regression. On 10-fold cross-validation, the descriptor-based, fingerprint-based and hybrid-based classification models showed similar accuracy (93%) and Matthews's correlation coefficient (0.84). The performances of all the three models were comparable (Matthews's correlation coefficient = 0.84–0.87) on the blind dataset. In addition, the regression-based models using descriptors as input features were also compared and evaluated on the blind dataset. Random forest based regression model for the prediction of solubility performed better (R2 = 0.84) than the multi-linear regression (MLR) and partial least square regression (PLSR) models, whereas, the partial least squares based regression model for the prediction of permeability (caco-2) performed better (R2 = 0.68) in comparison to the random forest and MLR based regression models. The performance of final classification and regression models was evaluated using the two validation datasets including the known toxins and commonly used constituents of health products, which attests to its accuracy. The ToxiM web server would be a highly useful and reliable tool for the prediction of toxicity, solubility, and permeability of small molecules. PMID:29249969
NASA Astrophysics Data System (ADS)
Choy, S.; Ahmed, H.; Wheatley, A.; McCormack, D. G.; Parraga, G.
2010-03-01
We developed image analysis tools to evaluate spatial and temporal 3He magnetic resonance imaging (MRI) ventilation in asthma and cystic fibrosis. We also developed temporal ventilation probability maps to provide a way to describe and quantify ventilation heterogeneity over time, as a way to test respiratory exacerbations or treatment predictions and to provide a discrete probability measurement of 3He ventilation defect persistence.
Predicting diet and consumption rate differences between and within species using gut ecomorphology.
Griffen, Blaine D; Mosblack, Hallie
2011-07-01
1. Rapid environmental changes and pressing human needs to forecast the consequences of environmental change are increasingly driving ecology to become a predictive science. The need for effective prediction requires both the development of new tools and the refocusing of existing tools that may have previously been used primarily for purposes other than prediction. One such tool that historically has been more descriptive in nature is ecomorphology (the study of relationships between ecological roles and morphological adaptations of species and individuals). 2. Here, we examine relationships between diet and gut morphology for 15 species of brachyuran crabs, a group of pervasive and highly successful consumers for which trophic predictions would be highly valuable. 3. We show that patterns in crab stomach volume closely match some predictions of metabolic theory and demonstrate that individual diet differences and associated morphological variation reflect, at least in some instances, individual choice or diet specialization. 4. We then present examples of how stomach volume can be used to predict both the per cent herbivory of brachyuran crabs and the relative consumption rates of individual crabs. © 2011 The Authors. Journal of Animal Ecology © 2011 British Ecological Society.
DOT National Transportation Integrated Search
2010-07-01
The objective of this work was to develop a : low-cost portable damage detection tool to : assess and predict damage areas in highway : bridges. : The proposed tool was based on standard : vibration-based damage identification (VBDI) : techniques but...
Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikkel, D. J.; McCabe, J.
This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computationalmore » tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.« less
Marchant, Carol A; Briggs, Katharine A; Long, Anthony
2008-01-01
ABSTRACT Lhasa Limited is a not-for-profit organization that exists to promote the sharing of data and knowledge in chemistry and the life sciences. It has developed the software tools Derek for Windows, Meteor, and Vitic to facilitate such sharing. Derek for Windows and Meteor are knowledge-based expert systems that predict the toxicity and metabolism of a chemical, respectively. Vitic is a chemically intelligent toxicity database. An overview of each software system is provided along with examples of the sharing of data and knowledge in the context of their development. These examples include illustrations of (1) the use of data entry and editing tools for the sharing of data and knowledge within organizations; (2) the use of proprietary data to develop nonconfidential knowledge that can be shared between organizations; (3) the use of shared expert knowledge to refine predictions; (4) the sharing of proprietary data between organizations through the formation of data-sharing groups; and (5) the use of proprietary data to validate predictions. Sharing of chemical toxicity and metabolism data and knowledge in this way offers a number of benefits including the possibilities of faster scientific progress and reductions in the use of animals in testing. Maximizing the accessibility of data also becomes increasingly crucial as in silico systems move toward the prediction of more complex phenomena for which limited data are available.
The Johns Hopkins Fall Risk Assessment Tool: A Study of Reliability and Validity.
Poe, Stephanie S; Dawson, Patricia B; Cvach, Maria; Burnett, Margaret; Kumble, Sowmya; Lewis, Maureen; Thompson, Carol B; Hill, Elizabeth E
Patient falls and fall-related injury remain a safety concern. The Johns Hopkins Fall Risk Assessment Tool (JHFRAT) was developed to facilitate early detection of risk for anticipated physiologic falls in adult inpatients. Psychometric properties in acute care settings have not yet been fully established; this study sought to fill that gap. Results indicate that the JHFRAT is reliable, with high sensitivity and negative predictive validity. Specificity and positive predictive validity were lower than expected.
Wang, Yibing; Heijmen, Ben J M; Petit, Steven F
2017-12-01
To prospectively investigate the use of an independent DVH prediction tool to detect outliers in the quality of fully automatically generated treatment plans for prostate cancer patients. A plan QA tool was developed to predict rectum, anus and bladder DVHs, based on overlap volume histograms and principal component analysis (PCA). The tool was trained with 22 automatically generated, clinical plans, and independently validated with 21 plans. Its use was prospectively investigated for 50 new plans by replanning in case of detected outliers. For rectum D mean , V 65Gy , V 75Gy , anus D mean , and bladder D mean , the difference between predicted and achieved was within 0.4 Gy or 0.3% (SD within 1.8 Gy or 1.3%). Thirteen detected outliers were re-planned, leading to moderate but statistically significant improvements (mean, max): rectum D mean (1.3 Gy, 3.4 Gy), V 65Gy (2.7%, 4.2%), anus D mean (1.6 Gy, 6.9 Gy), and bladder D mean (1.5 Gy, 5.1 Gy). The rectum V 75Gy of the new plans slightly increased (0.2%, p = 0.087). A high accuracy DVH prediction tool was developed and used for independent QA of automatically generated plans. In 28% of plans, minor dosimetric deviations were observed that could be improved by plan adjustments. Larger gains are expected for manually generated plans. Copyright © 2017 Elsevier B.V. All rights reserved.
Alyusuf, Raja H; Prasad, Kameshwar; Abdel Satir, Ali M; Abalkhail, Ali A; Arora, Roopa K
2013-01-01
The exponential use of the internet as a learning resource coupled with varied quality of many websites, lead to a need to identify suitable websites for teaching purposes. The aim of this study is to develop and to validate a tool, which evaluates the quality of undergraduate medical educational websites; and apply it to the field of pathology. A tool was devised through several steps of item generation, reduction, weightage, pilot testing, post-pilot modification of the tool and validating the tool. Tool validation included measurement of inter-observer reliability; and generation of criterion related, construct related and content related validity. The validated tool was subsequently tested by applying it to a population of pathology websites. Reliability testing showed a high internal consistency reliability (Cronbach's alpha = 0.92), high inter-observer reliability (Pearson's correlation r = 0.88), intraclass correlation coefficient = 0.85 and κ =0.75. It showed high criterion related, construct related and content related validity. The tool showed moderately high concordance with the gold standard (κ =0.61); 92.2% sensitivity, 67.8% specificity, 75.6% positive predictive value and 88.9% negative predictive value. The validated tool was applied to 278 websites; 29.9% were rated as recommended, 41.0% as recommended with caution and 29.1% as not recommended. A systematic tool was devised to evaluate the quality of websites for medical educational purposes. The tool was shown to yield reliable and valid inferences through its application to pathology websites.
Chun, Ting Sie; Malek, M A; Ismail, Amelia Ritahani
2015-01-01
The development of effluent removal prediction is crucial in providing a planning tool necessary for the future development and the construction of a septic sludge treatment plant (SSTP), especially in the developing countries. In order to investigate the expected functionality of the required standard, the prediction of the effluent quality, namely biological oxygen demand, chemical oxygen demand and total suspended solid of an SSTP was modelled using an artificial intelligence approach. In this paper, we adopt the clonal selection algorithm (CSA) to set up a prediction model, with a well-established method - namely the least-square support vector machine (LS-SVM) as a baseline model. The test results of the case study showed that the prediction of the CSA-based SSTP model worked well and provided model performance as satisfactory as the LS-SVM model. The CSA approach shows that fewer control and training parameters are required for model simulation as compared with the LS-SVM approach. The ability of a CSA approach in resolving limited data samples, non-linear sample function and multidimensional pattern recognition makes it a powerful tool in modelling the prediction of effluent removals in an SSTP.
Structural behavior of composites with progressive fracture
NASA Technical Reports Server (NTRS)
Minnetyan, L.; Murthy, P. L. N.; Chamis, C. C.
1989-01-01
The objective of the study is to unify several computational tools developed for the prediction of progressive damage and fracture with efforts for the prediction of the overall response of damaged composite structures. In particular, a computational finite element model for the damaged structure is developed using a computer program as a byproduct of the analysis of progressive damage and fracture. Thus, a single computational investigation can predict progressive fracture and the resulting variation in structural properties of angleplied composites.
Comparing GIS-based habitat models for applications in EIA and SEA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gontier, Mikael, E-mail: gontier@kth.s; Moertberg, Ulla, E-mail: mortberg@kth.s; Balfors, Berit, E-mail: balfors@kth.s
Land use changes, urbanisation and infrastructure developments in particular, cause fragmentation of natural habitats and threaten biodiversity. Tools and measures must be adapted to assess and remedy the potential effects on biodiversity caused by human activities and developments. Within physical planning, environmental impact assessment (EIA) and strategic environmental assessment (SEA) play important roles in the prediction and assessment of biodiversity-related impacts from planned developments. However, adapted prediction tools to forecast and quantify potential impacts on biodiversity components are lacking. This study tested and compared four different GIS-based habitat models and assessed their relevance for applications in environmental assessment. The modelsmore » were implemented in the Stockholm region in central Sweden and applied to data on the crested tit (Parus cristatus), a sedentary bird species of coniferous forest. All four models performed well and allowed the distribution of suitable habitats for the crested tit in the Stockholm region to be predicted. The models were also used to predict and quantify habitat loss for two regional development scenarios. The study highlighted the importance of model selection in impact prediction. Criteria that are relevant for the choice of model for predicting impacts on biodiversity were identified and discussed. Finally, the importance of environmental assessment for the preservation of biodiversity within the general frame of biodiversity conservation is emphasised.« less
Akseli, Ilgaz; Xie, Jingjin; Schultz, Leon; Ladyzhynsky, Nadia; Bramante, Tommasina; He, Xiaorong; Deanne, Rich; Horspool, Keith R; Schwabe, Robert
2017-01-01
Enabling the paradigm of quality by design requires the ability to quantitatively correlate material properties and process variables to measureable product performance attributes. Conventional, quality-by-test methods for determining tablet breaking force and disintegration time usually involve destructive tests, which consume significant amount of time and labor and provide limited information. Recent advances in material characterization, statistical analysis, and machine learning have provided multiple tools that have the potential to develop nondestructive, fast, and accurate approaches in drug product development. In this work, a methodology to predict the breaking force and disintegration time of tablet formulations using nondestructive ultrasonics and machine learning tools was developed. The input variables to the model include intrinsic properties of formulation and extrinsic process variables influencing the tablet during manufacturing. The model has been applied to predict breaking force and disintegration time using small quantities of active pharmaceutical ingredient and prototype formulation designs. The novel approach presented is a step forward toward rational design of a robust drug product based on insight into the performance of common materials during formulation and process development. It may also help expedite drug product development timeline and reduce active pharmaceutical ingredient usage while improving efficiency of the overall process. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Dantigny, Philippe; Guilmart, Audrey; Bensoussan, Maurice
2005-04-15
For over 20 years, predictive microbiology focused on food-pathogenic bacteria. Few studies concerned modelling fungal development. On one hand, most of food mycologists are not familiar with modelling techniques; on the other hand, people involved in modelling are developing tools dedicated to bacteria. Therefore, there is a tendency to extend the use of models that were developed for bacteria to moulds. However, some mould specificities should be taken into account. The use of specific models for predicting germination and growth of fungi was advocated previously []. This paper provides a short review of fungal modelling studies.
The development of a tool to predict team performance.
Sinclair, M A; Siemieniuch, C E; Haslam, R A; Henshaw, M J D C; Evans, L
2012-01-01
The paper describes the development of a tool to predict quantitatively the success of a team when executing a process. The tool was developed for the UK defence industry, though it may be useful in other domains. It is expected to be used by systems engineers in initial stages of systems design, when concepts are still fluid, including the structure of the team(s) which are expected to be operators within the system. It enables answers to be calculated for questions such as "What happens if I reduce team size?" and "Can I reduce the qualifications necessary to execute this process and still achieve the required level of success?". The tool has undergone verification and validation; it predicts fairly well and shows promise. An unexpected finding is that the tool creates a good a priori argument for significant attention to Human Factors Integration in systems projects. The simulations show that if a systems project takes full account of human factors integration (selection, training, process design, interaction design, culture, etc.) then the likelihood of team success will be in excess of 0.95. As the project derogates from this state, the likelihood of team success will drop as low as 0.05. If the team has good internal communications and good individuals in key roles, the likelihood of success rises towards 0.25. Even with a team comprising the best individuals, p(success) will not be greater than 0.35. It is hoped that these results will be useful for human factors professionals involved in systems design. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Bigger data, collaborative tools and the future of predictive drug discovery
NASA Astrophysics Data System (ADS)
Ekins, Sean; Clark, Alex M.; Swamidass, S. Joshua; Litterman, Nadia; Williams, Antony J.
2014-10-01
Over the past decade we have seen a growth in the provision of chemistry data and cheminformatics tools as either free websites or software as a service commercial offerings. These have transformed how we find molecule-related data and use such tools in our research. There have also been efforts to improve collaboration between researchers either openly or through secure transactions using commercial tools. A major challenge in the future will be how such databases and software approaches handle larger amounts of data as it accumulates from high throughput screening and enables the user to draw insights, enable predictions and move projects forward. We now discuss how information from some drug discovery datasets can be made more accessible and how privacy of data should not overwhelm the desire to share it at an appropriate time with collaborators. We also discuss additional software tools that could be made available and provide our thoughts on the future of predictive drug discovery in this age of big data. We use some examples from our own research on neglected diseases, collaborations, mobile apps and algorithm development to illustrate these ideas.
NASA Astrophysics Data System (ADS)
Darmon, David
2018-03-01
In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.
NASA Astrophysics Data System (ADS)
Kariniotakis, G.; Anemos Team
2003-04-01
Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for offshore wind farms taking into account advances in marine meteorology (interaction between wind and waves, coastal effects). The benefits from the use of satellite radar images for modeling local weather patterns are investigated. A next generation forecasting software, ANEMOS, will be developed to integrate the various models. The tool is enhanced by advanced Information Communication Technology (ICT) functionality and can operate both in stand alone, or remote mode, or be interfaced with standard Energy or Distribution Management Systems (EMS/DMS) systems. Contribution: The project provides an advanced technology for wind resource forecasting applicable in a large scale: at a single wind farm, regional or national level and for both interconnected and island systems. A major milestone is the on-line operation of the developed software by the participating utilities for onshore and offshore wind farms and the demonstration of the economic benefits. The outcome of the ANEMOS project will help consistently the increase of wind integration in two levels; in an operational level due to better management of wind farms, but also, it will contribute to increasing the installed capacity of wind farms. This is because accurate prediction of the resource reduces the risk of wind farm developers, who are then more willing to undertake new wind farm installations especially in a liberalized electricity market environment.
Cost Minimization Using an Artificial Neural Network Sleep Apnea Prediction Tool for Sleep Studies
Teferra, Rahel A.; Grant, Brydon J. B.; Mindel, Jesse W.; Siddiqi, Tauseef A.; Iftikhar, Imran H.; Ajaz, Fatima; Aliling, Jose P.; Khan, Meena S.; Hoffmann, Stephen P.
2014-01-01
Rationale: More than a million polysomnograms (PSGs) are performed annually in the United States to diagnose obstructive sleep apnea (OSA). Third-party payers now advocate a home sleep test (HST), rather than an in-laboratory PSG, as the diagnostic study for OSA regardless of clinical probability, but the economic benefit of this approach is not known. Objectives: We determined the diagnostic performance of OSA prediction tools including the newly developed OSUNet, based on an artificial neural network, and performed a cost-minimization analysis when the prediction tools are used to identify patients who should undergo HST. Methods: The OSUNet was trained to predict the presence of OSA in a derivation group of patients who underwent an in-laboratory PSG (n = 383). Validation group 1 consisted of in-laboratory PSG patients (n = 149). The network was trained further in 33 patients who underwent HST and then was validated in a separate group of 100 HST patients (validation group 2). Likelihood ratios (LRs) were compared with two previously published prediction tools. The total costs from the use of the three prediction tools and the third-party approach within a clinical algorithm were compared. Measurements and Main Results: The OSUNet had a higher +LR in all groups compared with the STOP-BANG and the modified neck circumference (MNC) prediction tools. The +LRs for STOP-BANG, MNC, and OSUNet in validation group 1 were 1.1 (1.0–1.2), 1.3 (1.1–1.5), and 2.1 (1.4–3.1); and in validation group 2 they were 1.4 (1.1–1.7), 1.7 (1.3–2.2), and 3.4 (1.8–6.1), respectively. With an OSA prevalence less than 52%, the use of all three clinical prediction tools resulted in cost savings compared with the third-party approach. Conclusions: The routine requirement of an HST to diagnose OSA regardless of clinical probability is more costly compared with the use of OSA clinical prediction tools that identify patients who should undergo this procedure when OSA is expected to be present in less than half of the population. With OSA prevalence less than 40%, the OSUNet offers the greatest savings, which are substantial when the number of sleep studies done annually is considered. PMID:25068704
Gray, Benjamin J; Bracken, Richard M; Turner, Daniel; Morgan, Kerry; Thomas, Michael; Williams, Sally P; Williams, Meurig; Rice, Sam; Stephens, Jeffrey W
2015-01-01
Background Use of a validated risk-assessment tool to identify individuals at high risk of developing type 2 diabetes is currently recommended. It is under-reported, however, whether a different risk tool alters the predicted risk of an individual. Aim This study explored any differences between commonly used validated risk-assessment tools for type 2 diabetes. Design and setting Cross-sectional analysis of individuals who participated in a workplace-based risk assessment in Carmarthenshire, South Wales. Method Retrospective analysis of 676 individuals (389 females and 287 males) who participated in a workplace-based diabetes risk-assessment initiative. Ten-year risk of type 2 diabetes was predicted using the validated QDiabetes®, Leicester Risk Assessment (LRA), FINDRISC, and Cambridge Risk Score (CRS) algorithms. Results Differences between the risk-assessment tools were apparent following retrospective analysis of individuals. CRS categorised the highest proportion (13.6%) of individuals at ‘high risk’ followed by FINDRISC (6.6%), QDiabetes (6.1%), and, finally, the LRA was the most conservative risk tool (3.1%). Following further analysis by sex, over one-quarter of males were categorised at high risk using CRS (25.4%), whereas a greater percentage of females were categorised as high risk using FINDRISC (7.8%). Conclusion The adoption of a different valid risk-assessment tool can alter the predicted risk of an individual and caution should be used to identify those individuals who really are at high risk of type 2 diabetes. PMID:26541180
Gray, Benjamin J; Bracken, Richard M; Turner, Daniel; Morgan, Kerry; Thomas, Michael; Williams, Sally P; Williams, Meurig; Rice, Sam; Stephens, Jeffrey W
2015-12-01
Use of a validated risk-assessment tool to identify individuals at high risk of developing type 2 diabetes is currently recommended. It is under-reported, however, whether a different risk tool alters the predicted risk of an individual. This study explored any differences between commonly used validated risk-assessment tools for type 2 diabetes. Cross-sectional analysis of individuals who participated in a workplace-based risk assessment in Carmarthenshire, South Wales. Retrospective analysis of 676 individuals (389 females and 287 males) who participated in a workplace-based diabetes risk-assessment initiative. Ten-year risk of type 2 diabetes was predicted using the validated QDiabetes(®), Leicester Risk Assessment (LRA), FINDRISC, and Cambridge Risk Score (CRS) algorithms. Differences between the risk-assessment tools were apparent following retrospective analysis of individuals. CRS categorised the highest proportion (13.6%) of individuals at 'high risk' followed by FINDRISC (6.6%), QDiabetes (6.1%), and, finally, the LRA was the most conservative risk tool (3.1%). Following further analysis by sex, over one-quarter of males were categorised at high risk using CRS (25.4%), whereas a greater percentage of females were categorised as high risk using FINDRISC (7.8%). The adoption of a different valid risk-assessment tool can alter the predicted risk of an individual and caution should be used to identify those individuals who really are at high risk of type 2 diabetes. © British Journal of General Practice 2015.
Self-learning computers for surgical planning and prediction of postoperative alignment.
Lafage, Renaud; Pesenti, Sébastien; Lafage, Virginie; Schwab, Frank J
2018-02-01
In past decades, the role of sagittal alignment has been widely demonstrated in the setting of spinal conditions. As several parameters can be affected, identifying the driver of the deformity is the cornerstone of a successful treatment approach. Despite the importance of restoring sagittal alignment for optimizing outcome, this task remains challenging. Self-learning computers and optimized algorithms are of great interest in spine surgery as in that they facilitate better planning and prediction of postoperative alignment. Nowadays, computer-assisted tools are part of surgeons' daily practice; however, the use of such tools remains to be time-consuming. NARRATIVE REVIEW AND RESULTS: Computer-assisted methods for the prediction of postoperative alignment consist of a three step analysis: identification of anatomical landmark, definition of alignment objectives, and simulation of surgery. Recently, complex rules for the prediction of alignment have been proposed. Even though this kind of work leads to more personalized objectives, the number of parameters involved renders it difficult for clinical use, stressing the importance of developing computer-assisted tools. The evolution of our current technology, including machine learning and other types of advanced algorithms, will provide powerful tools that could be useful in improving surgical outcomes and alignment prediction. These tools can combine different types of advanced technologies, such as image recognition and shape modeling, and using this technique, computer-assisted methods are able to predict spinal shape. The development of powerful computer-assisted methods involves the integration of several sources of information such as radiographic parameters (X-rays, MRI, CT scan, etc.), demographic information, and unusual non-osseous parameters (muscle quality, proprioception, gait analysis data). In using a larger set of data, these methods will aim to mimic what is actually done by spine surgeons, leading to real tailor-made solutions. Integrating newer technology can change the current way of planning/simulating surgery. The use of powerful computer-assisted tools that are able to integrate several parameters and learn from experience can change the traditional way of selecting treatment pathways and counseling patients. However, there is still much work to be done to reach a desired level as noted in other orthopedic fields, such as hip surgery. Many of these tools already exist in non-medical fields and their adaptation to spine surgery is of considerable interest.
ASTRYD: A new numerical tool for aircraft cabin and environmental noise prediction
NASA Astrophysics Data System (ADS)
Berhault, J.-P.; Venet, G.; Clerc, C.
ASTRYD is an analytical tool, developed originally for underwater applications, that computes acoustic pressure distribution around three-dimensional bodies in closed spaces like aircraft cabins. The program accepts data from measurements or other simulations, processes them in the time domain, and delivers temporal evolutions of the acoustic pressures and accelerations, as well as the radiated/diffracted pressure at arbitrary points located in the external/internal space. A typical aerospace application is prediction of acoustic load on satellites during the launching phase. An aeronautic application is engine noise distribution on a business jet body for prediction of environmental and cabin noise.
Applying online WEPP to assess forest watershed hydrology
S. Dun; J. Q. Wu; W. J. Elliot; J. R. Frankenberger; D. C. Flanagan; D. K. McCool
2011-01-01
The U.S. Army Corps of Engineers (USACE) and the Great Lakes Commission are developing technologies and predictive tools to aid in watershed management with an ultimate goal of improving and preserving the water quality in the Great Lakes Basin. A new version of the online Water Erosion Prediction Project (WEPP) GIS interface has been developed to assist in evaluating...
Biodiversity in environmental assessment-current practice and tools for prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gontier, Mikael; Balfors, Berit; Moertberg, Ulla
Habitat loss and fragmentation are major threats to biodiversity. Environmental impact assessment and strategic environmental assessment are essential instruments used in physical planning to address such problems. Yet there are no well-developed methods for quantifying and predicting impacts of fragmentation on biodiversity. In this study, a literature review was conducted on GIS-based ecological models that have potential as prediction tools for biodiversity assessment. Further, a review of environmental impact statements for road and railway projects from four European countries was performed, to study how impact prediction concerning biodiversity issues was addressed. The results of the study showed the existing gapmore » between research in GIS-based ecological modelling and current practice in biodiversity assessment within environmental assessment.« less
RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis
Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab
2012-01-01
RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. Availability http://www.cemb.edu.pk/sw.html Abbreviations RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language. PMID:23055611
NASA Astrophysics Data System (ADS)
Guillet, S.; Gosmain, A.; Ducoux, W.; Ponçon, M.; Fontaine, G.; Desseix, P.; Perraud, P.
2012-05-01
The increasing use of composite materials in aircrafts primary structures has led to different problematics in the field of safety of flight in lightning conditions. The consequences of this technological mutation, which occurs in a parallel context of extension of electrified critical functions, are addressed by aircraft manufacturers through the enhancement of their available assessment means of lightning transient. On the one hand, simulation tools, provided an accurate description of aircraft design, are today valuable assessment tools, in both predictive and operative terms. On the other hand, in-house test means allow confirmation and consolidation of design office hardening solutions. The combined use of predictive simulation tools and in- house test means offers an efficient and reliable support for all aircraft developments in their various life-time stages. The present paper provides PREFACE research project results that illustrate the above introduced strategy on the de-icing system of the NH90 composite main rotor blade.
Navigating through the minefield of read-across: from research to practical tools (WC10)
Read-across is used for regulatory purposes as a data gap filling technique. Research efforts have focused on the scientific justification and documentation challenges involved in read-across predictions. Software tools have also been developed to facilitate read-across predictio...
Software Tools for Weed Seed Germination Modeling
USDA-ARS?s Scientific Manuscript database
The next generation of weed seed germination models will need to account for variable soil microclimate conditions. In order to predict this microclimate environment we have developed a suite of individual tools (models) that can be used in conjunction with the next generation of weed seed germinati...
Augmenting the SCaN Link Budget Tool with Validated Atmospheric Propagation
NASA Technical Reports Server (NTRS)
Steinkerchner, Leo; Welch, Bryan
2017-01-01
In any Earth-Space or Space-Earth communications link, atmospheric effects cause significant signal attenuation. In order to develop a communications system that is cost effective while meeting appropriate performance requirements, it is important to accurately predict these effects for the given link parameters. This project aimed to develop a Matlab(TradeMark) (The MathWorks, Inc.) program that could augment the existing Space Communications and Navigation (SCaN) Link Budget Tool with accurate predictions of atmospheric attenuation of both optical and radio-frequency signals according to the SCaN Optical Link Assessment Model Version 5 and the International Telecommunications Union, Radiocommunications Sector (ITU-R) atmospheric propagation loss model, respectively. When compared to data collected from the Advance Communications Technology Satellite (ACTS), the radio-frequency model predicted attenuation to within 1.3 dB of loss for 95 of measurements. Ultimately, this tool will be integrated into the SCaN Center for Engineering, Networks, Integration, and Communications (SCENIC) user interface in order to support analysis of existing SCaN systems and planning capabilities for future NASA missions.
Development of Multi-Layered Floating Floor for Cabin Noise Reduction
NASA Astrophysics Data System (ADS)
Song, Jee-Hun; Hong, Suk-Yoon; Kwon, Hyun-Wung
2017-12-01
Recently, regulations pertaining to the noise and vibration environment of ship cabins have been strengthened. In this paper, a numerical model is developed for multi-layered floating floor to predict the structure-borne noise in ship cabins. The theoretical model consists of multi-panel structures lined with high-density mineral wool. The predicted results for structure-borne noise when multi-layered floating floor is used are compared to the measure-ments made of a mock-up. A comparison of the predicted results and the experimental one shows that the developed model could be an effective tool for predicting structure-borne noise in ship cabins.
Orbiter Boundary Layer Transition Prediction Tool Enhancements
NASA Technical Reports Server (NTRS)
Berry, Scott A.; King, Rudolph A.; Kegerise, Michael A.; Wood, William A.; McGinley, Catherine B.; Berger, Karen T.; Anderson, Brian P.
2010-01-01
Updates to an analytic tool developed for Shuttle support to predict the onset of boundary layer transition resulting from thermal protection system damage or repair are presented. The boundary layer transition tool is part of a suite of tools that analyze the local aerothermodynamic environment to enable informed disposition of damage for making recommendations to fly as is or to repair. Using mission specific trajectory information and details of each d agmea site or repair, the expected time (and thus Mach number) of transition onset is predicted to help define proper environments for use in subsequent thermal and stress analysis of the thermal protection system and structure. The boundary layer transition criteria utilized within the tool were updated based on new local boundary layer properties obtained from high fidelity computational solutions. Also, new ground-based measurements were obtained to allow for a wider parametric variation with both protuberances and cavities and then the resulting correlations were calibrated against updated flight data. The end result is to provide correlations that allow increased confidence with the resulting transition predictions. Recently, a new approach was adopted to remove conservatism in terms of sustained turbulence along the wing leading edge. Finally, some of the newer flight data are also discussed in terms of how these results reflect back on the updated correlations.
DEVELOPMENT OF STRUCTURE ACTIVITY RELATIONSHIPS FOR ASSESSING ECOLOGICAL RISKS
In the field of environmental toxicology, structure activity relationships (SARs) have developed as scientifically-credible tools for predicting the effects of chemicals when little or no empirical data are available.
Singh, Urminder; Rajkumar, Mohan Singh; Garg, Rohini
2017-01-01
Abstract Long non-coding RNAs (lncRNAs) make up a significant portion of non-coding RNAs and are involved in a variety of biological processes. Accurate identification/annotation of lncRNAs is the primary step for gaining deeper insights into their functions. In this study, we report a novel tool, PLncPRO, for prediction of lncRNAs in plants using transcriptome data. PLncPRO is based on machine learning and uses random forest algorithm to classify coding and long non-coding transcripts. PLncPRO has better prediction accuracy as compared to other existing tools and is particularly well-suited for plants. We developed consensus models for dicots and monocots to facilitate prediction of lncRNAs in non-model/orphan plants. The performance of PLncPRO was quite better with vertebrate transcriptome data as well. Using PLncPRO, we discovered 3714 and 3457 high-confidence lncRNAs in rice and chickpea, respectively, under drought or salinity stress conditions. We investigated different characteristics and differential expression under drought/salinity stress conditions, and validated lncRNAs via RT-qPCR. Overall, we developed a new tool for the prediction of lncRNAs in plants and showed its utility via identification of lncRNAs in rice and chickpea. PMID:29036354
Towards a National Space Weather Predictive Capability
NASA Astrophysics Data System (ADS)
Fox, N. J.; Ryschkewitsch, M. G.; Merkin, V. G.; Stephens, G. K.; Gjerloev, J. W.; Barnes, R. J.; Anderson, B. J.; Paxton, L. J.; Ukhorskiy, A. Y.; Kelly, M. A.; Berger, T. E.; Bonadonna, L. C. M. F.; Hesse, M.; Sharma, S.
2015-12-01
National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review the space weather system developed for the Van Allen Probes mission, together with other datasets, tools and models that have resulted from research by scientists at JHU/APL. We will look at how these, and results from future missions such as Solar Probe Plus, could be applied to support space weather applications in coordination with other community assets and capabilities.
Overview of Boundary Layer Transition Research in Support of Orbiter Return To Flight
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Horvath, Thomas J.; Greene, Francis A.; Kinder, Gerald R.; Wang, K. C.
2006-01-01
A predictive tool for estimating the onset of boundary layer transition resulting from damage to and/or repair of the thermal protection system was developed in support of Shuttle Return to Flight. The boundary layer transition tool is part of a suite of tools that analyze the aerothermodynamic environment to the local thermal protection system to allow informed disposition of damage for making recommendations to fly as is or to repair. Using mission specific trajectory information and details of each damage site or repair, the expected time (and thus Mach number) at transition onset is predicted to help define the aerothermodynamic environment to use in the subsequent thermal and stress analysis of the local thermal protection system and structure. The boundary layer transition criteria utilized for the tool was developed from ground-based measurements to account for the effect of both protuberances and cavities and has been calibrated against select flight data. Computed local boundary layer edge conditions were used to correlate the results, specifically the momentum thickness Reynolds number over the edge Mach number and the boundary layer thickness. For the initial Return to Flight mission, STS-114, empirical curve coefficients of 27, 100, and 900 were selected to predict transition onset for protuberances based on height, and cavities based on depth and length, respectively.
Osteoporosis risk prediction using machine learning and conventional methods.
Kim, Sung Kean; Yoo, Tae Keun; Oh, Ein; Kim, Deok Won
2013-01-01
A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women, and compared with the ability of a conventional clinical decision tool, osteoporosis self-assessment tool (OST). We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Surveys (KNHANES V-1). The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests (RF), artificial neural networks (ANN), and logistic regression (LR) based on various predictors associated with low bone density. The learning models were compared with OST. SVM had significantly better area under the curve (AUC) of the receiver operating characteristic (ROC) than ANN, LR, and OST. Validation on the test set showed that SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0%. We were the first to perform comparisons of the performance of osteoporosis prediction between the machine learning and conventional methods using population-based epidemiological data. The machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.
Li, Ginny X H; Vogel, Christine; Choi, Hyungwon
2018-06-07
While tandem mass spectrometry can detect post-translational modifications (PTM) at the proteome scale, reported PTM sites are often incomplete and include false positives. Computational approaches can complement these datasets by additional predictions, but most available tools use prediction models pre-trained for single PTM type by the developers and it remains a difficult task to perform large-scale batch prediction for multiple PTMs with flexible user control, including the choice of training data. We developed an R package called PTMscape which predicts PTM sites across the proteome based on a unified and comprehensive set of descriptors of the physico-chemical microenvironment of modified sites, with additional downstream analysis modules to test enrichment of individual or pairs of PTMs in protein domains. PTMscape is flexible in the ability to process any major modifications, such as phosphorylation and ubiquitination, while achieving the sensitivity and specificity comparable to single-PTM methods and outperforming other multi-PTM tools. Applying this framework, we expanded proteome-wide coverage of five major PTMs affecting different residues by prediction, especially for lysine and arginine modifications. Using a combination of experimentally acquired sites (PSP) and newly predicted sites, we discovered that the crosstalk among multiple PTMs occur more frequently than by random chance in key protein domains such as histone, protein kinase, and RNA recognition motifs, spanning various biological processes such as RNA processing, DNA damage response, signal transduction, and regulation of cell cycle. These results provide a proteome-scale analysis of crosstalk among major PTMs and can be easily extended to other types of PTM.
In Silico Approaches for Predicting Adme Properties
NASA Astrophysics Data System (ADS)
Madden, Judith C.
A drug requires a suitable pharmacokinetic profile to be efficacious in vivo in humans. The relevant pharmacokinetic properties include the absorption, distribution, metabolism, and excretion (ADME) profile of the drug. This chapter provides an overview of the definition and meaning of key ADME properties, recent models developed to predict these properties, and a guide as to how to select the most appropriate model(s) for a given query. Many tools using the state-of-the-art in silico methodology are now available to users, and it is anticipated that the continual evolution of these tools will provide greater ability to predict ADME properties in the future. However, caution must be exercised in applying these tools as data are generally available only for "successful" drugs, i.e., those that reach the marketplace, and little supplementary information, such as that for drugs that have a poor pharmacokinetic profile, is available. The possibilities of using these methods and possible integration into toxicity prediction are explored.
Translating New Science Into the Drug Review Process
Rouse, Rodney; Kruhlak, Naomi; Weaver, James; Burkhart, Keith; Patel, Vikram; Strauss, David G.
2017-01-01
In 2011, the US Food and drug Administration (FDA) developed a strategic plan for regulatory science that focuses on developing new tools, standards, and approaches to assess the safety, efficacy, quality, and performance of FDA-regulated products. In line with this, the Division of Applied Regulatory Science was created to move new science into the Center for Drug Evaluation and Research (CDER) review process and close the gap between scientific innovation and drug review. The Division, located in the Office of Clinical Pharmacology, is unique in that it performs mission-critical applied research and review across the translational research spectrum including in vitro and in vivo laboratory research, in silico computational modeling and informatics, and integrated clinical research covering clinical pharmacology, experimental medicine, and postmarket analyses. The Division collaborates with Offices throughout CDER, across the FDA, other government agencies, academia, and industry. The Division is able to rapidly form interdisciplinary teams of pharmacologists, biologists, chemists, computational scientists, and clinicians to respond to challenging regulatory questions for specific review issues and for longer-range projects requiring the development of predictive models, tools, and biomarkers to speed the development and regulatory evaluation of safe and effective drugs. This article reviews the Division’s recent work and future directions, highlighting development and validation of biomarkers; novel humanized animal models; translational predictive safety combining in vitro, in silico, and in vivo clinical biomarkers; chemical and biomedical informatics tools for safety predictions; novel approaches to speed the development of complex generic drugs, biosimilars, and antibiotics; and precision medicine. PMID:29568713
StructRNAfinder: an automated pipeline and web server for RNA families prediction.
Arias-Carrasco, Raúl; Vásquez-Morán, Yessenia; Nakaya, Helder I; Maracaja-Coutinho, Vinicius
2018-02-17
The function of many noncoding RNAs (ncRNAs) depend upon their secondary structures. Over the last decades, several methodologies have been developed to predict such structures or to use them to functionally annotate RNAs into RNA families. However, to fully perform this analysis, researchers should utilize multiple tools, which require the constant parsing and processing of several intermediate files. This makes the large-scale prediction and annotation of RNAs a daunting task even to researchers with good computational or bioinformatics skills. We present an automated pipeline named StructRNAfinder that predicts and annotates RNA families in transcript or genome sequences. This single tool not only displays the sequence/structural consensus alignments for each RNA family, according to Rfam database but also provides a taxonomic overview for each assigned functional RNA. Moreover, we implemented a user-friendly web service that allows researchers to upload their own nucleotide sequences in order to perform the whole analysis. Finally, we provided a stand-alone version of StructRNAfinder to be used in large-scale projects. The tool was developed under GNU General Public License (GPLv3) and is freely available at http://structrnafinder.integrativebioinformatics.me . The main advantage of StructRNAfinder relies on the large-scale processing and integrating the data obtained by each tool and database employed along the workflow, of which several files are generated and displayed in user-friendly reports, useful for downstream analyses and data exploration.
NASA Astrophysics Data System (ADS)
Haack, Lukas; Peniche, Ricardo; Sommer, Lutz; Kather, Alfons
2017-06-01
At early project stages, the main CSP plant design parameters such as turbine capacity, solar field size, and thermal storage capacity are varied during the techno-economic optimization to determine most suitable plant configurations. In general, a typical meteorological year with at least hourly time resolution is used to analyze each plant configuration. Different software tools are available to simulate the annual energy yield. Software tools offering a thermodynamic modeling approach of the power block and the CSP thermal cycle, such as EBSILONProfessional®, allow a flexible definition of plant topologies. In EBSILON, the thermodynamic equilibrium for each time step is calculated iteratively (quasi steady state), which requires approximately 45 minutes to process one year with hourly time resolution. For better presentation of gradients, 10 min time resolution is recommended, which increases processing time by a factor of 5. Therefore, analyzing a large number of plant sensitivities, as required during the techno-economic optimization procedure, the detailed thermodynamic simulation approach becomes impracticable. Suntrace has developed an in-house CSP-Simulation tool (CSPsim), based on EBSILON and applying predictive models, to approximate the CSP plant performance for central receiver and parabolic trough technology. CSPsim significantly increases the speed of energy yield calculations by factor ≥ 35 and has automated the simulation run of all predefined design configurations in sequential order during the optimization procedure. To develop the predictive models, multiple linear regression techniques and Design of Experiment methods are applied. The annual energy yield and derived LCOE calculated by the predictive model deviates less than ±1.5 % from the thermodynamic simulation in EBSILON and effectively identifies the optimal range of main design parameters for further, more specific analysis.
Ceramic Matrix Composites (CMC) Life Prediction Development - 2003
NASA Technical Reports Server (NTRS)
Levine, Stanley R.; Calomino, Anthony M.; Verrilli, Michael J.; Thomas, David J.; Halbig, Michael C.; Opila, Elizabeth J.; Ellis, John R.
2003-01-01
Accurate life prediction is critical to successful use of ceramic matrix composites (CMCs). The tools to accomplish this are immature and not oriented toward the behavior of carbon fiber reinforced silicon carbide (C/SiC), the primary system of interest for many reusable and single mission launch vehicle propulsion and airframe applications. This paper describes an approach and progress made to satisfy the need to develop an integrated life prediction system that addresses mechanical durability and environmental degradation of C/SiC.
Predictive Models and Tools for Assessing Chemicals under the Toxic Substances Control Act (TSCA)
EPA has developed databases and predictive models to help evaluate the hazard, exposure, and risk of chemicals released to the environment and how workers, the general public, and the environment may be exposed to and affected by them.
AOP-informed assessment of endocrine disruption in freshwater crustaceans
To date, most research focused on developing more efficient and cost effective methods to predict toxicity have focused on human biology. However, there is also a need for effective high throughput tools to predict toxicity to other species that perform critical ecosystem functio...
Predictive Modeling of a Fecal Indicator at a Subtropical Marine Beach
The Virtual Beach Model Builder (VBMB) is a software tool that can be used to develop predictive models at beaches based on microbial data and observations (explanatory variables) that describe hydrometeorological and biogeochemical conditions. During the summer of 2008, a study...
PREDICTION OF FUNDAMENTAL ASSEMBLAGES OF MID-ATLANTIC HIGHLAND STREAM FISHES
A statistical software tool, the Stream Fish Assemblage Predictor (SFAP), based on stream sampling data collected by the EPA in the mid-Atlantic Highlands, was developed to predict potential stream fish communities using characteristics of the stream and its watershed.
Step o...
Design of a final approach spacing tool for TRACON air traffic control
NASA Technical Reports Server (NTRS)
Davis, Thomas J.; Erzberger, Heinz; Bergeron, Hugh
1989-01-01
This paper describes an automation tool that assists air traffic controllers in the Terminal Radar Approach Control (TRACON) Facilities in providing safe and efficient sequencing and spacing of arrival traffic. The automation tool, referred to as the Final Approach Spacing Tool (FAST), allows the controller to interactively choose various levels of automation and advisory information ranging from predicted time errors to speed and heading advisories for controlling time error. FAST also uses a timeline to display current scheduling and sequencing information for all aircraft in the TRACON airspace. FAST combines accurate predictive algorithms and state-of-the-art mouse and graphical interface technology to present advisory information to the controller. Furthermore, FAST exchanges various types of traffic information and communicates with automation tools being developed for the Air Route Traffic Control Center. Thus it is part of an integrated traffic management system for arrival traffic at major terminal areas.
A discrete event simulation tool to support and predict hospital and clinic staffing.
DeRienzo, Christopher M; Shaw, Ryan J; Meanor, Phillip; Lada, Emily; Ferranti, Jeffrey; Tanaka, David
2017-06-01
We demonstrate how to develop a simulation tool to help healthcare managers and administrators predict and plan for staffing needs in a hospital neonatal intensive care unit using administrative data. We developed a discrete event simulation model of nursing staff needed in a neonatal intensive care unit and then validated the model against historical data. The process flow was translated into a discrete event simulation model. Results demonstrated that the model can be used to give a respectable estimate of annual admissions, transfers, and deaths based upon two different staffing levels. The discrete event simulation tool model can provide healthcare managers and administrators with (1) a valid method of modeling patient mix, patient acuity, staffing needs, and costs in the present state and (2) a forecast of how changes in a unit's staffing, referral patterns, or patient mix would affect a unit in a future state.
Evaluating the Zebrafish Embryo Toxicity Test for Pesticide Hazard Screening
Given the numerous chemicals used in society, it is critical to develop tools for accurate and efficient evaluation of potential risks to human and ecological receptors. Fish embryo acute toxicity tests are 1 tool that has been shown to be highly predictive of standard, more reso...
Biological assessment is becoming an increasingly popular tool in the evaluation of stream ecosystem integrity. However, little progress has been made to date in developing tools to relate assessment results to specific stressors. This paper continues the investigation of the f...
Predictive Tools for Severe Dengue Conforming to World Health Organization 2009 Criteria
Carrasco, Luis R.; Leo, Yee Sin; Cook, Alex R.; Lee, Vernon J.; Thein, Tun L.; Go, Chi Jong; Lye, David C.
2014-01-01
Background Dengue causes 50 million infections per year, posing a large disease and economic burden in tropical and subtropical regions. Only a proportion of dengue cases require hospitalization, and predictive tools to triage dengue patients at greater risk of complications may optimize usage of limited healthcare resources. For severe dengue (SD), proposed by the World Health Organization (WHO) 2009 dengue guidelines, predictive tools are lacking. Methods We undertook a retrospective study of adult dengue patients in Tan Tock Seng Hospital, Singapore, from 2006 to 2008. Demographic, clinical and laboratory variables at presentation from dengue polymerase chain reaction-positive and serology-positive patients were used to predict the development of SD after hospitalization using generalized linear models (GLMs). Principal findings Predictive tools compatible with well-resourced and resource-limited settings – not requiring laboratory measurements – performed acceptably with optimism-corrected specificities of 29% and 27% respectively for 90% sensitivity. Higher risk of severe dengue (SD) was associated with female gender, lower than normal hematocrit level, abdominal distension, vomiting and fever on admission. Lower risk of SD was associated with more years of age (in a cohort with an interquartile range of 27–47 years of age), leucopenia and fever duration on admission. Among the warning signs proposed by WHO 2009, we found support for abdominal pain or tenderness and vomiting as predictors of combined forms of SD. Conclusions The application of these predictive tools in the clinical setting may reduce unnecessary admissions by 19% allowing the allocation of scarce public health resources to patients according to the severity of outcomes. PMID:25010515
The Future of Air Traffic Management
NASA Technical Reports Server (NTRS)
Denery, Dallas G.; Erzberger, Heinz; Edwards, Thomas A. (Technical Monitor)
1998-01-01
A system for the control of terminal area traffic to improve productivity, referred to as the Center-TRACON Automation System (CTAS), is being developed at NASA's Ames Research Center under a joint program with the FAA. CTAS consists of a set of integrated tools that provide computer-generated advisories for en-route and terminal area controllers. The premise behind the design of CTAS has been that successful planning of traffic requires accurate trajectory prediction. Data bases consisting of representative aircraft performance models, airline preferred operational procedures and a three dimensional wind model support the trajectory prediction. The research effort has been the design of a set of automation tools that make use of this trajectory prediction capability to assist controllers in overall management of traffic. The first tool, the Traffic Management Advisor (TMA), provides the overall flow management between the en route and terminal areas. A second tool, the Final Approach Spacing Tool (FAST) provides terminal area controllers with sequence and runway advisories to allow optimal use of the runways. The TMA and FAST are now being used in daily operations at Dallas/Ft. Worth airport. Additional activities include the development of several other tools. These include: 1) the En Route Descent Advisor that assist the en route controller in issuing conflict free descents and ascents; 2) the extension of FAST to include speed and heading advisories and the Expedite Departure Path (EDP) that assists the terminal controller in management of departures; and 3) the Collaborative Arrival Planner (CAP) that will assist the airlines in operational decision making. The purpose of this presentation is to review the CTAS concept and to present the results of recent field tests. The paper will first discuss the overall concept and then discuss the status of the individual tools.
Energy Economics of Farm Biogas in Cold Climates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pillay, Pragasen; Grimberg, Stefan; Powers, Susan E
Anaerobic digestion of farm and dairy waste has been shown to be capital intensive. One way to improve digester economics is to co-digest high-energy substrates together with the dairy manure. Cheese whey for example represents a high-energy substrate that is generated during cheese manufacture. There are currently no quantitative tools available that predict performance of co-digestion farm systems. The goal of this project was to develop a mathematical tool that would (1) predict the impact of co-digestion and (2) determine the best use of the generated biogas for a cheese manufacturing plant. Two models were developed that separately could bemore » used to meet both goals of the project. Given current pricing structures of the most economical use of the generated biogas at the cheese manufacturing plant was as a replacement of fuel oil to generate heat. The developed digester model accurately predicted the performance of 26 farm digesters operating in the North Eastern U.S.« less
Corneal cell culture models: a tool to study corneal drug absorption.
Dey, Surajit
2011-05-01
In recent times, there has been an ever increasing demand for ocular drugs to treat sight threatening diseases such as glaucoma, age-related macular degeneration and diabetic retinopathy. As more drugs are developed, there is a great need to test in vitro permeability of these drugs to predict their efficacy and bioavailability in vivo. Corneal cell culture models are the only tool that can predict drug absorption across ocular layers accurately and rapidly. Cell culture studies are also valuable in reducing the number of animals needed for in vivo studies which can increase the cost of the drug developmental process. Currently, rabbit corneal cell culture models are used to predict human corneal absorption due to the difficulty in human corneal studies. More recently, a three dimensional human corneal equivalent has been developed using three different cell types to mimic the human cornea. In the future, human corneal cell culture systems need to be developed to be used as a standardized model for drug permeation.
Tools for beach health data management, data processing, and predictive model implementation
,
2013-01-01
This fact sheet describes utilities created for management of recreational waters to provide efficient data management, data aggregation, and predictive modeling as well as a prototype geographic information system (GIS)-based tool for data visualization and summary. All of these utilities were developed to assist beach managers in making decisions to protect public health. The Environmental Data Discovery and Transformation (EnDDaT) Web service identifies, compiles, and sorts environmental data from a variety of sources that help to define climatic, hydrologic, and hydrodynamic characteristics including multiple data sources within the U.S. Geological Survey and the National Oceanic and Atmospheric Administration. The Great Lakes Beach Health Database (GLBH-DB) and Web application was designed to provide a flexible input, export, and storage platform for beach water quality and sanitary survey monitoring data to compliment beach monitoring programs within the Great Lakes. A real-time predictive modeling strategy was implemented by combining the capabilities of EnDDaT and the GLBH-DB for timely, automated prediction of beach water quality. The GIS-based tool was developed to map beaches based on their physical and biological characteristics, which was shared with multiple partners to provide concepts and information for future Web-accessible beach data outlets.
Souza, João Paulo; Oladapo, Olufemi T; Bohren, Meghan A; Mugerwa, Kidza; Fawole, Bukola; Moscovici, Leonardo; Alves, Domingos; Perdona, Gleici; Oliveira-Ciabati, Livia; Vogel, Joshua P; Tunçalp, Özge; Zhang, Jim; Hofmeyr, Justus; Bahl, Rajiv; Gülmezoglu, A Metin
2015-05-26
The partograph is currently the main tool available to support decision-making of health professionals during labour. However, the rate of appropriate use of the partograph is disappointingly low. Apart from limitations that are associated with partograph use, evidence of positive impact on labour-related health outcomes is lacking. The main goal of this study is to develop a Simplified, Effective, Labour Monitoring-to-Action (SELMA) tool. The primary objectives are: to identify the essential elements of intrapartum monitoring that trigger the decision to use interventions aimed at preventing poor labour outcomes; to develop a simplified, monitoring-to-action algorithm for labour management; and to compare the diagnostic performance of SELMA and partograph algorithms as tools to identify women who are likely to develop poor labour-related outcomes. A prospective cohort study will be conducted in eight health facilities in Nigeria and Uganda (four facilities from each country). All women admitted for vaginal birth will comprise the study population (estimated sample size: 7,812 women). Data will be collected on maternal characteristics on admission, labour events and pregnancy outcomes by trained research assistants at the participating health facilities. Prediction models will be developed to identify women at risk of intrapartum-related perinatal death or morbidity (primary outcomes) throughout the course of labour. These predictions models will be used to assemble a decision-support tool that will be able to suggest the best course of action to avert adverse outcomes during the course of labour. To develop this set of prediction models, we will use up-to-date techniques of prognostic research, including identification of important predictors, assigning of relative weights to each predictor, estimation of the predictive performance of the model through calibration and discrimination, and determination of its potential for application using internal validation techniques. This research offers an opportunity to revisit the theoretical basis of the partograph. It is envisioned that the final product would help providers overcome the challenging tasks of promptly interpreting complex labour information and deriving appropriate clinical actions, and thus increase efficiency of the care process, enhance providers' competence and ultimately improve labour outcomes. Please see related articles ' http://dx.doi.org/10.1186/s12978-015-0027-6 ' and ' http://dx.doi.org/10.1186/s12978-015-0028-5 '.
[Development of a predictive program for microbial growth under various temperature conditions].
Fujikawa, Hiroshi; Yano, Kazuyoshi; Morozumi, Satoshi; Kimura, Bon; Fujii, Tateo
2006-12-01
A predictive program for microbial growth under various temperature conditions was developed with a mathematical model. The model was a new logistic model recently developed by us. The program predicts Escherichia coli growth in broth, Staphylococcus aureus growth and its enterotoxin production in milk, and Vibrio parahaemolyticus growth in broth at various temperature patterns. The program, which was built with Microsoft Excel (Visual Basic Application), is user-friendly; users can easily input the temperature history of a test food and obtain the prediction instantly on the computer screen. The predicted growth and toxin production can be important indices to determine whether a food is microbiologically safe or not. This program should be a useful tool to confirm the microbial safety of commercial foods.
Thyroid Cancer Risk Assessment Tool
The R package thyroid implements a risk prediction model developed by NCI researchers to calculate the absolute risk of developing a second primary thyroid cancer (SPTC) in individuals who were diagnosed with a cancer during their childhood.
Ensuring long-term utility of the AOP framework and knowledge for multiple stakeholders
1.Introduction There is a need to increase the development and implementation of predictive approaches to support chemical safety assessment. These predictive approaches feature generation of data from tools such as computational models, pathway-based in vitro assays, and short-t...
Optimal design of low-density SNP arrays for genomic prediction: algorithm and applications
USDA-ARS?s Scientific Manuscript database
Low-density (LD) single nucleotide polymorphism (SNP) arrays provide a cost-effective solution for genomic prediction and selection, but algorithms and computational tools are needed for their optimal design. A multiple-objective, local optimization (MOLO) algorithm was developed for design of optim...
Alyusuf, Raja H.; Prasad, Kameshwar; Abdel Satir, Ali M.; Abalkhail, Ali A.; Arora, Roopa K.
2013-01-01
Background: The exponential use of the internet as a learning resource coupled with varied quality of many websites, lead to a need to identify suitable websites for teaching purposes. Aim: The aim of this study is to develop and to validate a tool, which evaluates the quality of undergraduate medical educational websites; and apply it to the field of pathology. Methods: A tool was devised through several steps of item generation, reduction, weightage, pilot testing, post-pilot modification of the tool and validating the tool. Tool validation included measurement of inter-observer reliability; and generation of criterion related, construct related and content related validity. The validated tool was subsequently tested by applying it to a population of pathology websites. Results and Discussion: Reliability testing showed a high internal consistency reliability (Cronbach's alpha = 0.92), high inter-observer reliability (Pearson's correlation r = 0.88), intraclass correlation coefficient = 0.85 and κ =0.75. It showed high criterion related, construct related and content related validity. The tool showed moderately high concordance with the gold standard (κ =0.61); 92.2% sensitivity, 67.8% specificity, 75.6% positive predictive value and 88.9% negative predictive value. The validated tool was applied to 278 websites; 29.9% were rated as recommended, 41.0% as recommended with caution and 29.1% as not recommended. Conclusion: A systematic tool was devised to evaluate the quality of websites for medical educational purposes. The tool was shown to yield reliable and valid inferences through its application to pathology websites. PMID:24392243
Micro-Vibration Performance Prediction of SEPTA24 Using SMeSim (RUAG Space Mechanism Simulator Tool)
NASA Astrophysics Data System (ADS)
Omiciuolo, Manolo; Lang, Andreas; Wismer, Stefan; Barth, Stephan; Szekely, Gerhard
2013-09-01
Scientific space missions are currently challenging the performances of their payloads. The performances can be dramatically restricted by micro-vibration loads generated by any moving parts of the satellites, thus by Solar Array Drive Assemblies too. Micro-vibration prediction of SADAs is therefore very important to support their design and optimization in the early stages of a programme. The Space Mechanism Simulator (SMeSim) tool, developed by RUAG, enhances the capability of analysing the micro-vibration emissivity of a Solar Array Drive Assembly (SADA) under a specified set of boundary conditions. The tool is developed in the Matlab/Simulink® environment throughout a library of blocks simulating the different components a SADA is made of. The modular architecture of the blocks, assembled by the user, and the set up of the boundary conditions allow time-domain and frequency-domain analyses of a rigid multi-body model with concentrated flexibilities and coupled- electronic control of the mechanism. SMeSim is used to model the SEPTA24 Solar Array Drive Mechanism and predict its micro-vibration emissivity. SMeSim and the return of experience earned throughout its development and use can now support activities like verification by analysis of micro-vibration emissivity requirements and/or design optimization to minimize the micro- vibration emissivity of a SADA.
Frasher, Sarah K; Woodruff, Tracy M; Bouldin, Jennifer L
2016-06-01
In efforts to reduce nonpoint source runoff and improve water quality, Best Management Practices (BMPs) were implemented in the Outlet Larkin Creek Watershed. Farmers need to make scientifically informed decisions concerning BMPs addressing contaminants from agricultural fields. The BMP Tool was developed from previous studies to estimate BMP effectiveness at reducing nonpoint source contaminants. The purpose of this study was to compare the measured percent reduction of dissolved phosphorus (DP) and total suspended solids to the reported percent reductions from the BMP Tool for validation. Similarities were measured between the BMP Tool and the measured water quality parameters. Construction of a sedimentation pond resulted in 74 %-76 % reduction in DP as compared to 80 % as predicted with the BMP Tool. However, further research is needed to validate the tool for additional water quality parameters. The BMP Tool is recommended for future BMP implementation as a useful predictor for farmers.
BioFuelDB: a database and prediction server of enzymes involved in biofuels production.
Chaudhary, Nikhil; Gupta, Ankit; Gupta, Sudheer; Sharma, Vineet K
2017-01-01
In light of the rapid decrease in fossils fuel reserves and an increasing demand for energy, novel methods are required to explore alternative biofuel production processes to alleviate these pressures. A wide variety of molecules which can either be used as biofuels or as biofuel precursors are produced using microbial enzymes. However, the common challenges in the industrial implementation of enzyme catalysis for biofuel production are the unavailability of a comprehensive biofuel enzyme resource, low efficiency of known enzymes, and limited availability of enzymes which can function under extreme conditions in the industrial processes. We have developed a comprehensive database of known enzymes with proven or potential applications in biofuel production through text mining of PubMed abstracts and other publicly available information. A total of 131 enzymes with a role in biofuel production were identified and classified into six enzyme classes and four broad application categories namely 'Alcohol production', 'Biodiesel production', 'Fuel Cell' and 'Alternate biofuels'. A prediction tool 'Benz' was developed to identify and classify novel homologues of the known biofuel enzyme sequences from sequenced genomes and metagenomes. 'Benz' employs a hybrid approach incorporating HMMER 3.0 and RAPSearch2 programs to provide high accuracy and high speed for prediction. Using the Benz tool, 153,754 novel homologues of biofuel enzymes were identified from 23 diverse metagenomic sources. The comprehensive data of curated biofuel enzymes, their novel homologs identified from diverse metagenomes, and the hybrid prediction tool Benz are presented as a web server which can be used for the prediction of biofuel enzymes from genomic and metagenomic datasets. The database and the Benz tool is publicly available at http://metabiosys.iiserb.ac.in/biofueldb& http://metagenomics.iiserb.ac.in/biofueldb.
User manual of the CATSS system (version 1.0) communication analysis tool for space station
NASA Technical Reports Server (NTRS)
Tsang, C. S.; Su, Y. T.; Lindsey, W. C.
1983-01-01
The Communication Analysis Tool for the Space Station (CATSS) is a FORTRAN language software package capable of predicting the communications links performance for the Space Station (SS) communication and tracking (C & T) system. An interactive software package was currently developed to run on the DEC/VAX computers. The CATSS models and evaluates the various C & T links of the SS, which includes the modulation schemes such as Binary-Phase-Shift-Keying (BPSK), BPSK with Direct Sequence Spread Spectrum (PN/BPSK), and M-ary Frequency-Shift-Keying with Frequency Hopping (FH/MFSK). Optical Space Communication link is also included. CATSS is a C & T system engineering tool used to predict and analyze the system performance for different link environment. Identification of system weaknesses is achieved through evaluation of performance with varying system parameters. System tradeoff for different values of system parameters are made based on the performance prediction.
NASA Astrophysics Data System (ADS)
Afan, Haitham Abdulmohsin; El-shafie, Ahmed; Mohtar, Wan Hanna Melini Wan; Yaseen, Zaher Mundher
2016-10-01
An accurate model for sediment prediction is a priority for all hydrological researchers. Many conventional methods have shown an inability to achieve an accurate prediction of suspended sediment. These methods are unable to understand the behaviour of sediment transport in rivers due to the complexity, noise, non-stationarity, and dynamism of the sediment pattern. In the past two decades, Artificial Intelligence (AI) and computational approaches have become a remarkable tool for developing an accurate model. These approaches are considered a powerful tool for solving any non-linear model, as they can deal easily with a large number of data and sophisticated models. This paper is a review of all AI approaches that have been applied in sediment modelling. The current research focuses on the development of AI application in sediment transport. In addition, the review identifies major challenges and opportunities for prospective research. Throughout the literature, complementary models superior to classical modelling.
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.
1995-01-01
NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.
Identifying gnostic predictors of the vaccine response
Haining, W. Nicholas; Pulendran, Bali
2012-01-01
Molecular predictors of the response to vaccination could transform vaccine development. They would allow larger numbers of vaccine candidates to be rapidly screened, shortening the development time for new vaccines. Gene-expression based predictors of vaccine response have shown early promise. However, a limitation of gene-expression based predictors is that they often fail to reveal the mechanistic basis for their ability to classify response. Linking predictive signatures to the function of their component genes would advance basic understanding of vaccine immunity and also improve the robustness of outcome classification. New analytic tools now allow more biological meaning to be extracted from predictive signatures. Functional genomic approaches to perturb gene expression in mammalian cells permit the function of predictive genes to be surveyed in highly parallel experiments. The challenge for vaccinologists is therefore to use these tools to embed mechanistic insights into predictors of vaccine response. PMID:22633886
Predicting Great Lakes fish yields: tools and constraints
Lewis, C.A.; Schupp, D.H.; Taylor, W.W.; Collins, J.J.; Hatch, Richard W.
1987-01-01
Prediction of yield is a critical component of fisheries management. The development of sound yield prediction methodology and the application of the results of yield prediction are central to the evolution of strategies to achieve stated goals for Great Lakes fisheries and to the measurement of progress toward those goals. Despite general availability of species yield models, yield prediction for many Great Lakes fisheries has been poor due to the instability of the fish communities and the inadequacy of available data. A host of biological, institutional, and societal factors constrain both the development of sound predictions and their application to management. Improved predictive capability requires increased stability of Great Lakes fisheries through rehabilitation of well-integrated communities, improvement of data collection, data standardization and information-sharing mechanisms, and further development of the methodology for yield prediction. Most important is the creation of a better-informed public that will in turn establish the political will to do what is required.
Web tools for predictive toxicology model building.
Jeliazkova, Nina
2012-07-01
The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.
Revel8or: Model Driven Capacity Planning Tool Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Liu, Yan; Bui, Ngoc B.
2007-05-31
Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less
Development of a screening tool to predict malnutrition among children under two years old in Zambia
Hasegawa, Junko; Ito, Yoichi M; Yamauchi, Taro
2017-01-01
ABSTRACT Background: Maternal and child undernutrition is an important issue, particularly in low- and middle-income countries. Children at high risk of malnutrition should be prioritized to receive necessary interventions to minimize such risk. Several risk factors have been proposed; however, until now, there has been no appropriate evaluation method to identify these children. In sub-Saharan Africa, children commonly receive regular check-ups from community health workers. A simple and easy nutrition assessment method is therefore needed for use by semi-professional health workers. Objectives: The aim of this study was to develop and test a practical screening tool for community use in predicting growth stunting in children under two years in rural Zambia. Methods: Field research was conducted from July to August 2014 in Southern Province, Zambia. Two hundred and sixty-four mother-child pairs participated in the study. Anthropometric measurements were performed on all children and mothers, and all mothers were interviewed. Risk factors for the screening test were estimated by using least absolute shrinkage and selection operator analysis. After re-evaluating all participants using the new screening tool, a receiver operating characteristic curve was drawn to set the cut-off value. Sensitivity and specificity were also calculated. Results: The screening tool included age, weight-for-age Z-score status, birth weight, feeding status, history of sibling death, multiple birth, and maternal education level. The total score ranged from 0 to 22, and the cut-off value was eight. Sensitivity and specificity were 0.963 and 0.697 respectively. Conclusions: A screening tool was developed to predict children at high risk of malnutrition living in Zambia. Further longitudinal studies are needed to confirm the test’s validity in detecting future stunting and to investigate the effectiveness of malnutrition treatment. PMID:28730929
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef
Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less
Prostate Upgrading Team Project — EDRN Public Portal
Aim 1: We will develop a risk assessment tool using commonly-collected clinical information from a series of contemporary radical prostatectomies to predict the risk of prostate cancer upgrading to high grade cancer at radical prostatectomy. These data will be combined as a part of our Early Detection Research Network (EDRN) GU Working Group into a risk assessment tool; this tool will be named the EDRN Prostatectomy Upgrading Calculator or (EPUC).
Flight Awareness Collaboration Tool Development
NASA Technical Reports Server (NTRS)
Mogford, Richard
2016-01-01
This is a PowerPoint presentation covering airline operations center (AOC) research. It reviews a dispatcher decision support tool called the Flight Awareness Collaboration Tool (FACT). FACT gathers information about winter weather onto one screen and includes predictive abilities. FACT should prove to be useful for airline dispatchers and airport personnel when they manage winter storms and their effect on air traffic. This material is very similar to other previously approved presentations.
Iida, Masahiro; Ikeda, Fumie; Hata, Jun; Hirakawa, Yoichiro; Ohara, Tomoyuki; Mukai, Naoko; Yoshida, Daigo; Yonemoto, Koji; Esaki, Motohiro; Kitazono, Takanari; Kiyohara, Yutaka; Ninomiya, Toshiharu
2018-05-01
There have been very few reports of risk score models for the development of gastric cancer. The aim of this study was to develop and validate a risk assessment tool for discerning future gastric cancer risk in Japanese. A total of 2444 subjects aged 40 years or over were followed up for 14 years from 1988 (derivation cohort), and 3204 subjects of the same age group were followed up for 5 years from 2002 (validation cohort). The weighting (risk score) of each risk factor for predicting future gastric cancer in the risk assessment tool was determined based on the coefficients of a Cox proportional hazards model in the derivation cohort. The goodness of fit of the established risk assessment tool was assessed using the c-statistic and the Hosmer-Lemeshow test in the validation cohort. During the follow-up, gastric cancer developed in 90 subjects in the derivation cohort and 35 subjects in the validation cohort. In the derivation cohort, the risk prediction model for gastric cancer was established using significant risk factors: age, sex, the combination of Helicobacter pylori antibody and pepsinogen status, hemoglobin A1c level, and smoking status. The incidence of gastric cancer increased significantly as the sum of risk scores increased (P trend < 0.001). The risk assessment tool was validated internally and showed good discrimination (c-statistic = 0.76) and calibration (Hosmer-Lemeshow test P = 0.43) in the validation cohort. We developed a risk assessment tool for gastric cancer that provides a useful guide for stratifying an individual's risk of future gastric cancer.
Gruber, Andreas R; Bernhart, Stephan H; Lorenz, Ronny
2015-01-01
The ViennaRNA package is a widely used collection of programs for thermodynamic RNA secondary structure prediction. Over the years, many additional tools have been developed building on the core programs of the package to also address issues related to noncoding RNA detection, RNA folding kinetics, or efficient sequence design considering RNA-RNA hybridizations. The ViennaRNA web services provide easy and user-friendly web access to these tools. This chapter describes how to use this online platform to perform tasks such as prediction of minimum free energy structures, prediction of RNA-RNA hybrids, or noncoding RNA detection. The ViennaRNA web services can be used free of charge and can be accessed via http://rna.tbi.univie.ac.at.
Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony
2018-01-01
This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.
Fujikawa, Hiroshi; Kimura, Bon; Fujii, Tateo
2009-09-01
In this study, we developed a predictive program for Vibrio parahaemolyticus growth under various environmental conditions. Raw growth data was obtained with a V. parahaemolyticus O3:K6 strain cultured at a variety of broth temperatures, pH, and salt concentrations. Data were analyzed with our logistic model and the parameter values of the model were analyzed with polynomial equations. A prediction program consisting of the growth model and the polynomial equations was then developed. After the range of the growth environments was modified, the program successfully predicted the growth for all environments tested. The program could be a useful tool to ensure the bacteriological safety of seafood.
New Tool Released for Engine-Airframe Blade-Out Structural Simulations
NASA Technical Reports Server (NTRS)
Lawrence, Charles
2004-01-01
Researchers at the NASA Glenn Research Center have enhanced a general-purpose finite element code, NASTRAN, for engine-airframe structural simulations during steady-state and transient operating conditions. For steady-state simulations, the code can predict critical operating speeds, natural modes of vibration, and forced response (e.g., cabin noise and component fatigue). The code can be used to perform static analysis to predict engine-airframe response and component stresses due to maneuver loads. For transient response, the simulation code can be used to predict response due to bladeoff events and subsequent engine shutdown and windmilling conditions. In addition, the code can be used as a pretest analysis tool to predict the results of the bladeout test required for FAA certification of new and derivative aircraft engines. Before the present analysis code was developed, all the major aircraft engine and airframe manufacturers in the United States and overseas were performing similar types of analyses to ensure the structural integrity of engine-airframe systems. Although there were many similarities among the analysis procedures, each manufacturer was developing and maintaining its own structural analysis capabilities independently. This situation led to high software development and maintenance costs, complications with manufacturers exchanging models and results, and limitations in predicting the structural response to the desired degree of accuracy. An industry-NASA team was formed to overcome these problems by developing a common analysis tool that would satisfy all the structural analysis needs of the industry and that would be available and supported by a commercial software vendor so that the team members would be relieved of maintenance and development responsibilities. Input from all the team members was used to ensure that everyone's requirements were satisfied and that the best technology was incorporated into the code. Furthermore, because the code would be distributed by a commercial software vendor, it would be more readily available to engine and airframe manufacturers, as well as to nonaircraft companies that did not previously have access to this capability.
Marufu, Takawira C; Mannings, Alexa; Moppett, Iain K
2015-12-01
Accurate peri-operative risk prediction is an essential element of clinical practice. Various risk stratification tools for assessing patients' risk of mortality or morbidity have been developed and applied in clinical practice over the years. This review aims to outline essential characteristics (predictive accuracy, objectivity, clinical utility) of currently available risk scoring tools for hip fracture patients. We searched eight databases; AMED, CINHAL, Clinical Trials.gov, Cochrane, DARE, EMBASE, MEDLINE and Web of Science for all relevant studies published until April 2015. We included published English language observational studies that considered the predictive accuracy of risk stratification tools for patients with fragility hip fracture. After removal of duplicates, 15,620 studies were screened. Twenty-nine papers met the inclusion criteria, evaluating 25 risk stratification tools. Risk stratification tools considered in more than two studies were; ASA, CCI, E-PASS, NHFS and O-POSSUM. All tools were moderately accurate and validated in multiple studies; however there are some limitations to consider. The E-PASS and O-POSSUM are comprehensive but complex, and require intraoperative data making them a challenge for use on patient bedside. The ASA, CCI and NHFS are simple, easy and inexpensive using routinely available preoperative data. Contrary to the ASA and CCI which has subjective variables in addition to other limitations, the NHFS variables are all objective. In the search for a simple and inexpensive, easy to calculate, objective and accurate tool, the NHFS may be the most appropriate of the currently available scores for hip fracture patients. However more studies need to be undertaken before it becomes a national hip fracture risk stratification or audit tool of choice. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
XBeach-G: a tool for predicting gravel barrier response to extreme storm conditions
NASA Astrophysics Data System (ADS)
Masselink, Gerd; Poate, Tim; McCall, Robert; Roelvink, Dano; Russell, Paul; Davidson, Mark
2014-05-01
Gravel beaches protect low-lying back-barrier regions from flooding during storm events and their importance to society is widely acknowledged. Unfortunately, breaching and extensive storm damage has occurred at many gravel sites and this is likely to increase as a result of sea-level rise and enhanced storminess due to climate change. Limited scientific guidance is currently available to provide beach managers with operational management tools to predict the response of gravel beaches to storms. The New Understanding and Prediction of Storm Impacts on Gravel beaches (NUPSIG) project aims to improve our understanding of storm impacts on gravel coastal environments and to develop a predictive capability by modelling these impacts. The NUPSIG project uses a 5-pronged approach to address its aim: (1) analyse hydrodynamic data collected during a proto-type laboratory experiment on a gravel beach; (2) collect hydrodynamic field data on a gravel beach under a range of conditions, including storm waves with wave heights up to 3 m; (3) measure swash dynamics and beach response on 10 gravel beaches during extreme wave conditions with wave heights in excess of 3 m; (4) use the data collected under 1-3 to develop and validate a numerical model to model hydrodynamics and morphological response of gravel beaches under storm conditions; and (5) develop a tool for end-users, based on the model formulated under (4), for predicting storm response of gravel beaches and barriers. The aim of this presentation is to present the key results of the NUPSIG project and introduce the end-user tool for predicting storm response on gravel beaches. The model is based on the numerical model XBeach, and different forcing scenarios (wave and tides), barrier configurations (dimensions) and sediment characteristics are easily uploaded for model simulations using a Graphics User Interface (GUI). The model can be used to determine the vulnerability of gravel barriers to storm events, but can also be used to help optimise design criteria for gravel barriers to reduce their vulnerability and enhance their coastal protection ability.
Fracture prediction and calibration of a Canadian FRAX® tool: a population-based report from CaMos
Fraser, L.-A.; Langsetmo, L.; Berger, C.; Ioannidis, G.; Goltzman, D.; Adachi, J. D.; Papaioannou, A.; Josse, R.; Kovacs, C. S.; Olszynski, W. P.; Towheed, T.; Hanley, D. A.; Kaiser, S. M.; Prior, J.; Jamal, S.; Kreiger, N.; Brown, J. P.; Johansson, H.; Oden, A.; McCloskey, E.; Kanis, J. A.
2016-01-01
Summary A new Canadian WHO fracture risk assessment (FRAX®) tool to predict 10-year fracture probability was compared with observed 10-year fracture outcomes in a large Canadian population-based study (CaMos). The Canadian FRAX tool showed good calibration and discrimination for both hip and major osteoporotic fractures. Introduction The purpose of this study was to validate a new Canadian WHO fracture risk assessment (FRAX®) tool in a prospective, population-based cohort, the Canadian Multi-centre Osteoporosis Study (CaMos). Methods A FRAX tool calibrated to the Canadian population was developed by the WHO Collaborating Centre for Metabolic Bone Diseases using national hip fracture and mortality data. Ten-year FRAX probabilities with and without bone mineral density (BMD) were derived for CaMos women (N=4,778) and men (N=1,919) and compared with observed fracture outcomes to 10 years (Kaplan–Meier method). Cox proportional hazard models were used to investigate the contribution of individual FRAX variables. Results Mean overall 10-year FRAX probability with BMD for major osteoporotic fractures was not significantly different from the observed value in men [predicted 5.4% vs. observed 6.4% (95%CI 5.2–7.5%)] and only slightly lower in women [predicted 10.8% vs. observed 12.0% (95%CI 11.0–12.9%)]. FRAX was well calibrated for hip fracture assessment in women [predicted 2.7% vs. observed 2.7% (95%CI 2.2–3.2%)] but underestimated risk in men [predicted 1.3% vs. observed 2.4% (95%CI 1.7–3.1%)]. FRAX with BMD showed better fracture discrimination than FRAX without BMD or BMD alone. Age, body mass index, prior fragility fracture and femoral neck BMD were significant independent predictors of major osteoporotic fractures; sex, age, prior fragility fracture and femoral neck BMD were significant independent predictors of hip fractures. Conclusion The Canadian FRAX tool provides predictions consistent with observed fracture rates in Canadian women and men, thereby providing a valuable tool for Canadian clinicians assessing patients at risk of fracture. PMID:21161508
Bigger Data, Collaborative Tools and the Future of Predictive Drug Discovery
Clark, Alex M.; Swamidass, S. Joshua; Litterman, Nadia; Williams, Antony J.
2014-01-01
Over the past decade we have seen a growth in the provision of chemistry data and cheminformatics tools as either free websites or software as a service (SaaS) commercial offerings. These have transformed how we find molecule-related data and use such tools in our research. There have also been efforts to improve collaboration between researchers either openly or through secure transactions using commercial tools. A major challenge in the future will be how such databases and software approaches handle larger amounts of data as it accumulates from high throughput screening and enables the user to draw insights, enable predictions and move projects forward. We now discuss how information from some drug discovery datasets can be made more accessible and how privacy of data should not overwhelm the desire to share it at an appropriate time with collaborators. We also discuss additional software tools that could be made available and provide our thoughts on the future of predictive drug discovery in this age of big data. We use some examples from our own research on neglected diseases, collaborations, mobile apps and algorithm development to illustrate these ideas. PMID:24943138
In order to assess risk of contaminants to taxa with limited or no toxicity data available, Interspecies Correlation Estimation (ICE) models have been developed by the U.S. Environmental Protection Agency to extrapolate contaminant sensitivity predictions based on data from commo...
Applying online WEPP to assess forest watershed hydrology
USDA-ARS?s Scientific Manuscript database
The U.S. Army Corps of Engineers (USACE) and the Great Lakes Commission are developing technologies and predictive tools to aid in watershed management with an ultimate goal of improving and preserving the water quality in the Great Lakes Basin. A new version of the online Water Erosion Prediction P...
Solvation models, based on fundamental chemical structure theory, were developed in the SPARC mechanistic tool box to predict a large array of physical properties of organic compounds in water and in non-aqueous solvents strictly from molecular structure. The SPARC self-interact...
Study of high altitude plume impingement
NASA Technical Reports Server (NTRS)
Wojciechowski, C. J.; Penny, M. M.; Prozan, R. J.; Seymour, D.; Greenwood, T. F.
1972-01-01
Computer program has been developed as analytical tool to predict severity of effects of exhaust of rocket engines on adjacent spacecraft surfaces. Program computes forces, moments, pressures, and heating rates on surfaces immersed in or subjected to exhaust plume environments. Predictions will be useful in design of systems where such problems are anticipated.
USDA-ARS?s Scientific Manuscript database
Background/Question/Methods As the scientific and regulatory communities realize the significant environmental impacts and ubiquity of “contaminants of emerging concern” (CECs), it is increasingly imperative to develop quantitative assessment tools to evaluate and predict the fate and transport of...
Tools to aid post-wildfire assessment and erosion-mitigation treatment decisions
Peter R. Robichaud; Louise E. Ashmun
2013-01-01
A considerable investment in post-fire research over the past decade has improved our understanding of wildfire effects on soil, hydrology, erosion and erosion-mitigation treatment effectiveness. Using this new knowledge, we have developed several tools to assist land managers with post-wildfire assessment and treatment decisions, such as prediction models, research...
The Application of Function Points to Predict Source Lines of Code for Software Development
1992-09-01
there are some disadvantages. Software estimating tools are expensive. A single tool may cost more than $15,000 due to the high market value of the...term and Lang variables simultaneously onlN added marginal improvements over models with these terms included singularly. Using all the available
The National Academy of Science (NAS) recently recommended exploration of predictive tools, such as interspecies correlation estimation (ICE), to estimate acute toxicity values for listed species and support development of species sensitivity distributions (SSDs). We explored the...
ERIC Educational Resources Information Center
Jamil, Faiza M.; Sabol, Terri J.; Hamre, Bridget K.; Pianta, Robert C.
2015-01-01
Contemporary education reforms focus on assessing teachers' performance and developing selection mechanisms for hiring effective teachers. Tools that enable the prediction of teachers' classroom performance promote schools' ability to hire teachers more likely to be successful in the classroom. In addition, these assessment tools can be used for…
Pu, Xia; Ye, Yuanqing; Wu, Xifeng
2014-01-01
Despite the advances made in cancer management over the past few decades, improvements in cancer diagnosis and prognosis are still poor, highlighting the need for individualized strategies. Toward this goal, risk prediction models and molecular diagnostic tools have been developed, tailoring each step of risk assessment from diagnosis to treatment and clinical outcomes based on the individual's clinical, epidemiological, and molecular profiles. These approaches hold increasing promise for delivering a new paradigm to maximize the efficiency of cancer surveillance and efficacy of treatment. However, they require stringent study design, methodology development, comprehensive assessment of biomarkers and risk factors, and extensive validation to ensure their overall usefulness for clinical translation. In the current study, the authors conducted a systematic review using breast cancer as an example and provide general guidelines for risk prediction models and molecular diagnostic tools, including development, assessment, and validation. © 2013 American Cancer Society.
El Hage Chehade, Hiba; Wazir, Umar; Mokbel, Kinan; Kasem, Abdul; Mokbel, Kefah
2018-01-01
Decision-making regarding adjuvant chemotherapy has been based on clinical and pathological features. However, such decisions are seldom consistent. Web-based predictive models have been developed using data from cancer registries to help determine the need for adjuvant therapy. More recently, with the recognition of the heterogenous nature of breast cancer, genomic assays have been developed to aid in the therapeutic decision-making. We have carried out a comprehensive literature review regarding online prognostication tools and genomic assays to assess whether online tools could be used as valid alternatives to genomic profiling in decision-making regarding adjuvant therapy in early breast cancer. Breast cancer has been recently recognized as a heterogenous disease based on variations in molecular characteristics. Online tools are valuable in guiding adjuvant treatment, especially in resource constrained countries. However, in the era of personalized therapy, molecular profiling appears to be superior in predicting clinical outcome and guiding therapy. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G.P.; Burns, H.H.; Langton, C.
2013-07-01
The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste and Nuclear Materials Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up tomore » 100 years and longer for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox has produced tangible benefits to the DOE Performance Assessment (PA) community. A review of prior DOE PAs has provided a list of potential opportunities for improving cementitious barrier performance predictions through the use of the CBP software tools. These opportunities include: 1) impact of atmospheric exposure to concrete and grout before closure, such as accelerated slag and Tc-99 oxidation, 2) prediction of changes in K{sub d}/mobility as a function of time that result from changing pH and redox conditions, 3) concrete degradation from rebar corrosion due to carbonation, 4) early age cracking from drying and/or thermal shrinkage and 5) degradation due to sulfate attack. The CBP has already had opportunity to provide near-term, tangible support to ongoing DOE-EM PAs such as the Savannah River Saltstone Disposal Facility (SDF) by providing a sulfate attack analysis that predicts the extent and damage that sulfate ingress will have on the concrete vaults over extended time (i.e., > 1000 years). This analysis is one of the many technical opportunities in cementitious barrier performance that can be addressed by the DOE-EM sponsored CBP software tools. Modification of the existing tools can provide many opportunities to bring defense in depth in prediction of the performance of cementitious barriers over time. (authors)« less
A Machine Learning Approach to Predict Gene Regulatory Networks in Seed Development in Arabidopsis
Ni, Ying; Aghamirzaie, Delasa; Elmarakeby, Haitham; Collakova, Eva; Li, Song; Grene, Ruth; Heath, Lenwood S.
2016-01-01
Gene regulatory networks (GRNs) provide a representation of relationships between regulators and their target genes. Several methods for GRN inference, both unsupervised and supervised, have been developed to date. Because regulatory relationships consistently reprogram in diverse tissues or under different conditions, GRNs inferred without specific biological contexts are of limited applicability. In this report, a machine learning approach is presented to predict GRNs specific to developing Arabidopsis thaliana embryos. We developed the Beacon GRN inference tool to predict GRNs occurring during seed development in Arabidopsis based on a support vector machine (SVM) model. We developed both global and local inference models and compared their performance, demonstrating that local models are generally superior for our application. Using both the expression levels of the genes expressed in developing embryos and prior known regulatory relationships, GRNs were predicted for specific embryonic developmental stages. The targets that are strongly positively correlated with their regulators are mostly expressed at the beginning of seed development. Potential direct targets were identified based on a match between the promoter regions of these inferred targets and the cis elements recognized by specific regulators. Our analysis also provides evidence for previously unknown inhibitory effects of three positive regulators of gene expression. The Beacon GRN inference tool provides a valuable model system for context-specific GRN inference and is freely available at https://github.com/BeaconProjectAtVirginiaTech/beacon_network_inference.git. PMID:28066488
The development of a probabilistic approach to forecast coastal change
Lentz, Erika E.; Hapke, Cheryl J.; Rosati, Julie D.; Wang, Ping; Roberts, Tiffany M.
2011-01-01
This study demonstrates the applicability of a Bayesian probabilistic model as an effective tool in predicting post-storm beach changes along sandy coastlines. Volume change and net shoreline movement are modeled for two study sites at Fire Island, New York in response to two extratropical storms in 2007 and 2009. Both study areas include modified areas adjacent to unmodified areas in morphologically different segments of coast. Predicted outcomes are evaluated against observed changes to test model accuracy and uncertainty along 163 cross-shore transects. Results show strong agreement in the cross validation of predictions vs. observations, with 70-82% accuracies reported. Although no consistent spatial pattern in inaccurate predictions could be determined, the highest prediction uncertainties appeared in locations that had been recently replenished. Further testing and model refinement are needed; however, these initial results show that Bayesian networks have the potential to serve as important decision-support tools in forecasting coastal change.
An automated performance budget estimator: a process for use in instrumentation
NASA Astrophysics Data System (ADS)
Laporte, Philippe; Schnetler, Hermine; Rees, Phil
2016-08-01
Current day astronomy projects continue to increase in size and are increasingly becoming more complex, regardless of the wavelength domain, while risks in terms of safety, cost and operability have to be reduced to ensure an affordable total cost of ownership. All of these drivers have to be considered carefully during the development process of an astronomy project at the same time as there is a big drive to shorten the development life-cycle. From the systems engineering point of view, this evolution is a significant challenge. Big instruments imply management of interfaces within large consortia and dealing with tight design phase schedules which necessitate efficient and rapid interactions between all the stakeholders to firstly ensure that the system is defined correctly and secondly that the designs will meet all the requirements. It is essential that team members respond quickly such that the time available for the design team is maximised. In this context, performance prediction tools can be very helpful during the concept phase of a project to help selecting the best design solution. In the first section of this paper we present the development of such a prediction tool that can be used by the system engineer to determine the overall performance of the system and to evaluate the impact on the science based on the proposed design. This tool can also be used in "what-if" design analysis to assess the impact on the overall performance of the system based on the simulated numbers calculated by the automated system performance prediction tool. Having such a tool available from the beginning of a project can allow firstly for a faster turn-around between the design engineers and the systems engineer and secondly, between the systems engineer and the instrument scientist. Following the first section we described the process for constructing a performance estimator tool, followed by describing three projects in which such a tool has been utilised to illustrate how such a tool have been used in astronomy projects. The three use-cases are; EAGLE, one of the European Extremely Large Telescope (E-ELT) Multi-Object Spectrograph (MOS) instruments that was studied from 2007 to 2009, the Multi-Object Optical and Near-Infrared Spectrograph (MOONS) for the European Southern Observatory's Very Large Telescope (VLT), currently under development and SST-GATE.
Gowin, Joshua L; Ball, Tali M; Wittmann, Marc; Tapert, Susan F; Paulus, Martin P
2015-07-01
Nearly half of individuals with substance use disorders relapse in the year after treatment. A diagnostic tool to help clinicians make decisions regarding treatment does not exist for psychiatric conditions. Identifying individuals with high risk for relapse to substance use following abstinence has profound clinical consequences. This study aimed to develop neuroimaging as a robust tool to predict relapse. 68 methamphetamine-dependent adults (15 female) were recruited from 28-day inpatient treatment. During treatment, participants completed a functional MRI scan that examined brain activation during reward processing. Patients were followed 1 year later to assess abstinence. We examined brain activation during reward processing between relapsing and abstaining individuals and employed three random forest prediction models (clinical and personality measures, neuroimaging measures, a combined model) to generate predictions for each participant regarding their relapse likelihood. 18 individuals relapsed. There were significant group by reward-size interactions for neural activation in the left insula and right striatum for rewards. Abstaining individuals showed increased activation for large, risky relative to small, safe rewards, whereas relapsing individuals failed to show differential activation between reward types. All three random forest models yielded good test characteristics such that a positive test for relapse yielded a likelihood ratio 2.63, whereas a negative test had a likelihood ratio of 0.48. These findings suggest that neuroimaging can be developed in combination with other measures as an instrument to predict relapse, advancing tools providers can use to make decisions about individualized treatment of substance use disorders. Published by Elsevier Ireland Ltd.
Zorn, Kevin C; Gallina, Andrea; Hutterer, Georg C; Walz, Jochen; Shalhav, Arieh L; Zagaja, Gregory P; Valiquette, Luc; Gofrit, Ofer N; Orvieto, Marcelo A; Taxy, Jerome B; Karakiewicz, Pierre I
2007-11-01
Several staging tools have been developed for open radical prostatectomy (ORP) patients. However, the validity of these tools has never been formally tested in patients treated with robot-assisted laparoscopic radical prostatectomy (RALP). We tested the accuracy of an ORP-derived nomogram in predicting the rate of extracapsular extension (ECE) in a large RALP cohort. Serum prostate specific antigen (PSA) and side-specific clinical stage and biopsy Gleason sum information were used in a previously validated nomogram predicting side-specific ECE. The nomogram-derived predictions were compared with the observed rate of ECE, and the accuracy of the predictions was quantified. Each prostate lobe was analyzed independently. As complete data were available for 576 patients, the analyses targeted 1152 prostate lobes. Median age and serum PSA concentration at radical prostatectomy were 60 years and 5.4 ng/mL, respectively. The majority of side-specific clinical stages were T(1c) (993; 86.2%). Most side-specific biopsy Gleason sums were 6 (572; 49.7%). The median side-specific percentages of positive cores and of cancer were, respectively, 20.0% and 5.0%. At final pathologic review, 107 patients (18.6%) had ECE, and side-specific ECE was present in 117 patients (20.3%). The nomogram was 89% accurate in the RALP cohort v 84% in the previously reported ORP validation. The ORP side-specific ECE nomogram is highly accurate in the RALP population, suggesting that predictive and possibly prognostic tools developed in ORP patients may be equally accurate in their RALP counterparts.
Prognostic and Prediction Tools in Bladder Cancer: A Comprehensive Review of the Literature.
Kluth, Luis A; Black, Peter C; Bochner, Bernard H; Catto, James; Lerner, Seth P; Stenzl, Arnulf; Sylvester, Richard; Vickers, Andrew J; Xylinas, Evanguelos; Shariat, Shahrokh F
2015-08-01
This review focuses on risk assessment and prediction tools for bladder cancer (BCa). To review the current knowledge on risk assessment and prediction tools to enhance clinical decision making and counseling of patients with BCa. A literature search in English was performed using PubMed in July 2013. Relevant risk assessment and prediction tools for BCa were selected. More than 1600 publications were retrieved. Special attention was given to studies that investigated the clinical benefit of a prediction tool. Most prediction tools for BCa focus on the prediction of disease recurrence and progression in non-muscle-invasive bladder cancer or disease recurrence and survival after radical cystectomy. Although these tools are helpful, recent prediction tools aim to address a specific clinical problem, such as the prediction of organ-confined disease and lymph node metastasis to help identify patients who might benefit from neoadjuvant chemotherapy. Although a large number of prediction tools have been reported in recent years, many of them lack external validation. Few studies have investigated the clinical utility of any given model as measured by its ability to improve clinical decision making. There is a need for novel biomarkers to improve the accuracy and utility of prediction tools for BCa. Decision tools hold the promise of facilitating the shared decision process, potentially improving clinical outcomes for BCa patients. Prediction models need external validation and assessment of clinical utility before they can be incorporated into routine clinical care. We looked at models that aim to predict outcomes for patients with bladder cancer (BCa). We found a large number of prediction models that hold the promise of facilitating treatment decisions for patients with BCa. However, many models are missing confirmation in a different patient cohort, and only a few studies have tested the clinical utility of any given model as measured by its ability to improve clinical decision making. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.
A unified approach for composite cost reporting and prediction in the ACT program
NASA Technical Reports Server (NTRS)
Freeman, W. Tom; Vosteen, Louis F.; Siddiqi, Shahid
1991-01-01
The Structures Technology Program Office (STPO) at NASA Langley Research Center has held two workshops with representatives from the commercial airframe companies to establish a plan for development of a standard cost reporting format and a cost prediction tool for conceptual and preliminary designers. This paper reviews the findings of the workshop representatives with a plan for implementation of their recommendations. The recommendations of the cost tracking and reporting committee will be implemented by reinstituting the collection of composite part fabrication data in a format similar to the DoD/NASA Structural Composites Fabrication Guide. The process of data collection will be automated by taking advantage of current technology with user friendly computer interfaces and electronic data transmission. Development of a conceptual and preliminary designers' cost prediction model will be initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design (CAD) programs is assessed.
Musite, a tool for global prediction of general and kinase-specific phosphorylation sites.
Gao, Jianjiong; Thelen, Jay J; Dunker, A Keith; Xu, Dong
2010-12-01
Reversible protein phosphorylation is one of the most pervasive post-translational modifications, regulating diverse cellular processes in various organisms. High throughput experimental studies using mass spectrometry have identified many phosphorylation sites, primarily from eukaryotes. However, the vast majority of phosphorylation sites remain undiscovered, even in well studied systems. Because mass spectrometry-based experimental approaches for identifying phosphorylation events are costly, time-consuming, and biased toward abundant proteins and proteotypic peptides, in silico prediction of phosphorylation sites is potentially a useful alternative strategy for whole proteome annotation. Because of various limitations, current phosphorylation site prediction tools were not well designed for comprehensive assessment of proteomes. Here, we present a novel software tool, Musite, specifically designed for large scale predictions of both general and kinase-specific phosphorylation sites. We collected phosphoproteomics data in multiple organisms from several reliable sources and used them to train prediction models by a comprehensive machine-learning approach that integrates local sequence similarities to known phosphorylation sites, protein disorder scores, and amino acid frequencies. Application of Musite on several proteomes yielded tens of thousands of phosphorylation site predictions at a high stringency level. Cross-validation tests show that Musite achieves some improvement over existing tools in predicting general phosphorylation sites, and it is at least comparable with those for predicting kinase-specific phosphorylation sites. In Musite V1.0, we have trained general prediction models for six organisms and kinase-specific prediction models for 13 kinases or kinase families. Although the current pretrained models were not correlated with any particular cellular conditions, Musite provides a unique functionality for training customized prediction models (including condition-specific models) from users' own data. In addition, with its easily extensible open source application programming interface, Musite is aimed at being an open platform for community-based development of machine learning-based phosphorylation site prediction applications. Musite is available at http://musite.sourceforge.net/.
Predicting Networked Strategic Behavior via Machine Learning and Game Theory
2015-01-13
The funding for this project was used to develop basic models, methodology and algorithms for the application of machine learning and related tools to settings in which strategic behavior is central. Among the topics studied was the development of simple behavioral models explaining and predicting human subject behavior in networked strategic experiments from prior work. These included experiments in biased voting and networked trading, among others.
Surrogate Analysis and Index Developer (SAID) tool
Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.
2015-10-01
The regression models created in SAID can be used in utilities that have been developed to work with the USGS National Water Information System (NWIS) and for the USGS National Real-Time Water Quality (NRTWQ) Web site. The real-time dissemination of predicted SSC and prediction intervals for each time step has substantial potential to improve understanding of sediment-related water quality and associated engineering and ecological management decisions.
A critical assessment of topologically associating domain prediction tools
Dali, Rola
2017-01-01
Abstract Topologically associating domains (TADs) have been proposed to be the basic unit of chromosome folding and have been shown to play key roles in genome organization and gene regulation. Several different tools are available for TAD prediction, but their properties have never been thoroughly assessed. In this manuscript, we compare the output of seven different TAD prediction tools on two published Hi-C data sets. TAD predictions varied greatly between tools in number, size distribution and other biological properties. Assessed against a manual annotation of TADs, individual TAD boundary predictions were found to be quite reliable, but their assembly into complete TAD structures was much less so. In addition, many tools were sensitive to sequencing depth and resolution of the interaction frequency matrix. This manuscript provides users and designers of TAD prediction tools with information that will help guide the choice of tools and the interpretation of their predictions. PMID:28334773
Shah, Prakesh S; Ye, Xiang Y; Synnes, Anne; Rouvinez-Bouali, Nicole; Yee, Wendy; Lee, Shoo K
2012-03-01
To develop models and a graphical tool for predicting survival to discharge without major morbidity for infants with a gestational age (GA) at birth of 22-32 weeks using infant information at birth. Retrospective cohort study. Canadian Neonatal Network data for 2003-2008 were utilised. Neonates born between 22 and 32 weeks gestation admitted to neonatal intensive care units in Canada. Survival to discharge without major morbidity defined as survival without severe neurological injury (intraventricular haemorrhage grade 3 or 4 or periventricular leukomalacia), severe retinopathy (stage 3 or higher), necrotising enterocolitis (stage 2 or 3) or chronic lung disease. Of the 17 148 neonates who met the eligibility criteria, 65% survived without major morbidity. Sex and GA at birth were significant predictors. Birth weight (BW) had a significant but non-linear effect on survival without major morbidity. Although maternal information characteristics such as steroid use, improved the prediction of survival without major morbidity, sex, GA at birth and BW for GA predicted survival without major morbidity almost as accurately (area under the curve: 0.84). The graphical tool based on the models showed how the GA and BW for GA interact, to enable prediction of outcomes especially for small and large for GA infants. This graphical tool provides an improved and easily interpretable method to predict survival without major morbidity for very preterm infants at the time of birth. These curves are especially useful for small and large for GA infants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clement, T. Prabhakar; Barnett, Mark O.; Zheng, Chunmiao
DE-FG02-06ER64213: Development of Modeling Methods and Tools for Predicting Coupled Reactive Transport Processes in Porous Media at Multiple Scales Investigators: T. Prabhakar Clement (PD/PI) and Mark O. Barnett (Auburn), Chunmiao Zheng (Univ. of Alabama), and Norman L. Jones (BYU). The objective of this project was to develop scalable modeling approaches for predicting the reactive transport of metal contaminants. We studied two contaminants, a radioactive cation [U(VI)] and a metal(loid) oxyanion system [As(III/V)], and investigated their interactions with two types of subsurface materials, iron and manganese oxyhydroxides. We also developed modeling methods for describing the experimental results. Overall, the project supportedmore » 25 researchers at three universities. Produced 15 journal articles, 3 book chapters, 6 PhD dissertations and 6 MS theses. Three key journal articles are: 1) Jeppu et al., A scalable surface complexation modeling framework for predicting arsenate adsorption on goethite-coated sands, Environ. Eng. Sci., 27(2): 147-158, 2010. 2) Loganathan et al., Scaling of adsorption reactions: U(VI) experiments and modeling, Applied Geochemistry, 24 (11), 2051-2060, 2009. 3) Phillippi, et al., Theoretical solid/solution ratio effects on adsorption and transport: uranium (VI) and carbonate, Soil Sci. Soci. of America, 71:329-335, 2007« less
GPS-ARM: Computational Analysis of the APC/C Recognition Motif by Predicting D-Boxes and KEN-Boxes
Ren, Jian; Cao, Jun; Zhou, Yanhong; Yang, Qing; Xue, Yu
2012-01-01
Anaphase-promoting complex/cyclosome (APC/C), an E3 ubiquitin ligase incorporated with Cdh1 and/or Cdc20 recognizes and interacts with specific substrates, and faithfully orchestrates the proper cell cycle events by targeting proteins for proteasomal degradation. Experimental identification of APC/C substrates is largely dependent on the discovery of APC/C recognition motifs, e.g., the D-box and KEN-box. Although a number of either stringent or loosely defined motifs proposed, these motif patterns are only of limited use due to their insufficient powers of prediction. We report the development of a novel GPS-ARM software package which is useful for the prediction of D-boxes and KEN-boxes in proteins. Using experimentally identified D-boxes and KEN-boxes as the training data sets, a previously developed GPS (Group-based Prediction System) algorithm was adopted. By extensive evaluation and comparison, the GPS-ARM performance was found to be much better than the one using simple motifs. With this powerful tool, we predicted 4,841 potential D-boxes in 3,832 proteins and 1,632 potential KEN-boxes in 1,403 proteins from H. sapiens, while further statistical analysis suggested that both the D-box and KEN-box proteins are involved in a broad spectrum of biological processes beyond the cell cycle. In addition, with the co-localization information, we predicted hundreds of mitosis-specific APC/C substrates with high confidence. As the first computational tool for the prediction of APC/C-mediated degradation, GPS-ARM is a useful tool for information to be used in further experimental investigations. The GPS-ARM is freely accessible for academic researchers at: http://arm.biocuckoo.org. PMID:22479614
Srinivasan, M; Shetty, N; Gadekari, S; Thunga, G; Rao, K; Kunhikatta, V
2017-07-01
Severity or mortality prediction of nosocomial pneumonia could aid in the effective triage of patients and assisting physicians. To compare various severity assessment scoring systems for predicting intensive care unit (ICU) mortality in nosocomial pneumonia patients. A prospective cohort study was conducted in a tertiary care university-affiliated hospital in Manipal, India. One hundred patients with nosocomial pneumonia, admitted in the ICUs who developed pneumonia after >48h of admission, were included. The Nosocomial Pneumonia Mortality Prediction (NPMP) model, developed in our hospital, was compared with Acute Physiology and Chronic Health Evaluation II (APACHE II), Mortality Probability Model II (MPM 72 II), Simplified Acute Physiology Score II (SAPS II), Multiple Organ Dysfunction Score (MODS), Sequential Organ Failure Assessment (SOFA), Clinical Pulmonary Infection Score (CPIS), Ventilator-Associated Pneumonia Predisposition, Insult, Response, Organ dysfunction (VAP-PIRO). Data and clinical variables were collected on the day of pneumonia diagnosis. The outcome for the study was ICU mortality. The sensitivity and specificity of the various scoring systems was analysed by plotting receiver operating characteristic (ROC) curves and computing the area under the curve for each of the mortality predicting tools. NPMP, APACHE II, SAPS II, MPM 72 II, SOFA, and VAP-PIRO were found to have similar and acceptable discrimination power as assessed by the area under the ROC curve. The AUC values for the above scores ranged from 0.735 to 0.762. CPIS and MODS showed least discrimination. NPMP is a specific tool to predict mortality in nosocomial pneumonia and is comparable to other standard scores. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
An Empiric HIV Risk Scoring Tool to Predict HIV-1 Acquisition in African Women.
Balkus, Jennifer E; Brown, Elizabeth; Palanee, Thesla; Nair, Gonasagrie; Gafoor, Zakir; Zhang, Jingyang; Richardson, Barbra A; Chirenje, Zvavahera M; Marrazzo, Jeanne M; Baeten, Jared M
2016-07-01
To develop and validate an HIV risk assessment tool to predict HIV acquisition among African women. Data were analyzed from 3 randomized trials of biomedical HIV prevention interventions among African women (VOICE, HPTN 035, and FEM-PrEP). We implemented standard methods for the development of clinical prediction rules to generate a risk-scoring tool to predict HIV acquisition over the course of 1 year. Performance of the score was assessed through internal and external validations. The final risk score resulting from multivariable modeling included age, married/living with a partner, partner provides financial or material support, partner has other partners, alcohol use, detection of a curable sexually transmitted infection, and herpes simplex virus 2 serostatus. Point values for each factor ranged from 0 to 2, with a maximum possible total score of 11. Scores ≥5 were associated with HIV incidence >5 per 100 person-years and identified 91% of incident HIV infections from among only 64% of women. The area under the curve (AUC) for predictive ability of the score was 0.71 (95% confidence interval [CI]: 0.68 to 0.74), indicating good predictive ability. Risk score performance was generally similar with internal cross-validation (AUC = 0.69; 95% CI: 0.66 to 0.73) and external validation in HPTN 035 (AUC = 0.70; 95% CI: 0.65 to 0.75) and FEM-PrEP (AUC = 0.58; 95% CI: 0.51 to 0.65). A discrete set of characteristics that can be easily assessed in clinical and research settings was predictive of HIV acquisition over 1 year. The use of a validated risk score could improve efficiency of recruitment into HIV prevention research and inform scale-up of HIV prevention strategies in women at highest risk.
A Modeling Tool for Household Biogas Burner Flame Port Design
NASA Astrophysics Data System (ADS)
Decker, Thomas J.
Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.
NASA Astrophysics Data System (ADS)
Cheng, Jun; Gong, Yadong; Wang, Jinsheng
2013-11-01
The current research of micro-grinding mainly focuses on the optimal processing technology for different materials. However, the material removal mechanism in micro-grinding is the base of achieving high quality processing surface. Therefore, a novel method for predicting surface roughness in micro-grinding of hard brittle materials considering micro-grinding tool grains protrusion topography is proposed in this paper. The differences of material removal mechanism between convention grinding process and micro-grinding process are analyzed. Topography characterization has been done on micro-grinding tools which are fabricated by electroplating. Models of grain density generation and grain interval are built, and new predicting model of micro-grinding surface roughness is developed. In order to verify the precision and application effect of the surface roughness prediction model proposed, a micro-grinding orthogonally experiment on soda-lime glass is designed and conducted. A series of micro-machining surfaces which are 78 nm to 0.98 μm roughness of brittle material is achieved. It is found that experimental roughness results and the predicting roughness data have an evident coincidence, and the component variable of describing the size effects in predicting model is calculated to be 1.5×107 by reverse method based on the experimental results. The proposed model builds a set of distribution to consider grains distribution densities in different protrusion heights. Finally, the characterization of micro-grinding tools which are used in the experiment has been done based on the distribution set. It is concluded that there is a significant coincidence between surface prediction data from the proposed model and measurements from experiment results. Therefore, the effectiveness of the model is demonstrated. This paper proposes a novel method for predicting surface roughness in micro-grinding of hard brittle materials considering micro-grinding tool grains protrusion topography, which would provide significant research theory and experimental reference of material removal mechanism in micro-grinding of soda-lime glass.
An, Yi; Wang, Jiawei; Li, Chen; Leier, André; Marquez-Lago, Tatiana; Wilksch, Jonathan; Zhang, Yang; Webb, Geoffrey I; Song, Jiangning; Lithgow, Trevor
2018-01-01
Bacterial effector proteins secreted by various protein secretion systems play crucial roles in host-pathogen interactions. In this context, computational tools capable of accurately predicting effector proteins of the various types of bacterial secretion systems are highly desirable. Existing computational approaches use different machine learning (ML) techniques and heterogeneous features derived from protein sequences and/or structural information. These predictors differ not only in terms of the used ML methods but also with respect to the used curated data sets, the features selection and their prediction performance. Here, we provide a comprehensive survey and benchmarking of currently available tools for the prediction of effector proteins of bacterial types III, IV and VI secretion systems (T3SS, T4SS and T6SS, respectively). We review core algorithms, feature selection techniques, tool availability and applicability and evaluate the prediction performance based on carefully curated independent test data sets. In an effort to improve predictive performance, we constructed three ensemble models based on ML algorithms by integrating the output of all individual predictors reviewed. Our benchmarks demonstrate that these ensemble models outperform all the reviewed tools for the prediction of effector proteins of T3SS and T4SS. The webserver of the proposed ensemble methods for T3SS and T4SS effector protein prediction is freely available at http://tbooster.erc.monash.edu/index.jsp. We anticipate that this survey will serve as a useful guide for interested users and that the new ensemble predictors will stimulate research into host-pathogen relationships and inspiration for the development of new bioinformatics tools for predicting effector proteins of T3SS, T4SS and T6SS. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Gupta, Punkaj; Rettiganti, Mallikarjuna; Gossett, Jeffrey M; Daufeldt, Jennifer; Rice, Tom B; Wetzel, Randall C
2018-01-01
To create a novel tool to predict favorable neurologic outcomes during ICU stay among children with critical illness. Logistic regression models using adaptive lasso methodology were used to identify independent factors associated with favorable neurologic outcomes. A mixed effects logistic regression model was used to create the final prediction model including all predictors selected from the lasso model. Model validation was performed using a 10-fold internal cross-validation approach. Virtual Pediatric Systems (VPS, LLC, Los Angeles, CA) database. Patients less than 18 years old admitted to one of the participating ICUs in the Virtual Pediatric Systems database were included (2009-2015). None. A total of 160,570 patients from 90 hospitals qualified for inclusion. Of these, 1,675 patients (1.04%) were associated with a decline in Pediatric Cerebral Performance Category scale by at least 2 between ICU admission and ICU discharge (unfavorable neurologic outcome). The independent factors associated with unfavorable neurologic outcome included higher weight at ICU admission, higher Pediatric Index of Morality-2 score at ICU admission, cardiac arrest, stroke, seizures, head/nonhead trauma, use of conventional mechanical ventilation and high-frequency oscillatory ventilation, prolonged hospital length of ICU stay, and prolonged use of mechanical ventilation. The presence of chromosomal anomaly, cardiac surgery, and utilization of nitric oxide were associated with favorable neurologic outcome. The final online prediction tool can be accessed at https://soipredictiontool.shinyapps.io/GNOScore/. Our model predicted 139,688 patients with favorable neurologic outcomes in an internal validation sample when the observed number of patients with favorable neurologic outcomes was among 139,591 patients. The area under the receiver operating curve for the validation model was 0.90. This proposed prediction tool encompasses 20 risk factors into one probability to predict favorable neurologic outcome during ICU stay among children with critical illness. Future studies should seek external validation and improved discrimination of this prediction tool.
Finite element simulation of cutting grey iron HT250 by self-prepared Si3N4 ceramic insert
NASA Astrophysics Data System (ADS)
Wang, Bo; Wang, Li; Zhang, Enguang
2017-04-01
The finite element method has been able to simulate and solve practical machining problems, achieve the required accuracy and the highly reliability. In this paper, the simulation models based on the material properties of the self-prepared Si3N4 insert and HT250 were created. Using these models, the results of cutting force, cutting temperature and tool wear rate were obtained, and tool wear mode was predicted after cutting simulation. These approaches may develop as the new method for testing new cutting-tool materials, shortening development cycle and reducing the cost.
NASA Astrophysics Data System (ADS)
Kuttolamadom, Mathew Abraham
The objective of this research work is to create a comprehensive microstructural wear mechanism-based predictive model of tool wear in the tungsten carbide / Ti-6Al-4V machining tribosystem, and to develop a new topology characterization method for worn cutting tools in order to validate the model predictions. This is accomplished by blending first principle wear mechanism models using a weighting scheme derived from scanning electron microscopy (SEM) imaging and energy dispersive x-ray spectroscopy (EDS) analysis of tools worn under different operational conditions. In addition, the topology of worn tools is characterized through scanning by white light interferometry (WLI), and then application of an algorithm to stitch and solidify data sets to calculate the volume of the tool worn away. The methodology was to first combine and weight dominant microstructural wear mechanism models, to be able to effectively predict the tool volume worn away. Then, by developing a new metrology method for accurately quantifying the bulk-3D wear, the model-predicted wear was validated against worn tool volumes obtained from corresponding machining experiments. On analyzing worn crater faces using SEM/EDS, adhesion was found dominant at lower surface speeds, while dissolution wear dominated with increasing speeds -- this is in conformance with the lower relative surface speed requirement for micro welds to form and rupture, essentially defining the mechanical load limit of the tool material. It also conforms to the known dominance of high temperature-controlled wear mechanisms with increasing surface speed, which is known to exponentially increase temperatures especially when machining Ti-6Al-4V due to its low thermal conductivity. Thus, straight tungsten carbide wear when machining Ti-6Al-4V is mechanically-driven at low surface speeds and thermally-driven at high surface speeds. Further, at high surface speeds, craters were formed due to carbon diffusing to the tool surface and being carried away by the rubbing action of the chips -- this left behind a smooth crater surface predominantly of tungsten and cobalt as observed from EDS analysis. Also, at high surface speeds, carbon from the tool was found diffused into the adhered titanium layer to form a titanium carbide (TiC) boundary layer -- this was observed as instances of TiC build-up on the tool edge from EDS analysis. A complex wear mechanism interaction was thus observed, i.e., titanium adhered on top of an earlier worn out crater trough, additional carbon diffused into this adhered titanium layer to create a more stable boundary layer (which could limit diffusion-rates on saturation), and then all were further worn away by dissolution wear as temperatures increased. At low and medium feeds, notch discoloration was observed -- this was detected to be carbon from EDS analysis, suggesting that it was deposited from the edges of the passing chips. Mapping the dominant wear mechanisms showed the increasing dominance of dissolution wear relative to adhesion, with increasing grain size -- this is because a 13% larger sub-micron grain results in a larger surface area of cobalt exposed to chemical action. On the macro-scale, wear quantification through topology characterization elevated wear from a 1D to 3D concept. From investigation, a second order dependence of volumetric tool wear (VTW) and VTW rate with the material removal rate (MRR) emerged, suggesting that MRR is a more consistent wear-controlling factor instead of the traditionally used cutting speed. A predictive model for VTW was developed which showed its exponential dependence with workpiece stock volume removed. Also, both VTW and VTW rate were found to be dependent on the accumulated cumulative wear on the tool. Further, a ratio metric of stock material removed to tool volume lost is now possible as a tool efficiency quantifier and energy-based productivity parameter, which was found to inversely depend on MRR - this led to a more comprehensive tool wear definition based on cutting tool efficiency. (Abstract shortened by UMI.)
Smith, Brian J; Mezhir, James J
2014-10-01
Regional lymph node status has long been used as a dichotomous predictor of clinical outcomes in cancer patients. More recently, interest has turned to the prognostic utility of lymph node ratio (LNR), quantified as the proportion of positive nodes examined. However, statistical tools for the joint modeling of LNR and its effect on cancer survival are lacking. Data were obtained from the NCI SEER cancer registry on 6400 patients diagnosed with pancreatic ductal adenocarcinoma from 2004 to 2010 and who underwent radical oncologic resection. A novel Bayesian statistical approach was developed and applied to model simultaneously patients' true, but unobservable, LNR statuses and overall survival. New web development tools were then employed to create an interactive web application for individualized patient prediction. Histologic grade and T and M stages were important predictors of LNR status. Significant predictors of survival included age, gender, marital status, grade, histology, T and M stages, tumor size, and radiation therapy. LNR was found to have a highly significant, non-linear effect on survival. Furthermore, predictive performance of the survival model compared favorably to those from studies with more homogeneous patients and individualized predictors. We provide a new approach and tool set for the prediction of LNR and survival that are generally applicable to a host of cancer types, including breast, colon, melanoma, and stomach. Our methods are illustrated with the development of a validated model and web applications for the prediction of survival in a large set of pancreatic cancer patients. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Predictive Mining of Time Series Data
NASA Astrophysics Data System (ADS)
Java, A.; Perlman, E. S.
2002-05-01
All-sky monitors are a relatively new development in astronomy, and their data represent a largely untapped resource. Proper utilization of this resource could lead to important discoveries not only in the physics of variable objects, but in how one observes such objects. We discuss the development of a Java toolbox for astronomical time series data. Rather than using methods conventional in astronomy (e.g., power spectrum and cross-correlation analysis) we employ rule discovery techniques commonly used in analyzing stock-market data. By clustering patterns found within the data, rule discovery allows one to build predictive models, allowing one to forecast when a given event might occur or whether the occurrence of one event will trigger a second. We have tested the toolbox and accompanying display tool on datasets (representing several classes of objects) from the RXTE All Sky Monitor. We use these datasets to illustrate the methods and functionality of the toolbox. We have found predictive patterns in several ASM datasets. We also discuss problems faced in the development process, particularly the difficulties of dealing with discretized and irregularly sampled data. A possible application would be in scheduling target of opportunity observations where the astronomer wants to observe an object when a certain event or series of events occurs. By combining such a toolbox with an automatic, Java query tool which regularly gathers data on objects of interest, the astronomer or telescope operator could use the real-time datastream to efficiently predict the occurrence of (for example) a flare or other event. By combining the toolbox with dynamic time warping data-mining tools, one could predict events which may happen on variable time scales.
Analysis, prediction, and case studies of early-age cracking in bridge decks
NASA Astrophysics Data System (ADS)
ElSafty, Adel; Graeff, Matthew K.; El-Gharib, Georges; Abdel-Mohti, Ahmed; Mike Jackson, N.
2016-06-01
Early-age cracking can adversely affect strength, serviceability, and durability of concrete bridge decks. Early age is defined as the period after final setting, during which concrete properties change rapidly. Many factors can cause early-age bridge deck cracking including temperature change, hydration, plastic shrinkage, autogenous shrinkage, and drying shrinkage. The cracking may also increase the effect of freeze and thaw cycles and may lead to corrosion of reinforcement. This research paper presents an analysis of causes and factors affecting early-age cracking. It also provides a tool developed to predict the likelihood and initiation of early-age cracking of concrete bridge decks. Understanding the concrete properties is essential so that the developed tool can accurately model the mechanisms contributing to the cracking of concrete bridge decks. The user interface of the implemented computer Excel program enables the user to input the properties of the concrete being monitored. The research study and the developed spreadsheet were used to comprehensively investigate the issue of concrete deck cracking. The spreadsheet is designed to be a user-friendly calculation tool for concrete mixture proportioning, temperature prediction, thermal analysis, and tensile cracking prediction. The study also provides review and makes recommendations on the deck cracking based mainly on the Florida Department of Transportation specifications and Structures Design Guidelines, and Bridge Design Manuals of other states. The results were also compared with that of other commercially available software programs that predict early-age cracking in concrete slabs, concrete pavement, and reinforced concrete bridge decks. The outcome of this study can identify a set of recommendations to limit the deck cracking problem and maintain a longer service life of bridges.
Model-based high-throughput design of ion exchange protein chromatography.
Khalaf, Rushd; Heymann, Julia; LeSaout, Xavier; Monard, Florence; Costioli, Matteo; Morbidelli, Massimo
2016-08-12
This work describes the development of a model-based high-throughput design (MHD) tool for the operating space determination of a chromatographic cation-exchange protein purification process. Based on a previously developed thermodynamic mechanistic model, the MHD tool generates a large amount of system knowledge and thereby permits minimizing the required experimental workload. In particular, each new experiment is designed to generate information needed to help refine and improve the model. Unnecessary experiments that do not increase system knowledge are avoided. Instead of aspiring to a perfectly parameterized model, the goal of this design tool is to use early model parameter estimates to find interesting experimental spaces, and to refine the model parameter estimates with each new experiment until a satisfactory set of process parameters is found. The MHD tool is split into four sections: (1) prediction, high throughput experimentation using experiments in (2) diluted conditions and (3) robotic automated liquid handling workstations (robotic workstation), and (4) operating space determination and validation. (1) Protein and resin information, in conjunction with the thermodynamic model, is used to predict protein resin capacity. (2) The predicted model parameters are refined based on gradient experiments in diluted conditions. (3) Experiments on the robotic workstation are used to further refine the model parameters. (4) The refined model is used to determine operating parameter space that allows for satisfactory purification of the protein of interest on the HPLC scale. Each section of the MHD tool is used to define the adequate experimental procedures for the next section, thus avoiding any unnecessary experimental work. We used the MHD tool to design a polishing step for two proteins, a monoclonal antibody and a fusion protein, on two chromatographic resins, in order to demonstrate it has the ability to strongly accelerate the early phases of process development. Copyright © 2016 Elsevier B.V. All rights reserved.
Modeling of fiber orientation in viscous fluid flow with application to self-compacting concrete
NASA Astrophysics Data System (ADS)
Kolařík, Filip; Patzák, Bořek
2013-10-01
In recent years, unconventional concrete reinforcement is of growing popularity. Especially fiber reinforcement has very wide usage in high performance concretes like "Self Compacting Concrete" (SCC). The design of advanced tailor-made structures made of SCC can take advantage of anisotropic orientation of fibers. Tools for fiber orientation predictions can contribute to design of tailor made structure and allow to develop casting procedures that enable to achieve the desired fiber distribution and orientation. This paper deals with development and implementation of suitable tool for prediction of fiber orientation in a fluid based on the knowledge of the velocity field. Statistical approach to the topic is employed. Fiber orientation is described by a probability distribution of the fiber angle.
Risk determination after an acute myocardial infarction: review of 3 clinical risk prediction tools.
Scruth, Elizabeth Ann; Page, Karen; Cheng, Eugene; Campbell, Michelle; Worrall-Carter, Linda
2012-01-01
The objective of the study was to provide comprehensive information for the clinical nurse specialist (CNS) on commonly used clinical prediction (risk assessment) tools used to estimate risk of a secondary cardiac or noncardiac event and mortality in patients undergoing primary percutaneous coronary intervention (PCI) for ST-elevation myocardial infarction (STEMI). The evolution and widespread adoption of primary PCI represent major advances in the treatment of acute myocardial infarction, specifically STEMI. The American College of Cardiology and the American Heart Association have recommended early risk stratification for patients presenting with acute coronary syndromes using several clinical risk scores to identify patients' mortality and secondary event risk after PCI. Clinical nurse specialists are integral to any performance improvement strategy. Their knowledge and understandings of clinical prediction tools will be essential in carrying out important assessment, identifying and managing risk in patients who have sustained a STEMI, and enhancing discharge education including counseling on medications and lifestyle changes. Over the past 2 decades, risk scores have been developed from clinical trials to facilitate risk assessment. There are several risk scores that can be used to determine in-hospital and short-term survival. This article critiques the most common tools: the Thrombolytic in Myocardial Infarction risk score, the Global Registry of Acute Coronary Events risk score, and the Controlled Abciximab and Device Investigation to Lower Late Angioplasty Complications risk score. The importance of incorporating risk screening assessment tools (that are important for clinical prediction models) to guide therapeutic management of patients cannot be underestimated. The ability to forecast secondary risk after a STEMI will assist in determining which patients would require the most aggressive level of treatment and monitoring postintervention including outpatient monitoring. With an increased awareness of specialist assessment tools, the CNS can play an important role in risk prevention and ongoing cardiovascular health promotion in patients diagnosed with STEMI. Knowledge of clinical prediction tools to estimate risk for mortality and risk of secondary events after PCI for acute coronary syndromes including STEMI is essential for the CNS in assisting with improving short- and long-term outcomes and for performance improvement strategies. The risk score assessment utilizing a collaborative approach with the multidisciplinary healthcare team provides for the development of a treatment plan including any invasive intervention strategy for the patient. Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins
Li, Fuyi; Li, Chen; Marquez-Lago, Tatiana T; Leier, André; Akutsu, Tatsuya; Purcell, Anthony W; Smith, A Ian; Lithgow, Trevor; Daly, Roger J; Song, Jiangning; Chou, Kuo-Chen
2018-06-27
Kinase-regulated phosphorylation is a ubiquitous type of post-translational modification (PTM) in both eukaryotic and prokaryotic cells. Phosphorylation plays fundamental roles in many signalling pathways and biological processes, such as protein degradation and protein-protein interactions. Experimental studies have revealed that signalling defects caused by aberrant phosphorylation are highly associated with a variety of human diseases, especially cancers. In light of this, a number of computational methods aiming to accurately predict protein kinase family-specific or kinase-specific phosphorylation sites have been established, thereby facilitating phosphoproteomic data analysis. In this work, we present Quokka, a novel bioinformatics tool that allows users to rapidly and accurately identify human kinase family-regulated phosphorylation sites. Quokka was developed by using a variety of sequence scoring functions combined with an optimized logistic regression algorithm. We evaluated Quokka based on well-prepared up-to-date benchmark and independent test datasets, curated from the Phospho.ELM and UniProt databases, respectively. The independent test demonstrates that Quokka improves the prediction performance compared with state-of-the-art computational tools for phosphorylation prediction. In summary, our tool provides users with high-quality predicted human phosphorylation sites for hypothesis generation and biological validation. The Quokka webserver and datasets are freely available at http://quokka.erc.monash.edu/. Supplementary data are available at Bioinformatics online.
Development of a fourth generation predictive capability maturity model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard Guy; Witkowski, Walter R.; Urbina, Angel
2013-09-01
The Predictive Capability Maturity Model (PCMM) is an expert elicitation tool designed to characterize and communicate completeness of the approaches used for computational model definition, verification, validation, and uncertainty quantification associated for an intended application. The primary application of this tool at Sandia National Laboratories (SNL) has been for physics-based computational simulations in support of nuclear weapons applications. The two main goals of a PCMM evaluation are 1) the communication of computational simulation capability, accurately and transparently, and 2) the development of input for effective planning. As a result of the increasing importance of computational simulation to SNLs mission, themore » PCMM has evolved through multiple generations with the goal to provide more clarity, rigor, and completeness in its application. This report describes the approach used to develop the fourth generation of the PCMM.« less
PBPK Modeling - A Predictive, Eco-Friendly, Bio-Waiver Tool for Drug Research.
De, Baishakhi; Bhandari, Koushik; Mukherjee, Ranjan; Katakam, Prakash; Adiki, Shanta K; Gundamaraju, Rohit; Mitra, Analava
2017-01-01
The world has witnessed growing complexities in disease scenario influenced by the drastic changes in host-pathogen- environment triadic relation. Pharmaceutical R&Ds are in constant search of novel therapeutic entities to hasten transition of drug molecules from lab bench to patient bedside. Extensive animal studies and human pharmacokinetics are still the "gold standard" in investigational new drug research and bio-equivalency studies. Apart from cost, time and ethical issues on animal experimentation, burning questions arise relating to ecological disturbances, environmental hazards and biodiversity issues. Grave concerns arises when the adverse outcomes of continued studies on one particular disease on environment gives rise to several other pathogenic agents finally complicating the total scenario. Thus Pharma R&Ds face a challenge to develop bio-waiver protocols. Lead optimization, drug candidate selection with favorable pharmacokinetics and pharmacodynamics, toxicity assessment are vital steps in drug development. Simulation tools like Gastro Plus™, PK Sim®, SimCyp find applications for the purpose. Advanced technologies like organ-on-a chip or human-on-a chip where a 3D representation of human organs and systems can mimic the related processes and activities, thereby linking them to major features of human biology can be successfully incorporated in the drug development tool box. PBPK provides the State of Art to serve as an optional of animal experimentation. PBPK models can successfully bypass bio-equivalency studies, predict bioavailability, drug interactions and on hyphenation with in vitro-in vivo correlation can be extrapolated to humans thus serving as bio-waiver. PBPK can serve as an eco-friendly bio-waiver predictive tool in drug development. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Integrated Wind Power Planning Tool
NASA Astrophysics Data System (ADS)
Rosgaard, Martin; Giebel, Gregor; Skov Nielsen, Torben; Hahmann, Andrea; Sørensen, Poul; Madsen, Henrik
2013-04-01
This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the title "Integrated Wind Power Planning Tool". The goal is to integrate a mesoscale numerical weather prediction (NWP) model with purely statistical tools in order to assess wind power fluctuations, with focus on long term power system planning for future wind farms as well as short term forecasting for existing wind farms. Currently, wind power fluctuation models are either purely statistical or integrated with NWP models of limited resolution. Using the state-of-the-art mesoscale NWP model Weather Research & Forecasting model (WRF) the forecast error is sought quantified in dependence of the time scale involved. This task constitutes a preparative study for later implementation of features accounting for NWP forecast errors in the DTU Wind Energy maintained Corwind code - a long term wind power planning tool. Within the framework of PSO 10464 research related to operational short term wind power prediction will be carried out, including a comparison of forecast quality at different mesoscale NWP model resolutions and development of a statistical wind power prediction tool taking input from WRF. The short term prediction part of the project is carried out in collaboration with ENFOR A/S; a Danish company that specialises in forecasting and optimisation for the energy sector. The integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting for its spatio-temporal dependencies, and depending on the prevailing weather conditions defined by the WRF output. The output from the integrated short term prediction tool constitutes scenario forecasts for the coming period, which can then be fed into any type of system model or decision making problem to be solved. The high resolution of the WRF results loaded into the integrated prediction model will ensure a high accuracy data basis is available for use in the decision making process of the Danish transmission system operator. The need for high accuracy predictions will only increase over the next decade as Denmark approaches the goal of 50% wind power based electricity in 2025 from the current 20%.
Small Engine Technology (SET). Task 33: Airframe, Integration, and Community Noise Study
NASA Technical Reports Server (NTRS)
Lieber, Lys S.; Elkins, Daniel; Golub, Robert A. (Technical Monitor)
2002-01-01
Task Order 33 had four primary objectives as follows: (1) Identify and prioritize the airframe noise reduction technologies needed to accomplish the NASA Pillar goals for business and regional aircraft. (2) Develop a model to estimate the effect of jet shear layer refraction and attenuation of internally generated source noise of a turbofan engine on the aircraft system noise. (3) Determine the effect on community noise of source noise changes of a generic turbofan engine operating from sea level to 15,000 feet. (4) Support lateral attenuation experiments conducted by NASA Langley at Wallops Island, VA, by coordinating opportunities for Contractor Aircraft to participate as a noise source during the noise measurements. Noise data and noise prediction tools, including airframe noise codes, from the NASA Advanced Subsonic Technology (AST) program were applied to assess the current status of noise reduction technologies relative to the NASA pillar goals for regional and small business jet aircraft. In addition, the noise prediction tools were applied to evaluate the effectiveness of airframe-related noise reduction concepts developed in the AST program on reducing the aircraft system noise. The AST noise data and acoustic prediction tools used in this study were furnished by NASA.
Deep convolutional neural networks for pan-specific peptide-MHC class I binding prediction.
Han, Youngmahn; Kim, Dongsup
2017-12-28
Computational scanning of peptide candidates that bind to a specific major histocompatibility complex (MHC) can speed up the peptide-based vaccine development process and therefore various methods are being actively developed. Recently, machine-learning-based methods have generated successful results by training large amounts of experimental data. However, many machine learning-based methods are generally less sensitive in recognizing locally-clustered interactions, which can synergistically stabilize peptide binding. Deep convolutional neural network (DCNN) is a deep learning method inspired by visual recognition process of animal brain and it is known to be able to capture meaningful local patterns from 2D images. Once the peptide-MHC interactions can be encoded into image-like array(ILA) data, DCNN can be employed to build a predictive model for peptide-MHC binding prediction. In this study, we demonstrated that DCNN is able to not only reliably predict peptide-MHC binding, but also sensitively detect locally-clustered interactions. Nonapeptide-HLA-A and -B binding data were encoded into ILA data. A DCNN, as a pan-specific prediction model, was trained on the ILA data. The DCNN showed higher performance than other prediction tools for the latest benchmark datasets, which consist of 43 datasets for 15 HLA-A alleles and 25 datasets for 10 HLA-B alleles. In particular, the DCNN outperformed other tools for alleles belonging to the HLA-A3 supertype. The F1 scores of the DCNN were 0.86, 0.94, and 0.67 for HLA-A*31:01, HLA-A*03:01, and HLA-A*68:01 alleles, respectively, which were significantly higher than those of other tools. We found that the DCNN was able to recognize locally-clustered interactions that could synergistically stabilize peptide binding. We developed ConvMHC, a web server to provide user-friendly web interfaces for peptide-MHC class I binding predictions using the DCNN. ConvMHC web server can be accessible via http://jumong.kaist.ac.kr:8080/convmhc . We developed a novel method for peptide-HLA-I binding predictions using DCNN trained on ILA data that encode peptide binding data and demonstrated the reliable performance of the DCNN in nonapeptide binding predictions through the independent evaluation on the latest IEDB benchmark datasets. Our approaches can be applied to characterize locally-clustered patterns in molecular interactions, such as protein/DNA, protein/RNA, and drug/protein interactions.
Real-time micro-modelling of city evacuations
NASA Astrophysics Data System (ADS)
Löhner, Rainald; Haug, Eberhard; Zinggerling, Claudio; Oñate, Eugenio
2018-01-01
A methodology to integrate geographical information system (GIS) data with large-scale pedestrian simulations has been developed. Advances in automatic data acquisition and archiving from GIS databases, automatic input for pedestrian simulations, as well as scalable pedestrian simulation tools have made it possible to simulate pedestrians at the individual level for complete cities in real time. An example that simulates the evacuation of the city of Barcelona demonstrates that this is now possible. This is the first step towards a fully integrated crowd prediction and management tool that takes into account not only data gathered in real time from cameras, cell phones or other sensors, but also merges these with advanced simulation tools to predict the future state of the crowd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borges, Ronaldo C.; D'Auria, Francesco; Alvim, Antonio Carlos M.
2002-07-01
The Code with - the capability of - Internal Assessment of Uncertainty (CIAU) is a tool proposed by the 'Dipartimento di Ingegneria Meccanica, Nucleare e della Produzione (DIMNP)' of the University of Pisa. Other Institutions including the nuclear regulatory body from Brazil, 'Comissao Nacional de Energia Nuclear', contributed to the development of the tool. The CIAU aims at providing the currently available Relap5/Mod3.2 system code with the integrated capability of performing not only relevant transient calculations but also the related estimates of uncertainty bands. The Uncertainty Methodology based on Accuracy Extrapolation (UMAE) is used to characterize the uncertainty in themore » prediction of system code calculations for light water reactors and is internally coupled with the above system code. Following an overview of the CIAU development, the present paper deals with the independent qualification of the tool. The qualification test is performed by estimating the uncertainty bands that should envelope the prediction of the Angra 1 NPP transient RES-11. 99 originated by an inadvertent complete load rejection that caused the reactor scram when the unit was operating at 99% of nominal power. The current limitation of the 'error' database, implemented into the CIAU prevented a final demonstration of the qualification. However, all the steps for the qualification process are demonstrated. (authors)« less
Challenges of NDE simulation tool validation, optimization, and utilization for composites
NASA Astrophysics Data System (ADS)
Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter
2016-02-01
Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.
A Probabilistic Approach to Model Update
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Reaves, Mercedes C.; Voracek, David F.
2001-01-01
Finite element models are often developed for load validation, structural certification, response predictions, and to study alternate design concepts. In rare occasions, models developed with a nominal set of parameters agree with experimental data without the need to update parameter values. Today, model updating is generally heuristic and often performed by a skilled analyst with in-depth understanding of the model assumptions. Parameter uncertainties play a key role in understanding the model update problem and therefore probabilistic analysis tools, developed for reliability and risk analysis, may be used to incorporate uncertainty in the analysis. In this work, probability analysis (PA) tools are used to aid the parameter update task using experimental data and some basic knowledge of potential error sources. Discussed here is the first application of PA tools to update parameters of a finite element model for a composite wing structure. Static deflection data at six locations are used to update five parameters. It is shown that while prediction of individual response values may not be matched identically, the system response is significantly improved with moderate changes in parameter values.
Wang, Yuan; Gao, Ying; Battsend, Munkhzul; Chen, Kexin; Lu, Wenli; Wang, Yaogang
2014-11-01
The optimal approach regarding breast cancer screening for Chinese women is unclear due to the relative low incidence rate. A risk assessment tool may be useful for selection of high-risk subsets of population for mammography screening in low-incidence and resource-limited developing country. The odd ratios for six main risk factors of breast cancer were pooled by review manager after a systematic research of literature. Health risk appraisal (HRA) model was developed to predict an individual's risk of developing breast cancer in the next 5 years from current age. The performance of this HRA model was assessed based on a first-round screening database. Estimated risk of breast cancer increased with age. Increases in the 5-year risk of developing breast cancer were found with the existence of any of included risk factors. When individuals who had risk above median risk (3.3‰) were selected from the validation database, the sensitivity is 60.0% and the specificity is 47.8%. The unweighted area under the curve (AUC) was 0.64 (95% CI = 0.50-0.78). The risk-prediction model reported in this article is based on a combination of risk factors and shows good overall predictive power, but it is still weak at predicting which particular women will develop the disease. It would be very helpful for the improvement of a current model if more population-based prospective follow-up studies were used for the validation.
Surrogate analysis and index developer (SAID) tool and real-time data dissemination utilities
Domanski, Marian M.; Straub, Timothy D.; Wood, Molly S.; Landers, Mark N.; Wall, Gary R.; Brady, Steven J.
2015-01-01
The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Critical to advancing the operational use of surrogates are tools to process and evaluate the data along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research, and on surrogate monitoring sites currently in operation. The Surrogate Analysis and Index Developer (SAID) standalone tool, under development by the U.S. Geological Survey (USGS), assists in the creation of regression models that relate response and explanatory variables by providing visual and quantitative diagnostics to the user. SAID also processes acoustic parameters to be used as explanatory variables for suspended-sediment concentrations. The sediment acoustic method utilizes acoustic parameters from fixed-mount stationary equipment. The background theory and method used by the tool have been described in recent publications, and the tool also serves to support sediment-acoustic-index methods being drafted by the multi-agency Sediment Acoustic Leadership Team (SALT), and other surrogate guidelines like USGS Techniques and Methods 3-C4 for turbidity and SSC. The regression models in SAID can be used in utilities that have been developed to work with the USGS National Water Information System (NWIS) and for the USGS National Real-Time Water Quality (NRTWQ) Web site. The real-time dissemination of predicted SSC and prediction intervals for each time step has substantial potential to improve understanding of sediment-related water-quality and associated engineering and ecological management decisions.
Sousa, Bruno
2013-01-01
Objective To translate into Portuguese and evaluate the measuring properties of the Sunderland Scale and the Cubbin & Jackson Revised Scale, which are instruments for evaluating the risk of developing pressure ulcers during intensive care. Methods This study included the process of translation and adaptation of the scales to the Portuguese language, as well as the validation of these tools. To assess the reliability, Cronbach alpha values of 0.702 to 0.708 were identified for the Sunderland Scale and the Cubbin & Jackson Revised Scale, respectively. The validation criteria (predictive) were performed comparatively with the Braden Scale (gold standard), and the main measurements evaluated were sensitivity, specificity, positive predictive value, negative predictive value, and area under the curve, which were calculated based on cutoff points. Results The Sunderland Scale exhibited 60% sensitivity, 86.7% specificity, 47.4% positive predictive value, 91.5% negative predictive value, and 0.86 for the area under the curve. The Cubbin & Jackson Revised Scale exhibited 73.3% sensitivity, 86.7% specificity, 52.4% positive predictive value, 94.2% negative predictive value, and 0.91 for the area under the curve. The Braden scale exhibited 100% sensitivity, 5.3% specificity, 17.4% positive predictive value, 100% negative predictive value, and 0.72 for the area under the curve. Conclusions Both tools demonstrated reliability and validity for this sample. The Cubbin & Jackson Revised Scale yielded better predictive values for the development of pressure ulcers during intensive care. PMID:23917975
Software Tools to Support Research on Airport Departure Planning
NASA Technical Reports Server (NTRS)
Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul
2003-01-01
A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.
Richardson, Philip; Greenslade, Jaimi; Shanmugathasan, Sulochana; Doucet, Katherine; Widdicombe, Neil; Chu, Kevin; Brown, Anthony
2015-01-01
CARING is a screening tool developed to identify patients who have a high likelihood of death in 1 year. This study sought to validate a modified CARING tool (termed PREDICT) using a population of patients presenting to the Emergency Department. In total, 1000 patients aged over 55 years who were admitted to hospital via the Emergency Department between January and June 2009 were eligible for inclusion in this study. Data on the six prognostic indicators comprising PREDICT were obtained retrospectively from patient records. One-year mortality data were obtained from the State Death Registry. Weights were applied to each PREDICT criterion, and its final score ranged from 0 to 44. Receiver operator characteristic analyses and diagnostic accuracy statistics were used to assess the accuracy of PREDICT in identifying 1-year mortality. The sample comprised 976 patients with a median (interquartile range) age of 71 years (62-81 years) and a 1-year mortality of 23.4%. In total, 50% had ≥1 PREDICT criteria with a 1-year mortality of 40.4%. Receiver operator characteristic analysis gave an area under the curve of 0.86 (95% confidence interval: 0.83-0.89). Using a cut-off of 13 points, PREDICT had a 95.3% (95% confidence interval: 93.6-96.6) specificity and 53.9% (95% confidence interval: 47.5-60.3) sensitivity for predicting 1-year mortality. PREDICT was simpler than the CARING criteria and identified 158 patients per 1000 admitted who could benefit from advance care planning. PREDICT was successfully applied to the Australian healthcare system with findings similar to the original CARING study conducted in the United States. This tool could improve end-of-life care by identifying who should have advance care planning or an advance healthcare directive. © The Author(s) 2014.
Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool
NASA Technical Reports Server (NTRS)
Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.
2015-01-01
Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Fifield, Leonard S.; Gandhi, Umesh N.
This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used asmore » resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.« less
BASINs and WEPP Climate Assessment Tools (CAT): Case ...
EPA announced the release of the final report, BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications. This report supports application of two recently developed water modeling tools, the Better Assessment Science Integrating point & Non-point Sources (BASINS) and the Water Erosion Prediction Project Climate Assessment Tool (WEPPCAT). The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments of the potential effects of climate change on streamflow and water quality. This report presents a series of short, illustrative case studies using the BASINS and WEPP climate assessment tools.
Validation of behave fire behavior predictions in oak savannas
Grabner, Keith W.; Dwyer, John; Cutter, Bruce E.
1997-01-01
Prescribed fire is a valuable tool in the restoration and management of oak savannas. BEHAVE, a fire behavior prediction system developed by the United States Forest Service, can be a useful tool when managing oak savannas with prescribed fire. BEHAVE predictions of fire rate-of-spread and flame length were validated using four standardized fuel models: Fuel Model 1 (short grass), Fuel Model 2 (timber and grass), Fuel Model 3 (tall grass), and Fuel Model 9 (hardwood litter). Also, a customized oak savanna fuel model (COSFM) was created and validated. Results indicate that standardized fuel model 2 and the COSFM reliably estimate mean rate-of-spread (MROS). The COSFM did not appreciably reduce MROS variation when compared to fuel model 2. Fuel models 1, 3, and 9 did not reliably predict MROS. Neither the standardized fuel models nor the COSFM adequately predicted flame lengths. We concluded that standardized fuel model 2 should be used with BEHAVE when predicting fire rates-of-spread in established oak savannas.
NASA Astrophysics Data System (ADS)
Tang, Arthur; Lee, Wing C.; St. Pierre, Shawn; He, Jeanne; Liu, Kesu; Chen, Chin C.
2005-08-01
Over the past two decades, the Computer Aided Engineering (CAE) tools have emerged as one of the most important engineering tools in various industries, due to its flexibility and accuracy in prediction. Nowadays, CAE tools are widely used in the sheet metal forming industry to predict the forming feasibility of a wide variety of complex components, ranging from aerospace and automotive components to household products. As the demand of CAE based formability accelerates, the need for a robust and streamlined die face engineering tool becomes more crucial, especially in the early stage when the tooling layout is not available, but a product design decision must be made. Ability to generate blank, binder and addendum surfaces with an appropriate layout of Drawbead, Punch Opening Line, Trim Line are the primary features and functions of a CAE based die face engineering tool. Once the die face layout is ready, a formability study should be followed to verify the die face layout is adequate to produce a formable part. If successful, the established die face surface should be exported back to the CAD/CAM environment to speed up the tooling and manufacturing design process with confidence that this particular part is formable with this given die face. With a CAE tool as described above, the tool & die industry will be greatly impacted as the processes will enable the bypass of hardware try-out and shorten the overall vehicle production timing. The trend has shown that OEMs and first tiers will source to low cost producers in the world which will have a negative impact to the traditional tool & die makers in the developed countries. CAE based tool as described should be adopted, along with many other solutions, in order to maintain efficiency of producing high quality product and meeting time-to-market requirements. This paper will describe how a CAE based die face engineering (DFE) tool could be further developed to enable the traditional tool & die makers to meet the challenge ahead.
Navy Enhanced Sierra Mechanics (NESM): Toolbox for predicting Navy shock and damage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moyer, Thomas; Stergiou, Jonathan; Reese, Garth
Here, the US Navy is developing a new suite of computational mechanics tools (Navy Enhanced Sierra Mechanics) for the prediction of ship response, damage, and shock environments transmitted to vital systems during threat weapon encounters. NESM includes fully coupled Euler-Lagrange solvers tailored to ship shock/damage predictions. NESM is optimized to support high-performance computing architectures, providing the physics-based ship response/threat weapon damage predictions needed to support the design and assessment of highly survivable ships. NESM is being employed to support current Navy ship design and acquisition programs while being further developed for future Navy fleet needs.
NASA Technical Reports Server (NTRS)
Kontos, Karen B.; Kraft, Robert E.; Gliebe, Philip R.
1996-01-01
The Aircraft Noise Predication Program (ANOPP) is an industry-wide tool used to predict turbofan engine flyover noise in system noise optimization studies. Its goal is to provide the best currently available methods for source noise prediction. As part of a program to improve the Heidmann fan noise model, models for fan inlet and fan exhaust noise suppression estimation that are based on simple engine and acoustic geometry inputs have been developed. The models can be used to predict sound power level suppression and sound pressure level suppression at a position specified relative to the engine inlet.
Navy Enhanced Sierra Mechanics (NESM): Toolbox for predicting Navy shock and damage
Moyer, Thomas; Stergiou, Jonathan; Reese, Garth; ...
2016-05-25
Here, the US Navy is developing a new suite of computational mechanics tools (Navy Enhanced Sierra Mechanics) for the prediction of ship response, damage, and shock environments transmitted to vital systems during threat weapon encounters. NESM includes fully coupled Euler-Lagrange solvers tailored to ship shock/damage predictions. NESM is optimized to support high-performance computing architectures, providing the physics-based ship response/threat weapon damage predictions needed to support the design and assessment of highly survivable ships. NESM is being employed to support current Navy ship design and acquisition programs while being further developed for future Navy fleet needs.
The Software Management Environment (SME)
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Decker, William; Buell, John
1988-01-01
The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.
Tools4miRs – one place to gather all the tools for miRNA analysis
Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr
2016-01-01
Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. Contact: piotr@ibb.waw.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626
Tools4miRs - one place to gather all the tools for miRNA analysis.
Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr
2016-09-01
MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. piotr@ibb.waw.pl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Spindle Thermal Error Optimization Modeling of a Five-axis Machine Tool
NASA Astrophysics Data System (ADS)
Guo, Qianjian; Fan, Shuo; Xu, Rufeng; Cheng, Xiang; Zhao, Guoyong; Yang, Jianguo
2017-05-01
Aiming at the problem of low machining accuracy and uncontrollable thermal errors of NC machine tools, spindle thermal error measurement, modeling and compensation of a two turntable five-axis machine tool are researched. Measurement experiment of heat sources and thermal errors are carried out, and GRA(grey relational analysis) method is introduced into the selection of temperature variables used for thermal error modeling. In order to analyze the influence of different heat sources on spindle thermal errors, an ANN (artificial neural network) model is presented, and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN, a new ABC-NN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors. In order to test the prediction performance of ABC-NN model, an experiment system is developed, the prediction results of LSR (least squares regression), ANN and ABC-NN are compared with the measurement results of spindle thermal errors. Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN, and the residual error is smaller than 3 μm, the new modeling method is feasible. The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.
The DSM-5 Self-Rated Level 1 Cross-Cutting Symptom Measure as a Screening Tool.
Bastiaens, Leo; Galus, James
2018-03-01
The DSM-5 Self-Rated Level 1 Cross-Cutting Symptom Measure was developed to aid clinicians with a dimensional assessment of psychopathology; however, this measure resembles a screening tool for several symptomatic domains. The objective of the current study was to examine the basic parameters of sensitivity, specificity, positive and negative predictive power of the measure as a screening tool. One hundred and fifty patients in a correctional community center filled out the measure prior to a psychiatric evaluation, including the Mini International Neuropsychiatric Interview screen. The above parameters were calculated for the domains of depression, mania, anxiety, and psychosis. The results showed that the sensitivity and positive predictive power of the studied domains was poor because of a high rate of false positive answers on the measure. However, when the lowest threshold on the Cross-Cutting Symptom Measure was used, the sensitivity of the anxiety and psychosis domains and the negative predictive values for mania, anxiety and psychosis were good. In conclusion, while it is foreseeable that some clinicians may use the DSM-5 Self-Rated Level 1 Cross-Cutting Symptom Measure as a screening tool, it should not be relied on to identify positive findings. It functioned well in the negative prediction of mania, anxiety and psychosis symptoms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adigun, Babatunde John; Fensin, Michael Lorne; Galloway, Jack D.
Our burnup study examined the effect of a predicted critical control rod position on the nuclide predictability of several axial and radial locations within a 4×4 graphite moderated gas cooled reactor fuel cluster geometry. To achieve this, a control rod position estimator (CRPE) tool was developed within the framework of the linkage code Monteburns between the transport code MCNP and depletion code CINDER90, and four methodologies were proposed within the tool for maintaining criticality. Two of the proposed methods used an inverse multiplication approach - where the amount of fissile material in a set configuration is slowly altered until criticalitymore » is attained - in estimating the critical control rod position. Another method carried out several MCNP criticality calculations at different control rod positions, then used a linear fit to estimate the critical rod position. The final method used a second-order polynomial fit of several MCNP criticality calculations at different control rod positions to guess the critical rod position. The results showed that consistency in prediction of power densities as well as uranium and plutonium isotopics was mutual among methods within the CRPE tool that predicted critical position consistently well. Finall, while the CRPE tool is currently limited to manipulating a single control rod, future work could be geared toward implementing additional criticality search methodologies along with additional features.« less
Distributed collaborative decision support environments for predictive awareness
NASA Astrophysics Data System (ADS)
McQuay, William K.; Stilman, Boris; Yakhnis, Vlad
2005-05-01
The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, rapidly assess the enemy"s course of action (eCOA) or possible actions and promulgate their own course of action (COA) - a need for predictive awareness. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Revolutionary new approaches to strategy generation and assessment such as Linguistic Geometry (LG) permit the rapid development of COA vs. enemy COA (eCOA). LG tools automatically generate and permit the operators to take advantage of winning strategies and tactics for mission planning and execution in near real-time. LG is predictive and employs deep "look-ahead" from the current state and provides a realistic, reactive model of adversary reasoning and behavior. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing research efforts in applying distributed collaborative environments to decision support for predictive mission awareness.
Accessible tools to quantify adverse outcomes pathways (AOPs) that can predict the ecological effects of chemicals and other stressors are a major goal of Chemical Safety and Sustainability research within US EPA’s Office of Research and Development. To address this goal, w...
Comparison of measured and simulated friction velocity and threshold friction velocity using SWEEP
USDA-ARS?s Scientific Manuscript database
The Wind Erosion Prediction System (WEPS) was developed by the USDA Agricultural Research Service as a tool to predict wind erosion and assess the influence of control practices on windblown soil loss. Occasional failure of the WEPS erosion submodel (SWEEP) to simulate erosion in the Columbia Platea...
Development of Research-Based Protocol Aligned to Predict High Levels of Teaching Quality
ERIC Educational Resources Information Center
Schumacher, Gary; Grigsby, Bettye; Vesey, Winona
2011-01-01
This study proposes a research-based teacher selection protocol. The protocol is intended to offer school district hiring authorities a tool to identify teacher candidates with the behaviors expected to predict effective teaching. It is hypothesized that a particular series of research-based interview questions focusing on teaching behaviors in…
A Java-based web service is being developed within the US EPA’s Chemistry Dashboard to provide real time estimates of toxicity values and physical properties. WebTEST can generate toxicity predictions directly from a simple URL which includes the endpoint, QSAR method, and ...
A Java-based web service is being developed within the US EPA’s Chemistry Dashboard to provide real time estimates of toxicity values and physical properties. WebTEST can generate toxicity predictions directly from a simple URL which includes the endpoint, QSAR method, and ...
ERIC Educational Resources Information Center
Brill, Robert T.; Gilfoil, David M.; Doll, Kristen
2014-01-01
Academic researchers have often touted the growing importance of "soft skills" for modern day business leaders, especially leadership and communication skills. Despite this growing interest and attention, relatively little work has been done to develop and validate tools to assess soft skills. Forty graduate students from nine MBA…
ERIC Educational Resources Information Center
Moffitt, Kevin Christopher
2011-01-01
The three objectives of this dissertation were to develop a question type model for predicting linguistic features of responses to interview questions, create a tool for linguistic analysis of documents, and use lexical bundle analysis to identify linguistic differences between fraudulent and non-fraudulent financial reports. First, The Moffitt…
NASA Technical Reports Server (NTRS)
Moes, Timothy R.
2009-01-01
The principal objective of the Supersonics Project is to develop and validate multidisciplinary physics-based predictive design, analysis and optimization capabilities for supersonic vehicles. For aircraft, the focus will be on eliminating the efficiency, environmental and performance barriers to practical supersonic flight. Previous flight projects found that a shaped sonic boom could propagate all the way to the ground (F-5 SSBD experiment) and validated design tools for forebody shape modifications (F-5 SSBD and Quiet Spike experiments). The current project, Lift and Nozzle Change Effects on Tail Shock (LaNCETS) seeks to obtain flight data to develop and validate design tools for low-boom tail shock modifications. Attempts will be made to alter the shock structure of NASA's NF-15B TN/837 by changing the lift distribution by biasing the canard positions, changing the plume shape by under- and over-expanding the nozzles, and changing the plume shape using thrust vectoring. Additional efforts will measure resulting shocks with a probing aircraft (F-15B TN/836) and use the results to validate and update predictive tools. Preliminary flight results are presented and are available to provide truth data for developing and validating the CFD tools required to design low-boom supersonic aircraft.
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj; Gulhan, Ali; Aftosmis, Michael; Brock, Joseph; Mathias, Donovan; Need, Dominic; Rodriguez, David; Seltner, Patrick; Stern, Eric; Wiles, Sebastian
2017-01-01
An airburst from a large asteroid during entry can cause significant ground damage. The damage depends on the energy and the altitude of airburst. Breakup of asteroids into fragments and their lateral spread have been observed. Modeling the underlying physics of fragmented bodies interacting at hypersonic speeds and the spread of fragments is needed for a true predictive capability. Current models use heuristic arguments and assumptions such as pancaking or point source explosive energy release at pre-determined altitude or an assumed fragmentation spread rate to predict airburst damage. A multi-year collaboration between German Aerospace Center (DLR) and NASA has been established to develop validated computational tools to address the above challenge.
Jieyi Li; Arandjelovic, Ognjen
2017-07-01
Computer science and machine learning in particular are increasingly lauded for their potential to aid medical practice. However, the highly technical nature of the state of the art techniques can be a major obstacle in their usability by health care professionals and thus, their adoption and actual practical benefit. In this paper we describe a software tool which focuses on the visualization of predictions made by a recently developed method which leverages data in the form of large scale electronic records for making diagnostic predictions. Guided by risk predictions, our tool allows the user to explore interactively different diagnostic trajectories, or display cumulative long term prognostics, in an intuitive and easily interpretable manner.
A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.
Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy
2016-12-01
Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.
Lesley Fusina; Sharon Zhong; Julide Koracin; Tim Brown; Annie Esperanza; Leland Tarney; Haiganoush Preisler
2007-01-01
The BlueSky Smoke Prediction System developed by the U.S. Department of Agriculture, Forest Service, AirFire Team under the National Fire Plan is a modeling framework that integrates tools, knowledge of fuels, moisture, combustion, emissions, plume dynamics, and weather to produce real-time predictions of the cumulative impacts of smoke from wildfires, prescribed fires...
Tools for outcome prediction in patients with community acquired pneumonia.
Khan, Faheem; Owens, Mark B; Restrepo, Marcos; Povoa, Pedro; Martin-Loeches, Ignacio
2017-02-01
Community-acquired pneumonia (CAP) is one of the most common causes of mortality world-wide. The mortality rate of patients with CAP is influenced by the severity of the disease, treatment failure and the requirement for hospitalization and/or intensive care unit (ICU) management, all of which may be predicted by biomarkers and clinical scoring systems. Areas covered: We review the recent literature examining the efficacy of established and newly-developed clinical scores, biological and inflammatory markers such as C-Reactive protein (CRP), procalcitonin (PCT) and Interleukin-6 (IL-6), whether used alone or in conjunction with clinical severity scores to assess the severity of CAP, predict treatment failure, guide acute in-hospital or ICU admission and predict mortality. Expert commentary: The early prediction of treatment failure using clinical scores and biomarkers plays a developing role in improving survival of patients with CAP by identifying high-risk patients requiring hospitalization or ICU admission; and may enable more efficient allocation of resources. However, it is likely that combinations of scoring systems and biomarkers will be of greater use than individual markers. Further larger studies are needed to corroborate the additive value of these markers to clinical prediction scores to provide a safer and more effective assessment tool for clinicians.
Computational modeling of human oral bioavailability: what will be next?
Cabrera-Pérez, Miguel Ángel; Pham-The, Hai
2018-06-01
The oral route is the most convenient way of administrating drugs. Therefore, accurate determination of oral bioavailability is paramount during drug discovery and development. Quantitative structure-property relationship (QSPR), rule-of-thumb (RoT) and physiologically based-pharmacokinetic (PBPK) approaches are promising alternatives to the early oral bioavailability prediction. Areas covered: The authors give insight into the factors affecting bioavailability, the fundamental theoretical framework and the practical aspects of computational methods for predicting this property. They also give their perspectives on future computational models for estimating oral bioavailability. Expert opinion: Oral bioavailability is a multi-factorial pharmacokinetic property with its accurate prediction challenging. For RoT and QSPR modeling, the reliability of datasets, the significance of molecular descriptor families and the diversity of chemometric tools used are important factors that define model predictability and interpretability. Likewise, for PBPK modeling the integrity of the pharmacokinetic data, the number of input parameters, the complexity of statistical analysis and the software packages used are relevant factors in bioavailability prediction. Although these approaches have been utilized independently, the tendency to use hybrid QSPR-PBPK approaches together with the exploration of ensemble and deep-learning systems for QSPR modeling of oral bioavailability has opened new avenues for development promising tools for oral bioavailability prediction.
2017-04-01
A COMPARISON OF PREDICTIVE THERMO AND WATER SOLVATION PROPERTY PREDICTION TOOLS AND EXPERIMENTAL DATA FOR...4. TITLE AND SUBTITLE A Comparison of Predictive Thermo and Water Solvation Property Prediction Tools and Experimental Data for Selected...1 2. EXPERIMENTAL PROCEDURE
Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A
2018-05-01
Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.
George, Steven Z; Beneciuk, Jason M; Lentz, Trevor A; Wu, Samuel S
2017-01-01
Purpose There is an increased need for determining which patients with musculoskeletal pain benefit from additional diagnostic testing or psychologically informed intervention. The Optimal Screening for Prediction of Referral and Outcome (OSPRO) cohort studies were designed to develop and validate standard assessment tools for review of systems and yellow flags. This cohort profile paper provides a description of and future plans for the validation cohort. Participants Patients (n=440) with primary complaint of spine, shoulder or knee pain were recruited into the OSPRO validation cohort via a national Orthopaedic Physical Therapy-Investigative Network. Patients were followed up at 4 weeks, 6 months and 12 months for pain, functional status and quality of life outcomes. Healthcare utilisation outcomes were also collected at 6 and 12 months. Findings to date There are no longitudinal findings reported to date from the ongoing OSPRO validation cohort. The previously completed cross-sectional OSPRO development cohort yielded two assessment tools that were investigated in the validation cohort. Future plans Follow-up data collection was completed in January 2017. Primary analyses will investigate how accurately the OSPRO review of systems and yellow flag tools predict 12-month pain, functional status, quality of life and healthcare utilisation outcomes. Planned secondary analyses include prediction of pain interference and/or development of chronic pain, investigation of treatment expectation on patient outcomes and analysis of patient satisfaction following an episode of physical therapy. Trial registration number The OSPRO validation cohort was not registered. PMID:28600371
Fischer, John P; Nelson, Jonas A; Shang, Eric K; Wink, Jason D; Wingate, Nicholas A; Woo, Edward Y; Jackson, Benjamin M; Kovach, Stephen J; Kanchwala, Suhail
2014-12-01
Groin wound complications after open vascular surgery procedures are common, morbid, and costly. The purpose of this study was to generate a simple, validated, clinically usable risk assessment tool for predicting groin wound morbidity after infra-inguinal vascular surgery. A retrospective review of consecutive patients undergoing groin cutdowns for femoral access between 2005-2011 was performed. Patients necessitating salvage flaps were compared to those who did not, and a stepwise logistic regression was performed and validated using a bootstrap technique. Utilising this analysis, a simplified risk score was developed to predict the risk of developing a wound which would necessitate salvage. A total of 925 patients were included in the study. The salvage flap rate was 11.2% (n = 104). Predictors determined by logistic regression included prior groin surgery (OR = 4.0, p < 0.001), prosthetic graft (OR = 2.7, p < 0.001), coronary artery disease (OR = 1.8, p = 0.019), peripheral arterial disease (OR = 5.0, p < 0.001), and obesity (OR = 1.7, p = 0.039). Based upon the respective logistic coefficients, a simplified scoring system was developed to enable the preoperative risk stratification regarding the likelihood of a significant complication which would require a salvage muscle flap. The c-statistic for the regression demonstrated excellent discrimination at 0.89. This study presents a simple, internally validated risk assessment tool that accurately predicts wound morbidity requiring flap salvage in open groin vascular surgery patients. The preoperatively high-risk patient can be identified and selectively targeted as a candidate for a prophylactic muscle flap.
NASA Technical Reports Server (NTRS)
Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.
1992-01-01
The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
NASA Astrophysics Data System (ADS)
Daniels, R. M.; Jacobs, J. M.; Paranjpye, R.; Lanerolle, L. W.
2016-02-01
The Pathogens group of the NOAA Ecological Forecasting Roadmap has begun a range of efforts to monitor and predict potential pathogen occurrences in shellfish and in U.S. Coastal waters. NOAA/NCOSS along with NMFS/NWFSC have led the Pathogens group and the development of web based tools and forecasts for both Vibrio vulnificus and Vibrio parahaemolyticus. A strong relationship with FDA has allowed the team to develop forecasts that will serve U.S. shellfish harvesters and consumers. NOAA/NOS/CSDL has provided modeling expertise to help the group use the hydrodynamic models and their forecasts of physical variables that drive the ecological predictions. The NOAA/NWS/Ocean Prediction Center has enabled these ecological forecasting efforts by providing the infrastructure, computing knowledge and experience in an operational culture. Daily forecasts have been demonstrated and are available from the web for the Chesapeake Bay, Delaware Bay, Northern Gulf of Mexico, Tampa Bay, Puget Sound and Long Island Sound. The forecast systems run on a daily basis being fed by NOS model data from the NWS/NCEP super computers. New forecast tools including V. parahaemolyticus post harvest growth and doubling time in ambient air temperature will be described.
Carrió, Pau; López, Oriol; Sanz, Ferran; Pastor, Manuel
2015-01-01
Computational models based in Quantitative-Structure Activity Relationship (QSAR) methodologies are widely used tools for predicting the biological properties of new compounds. In many instances, such models are used as a routine in the industry (e.g. food, cosmetic or pharmaceutical industry) for the early assessment of the biological properties of new compounds. However, most of the tools currently available for developing QSAR models are not well suited for supporting the whole QSAR model life cycle in production environments. We have developed eTOXlab; an open source modeling framework designed to be used at the core of a self-contained virtual machine that can be easily deployed in production environments, providing predictions as web services. eTOXlab consists on a collection of object-oriented Python modules with methods mapping common tasks of standard modeling workflows. This framework allows building and validating QSAR models as well as predicting the properties of new compounds using either a command line interface or a graphic user interface (GUI). Simple models can be easily generated by setting a few parameters, while more complex models can be implemented by overriding pieces of the original source code. eTOXlab benefits from the object-oriented capabilities of Python for providing high flexibility: any model implemented using eTOXlab inherits the features implemented in the parent model, like common tools and services or the automatic exposure of the models as prediction web services. The particular eTOXlab architecture as a self-contained, portable prediction engine allows building models with confidential information within corporate facilities, which can be safely exported and used for prediction without disclosing the structures of the training series. The software presented here provides full support to the specific needs of users that want to develop, use and maintain predictive models in corporate environments. The technologies used by eTOXlab (web services, VM, object-oriented programming) provide an elegant solution to common practical issues; the system can be installed easily in heterogeneous environments and integrates well with other software. Moreover, the system provides a simple and safe solution for building models with confidential structures that can be shared without disclosing sensitive information.
Development of the Surface Management System Integrated with CTAS Arrival Tools
NASA Technical Reports Server (NTRS)
Jung, Yoon C.; Jara, Dave
2005-01-01
The Surface Management System (SMS) developed by NASA Ames Research Center in coordination with the Federal Aviation Administration (FAA) is a decision support tool to help tower traffic coordinators and Ground/Local controllers in managing and controlling airport surface traffic in order to increase capacity, efficiency, and flexibility. SMS provides common situation awareness to personnel at various air traffic control facilities such as airport traffic control towers (ATCT s), airline ramp towers, Terminal Radar Approach Control (TRACON), and Air Route Traffic Control Center (ARTCC). SMS also provides a traffic management tool to assist ATCT traffic management coordinators (TMCs) in making decisions such as airport configuration and runway load balancing. The Build 1 of the SMS tool was installed and successfully tested at Memphis International Airport (MEM) and received high acceptance scores from ATCT controllers and coordinators, as well as airline ramp controllers. NASA Ames Research Center continues to develop SMS under NASA s Strategic Airspace Usage (SAU) project in order to improve its prediction accuracy and robustness under various modeling uncertainties. This paper reports the recent development effort performed by the NASA Ames Research Center: 1) integration of Center TRACON Automation System (CTAS) capability with SMS and 2) an alternative approach to obtain airline gate information through a publicly available website. The preliminary analysis results performed on the air/surface traffic data at the DFW airport have shown significant improvement in predicting airport arrival demand and IN time at the gate. This paper concludes with recommendations for future research and development.
modPDZpep: a web resource for structure based analysis of human PDZ-mediated interaction networks.
Sain, Neetu; Mohanty, Debasisa
2016-09-21
PDZ domains recognize short sequence stretches usually present in C-terminal of their interaction partners. Because of the involvement of PDZ domains in many important biological processes, several attempts have been made for developing bioinformatics tools for genome-wide identification of PDZ interaction networks. Currently available tools for prediction of interaction partners of PDZ domains utilize machine learning approach. Since, they have been trained using experimental substrate specificity data for specific PDZ families, their applicability is limited to PDZ families closely related to the training set. These tools also do not allow analysis of PDZ-peptide interaction interfaces. We have used a structure based approach to develop modPDZpep, a program to predict the interaction partners of human PDZ domains and analyze structural details of PDZ interaction interfaces. modPDZpep predicts interaction partners by using structural models of PDZ-peptide complexes and evaluating binding energy scores using residue based statistical pair potentials. Since, it does not require training using experimental data on peptide binding affinity, it can predict substrates for diverse PDZ families. Because of the use of simple scoring function for binding energy, it is also fast enough for genome scale structure based analysis of PDZ interaction networks. Benchmarking using artificial as well as real negative datasets indicates good predictive power with ROC-AUC values in the range of 0.7 to 0.9 for a large number of human PDZ domains. Another novel feature of modPDZpep is its ability to map novel PDZ mediated interactions in human protein-protein interaction networks, either by utilizing available experimental phage display data or by structure based predictions. In summary, we have developed modPDZpep, a web-server for structure based analysis of human PDZ domains. It is freely available at http://www.nii.ac.in/modPDZpep.html or http://202.54.226.235/modPDZpep.html . This article was reviewed by Michael Gromiha and Zoltán Gáspári.
NASA Astrophysics Data System (ADS)
Kurihara, Osamu; Kim, Eunjoo; Kunishima, Naoaki; Tani, Kotaro; Ishikawa, Tetsuo; Furuyama, Kazuo; Hashimoto, Shozo; Akashi, Makoto
2017-09-01
A tool was developed to facilitate the calculation of the early internal doses to residents involved in the Fukushima Nuclear Disaster based on atmospheric transport and dispersion model (ATDM) simulations performed using Worldwide version of System for Prediction of Environmental Emergency Information 2nd version (WSPEEDI-II) together with personal behavior data containing the history of the whereabouts of individul's after the accident. The tool generates hourly-averaged air concentration data for the simulation grids nearest to an individual's whereabouts using WSPEEDI-II datasets for the subsequent calculation of internal doses due to inhalation. This paper presents an overview of the developed tool and provides tentative comparisons between direct measurement-based and ATDM-based results regarding the internal doses received by 421 persons from whom personal behavior data available.
Wind Prediction Accuracy for Air Traffic Management Decision Support Tools
NASA Technical Reports Server (NTRS)
Cole, Rod; Green, Steve; Jardin, Matt; Schwartz, Barry; Benjamin, Stan
2000-01-01
The performance of Air Traffic Management and flight deck decision support tools depends in large part on the accuracy of the supporting 4D trajectory predictions. This is particularly relevant to conflict prediction and active advisories for the resolution of conflicts and the conformance with of traffic-flow management flow-rate constraints (e.g., arrival metering / required time of arrival). Flight test results have indicated that wind prediction errors may represent the largest source of trajectory prediction error. The tests also discovered relatively large errors (e.g., greater than 20 knots), existing in pockets of space and time critical to ATM DST performance (one or more sectors, greater than 20 minutes), are inadequately represented by the classic RMS aggregate prediction-accuracy studies of the past. To facilitate the identification and reduction of DST-critical wind-prediction errors, NASA has lead a collaborative research and development activity with MIT Lincoln Laboratories and the Forecast Systems Lab of the National Oceanographic and Atmospheric Administration (NOAA). This activity, begun in 1996, has focussed on the development of key metrics for ATM DST performance, assessment of wind-prediction skill for state of the art systems, and development/validation of system enhancements to improve skill. A 13 month study was conducted for the Denver Center airspace in 1997. Two complementary wind-prediction systems were analyzed and compared to the forecast performance of the then standard 60 km Rapid Update Cycle - version 1 (RUC-1). One system, developed by NOAA, was the prototype 40-km RUC-2 that became operational at NCEP in 1999. RUC-2 introduced a faster cycle (1 hr vs. 3 hr) and improved mesoscale physics. The second system, Augmented Winds (AW), is a prototype en route wind application developed by MITLL based on the Integrated Terminal Wind System (ITWS). AW is run at a local facility (Center) level, and updates RUC predictions based on an optimal interpolation of the latest ACARS reports since the RUC run. This paper presents an overview of the study's results including the identification and use of new large mor wind-prediction accuracy metrics that are key to ATM DST performance.
Schmidt-Hansen, Mia; Berendse, Sabine; Hamilton, Willie; Baldwin, David R
2017-01-01
Background Lung cancer is the leading cause of cancer deaths. Around 70% of patients first presenting to specialist care have advanced disease, at which point current treatments have little effect on survival. The issue for primary care is how to recognise patients earlier and investigate appropriately. This requires an assessment of the risk of lung cancer. Aim The aim of this study was to systematically review the existing risk prediction tools for patients presenting in primary care with symptoms that may indicate lung cancer Design and setting Systematic review of primary care data. Method Medline, PreMedline, Embase, the Cochrane Library, Web of Science, and ISI Proceedings (1980 to March 2016) were searched. The final list of included studies was agreed between two of the authors, who also appraised and summarised them. Results Seven studies with between 1482 and 2 406 127 patients were included. The tools were all based on UK primary care data, but differed in complexity of development, number/type of variables examined/included, and outcome time frame. There were four multivariable tools with internal validation area under the curves between 0.88 and 0.92. The tools all had a number of limitations, and none have been externally validated, or had their clinical and cost impact examined. Conclusion There is insufficient evidence for the recommendation of any one of the available risk prediction tools. However, some multivariable tools showed promising discrimination. What is needed to guide clinical practice is both external validation of the existing tools and a comparative study, so that the best tools can be incorporated into clinical decision tools used in primary care. PMID:28483820
Huysentruyt, Koen; Devreker, Thierry; Dejonckheere, Joachim; De Schepper, Jean; Vandenplas, Yvan; Cools, Filip
2015-08-01
The aim of the present study was to evaluate the predictive accuracy of screening tools for assessing nutritional risk in hospitalized children in developed countries. The study involved a systematic review of literature (MEDLINE, EMBASE, and Cochrane Central databases up to January 17, 2014) of studies on the diagnostic performance of pediatric nutritional screening tools. Methodological quality was assessed using a modified QUADAS tool. Sensitivity and specificity were calculated for each screening tool per validation method. A meta-analysis was performed to estimate the risk ratio of different screening result categories of being truly at nutritional risk. A total of 11 studies were included on ≥1 of the following screening tools: Pediatric Nutritional Risk Score, Screening Tool for the Assessment of Malnutrition in Paediatrics, Paediatric Yorkhill Malnutrition Score, and Screening Tool for Risk on Nutritional Status and Growth. Because of variation in reference standards, a direct comparison of the predictive accuracy of the screening tools was not possible. A meta-analysis was performed on 1629 children from 7 different studies. The risk ratio of being truly at nutritional risk was 0.349 (95% confidence interval [CI] 0.16-0.78) for children in the low versus moderate screening category and 0.292 (95% CI 0.19-0.44) in the moderate versus high screening category. There is insufficient evidence to choose 1 nutritional screening tool over another based on their predictive accuracy. The estimated risk of being at "true nutritional risk" increases with each category of screening test result. Each screening category should be linked to a specific course of action, although further research is needed.
NASA Astrophysics Data System (ADS)
Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.
We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.
Automated support for experience-based software management
NASA Technical Reports Server (NTRS)
Valett, Jon D.
1992-01-01
To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.
ENFIN--A European network for integrative systems biology.
Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan
2009-11-01
Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.
Ontology-based tools to expedite predictive model construction.
Haug, Peter; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Ferraro, Jeffrey
2014-01-01
Large amounts of medical data are collected electronically during the course of caring for patients using modern medical information systems. This data presents an opportunity to develop clinically useful tools through data mining and observational research studies. However, the work necessary to make sense of this data and to integrate it into a research initiative can require substantial effort from medical experts as well as from experts in medical terminology, data extraction, and data analysis. This slows the process of medical research. To reduce the effort required for the construction of computable, diagnostic predictive models, we have developed a system that hybridizes a medical ontology with a large clinical data warehouse. Here we describe components of this system designed to automate the development of preliminary diagnostic models and to provide visual clues that can assist the researcher in planning for further analysis of the data behind these models.
Refinement of detecting atrial fibrillation in stroke patients: results from the TRACK-AF Study.
Reinke, F; Bettin, M; Ross, L S; Kochhäuser, S; Kleffner, I; Ritter, M; Minnerup, J; Dechering, D; Eckardt, L; Dittrich, R
2018-04-01
Detection of occult atrial fibrillation (AF) is crucial for optimal secondary prevention in stroke patients. The AF detection rate was determined by implantable cardiac monitor (ICM) and compared to the prediction rate of the probability of incident AF by software based analysis of a continuously monitored electrocardiogram at follow-up (stroke risk analysis, SRA); an optimized AF detection algorithm is proposed by combining both tools. In a monocentric prospective study 105 out of 389 patients with cryptogenic stroke despite extensive diagnostic workup were investigated with two additional cardiac monitoring tools: (a) 20 months' monitoring by ICM and (b) SRA during hospitalization at the stroke unit. The detection rate of occult AF was 18% by ICM (n = 19) (range 6-575 days) and 62% (n = 65) had an increased risk for AF predicted by SRA. When comparing the predictive accuracy of SRA to ICM, the sensitivity was 95%, specificity 35%, positive predictive value 27% and negative predictive value 96%. In 18 patients with AF detected by ICM, SRA also showed a medium risk for AF. Only one patient with a very low risk predicted by SRA developed AF revealed by ICM after 417 days. A combination of SRA and ICM is a promising strategy to detect occult AF. SRA is reliable in predicting incident AF with a high negative predictive value. Thus, SRA may serve as a cost-effective pre-selection tool identifying patients at risk for AF who may benefit from further cardiac monitoring by ICM. © 2017 EAN.
U.S. Geological Survey science for the Wyoming Landscape Conservation Initiative—2014 annual report
Bowen, Zachary H.; Aldridge, Cameron L.; Anderson, Patrick J.; Assal, Timothy J.; Bartos, Timothy T.; Biewick, Laura R; Boughton, Gregory K.; Chalfoun, Anna D.; Chong, Geneva W.; Dematatis, Marie K.; Eddy-Miller, Cheryl A.; Garman, Steven L.; Germaine, Stephen S.; Homer, Collin G.; Huber, Christopher; Kauffman, Matthew J.; Latysh, Natalie; Manier, Daniel; Melcher, Cynthia P.; Miller, Alexander; Miller, Kirk A.; Olexa, Edward M.; Schell, Spencer; Walters, Annika W.; Wilson, Anna B.; Wyckoff, Teal B.
2015-01-01
Finally, capabilities of the WLCI Web site and the USGS ScienceBase infrastructure were maintained and upgraded to help ensure access to and efficient use of all the WLCI data, products, assessment tools, and outreach materials that have been developed. Of particular note is the completion of three Web applications developed for mapping (1) the 1900−2008 progression of oil and gas development;(2) the predicted distributions of Wyoming’s Species of Greatest Conservation Need; and (3) the locations of coal and wind energy production, sage-grouse distribution and core management areas, and alternative routes for transmission lines within the WLCI region. Collectively, these applications tools provide WLCI planners and managers with powerful tools for better understanding the distributions of wildlife species and potential alternatives for energy development.
Initial Integration of Noise Prediction Tools for Acoustic Scattering Effects
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Burley, Casey L.; Tinetti, Ana; Rawls, John W.
2008-01-01
This effort provides an initial glimpse at NASA capabilities available in predicting the scattering of fan noise from a non-conventional aircraft configuration. The Aircraft NOise Prediction Program, Fast Scattering Code, and the Rotorcraft Noise Model were coupled to provide increased fidelity models of scattering effects on engine fan noise sources. The integration of these codes led to the identification of several keys issues entailed in applying such multi-fidelity approaches. In particular, for prediction at noise certification points, the inclusion of distributed sources leads to complications with the source semi-sphere approach. Computational resource requirements limit the use of the higher fidelity scattering code to predict radiated sound pressure levels for full scale configurations at relevant frequencies. And, the ability to more accurately represent complex shielding surfaces in current lower fidelity models is necessary for general application to scattering predictions. This initial step in determining the potential benefits/costs of these new methods over the existing capabilities illustrates a number of the issues that must be addressed in the development of next generation aircraft system noise prediction tools.
Nelson, D B; Bellamy, S; Odibo, A; Nachamkin, I; Ness, R B; Allen-Taylor, L
2007-11-01
Vaginal complaints compel an evaluation of bacterial vaginosis (BV), however, many cases of BV are asymptomatic. We evaluated the sensitivity and specificity of vaginal symptoms in the diagnosis of BV and examined the utility of creating a BV screening tool using clinical, behavioural and demographic characteristics. A total of 1916 pregnant women were included in this analysis. In total, 757 women screened positive for BV and over one third of BV-positive women presented without any lower genital tract symptoms (39.4%). African American race, abnormal vaginal odour, and smoking were independently related to BV positivity. A BV screening tool including these three factors was fairly predictive of BV status with the area under the ROC curve equal to 0.669. This three-item prediction rule may be useful in identifying high- risk pregnant women in need of BV screening and, given the high specificity, accurately identify the group of BV-negative pregnant women.
ENFIN a network to enhance integrative systems biology.
Kahlem, Pascal; Birney, Ewan
2007-12-01
Integration of biological data of various types and development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing both an adapted infrastructure to connect databases and platforms to enable the generation of new bioinformatics tools as well as the experimental validation of computational predictions. We will give an overview of the projects tackled within ENFIN and discuss the challenges associated with integration for systems biology.
Nawrocki, Eric P.; Burge, Sarah W.
2013-01-01
The development of RNA bioinformatic tools began more than 30 y ago with the description of the Nussinov and Zuker dynamic programming algorithms for single sequence RNA secondary structure prediction. Since then, many tools have been developed for various RNA sequence analysis problems such as homology search, multiple sequence alignment, de novo RNA discovery, read-mapping, and many more. In this issue, we have collected a sampling of reviews and original research that demonstrate some of the many ways bioinformatics is integrated with current RNA biology research. PMID:23948768
DengueTools: innovative tools and strategies for the surveillance and control of dengue.
Wilder-Smith, Annelies; Renhorn, Karl-Erik; Tissera, Hasitha; Abu Bakar, Sazaly; Alphey, Luke; Kittayapong, Pattamaporn; Lindsay, Steve; Logan, James; Hatz, Christoph; Reiter, Paul; Rocklöv, Joacim; Byass, Peter; Louis, Valérie R; Tozan, Yesim; Massad, Eduardo; Tenorio, Antonio; Lagneau, Christophe; L'Ambert, Grégory; Brooks, David; Wegerdt, Johannah; Gubler, Duane
2012-01-01
Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of 'Comprehensive control of Dengue fever under changing climatic conditions'. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named 'DengueTools' to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change.The consortium comprises 12 work packages to address a set of research questions in three areas:Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring.Research area 2: Develop novel strategies to prevent dengue in children.Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change.In this paper, we report on the rationale and specific study objectives of 'DengueTools'. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools.
NASA Technical Reports Server (NTRS)
Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.
1992-01-01
A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
GPS-MBA: Computational Analysis of MHC Class II Epitopes in Type 1 Diabetes
Ren, Jian; Ma, Chuang; Gao, Tianshun; Zhou, Yanhong; Yang, Qing; Xue, Yu
2012-01-01
As a severe chronic metabolic disease and autoimmune disorder, type 1 diabetes (T1D) affects millions of people world-wide. Recent advances in antigen-based immunotherapy have provided a great opportunity for further treating T1D with a high degree of selectivity. It is reported that MHC class II I-Ag7 in the non-obese diabetic (NOD) mouse and human HLA-DQ8 are strongly linked to susceptibility to T1D. Thus, the identification of new I-Ag7 and HLA-DQ8 epitopes would be of great help to further experimental and biomedical manipulation efforts. In this study, a novel GPS-MBA (MHC Binding Analyzer) software package was developed for the prediction of I-Ag7 and HLA-DQ8 epitopes. Using experimentally identified epitopes as the training data sets, a previously developed GPS (Group-based Prediction System) algorithm was adopted and improved. By extensive evaluation and comparison, the GPS-MBA performance was found to be much better than other tools of this type. With this powerful tool, we predicted a number of potentially new I-Ag7 and HLA-DQ8 epitopes. Furthermore, we designed a T1D epitope database (TEDB) for all of the experimentally identified and predicted T1D-associated epitopes. Taken together, this computational prediction result and analysis provides a starting point for further experimental considerations, and GPS-MBA is demonstrated to be a useful tool for generating starting information for experimentalists. The GPS-MBA is freely accessible for academic researchers at: http://mba.biocuckoo.org. PMID:22479466
An Integrated Children Disease Prediction Tool within a Special Social Network.
Apostolova Trpkovska, Marika; Yildirim Yayilgan, Sule; Besimi, Adrian
2016-01-01
This paper proposes a social network with an integrated children disease prediction system developed by the use of the specially designed Children General Disease Ontology (CGDO). This ontology consists of children diseases and their relationship with symptoms and Semantic Web Rule Language (SWRL rules) that are specially designed for predicting diseases. The prediction process starts by filling data about the appeared signs and symptoms by the user which are after that mapped with the CGDO ontology. Once the data are mapped, the prediction results are presented. The phase of prediction executes the rules which extract the predicted disease details based on the SWRL rule specified. The motivation behind the development of this system is to spread knowledge about the children diseases and their symptoms in a very simple way using the specialized social networking website www.emama.mk.
NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.
2015-01-01
Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and auralizations.
NASA Astrophysics Data System (ADS)
Murrill, Steven R.; Jacobs, Eddie L.; Franck, Charmaine C.; Petkie, Douglas T.; De Lucia, Frank C.
2015-10-01
The U.S. Army Research Laboratory (ARL) has continued to develop and enhance a millimeter-wave (MMW) and submillimeter- wave (SMMW)/terahertz (THz)-band imaging system performance prediction and analysis tool for both the detection and identification of concealed weaponry, and for pilotage obstacle avoidance. The details of the MATLAB-based model which accounts for the effects of all critical sensor and display components, for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security and Defence Symposium (Brugge). An advanced version of the base model that accounts for both the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system, and for the impact of target and background thermal emission, was reported on at the 2007 SPIE Defense and Security Symposium (Orlando). Further development of this tool that includes a MODTRAN-based atmospheric attenuation calculator and advanced system architecture configuration inputs that allow for straightforward performance analysis of active or passive systems based on scanning (single- or line-array detector element(s)) or staring (focal-plane-array detector elements) imaging architectures was reported on at the 2011 SPIE Europe Security and Defence Symposium (Prague). This paper provides a comprehensive review of a newly enhanced MMW and SMMW/THz imaging system analysis and design tool that now includes an improved noise sub-model for more accurate and reliable performance predictions, the capability to account for postcapture image contrast enhancement, and the capability to account for concealment material backscatter with active-illumination- based systems. Present plans for additional expansion of the model's predictive capabilities are also outlined.
Development of Multi-slice Analytical Tool to Support BIM-based Design Process
NASA Astrophysics Data System (ADS)
Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.
2017-03-01
This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.
Frailty screening and assessment tools: a review of characteristics and use in Public Health.
Gilardi, F; Capanna, A; Ferraro, M; Scarcella, P; Marazzi, M C; Palombi, L; Liotta, G
2018-01-01
Frailty screening and assessment are a fundamental issue in Public Health in order to plan prevention programs and services. By a narrative review of the literature employing the International Narrative Systematic Assessment tool, the authors aims to develop an updated framework for the main procedures and measurement tools to assess frailty in older adults, paying attention to the use in the primary care setting. The study selected 10 reviews published between January 2010 and December 2016 that define some characteristics of the main tools used to measure the frailty. Within the selected reviews only one of the described tools met all the criteria (multidimensionality, quick and easy administration, accurate risk prediction of negative outcomes and high sensitivity and specificity) necessary for a screening tool. Accurate risk prediction of negative outcomes could be the appropriate and sufficient criteria to assess a tool aimed to detect frailty in the community-dwelling elderly population. A two-step process (a first short questionnaire to detect frailty and a second longer questionnaire to define the care demand at individual level) could represent the appropriate pathway for planning care services at community level.
Kimmel, Lara A; Holland, Anne E; Simpson, Pam M; Edwards, Elton R; Gabbe, Belinda J
2014-07-01
Early, accurate prediction of discharge destination from the acute hospital assists individual patients and the wider hospital system. The Trauma Rehabilitation and Prediction Tool (TRaPT), developed using registry data, determines probability of inpatient rehabilitation discharge for patients with isolated lower limb fractures. The aims of this study were: (1) to prospectively validatate the TRaPT, (2) to assess whether its performance could be improved by adding additional demographic data, and (3) to simplify it for use as a bedside tool. This was a cohort, measurement-focused study. Patients with isolated lower limb fractures (N=114) who were admitted to a major trauma center in Melbourne, Australia, were included. The participants' TRaPT scores were calculated from admission data. Performance of the TRaPT score alone, and in combination with frailty, weight-bearing status, and home supports, was assessed using measures of discrimination and calibration. A simplified TRaPT was developed by rounding the coefficients of variables in the original model and grouping age into 8 categories. Simplified TRaPT performance measures, including specificity, sensitivity, and positive and negative predictive values, were evaluated. Prospective validation of the TRaPT showed excellent discrimination (C-statistic=0.90 [95% confidence interval=0.82, 0.97]), a sensitivity of 80%, and specificity of 94%. All participants able to weight bear were discharged directly home. Simplified TRaPT scores had a sensitivity of 80% and a specificity of 88%. Generalizability may be limited given the compensation system that exists in Australia, but the methods used will assist in designing a similar tool in any population. The TRaPT accurately predicted discharge destination for 80% of patients and may form a useful aid for discharge decision making, with the simplified version facilitating its use as a bedside tool. © 2014 American Physical Therapy Association.
Towards early software reliability prediction for computer forensic tools (case study).
Abu Talib, Manar
2016-01-01
Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.
ERIC Educational Resources Information Center
Alyuruk, Hakan; Cavas, Levent
2014-01-01
Genomics and proteomics projects have produced a huge amount of raw biological data including DNA and protein sequences. Although these data have been stored in data banks, their evaluation is strictly dependent on bioinformatics tools. These tools have been developed by multidisciplinary experts for fast and robust analysis of biological data.…
Technology Solutions Case Study: Predicting Envelope Leakage in Attached Dwellings
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2013-11-01
The most common method of measuring air leakage is to perform single (or solo) blower door pressurization and/or depressurization test. In detached housing, the single blower door test measures leakage to the outside. In attached housing, however, this “solo” test method measures both air leakage to the outside and air leakage between adjacent units through common surfaces. In an attempt to create a simplified tool for predicting leakage to the outside, Building America team Consortium for Advanced Residential Buildings (CARB) performed a preliminary statistical analysis on blower door test results from 112 attached dwelling units in four apartment complexes. Althoughmore » the subject data set is limited in size and variety, the preliminary analyses suggest significant predictors are present and support the development of a predictive model. Further data collection is underway to create a more robust prediction tool for use across different construction types, climate zones, and unit configurations.« less
Advanced Computational Modeling Approaches for Shock Response Prediction
NASA Technical Reports Server (NTRS)
Derkevorkian, Armen; Kolaini, Ali R.; Peterson, Lee
2015-01-01
Motivation: (1) The activation of pyroshock devices such as explosives, separation nuts, pin-pullers, etc. produces high frequency transient structural response, typically from few tens of Hz to several hundreds of kHz. (2) Lack of reliable analytical tools makes the prediction of appropriate design and qualification test levels a challenge. (3) In the past few decades, several attempts have been made to develop methodologies that predict the structural responses to shock environments. (4) Currently, there is no validated approach that is viable to predict shock environments overt the full frequency range (i.e., 100 Hz to 10 kHz). Scope: (1) Model, analyze, and interpret space structural systems with complex interfaces and discontinuities, subjected to shock loads. (2) Assess the viability of a suite of numerical tools to simulate transient, non-linear solid mechanics and structural dynamics problems, such as shock wave propagation.
MetaDP: a comprehensive web server for disease prediction of 16S rRNA metagenomic datasets.
Xu, Xilin; Wu, Aiping; Zhang, Xinlei; Su, Mingming; Jiang, Taijiao; Yuan, Zhe-Ming
2016-01-01
High-throughput sequencing-based metagenomics has garnered considerable interest in recent years. Numerous methods and tools have been developed for the analysis of metagenomic data. However, it is still a daunting task to install a large number of tools and complete a complicated analysis, especially for researchers with minimal bioinformatics backgrounds. To address this problem, we constructed an automated software named MetaDP for 16S rRNA sequencing data analysis, including data quality control, operational taxonomic unit clustering, diversity analysis, and disease risk prediction modeling. Furthermore, a support vector machine-based prediction model for intestinal bowel syndrome (IBS) was built by applying MetaDP to microbial 16S sequencing data from 108 children. The success of the IBS prediction model suggests that the platform may also be applied to other diseases related to gut microbes, such as obesity, metabolic syndrome, or intestinal cancer, among others (http://metadp.cn:7001/).
Dose-response patterns for vibration-induced white finger
Griffin, M; Bovenzi, M; Nelson, C
2003-01-01
Aims: To investigate alternative relations between cumulative exposures to hand-transmitted vibration (taking account of vibration magnitude, lifetime exposure duration, and frequency of vibration) and the development of white finger (Raynaud's phenomenon). Methods: Three previous studies have been combined to provide a group of 1557 users of powered vibratory tools in seven occupational subgroups: stone grinders, stone carvers, quarry drillers, dockyard caulkers, dockyard boilermakers, dockyard painters, and forest workers. The estimated total operating duration in hours was thus obtained for each subject, for each tool, and for all tools combined. From the vibration magnitudes and exposure durations, seven alternative measurements of cumulative exposure were calculated for each subject, using expressions of the form: dose = ∑amiti, where ai is the acceleration magnitude on tool i, ti is the lifetime exposure duration for tool i, and m = 0, 1, 2, or 4. Results: For all seven alternative dose measures, an increase in dose was associated with a significant increase in the occurrence of vibration-induced white finger, after adjustment for age and smoking. However, dose measures with high powers of acceleration (m > 1) faired less well than measures in which the weighted or unweighted acceleration, and lifetime exposure duration, were given equal weight (m = 1). Dose determined solely by the lifetime exposure duration (without consideration of the vibration magnitude) gave better predictions than measures with m greater than unity. All measures of dose calculated from the unweighted acceleration gave better predictions than the equivalent dose measures using acceleration frequency-weighted according to current standards. Conclusions: Since the total duration of exposure does not discriminate between exposures accumulated over the day and those accumulated over years, a linear relation between vibration magnitude and exposure duration seems appropriate for predicting the occurrence of vibration-induced white finger. Poorer predictions were obtained when the currently recommended frequency weighting was employed than when accelerations at all frequencies were given equal weight. Findings suggest that improvements are possible to both the frequency weighting and the time dependency used to predict the development of vibration-induced white finger in current standards. PMID:12499452
Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles
NASA Technical Reports Server (NTRS)
Ivanco, Thomas G.
2016-01-01
Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.
C. H. Greenberg; S. Goodrick; J. D. Austin; B. R. Parresol
2015-01-01
Hydroregimes of ephemeral wetlands affect reproductive success of many amphibian species and are sensitive to altered weather patterns associated with climate change.We used 17 years of weekly temperature, precipitation, and waterdepth measurements for eight small, ephemeral, groundwaterdriven sinkhole wetlands in Florida sandhills to develop a hydroregime predictive...
Background/Question/Methods What species of fish might someone find in a local stream? How might that community change as a result of changes to characteristics of the stream and its watershed? PiSCES is a browser-based toolkit developed to predict a fish community for any NHD-Pl...
A quick reality check for microRNA target prediction.
Kast, Juergen
2011-04-01
The regulation of protein abundance by microRNA (miRNA)-mediated repression of mRNA translation is a rapidly growing area of interest in biochemical research. In animal cells, the miRNA seed sequence does not perfectly match that of the mRNA it targets, resulting in a large number of possible miRNA targets and varied extents of repression. Several software tools are available for the prediction of miRNA targets, yet the overlap between them is limited. Jovanovic et al. have developed and applied a targeted, quantitative approach to validate predicted miRNA target proteins. Using a proteome database, they have set up and tested selected reaction monitoring assays for approximately 20% of more than 800 predicted let-7 targets, as well as control genes in Caenorhabditis elegans. Their results demonstrate that such assays can be developed quickly and with relative ease, and applied in a high-throughput setup to verify known and identify novel miRNA targets. They also show, however, that the choice of the biological system and material has a noticeable influence on the frequency, extent and direction of the observed changes. Nonetheless, selected reaction monitoring assays, such as those developed by Jovanovic et al., represent an attractive new tool in the study of miRNA function at the organism level.
Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.
Fong, Stephen S
2014-08-01
Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.
An experiment with interactive planning models
NASA Technical Reports Server (NTRS)
Beville, J.; Wagner, J. H.; Zannetos, Z. S.
1970-01-01
Experiments on decision making in planning problems are described. Executives were tested in dealing with capital investments and competitive pricing decisions under conditions of uncertainty. A software package, the interactive risk analysis model system, was developed, and two controlled experiments were conducted. It is concluded that planning models can aid management, and predicted uses of the models are as a central tool, as an educational tool, to improve consistency in decision making, to improve communications, and as a tool for consensus decision making.
Software Engineering Tools for Scientific Models
NASA Technical Reports Server (NTRS)
Abrams, Marc; Saboo, Pallabi; Sonsini, Mike
2013-01-01
Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.
Comparison of Predictive Modeling Methods of Aircraft Landing Speed
NASA Technical Reports Server (NTRS)
Diallo, Ousmane H.
2012-01-01
Expected increases in air traffic demand have stimulated the development of air traffic control tools intended to assist the air traffic controller in accurately and precisely spacing aircraft landing at congested airports. Such tools will require an accurate landing-speed prediction to increase throughput while decreasing necessary controller interventions for avoiding separation violations. There are many practical challenges to developing an accurate landing-speed model that has acceptable prediction errors. This paper discusses the development of a near-term implementation, using readily available information, to estimate/model final approach speed from the top of the descent phase of flight to the landing runway. As a first approach, all variables found to contribute directly to the landing-speed prediction model are used to build a multi-regression technique of the response surface equation (RSE). Data obtained from operations of a major airlines for a passenger transport aircraft type to the Dallas/Fort Worth International Airport are used to predict the landing speed. The approach was promising because it decreased the standard deviation of the landing-speed error prediction by at least 18% from the standard deviation of the baseline error, depending on the gust condition at the airport. However, when the number of variables is reduced to the most likely obtainable at other major airports, the RSE model shows little improvement over the existing methods. Consequently, a neural network that relies on a nonlinear regression technique is utilized as an alternative modeling approach. For the reduced number of variables cases, the standard deviation of the neural network models errors represent over 5% reduction compared to the RSE model errors, and at least 10% reduction over the baseline predicted landing-speed error standard deviation. Overall, the constructed models predict the landing-speed more accurately and precisely than the current state-of-the-art.
The PREM score: a graphical tool for predicting survival in very preterm births.
Cole, T J; Hey, E; Richmond, S
2010-01-01
To develop a tool for predicting survival to term in babies born more than 8 weeks early using only information available at or before birth. 1456 non-malformed very preterm babies of 22-31 weeks' gestation born in 2000-3 in the north of England and 3382 births of 23-31 weeks born in 2000-4 in Trent. Survival to term, predicted from information available at birth, and at the onset of labour or delivery. Development of a logistic regression model (the prematurity risk evaluation measure or PREM score) based on gestation, birth weight for gestation and base deficit from umbilical cord blood. Gestation was by far the most powerful predictor of survival to term, and as few as 5 extra days can double the chance of survival. Weight for gestation also had a powerful but non-linear effect on survival, with weight between the median and 85th centile predicting the highest survival. Using this information survival can be predicted almost as accurately before birth as after, although base deficit further improves the prediction. A simple graph is described that shows how the two main variables gestation and weight for gestation interact to predict the chance of survival. The PREM score can be used to predict the chance of survival at or before birth almost as accurately as existing measures influenced by post-delivery condition, to balance risk at entry into a controlled trial and to adjust for differences in "case mix" when assessing the quality of perinatal care.
Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C
2012-10-01
Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.
Huang, Ying; Chen, Shi-Yi; Deng, Feilong
2016-01-01
In silico analysis of DNA sequences is an important area of computational biology in the post-genomic era. Over the past two decades, computational approaches for ab initio prediction of gene structure from genome sequence alone have largely facilitated our understanding on a variety of biological questions. Although the computational prediction of protein-coding genes has already been well-established, we are also facing challenges to robustly find the non-coding RNA genes, such as miRNA and lncRNA. Two main aspects of ab initio gene prediction include the computed values for describing sequence features and used algorithm for training the discriminant function, and by which different combinations are employed into various bioinformatic tools. Herein, we briefly review these well-characterized sequence features in eukaryote genomes and applications to ab initio gene prediction. The main purpose of this article is to provide an overview to beginners who aim to develop the related bioinformatic tools.
A Deep Space Orbit Determination Software: Overview and Event Prediction Capability
NASA Astrophysics Data System (ADS)
Kim, Youngkwang; Park, Sang-Young; Lee, Eunji; Kim, Minsik
2017-06-01
This paper presents an overview of deep space orbit determination software (DSODS), as well as validation and verification results on its event prediction capabilities. DSODS was developed in the MATLAB object-oriented programming environment to support the Korea Pathfinder Lunar Orbiter (KPLO) mission. DSODS has three major capabilities: celestial event prediction for spacecraft, orbit determination with deep space network (DSN) tracking data, and DSN tracking data simulation. To achieve its functionality requirements, DSODS consists of four modules: orbit propagation (OP), event prediction (EP), data simulation (DS), and orbit determination (OD) modules. This paper explains the highest-level data flows between modules in event prediction, orbit determination, and tracking data simulation processes. Furthermore, to address the event prediction capability of DSODS, this paper introduces OP and EP modules. The role of the OP module is to handle time and coordinate system conversions, to propagate spacecraft trajectories, and to handle the ephemerides of spacecraft and celestial bodies. Currently, the OP module utilizes the General Mission Analysis Tool (GMAT) as a third-party software component for highfidelity deep space propagation, as well as time and coordinate system conversions. The role of the EP module is to predict celestial events, including eclipses, and ground station visibilities, and this paper presents the functionality requirements of the EP module. The validation and verification results show that, for most cases, event prediction errors were less than 10 millisec when compared with flight proven mission analysis tools such as GMAT and Systems Tool Kit (STK). Thus, we conclude that DSODS is capable of predicting events for the KPLO in real mission applications.
NASA Astrophysics Data System (ADS)
Kumbur, E. C.; Sharp, K. V.; Mench, M. M.
Developing a robust, intelligent design tool for multivariate optimization of multi-phase transport in fuel cell diffusion media (DM) is of utmost importance to develop advanced DM materials. This study explores the development of a DM design algorithm based on artificial neural network (ANN) that can be used as a powerful tool for predicting the capillary transport characteristics of fuel cell DM. Direct measurements of drainage capillary pressure-saturation curves of the differently engineered DMs (5, 10 and 20 wt.% PTFE) were performed at room temperature under three compressions (0, 0.6 and 1.4 MPa) [E.C. Kumbur, K.V. Sharp, M.M. Mench, J. Electrochem. Soc. 154(12) (2007) B1295-B1304; E.C. Kumbur, K.V. Sharp, M.M. Mench, J. Electrochem. Soc. 154(12) (2007) B1305-B1314; E.C. Kumbur, K.V. Sharp, M.M. Mench, J. Electrochem. Soc. 154(12) (2007) B1315-B1324]. The generated benchmark data were utilized to systematically train a three-layered ANN framework that processes the feed-forward error back propagation methodology. The designed ANN successfully predicts the measured capillary pressures within an average uncertainty of ±5.1% of the measured data, confirming that the present ANN model can be used as a design tool within the range of tested parameters. The ANN simulations reveal that tailoring the DM with high PTFE loading and applying high compression pressure lead to a higher capillary pressure, therefore promoting the liquid water transport within the pores of the DM. Any increase in hydrophobicity of the DM is found to amplify the compression effect, thus yielding a higher capillary pressure for the same saturation level and compression.
Thakur, Shalabh; Guttman, David S
2016-06-30
Comparative analysis of whole genome sequence data from closely related prokaryotic species or strains is becoming an increasingly important and accessible approach for addressing both fundamental and applied biological questions. While there are number of excellent tools developed for performing this task, most scale poorly when faced with hundreds of genome sequences, and many require extensive manual curation. We have developed a de-novo genome analysis pipeline (DeNoGAP) for the automated, iterative and high-throughput analysis of data from comparative genomics projects involving hundreds of whole genome sequences. The pipeline is designed to perform reference-assisted and de novo gene prediction, homolog protein family assignment, ortholog prediction, functional annotation, and pan-genome analysis using a range of proven tools and databases. While most existing methods scale quadratically with the number of genomes since they rely on pairwise comparisons among predicted protein sequences, DeNoGAP scales linearly since the homology assignment is based on iteratively refined hidden Markov models. This iterative clustering strategy enables DeNoGAP to handle a very large number of genomes using minimal computational resources. Moreover, the modular structure of the pipeline permits easy updates as new analysis programs become available. DeNoGAP integrates bioinformatics tools and databases for comparative analysis of a large number of genomes. The pipeline offers tools and algorithms for annotation and analysis of completed and draft genome sequences. The pipeline is developed using Perl, BioPerl and SQLite on Ubuntu Linux version 12.04 LTS. Currently, the software package accompanies script for automated installation of necessary external programs on Ubuntu Linux; however, the pipeline should be also compatible with other Linux and Unix systems after necessary external programs are installed. DeNoGAP is freely available at https://sourceforge.net/projects/denogap/ .
Peters, Adam; Schlekat, Christian E; Merrington, Graham
2016-10-01
A bioavailability-based environmental quality standard (EQS) was established for nickel in freshwaters under the European Union's Water Framework Directive. Bioavailability correction based on pH, water hardness, and dissolved organic carbon is a demonstrable improvement on existing hardness-based quality standards, which may be underprotective in high-hardness waters. The present study compares several simplified bioavailability tools developed to implement the Ni EQS (biomet, M-BAT, and PNECPro) against the full bioavailability normalization procedure on which the EQS was based. Generally, all tools correctly distinguished sensitive waters from insensitive waters, although with varying degrees of accuracy compared with full normalization. Biomet and M-BAT predictions were consistent with, but less accurate than, full bioavailability normalization results, whereas PNECpro results were generally more conservative. The comparisons revealed important differences in tools in development, which results in differences in the predictions. Importantly, the models used for the development of PNECpro use a different ecotoxicity dataset, and a different bioavailability normalization approach using fewer biotic ligand models (BLMs) than that used for the derivation of the Ni EQS. The failure to include all of the available toxicity data, and all of the appropriate NiBLMs, has led to some significant differences between the predictions provided by PNECpro and those calculated using the process agreed to in Europe under the Water Framework Directive and other chemicals management programs (such as REACH). These considerable differences mean that PNECpro does not reflect the behavior, fate, and ecotoxicity of nickel, and raises concerns about its applicability for checking compliance against the Ni EQS. Environ Toxicol Chem 2016;35:2397-2404. © 2016 SETAC. © 2016 SETAC.
Assessment of Near-Field Sonic Boom Simulation Tools
NASA Technical Reports Server (NTRS)
Casper, J. H.; Cliff, S. E.; Thomas, S. D.; Park, M. A.; McMullen, M. S.; Melton, J. E.; Durston, D. A.
2008-01-01
A recent study for the Supersonics Project, within the National Aeronautics and Space Administration, has been conducted to assess current in-house capabilities for the prediction of near-field sonic boom. Such capabilities are required to simulate the highly nonlinear flow near an aircraft, wherein a sonic-boom signature is generated. There are many available computational fluid dynamics codes that could be used to provide the near-field flow for a sonic boom calculation. However, such codes have typically been developed for applications involving aerodynamic configuration, for which an efficiently generated computational mesh is usually not optimum for a sonic boom prediction. Preliminary guidelines are suggested to characterize a state-of-the-art sonic boom prediction methodology. The available simulation tools that are best suited to incorporate into that methodology are identified; preliminary test cases are presented in support of the selection. During this phase of process definition and tool selection, parallel research was conducted in an attempt to establish criteria that link the properties of a computational mesh to the accuracy of a sonic boom prediction. Such properties include sufficient grid density near shocks and within the zone of influence, which are achieved by adaptation and mesh refinement strategies. Prediction accuracy is validated by comparison with wind tunnel data.
Development of a model for predicting NASA/MSFC program success
NASA Technical Reports Server (NTRS)
Riggs, Jeffrey; Miller, Tracy; Finley, Rosemary
1990-01-01
Research conducted during the execution of a previous contract (NAS8-36955/0039) firmly established the feasibility of developing a tool to aid decision makers in predicting the potential success of proposed projects. The final report from that investigation contains an outline of the method to be applied in developing this Project Success Predictor Model. As a follow-on to the previous study, this report describes in detail the development of this model and includes full explanation of the data-gathering techniques used to poll expert opinion. The report includes the presentation of the model code itself.
Evaluation of Load Analysis Methods for NASAs GIII Adaptive Compliant Trailing Edge Project
NASA Technical Reports Server (NTRS)
Cruz, Josue; Miller, Eric J.
2016-01-01
The Air Force Research Laboratory (AFRL), NASA Armstrong Flight Research Center (AFRC), and FlexSys Inc. (Ann Arbor, Michigan) have collaborated to flight test the Adaptive Compliant Trailing Edge (ACTE) flaps. These flaps were installed on a Gulfstream Aerospace Corporation (GAC) GIII aircraft and tested at AFRC at various deflection angles over a range of flight conditions. External aerodynamic and inertial load analyses were conducted with the intention to ensure that the change in wing loads due to the deployed ACTE flap did not overload the existing baseline GIII wing box structure. The objective of this paper was to substantiate the analysis tools used for predicting wing loads at AFRC. Computational fluid dynamics (CFD) models and distributed mass inertial models were developed for predicting the loads on the wing. The analysis tools included TRANAIR (full potential) and CMARC (panel) models. Aerodynamic pressure data from the analysis codes were validated against static pressure port data collected in-flight. Combined results from the CFD predictions and the inertial load analysis were used to predict the normal force, bending moment, and torque loads on the wing. Wing loads obtained from calibrated strain gages installed on the wing were used for substantiation of the load prediction tools. The load predictions exhibited good agreement compared to the flight load results obtained from calibrated strain gage measurements.
Computational prediction of type III and IV secreted effectors in Gram-negative bacteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDermott, Jason E.; Corrigan, Abigail L.; Peterson, Elena S.
2011-01-01
In this review, we provide an overview of the methods employed by four recent papers that described novel methods for computational prediction of secreted effectors from type III and IV secretion systems in Gram-negative bacteria. The results of the studies in terms of performance at accurately predicting secreted effectors and similarities found between secretion signals that may reflect biologically relevant features for recognition. We discuss the web-based tools for secreted effector prediction described in these studies and announce the availability of our tool, the SIEVEserver (http://www.biopilot.org). Finally, we assess the accuracy of the three type III effector prediction methods onmore » a small set of proteins not known prior to the development of these tools that we have recently discovered and validated using both experimental and computational approaches. Our comparison shows that all methods use similar approaches and, in general arrive at similar conclusions. We discuss the possibility of an order-dependent motif in the secretion signal, which was a point of disagreement in the studies. Our results show that there may be classes of effectors in which the signal has a loosely defined motif, and others in which secretion is dependent only on compositional biases. Computational prediction of secreted effectors from protein sequences represents an important step toward better understanding the interaction between pathogens and hosts.« less
Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis
2015-01-01
Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.
Development and validation of a cancer-specific swallowing assessment tool: MASA-C.
Carnaby, Giselle D; Crary, Michael A
2014-03-01
We present data from a sample of patients receiving radiotherapy for head/neck cancer to define and measure the validity of a new clinical assessment measure for swallowing. Fifty-eight patients undergoing radiotherapy (±chemotherapy) for head/neck cancer (HNC) supported the development of a physiology-based assessment tool of swallowing (Mann Assessment of Swallowing Ability--Cancer: MASA-C) administered at two time points (baseline and following radiotherapy treatment). The new exam was evaluated for internal consistency of items using Cronbach's alpha. Reliability of measurement was evaluated with intraclass correlation (ICC) and the Kappa statistic between two independent raters. Concurrent validity was established through comparison with the original MASA examination and against the referent standard videofluoroscopic swallowing examination (VFE). Sensitivity, specificity, and likelihood ratios along with 95 % confidence intervals (CIs) were derived for comparison of the two evaluation forms (MASA vs. MASA-C). Accuracy of diagnostic precision was displayed using receiver operator characteristic curves. The new MASA-C tool demonstrated superior validity to the original MASA examination applied to a HNC population. In comparison to the VFE referent exam, the MASA-C revealed strong sensitivity and specificity (Se 83, Sp 96), predictive values (positive predictive value (PPV) 0.95, negative predictive value (NPV) 0.86), and likelihood ratios (21.6). In addition, it demonstrated good reliability (ICC = 0.96) between speech-language pathology raters. The MASA-C is a reliable and valid scale that is sensitive to differences in swallowing performance in HNC patients with and without dysphagia. Future longitudinal evaluation of this tool in larger samples is suggested. The development and refinement of this swallowing assessment tool for use in multidisciplinary HNC teams will facilitate earlier identification of patients with swallowing difficulties and enable more efficient allocation of resources to the management of dysphagia in this population. The MASA-C may also prove useful in future clinical HNC rehabilitation trials with this population.
Liu, Shi; Wu, Yu; Wooten, H Omar; Green, Olga; Archer, Brent; Li, Harold; Yang, Deshan
2016-03-08
A software tool is developed, given a new treatment plan, to predict treatment delivery time for radiation therapy (RT) treatments of patients on ViewRay magnetic resonance image-guided radiation therapy (MR-IGRT) delivery system. This tool is necessary for managing patient treatment scheduling in our clinic. The predicted treatment delivery time and the assessment of plan complexities could also be useful to aid treatment planning. A patient's total treatment delivery time, not including time required for localization, is modeled as the sum of four components: 1) the treatment initialization time; 2) the total beam-on time; 3) the gantry rotation time; and 4) the multileaf collimator (MLC) motion time. Each of the four components is predicted separately. The total beam-on time can be calculated using both the planned beam-on time and the decay-corrected dose rate. To predict the remain-ing components, we retrospectively analyzed the patient treatment delivery record files. The initialization time is demonstrated to be random since it depends on the final gantry angle of the previous treatment. Based on modeling the relationships between the gantry rotation angles and the corresponding rotation time, linear regression is applied to predict the gantry rotation time. The MLC motion time is calculated using the leaves delay modeling method and the leaf motion speed. A quantitative analysis was performed to understand the correlation between the total treatment time and the plan complexity. The proposed algorithm is able to predict the ViewRay treatment delivery time with the average prediction error 0.22min or 1.82%, and the maximal prediction error 0.89 min or 7.88%. The analysis has shown the correlation between the plan modulation (PM) factor and the total treatment delivery time, as well as the treatment delivery duty cycle. A possibility has been identified to significantly reduce MLC motion time by optimizing the positions of closed MLC pairs. The accuracy of the proposed prediction algorithm is sufficient to support patient treatment appointment scheduling. This developed software tool is currently applied in use on a daily basis in our clinic, and could also be used as an important indicator for treatment plan complexity.
Finite Element Simulations of Micro Turning of Ti-6Al-4V using PCD and Coated Carbide tools
NASA Astrophysics Data System (ADS)
Jagadesh, Thangavel; Samuel, G. L.
2017-02-01
The demand for manufacturing axi-symmetric Ti-6Al-4V implants is increasing in biomedical applications and it involves micro turning process. To understand the micro turning process, in this work, a 3D finite element model has been developed for predicting the tool chip interface temperature, cutting, thrust and axial forces. Strain gradient effect has been included in the Johnson-Cook material model to represent the flow stress of the work material. To verify the simulation results, experiments have been conducted at four different feed rates and at three different cutting speeds. Since titanium alloy has low Young's modulus, spring back effect is predominant for higher edge radius coated carbide tool which leads to the increase in the forces. Whereas, polycrystalline diamond (PCD) tool has smaller edge radius that leads to lesser forces and decrease in tool chip interface temperature due to high thermal conductivity. Tool chip interface temperature increases by increasing the cutting speed, however the increase is less for PCD tool as compared to the coated carbide tool. When uncut chip thickness decreases, there is an increase in specific cutting energy due to material strengthening effects. Surface roughness is higher for coated carbide tool due to ploughing effect when compared with PCD tool. The average prediction error of finite element model for cutting and thrust forces are 11.45 and 14.87 % respectively.
Ribay, Kathryn; Kim, Marlene T; Wang, Wenyi; Pinolini, Daniel; Zhu, Hao
2016-03-01
Estrogen receptors (ERα) are a critical target for drug design as well as a potential source of toxicity when activated unintentionally. Thus, evaluating potential ERα binding agents is critical in both drug discovery and chemical toxicity areas. Using computational tools, e.g., Quantitative Structure-Activity Relationship (QSAR) models, can predict potential ERα binding agents before chemical synthesis. The purpose of this project was to develop enhanced predictive models of ERα binding agents by utilizing advanced cheminformatics tools that can integrate publicly available bioassay data. The initial ERα binding agent data set, consisting of 446 binders and 8307 non-binders, was obtained from the Tox21 Challenge project organized by the NIH Chemical Genomics Center (NCGC). After removing the duplicates and inorganic compounds, this data set was used to create a training set (259 binders and 259 non-binders). This training set was used to develop QSAR models using chemical descriptors. The resulting models were then used to predict the binding activity of 264 external compounds, which were available to us after the models were developed. The cross-validation results of training set [Correct Classification Rate (CCR) = 0.72] were much higher than the external predictivity of the unknown compounds (CCR = 0.59). To improve the conventional QSAR models, all compounds in the training set were used to search PubChem and generate a profile of their biological responses across thousands of bioassays. The most important bioassays were prioritized to generate a similarity index that was used to calculate the biosimilarity score between each two compounds. The nearest neighbors for each compound within the set were then identified and its ERα binding potential was predicted by its nearest neighbors in the training set. The hybrid model performance (CCR = 0.94 for cross validation; CCR = 0.68 for external prediction) showed significant improvement over the original QSAR models, particularly for the activity cliffs that induce prediction errors. The results of this study indicate that the response profile of chemicals from public data provides useful information for modeling and evaluation purposes. The public big data resources should be considered along with chemical structure information when predicting new compounds, such as unknown ERα binding agents.
Predictive models in cancer management: A guide for clinicians.
Kazem, Mohammed Ali
2017-04-01
Predictive tools in cancer management are used to predict different outcomes including survival probability or risk of recurrence. The uptake of these tools by clinicians involved in cancer management has not been as common as other clinical tools, which may be due to the complexity of some of these tools or a lack of understanding of how they can aid decision-making in particular clinical situations. The aim of this article is to improve clinicians' knowledge and understanding of predictive tools used in cancer management, including how they are built, how they can be applied to medical practice, and what their limitations may be. Literature review was conducted to investigate the role of predictive tools in cancer management. All predictive models share similar characteristics, but depending on the type of the tool its ability to predict an outcome will differ. Each type has its own pros and cons, and its generalisability will depend on the cohort used to build the tool. These factors will affect the clinician's decision whether to apply the model to their cohort or not. Before a model is used in clinical practice, it is important to appreciate how the model is constructed, what its use may add over and above traditional decision-making tools, and what problems or limitations may be associated with it. Understanding all the above is an important step for any clinician who wants to decide whether or not use predictive tools in their practice. Copyright © 2016 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.
On Nonlinear Combustion Instability in Liquid Propellant Rocket Motors
NASA Technical Reports Server (NTRS)
Sims, J. D. (Technical Monitor); Flandro, Gary A.; Majdalani, Joseph; Sims, Joseph D.
2004-01-01
All liquid propellant rocket instability calculations in current use have limited value in the predictive sense and serve mainly as a correlating framework for the available data sets. The well-known n-t model first introduced by Crocco and Cheng in 1956 is still used as the primary analytical tool of this type. A multitude of attempts to establish practical analytical methods have achieved only limited success. These methods usually produce only stability boundary maps that are of little use in making critical design decisions in new motor development programs. Recent progress in understanding the mechanisms of combustion instability in solid propellant rockets"' provides a firm foundation for a new approach to prediction, diagnosis, and correction of the closely related problems in liquid motor instability. For predictive tools to be useful in the motor design process, they must have the capability to accurately determine: 1) time evolution of the pressure oscillations and limit amplitude, 2) critical triggering pulse amplitude, and 3) unsteady heat transfer rates at injector surfaces and chamber walls. The method described in this paper relates these critical motor characteristics directly to system design parameters. Inclusion of mechanisms such as wave steepening, vorticity production and transport, and unsteady detonation wave phenomena greatly enhance the representation of key features of motor chamber oscillatory behavior. The basic theoretical model is described and preliminary computations are compared to experimental data. A plan to develop the new predictive method into a comprehensive analysis tool is also described.
Giollo, Manuel; Martin, Alberto J M; Walsh, Ian; Ferrari, Carlo; Tosatto, Silvio C E
2014-01-01
The rapid growth of un-annotated missense variants poses challenges requiring novel strategies for their interpretation. From the thermodynamic point of view, amino acid changes can lead to a change in the internal energy of a protein and induce structural rearrangements. This is of great relevance for the study of diseases and protein design, justifying the development of prediction methods for variant-induced stability changes. Here we propose NeEMO, a tool for the evaluation of stability changes using an effective representation of proteins based on residue interaction networks (RINs). RINs are used to extract useful features describing interactions of the mutant amino acid with its structural environment. Benchmarking shows NeEMO to be very effective, allowing reliable predictions in different parts of the protein such as β-strands and buried residues. Validation on a previously published independent dataset shows that NeEMO has a Pearson correlation coefficient of 0.77 and a standard error of 1 Kcal/mol, outperforming nine recent methods. The NeEMO web server can be freely accessed from URL: http://protein.bio.unipd.it/neemo/. NeEMO offers an innovative and reliable tool for the annotation of amino acid changes. A key contribution are RINs, which can be used for modeling proteins and their interactions effectively. Interestingly, the approach is very general, and can motivate the development of a new family of RIN-based protein structure analyzers. NeEMO may suggest innovative strategies for bioinformatics tools beyond protein stability prediction.
Adigun, Babatunde John; Fensin, Michael Lorne; Galloway, Jack D.; ...
2016-10-01
Our burnup study examined the effect of a predicted critical control rod position on the nuclide predictability of several axial and radial locations within a 4×4 graphite moderated gas cooled reactor fuel cluster geometry. To achieve this, a control rod position estimator (CRPE) tool was developed within the framework of the linkage code Monteburns between the transport code MCNP and depletion code CINDER90, and four methodologies were proposed within the tool for maintaining criticality. Two of the proposed methods used an inverse multiplication approach - where the amount of fissile material in a set configuration is slowly altered until criticalitymore » is attained - in estimating the critical control rod position. Another method carried out several MCNP criticality calculations at different control rod positions, then used a linear fit to estimate the critical rod position. The final method used a second-order polynomial fit of several MCNP criticality calculations at different control rod positions to guess the critical rod position. The results showed that consistency in prediction of power densities as well as uranium and plutonium isotopics was mutual among methods within the CRPE tool that predicted critical position consistently well. Finall, while the CRPE tool is currently limited to manipulating a single control rod, future work could be geared toward implementing additional criticality search methodologies along with additional features.« less
Ahmadi, Hamed; Rodehutscord, Markus
2017-01-01
In the nutrition literature, there are several reports on the use of artificial neural network (ANN) and multiple linear regression (MLR) approaches for predicting feed composition and nutritive value, while the use of support vector machines (SVM) method as a new alternative approach to MLR and ANN models is still not fully investigated. The MLR, ANN, and SVM models were developed to predict metabolizable energy (ME) content of compound feeds for pigs based on the German energy evaluation system from analyzed contents of crude protein (CP), ether extract (EE), crude fiber (CF), and starch. A total of 290 datasets from standardized digestibility studies with compound feeds was provided from several institutions and published papers, and ME was calculated thereon. Accuracy and precision of developed models were evaluated, given their produced prediction values. The results revealed that the developed ANN [ R 2 = 0.95; root mean square error (RMSE) = 0.19 MJ/kg of dry matter] and SVM ( R 2 = 0.95; RMSE = 0.21 MJ/kg of dry matter) models produced better prediction values in estimating ME in compound feed than those produced by conventional MLR ( R 2 = 0.89; RMSE = 0.27 MJ/kg of dry matter). The developed ANN and SVM models produced better prediction values in estimating ME in compound feed than those produced by conventional MLR; however, there were not obvious differences between performance of ANN and SVM models. Thus, SVM model may also be considered as a promising tool for modeling the relationship between chemical composition and ME of compound feeds for pigs. To provide the readers and nutritionist with the easy and rapid tool, an Excel ® calculator, namely, SVM_ME_pig, was created to predict the metabolizable energy values in compound feeds for pigs using developed support vector machine model.
Model Performance Evaluation and Scenario Analysis (MPESA)
Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)
A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit
NASA Technical Reports Server (NTRS)
Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.
2016-01-01
Suboptimal suit fit is a known risk factor for crewmember shoulder injury. Suit fit assessment is however prohibitively time consuming and cannot be generalized across wide variations of body shapes and poses. In this work, we have developed a new design tool based on the statistical analysis of body shape scans. This tool is aimed at predicting the skin deformation and shape variations for any body size and shoulder pose for a target population. This new process, when incorporated with CAD software, will enable virtual suit fit assessments, predictively quantifying the contact volume, and clearance between the suit and body surface at reduced time and cost.
Development of assessment tools to measure organizational support for employee health.
Golaszewski, Thomas; Barr, Donald; Pronk, Nico
2003-01-01
To develop systems that measure and effect organizational support for employee health. Multiple studies and developmental projects were reviewed that show the process of instrument development, metric quality testing, utilization within intervention studies, and prediction modeling efforts. Demographic patterns indicate high support levels and relationships of subsections to various employee health risks. Successes with the initial version have given rise to 2 additional evaluation tools. The availability of these systems illustrates how ecological models can be practically applied. Such efforts contribute to the paradigm shift in worksite health promotion that focuses on the organization as the target of intervention.
Forensic analysis of explosions: Inverse calculation of the charge mass.
van der Voort, M M; van Wees, R M M; Brouwer, S D; van der Jagt-Deutekom, M J; Verreault, J
2015-07-01
Forensic analysis of explosions consists of determining the point of origin, the explosive substance involved, and the charge mass. Within the EU FP7 project Hyperion, TNO developed the Inverse Explosion Analysis (TNO-IEA) tool to estimate the charge mass and point of origin based on observed damage around an explosion. In this paper, inverse models are presented based on two frequently occurring and reliable sources of information: window breakage and building damage. The models have been verified by applying them to the Enschede firework disaster and the Khobar tower attack. Furthermore, a statistical method has been developed to combine the various types of data, in order to determine an overall charge mass distribution. In relatively open environments, like for the Enschede firework disaster, the models generate realistic charge masses that are consistent with values found in forensic literature. The spread predicted by the IEA tool is however larger than presented in the literature for these specific cases. This is also realistic due to the large inherent uncertainties in a forensic analysis. The IEA-models give a reasonable first order estimate of the charge mass in a densely built urban environment, such as for the Khobar tower attack. Due to blast shielding effects which are not taken into account in the IEA tool, this is usually an under prediction. To obtain more accurate predictions, the application of Computational Fluid Dynamics (CFD) simulations is advised. The TNO IEA tool gives unique possibilities to inversely calculate the TNT equivalent charge mass based on a large variety of explosion effects and observations. The IEA tool enables forensic analysts, also those who are not experts on explosion effects, to perform an analysis with a largely reduced effort. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Pai, Pei-Jing; Hu, Yingwei; Lam, Henry
2016-08-31
Intact glycopeptide MS analysis to reveal site-specific protein glycosylation is an important frontier of proteomics. However, computational tools for analyzing MS/MS spectra of intact glycopeptides are still limited and not well-integrated into existing workflows. In this work, a new computational tool which combines the spectral library building/searching tool, SpectraST (Lam et al. Nat. Methods2008, 5, 873-875), and the glycopeptide fragmentation prediction tool, MassAnalyzer (Zhang et al. Anal. Chem.2010, 82, 10194-10202) for intact glycopeptide analysis has been developed. Specifically, this tool enables the determination of the glycan structure directly from low-energy collision-induced dissociation (CID) spectra of intact glycopeptides. Given a list of possible glycopeptide sequences as input, a sample-specific spectral library of MassAnalyzer-predicted spectra is built using SpectraST. Glycan identification from CID spectra is achieved by spectral library searching against this library, in which both m/z and intensity information of the possible fragmentation ions are taken into consideration for improved accuracy. We validated our method using a standard glycoprotein, human transferrin, and evaluated its potential to be used in site-specific glycosylation profiling of glycoprotein datasets from LC-MS/MS. In addition, we further applied our method to reveal, for the first time, the site-specific N-glycosylation profile of recombinant human acetylcholinesterase expressed in HEK293 cells. For maximum usability, SpectraST is developed as part of the Trans-Proteomic Pipeline (TPP), a freely available and open-source software suite for MS data analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
Solubility prediction, solvate and cocrystal screening as tools for rational crystal engineering.
Loschen, Christoph; Klamt, Andreas
2015-06-01
The fact that novel drug candidates are becoming increasingly insoluble is a major problem of current drug development. Computational tools may address this issue by screening for suitable solvents or by identifying potential novel cocrystal formers that increase bioavailability. In contrast to other more specialized methods, the fluid phase thermodynamics approach COSMO-RS (conductor-like screening model for real solvents) allows for a comprehensive treatment of drug solubility, solvate and cocrystal formation and many other thermodynamics properties in liquids. This article gives an overview of recent COSMO-RS developments that are of interest for drug development and contains several new application examples for solubility prediction and solvate/cocrystal screening. For all property predictions COSMO-RS has been used. The basic concept of COSMO-RS consists of using the screening charge density as computed from first principles calculations in combination with fast statistical thermodynamics to compute the chemical potential of a compound in solution. The fast and accurate assessment of drug solubility and the identification of suitable solvents, solvate or cocrystal formers is nowadays possible and may be used to complement modern drug development. Efficiency is increased by avoiding costly quantum-chemical computations using a database of previously computed molecular fragments. COSMO-RS theory can be applied to a range of physico-chemical properties, which are of interest in rational crystal engineering. Most notably, in combination with experimental reference data, accurate quantitative solubility predictions in any solvent or solvent mixture are possible. Additionally, COSMO-RS can be extended to the prediction of cocrystal formation, which results in considerable predictive accuracy concerning coformer screening. In a recent variant costly quantum chemical calculations are avoided resulting in a significant speed-up and ease-of-use. © 2015 Royal Pharmaceutical Society.
2013-01-01
Background The present study aimed to develop an artificial neural network (ANN) based prediction model for cardiovascular autonomic (CA) dysfunction in the general population. Methods We analyzed a previous dataset based on a population sample consisted of 2,092 individuals aged 30–80 years. The prediction models were derived from an exploratory set using ANN analysis. Performances of these prediction models were evaluated in the validation set. Results Univariate analysis indicated that 14 risk factors showed statistically significant association with CA dysfunction (P < 0.05). The mean area under the receiver-operating curve was 0.762 (95% CI 0.732–0.793) for prediction model developed using ANN analysis. The mean sensitivity, specificity, positive and negative predictive values were similar in the prediction models was 0.751, 0.665, 0.330 and 0.924, respectively. All HL statistics were less than 15.0. Conclusion ANN is an effective tool for developing prediction models with high value for predicting CA dysfunction among the general population. PMID:23902963
Emura, Takeshi; Nakatochi, Masahiro; Matsui, Shigeyuki; Michimae, Hirofumi; Rondeau, Virginie
2017-01-01
Developing a personalized risk prediction model of death is fundamental for improving patient care and touches on the realm of personalized medicine. The increasing availability of genomic information and large-scale meta-analytic data sets for clinicians has motivated the extension of traditional survival prediction based on the Cox proportional hazards model. The aim of our paper is to develop a personalized risk prediction formula for death according to genetic factors and dynamic tumour progression status based on meta-analytic data. To this end, we extend the existing joint frailty-copula model to a model allowing for high-dimensional genetic factors. In addition, we propose a dynamic prediction formula to predict death given tumour progression events possibly occurring after treatment or surgery. For clinical use, we implement the computation software of the prediction formula in the joint.Cox R package. We also develop a tool to validate the performance of the prediction formula by assessing the prediction error. We illustrate the method with the meta-analysis of individual patient data on ovarian cancer patients.
A human-hearing-related prediction tool for soundscapes and community noise
NASA Astrophysics Data System (ADS)
Genuit, Klaus
2002-11-01
There are several methods of calculation available for the prediction of the A-weighted sound-pressure level of environmental noise, which are, however, not suitable for a qualified prediction of the residents' annoyance and physiological strain. The subjectively felt noise quality does not only depend on the A-weighted sound-pressure level, but also on other psychoacoustical parameters, such as loudness, roughness, sharpness, etc. In addition to these physical and psychoacoustical aspects of noise, the so-called psychological or cognitive aspects have to be considered, too, which means that the listeners' expectations, their mental attitude, as well as the information content of the noise finally influence the noise quality perceived by the individual persons. Within the scope of a research project SVEN (Sound Quality of Vehicle Exterior Noise), which is promoted by the EC, a new tool has been developed which allows a binaural simulation and prediction of the environmental noise to evaluate the influence of different contributions by the sound events with respect to the psychoacoustical parameters, the spatial distribution, movement, and frequency. By means of this tool it is now possible to consider completely new aspects regarding the audible perception of noise when establishing a soundscape or when planning community noise.
Development of a CME-associated geomagnetic storm intensity prediction tool
NASA Astrophysics Data System (ADS)
Wu, C. C.; DeHart, J. M.
2015-12-01
From 1995 to 2012, the Wind spacecraft recorded 168 magnetic cloud (MC) events. Among those events, 79 were found to have upstream shock waves and their source locations on the Sun were identified. Using a recipe of interplanetary magnetic field (IMF) Bz initial turning direction after shock (Wu et al., 1996, GRL), it is found that the north-south polarity of 66 (83.5%) out of the 79 events were accurately predicted. These events were tested and further analyzed, reaffirming that the Bz intial turning direction was accurate. The results also indicate that 37 of the 79 MCs originate from the north (of the Sun) averaged a Dst_min of -119 nT, whereas 42 of the MCs originating from the south (of the Sun) averaged -89 nT. In an effort to provide this research to others, a website was built that incorporated various tools and pictures to predict the intensity of the geomagnetic storms. The tool is capable of predicting geomagnetic storms with different ranges of Dst_min (from no-storm to gigantic storms). This work was supported by Naval Research Lab HBCU/MI Internship program and Chief of Naval Research.
NASA Astrophysics Data System (ADS)
Bose, Subrata K.; Browne, Antony; Kazemian, Hassan; White, Kenneth
Membrane proteins (MPs) are large set of biological macromolecules that play a fundamental role in physiology and pathophysiology for survival. From a pharma-economical perspective, though it is the fact that MPs constitute ˜75% of possible targets for novel drugs but MPs are one of the most understudied groups of proteins in biochemical research. This is mainly because of the technical difficulties of obtaining structural information about trans-membrane regions (these are small sequences that crossways the bilayer lipid membrane). It is quite useful to predict the location of transmembrane segments down the sequence, since these are the elementary structural building blocks defining their topology. There have been several attempts over the last 20 years to develop tools for predicting membrane-spanning regions but current tools are far away from achieving a considerable reliability in prediction. This study aims to exploit the knowledge and current understanding in the field of artificial neural networks (ANNs) in particular data representation through the development of a system to identify and predict membrane-spanning regions by analysing primary amino acids sequence. In this paper we present a novel neural network (NNs) architecture and algorithms for predicting membrane spanning regions from primary amino acids sequences by using their preference parameters.
Individualized Behavioral Health Monitoring Tool
NASA Technical Reports Server (NTRS)
Mollicone, Daniel
2015-01-01
Behavioral health risks during long-duration space exploration missions are among the most difficult to predict, detect, and mitigate. Given the anticipated extended duration of future missions and their isolated, extreme, and confined environments, there is the possibility that behavior conditions and mental disorders will develop among astronaut crew. Pulsar Informatics, Inc., has developed a health monitoring tool that provides a means to detect and address behavioral disorders and mental conditions at an early stage. The tool integrates all available behavioral measures collected during a mission to identify possible health indicator warning signs within the context of quantitatively tracked mission stressors. It is unobtrusive and requires minimal crew time and effort to train and utilize. The monitoring tool can be deployed in space analog environments for validation testing and ultimate deployment in long-duration space exploration missions.
Jalalian, Mehrdad; Latiff, Latiffah; Hassan, Syed Tajuddin Syed; Hanachi, Parichehr; Othman, Mohamed
2010-05-01
University students are a target group for blood donor programs. To develop a blood donation culture among university students, it is important to identify factors used to predict their intent to donate blood. This study attempted to develop a valid and reliable measurement tool to be employed in assessing variables in a blood donation behavior model based on the Theory of Planned Behavior (TPB), a commonly used theoretical foundation for social psychology studies. We employed an elicitation study, in which we determined the commonly held behavioral and normative beliefs about blood donation. We used the results of the elicitation study and a standard format for creating questionnaire items for all constructs of the TPB model to prepare the first draft of the measurement tool. After piloting the questionnaire, we prepared the final draft of the questionnaire to be used in our main study. Examination of internal consistency using Chronbach's alpha coefficient and item-total statistics indicated the constructs "Intention" and "Self efficacy" had the highest reliability. Removing one item from each of the constructs, "Attitude," "Subjective norm," "Self efficacy," or "Behavioral beliefs", can considerably increase the reliability of the measurement tool, however, such action is controversial, especially for the variables "attitude" and "subjective norm." We consider all the items of our first draft questionnaire in our main study to make it a reliable measurement tool.
Transient Three-Dimensional Analysis of Nozzle Side Load in Regeneratively Cooled Engines
NASA Technical Reports Server (NTRS)
ng, Ten-See
2005-01-01
Nozzle side loads are potentially detrimental to the integrity and life of almost all launch vehicles. the lack of a detailed prediction capability results in reducing life and increased weight for reusable nozzle systems. A clear understanding of the mechanism that contribute to side loads during engine startup, shutdown, and steady-state operations must be established. A CFD based predictive tool must be developed to aid the understanding of side load physics and development of future reusable engine.
Organization Development Strategies in Educational Policy Planning and Management.
ERIC Educational Resources Information Center
Jones, B. Kathryn; Biles, Stephen
1990-01-01
This synthesis reviews organizational development (OD) and its decision tools, describes OD applications in educational organizations, explores OD's limitations, and predicts how OD will influence future educational decision making. Findings identify eight specific management and planning areas where OD can be used to improve organizational…
NASA Astrophysics Data System (ADS)
Salas-García, Irene; Fanjul-Vélez, Félix; Arce-Diego, José Luis
2012-03-01
The development of Photodynamic Therapy (PDT) predictive models has become a valuable tool for an optimal treatment planning, monitoring and dosimetry adjustment. A few attempts have achieved a quite complete characterization of the complex photochemical and photophysical processes involved, even taking into account superficial fluorescence in the target tissue. The present work is devoted to the application of a predictive PDT model to obtain fluorescence tomography information during PDT when applied to a skin disease. The model takes into account the optical radiation distribution, a non-homogeneous topical photosensitizer distribution, the time dependent photochemical interaction and the photosensitizer fluorescence emission. The results show the spatial evolution of the photosensitizer fluorescence emission and the amount of singlet oxygen produced during PDT. The depth dependent photosensitizer fluorescence emission obtained is essential to estimate the spatial photosensitizer concentration and its degradation due to photobleaching. As a consequence the proposed approach could be used to predict the photosensitizer fluorescence tomographic measurements during PDT. The singlet oxygen prediction could also be employed as a valuable tool to predict the short term treatment outcome.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Boyun; Duguid, Andrew; Nygaard, Ronar
The objective of this project is to develop a computerized statistical model with the Integrated Neural-Genetic Algorithm (INGA) for predicting the probability of long-term leak of wells in CO 2 sequestration operations. This object has been accomplished by conducting research in three phases: 1) data mining of CO 2-explosed wells, 2) INGA computer model development, and 3) evaluation of the predictive performance of the computer model with data from field tests. Data mining was conducted for 510 wells in two CO 2 sequestration projects in the Texas Gulf Coast region. They are the Hasting West field and Oyster Bayou fieldmore » in the Southern Texas. Missing wellbore integrity data were estimated using an analytical and Finite Element Method (FEM) model. The INGA was first tested for performances of convergence and computing efficiency with the obtained data set of high dimension. It was concluded that the INGA can handle the gathered data set with good accuracy and reasonable computing time after a reduction of dimension with a grouping mechanism. A computerized statistical model with the INGA was then developed based on data pre-processing and grouping. Comprehensive training and testing of the model were carried out to ensure that the model is accurate and efficient enough for predicting the probability of long-term leak of wells in CO 2 sequestration operations. The Cranfield in the southern Mississippi was select as the test site. Observation wells CFU31F2 and CFU31F3 were used for pressure-testing, formation-logging, and cement-sampling. Tools run in the wells include Isolation Scanner, Slim Cement Mapping Tool (SCMT), Cased Hole Formation Dynamics Tester (CHDT), and Mechanical Sidewall Coring Tool (MSCT). Analyses of the obtained data indicate no leak of CO 2 cross the cap zone while it is evident that the well cement sheath was invaded by the CO 2 from the storage zone. This observation is consistent with the result predicted by the INGA model which indicates the well has a CO 2 leak-safe probability of 72%. This comparison implies that the developed INGA model is valid for future use in predicting well leak probability.« less
NASA Technical Reports Server (NTRS)
Smith, Mark S.; Bui, Trong T.; Garcia, Christian A.; Cumming, Stephen B.
2016-01-01
A pair of compliant trailing edge flaps was flown on a modified GIII airplane. Prior to flight test, multiple analysis tools of various levels of complexity were used to predict the aerodynamic effects of the flaps. Vortex lattice, full potential flow, and full Navier-Stokes aerodynamic analysis software programs were used for prediction, in addition to another program that used empirical data. After the flight-test series, lift and pitching moment coefficient increments due to the flaps were estimated from flight data and compared to the results of the predictive tools. The predicted lift increments matched flight data well for all predictive tools for small flap deflections. All tools over-predicted lift increments for large flap deflections. The potential flow and Navier-Stokes programs predicted pitching moment coefficient increments better than the other tools.
Using Predictive Analytics to Detect Major Problems in Department of Defense Acquisition Programs
2012-03-01
research is focused on three questions. First, can we predict the contractor provided estimate at complete (EAC)? Second, can we use those predictions to...develop an algorithm to determine if a problem will occur in an acquisition program or sub-program? Lastly, can we provide the probability of a problem...more than doubling the probability of a problem occurrence compared to current tools in the cost community. Though program managers can use this
Prediction of Agglomeration, Fouling, and Corrosion Tendency of Fuels in CFB Co-Combustion
NASA Astrophysics Data System (ADS)
Barišć, Vesna; Zabetta, Edgardo Coda; Sarkki, Juha
Prediction of agglomeration, fouling, and corrosion tendency of fuels is essential to the design of any CFB boiler. During the years, tools have been successfully developed at Foster Wheeler to help with such predictions for the most commercial fuels. However, changes in fuel market and the ever-growing demand for co-combustion capabilities pose a continuous need for development. This paper presents results from recently upgraded models used at Foster Wheeler to predict agglomeration, fouling, and corrosion tendency of a variety of fuels and mixtures. The models, subject of this paper, are semi-empirical computer tools that combine the theoretical basics of agglomeration/fouling/corrosion phenomena with empirical correlations. Correlations are derived from Foster Wheeler's experience in fluidized beds, including nearly 10,000 fuel samples and over 1,000 tests in about 150 CFB units. In these models, fuels are evaluated based on their classification, their chemical and physical properties by standard analyses (proximate, ultimate, fuel ash composition, etc.;.) alongside with Foster Wheeler own characterization methods. Mixtures are then evaluated taking into account the component fuels. This paper presents the predictive capabilities of the agglomeration/fouling/corrosion probability models for selected fuels and mixtures fired in full-scale. The selected fuels include coals and different types of biomass. The models are capable to predict the behavior of most fuels and mixtures, but also offer possibilities for further improvements.
NASA Technical Reports Server (NTRS)
DeHart, Russell
2017-01-01
This study determines the feasibility of creating a tool that can accurately predict Lunar Reconnaissance Orbiter (LRO) reaction wheel assembly (RWA) angular momentum, weeks or even months into the future. LRO is a three-axis stabilized spacecraft that was launched on June 18, 2009. While typically nadir-pointing, LRO conducts many types of slews to enable novel science collection. Momentum unloads have historically been performed approximately once every two weeks with the goal of maintaining system total angular momentum below 70 Nms; however flight experience shows the models developed before launch are overly conservative, with many momentum unloads being performed before system angular momentum surpasses 50 Nms. A more accurate model of RWA angular momentum growth would improve momentum unload scheduling and decrease the frequency of these unloads. Since some LRO instruments must be deactivated during momentum unloads and in the case of one instrument, decontaminated for 24 hours there after a decrease in the frequency of unloads increases science collection. This study develops a new model to predict LRO RWA angular momentum. Regression analysis of data from October 2014 to October 2015 was used to develop relationships between solar beta angle, slew specifications, and RWA angular momentum growth. The resulting model predicts RWA angular momentum using input solar beta angle and mission schedule data. This model was used to predict RWA angular momentum from October 2013 to October 2014. Predictions agree well with telemetry; of the 23 momentum unloads performed from October 2013 to October 2014, the mean and median magnitude of the RWA total angular momentum prediction error at the time of the momentum unloads were 3.7 and 2.7 Nms, respectively. The magnitude of the largest RWA total angular momentum prediction error was 10.6 Nms. Development of a tool that uses the models presented herein is currently underway.
Sang-Kyun Han; Han-Sup Han; William J. Elliot; Edward M. Bilek
2017-01-01
We developed a spreadsheet-based model, named ThinTool, to evaluate the cost of mechanical fuel reduction thinning including biomass removal, to predict net energy output, and to assess nutrient impacts from thinning treatments in northern California and southern Oregon. A combination of literature reviews, field-based studies, and contractor surveys was used to...
Rajadhyaksha, Manoj; Subramanyam, Meena; Rup, Bonnie
2013-10-01
The immunogenicity profile of a biotherapeutic is determined by multiple product-, process- or manufacturing-, patient- and treatment-related factors and the bioanalytical methodology used to monitor for immunogenicity. This creates a complex situation that limits direct correlation of individual factors to observed immunogenicity rates. Therefore, mechanistic understanding of how these factors individually or in concert could influence the overall incidence and clinical risk of immunogenicity is crucial to provide the best benefit/risk profile for a given biotherapeutic in a given indication and to inform risk mitigation strategies. Advances in the field of immunogenicity have included development of best practices for monitoring anti-drug antibody development, categorization of risk factors contributing to immunogenicity, development of predictive tools, and development of effective strategies for risk management and mitigation. Thus, the opportunity to ask "where we are now and where we would like to go from here?" was the main driver for organizing an Open Forum on Improving Immunogenicity Risk Prediction and Management, conducted at the 2012 American Association of Pharmaceutical Scientists' (AAPS) National Biotechnology Conference in San Diego. The main objectives of the Forum include the following: to understand the nature of immunogenicity risk factors, to identify analytical tools used and animal models and management strategies needed to improve their predictive value, and finally to identify collaboration opportunities to improve the reliability of risk prediction, mitigation, and management. This meeting report provides the Forum participant's and author's perspectives on the barriers to advancing this field and recommendations for overcoming these barriers through collaborative efforts.
X-DRAIN and XDS: a simplified road erosion prediction method
William J. Elliot; David E. Hall; S. R. Graves
1998-01-01
To develop a simple road sediment delivery tool, the WEPP program modeled sedimentation from forest roads for more than 50,000 combinations of distance between cross drains, road gradient, soil texture, distance from stream, steepness of the buffer between the road and the stream, and climate. The sediment yield prediction from each of these runs was stored in a data...
Chen, A.; Yarmush, M.L.; Maguire, T.
2014-01-01
There is a large emphasis within the pharmaceutical industry to provide tools that will allow early research and development groups to better predict dose ranges for and metabolic responses of candidate molecules in a high throughput manner, prior to entering clinical trials. These tools incorporate approaches ranging from PBPK, QSAR, and molecular dynamics simulations in the in silico realm, to micro cell culture analogue (CCAs)s in the in vitro realm. This paper will serve to review these areas of high throughput predictive research, and highlight hurdles and potential solutions. In particular we will focus on CCAs, as their incorporation with PBPK modeling has the potential to replace animal testing, with a more predictive assay that can combine multiple organ analogs on one microfluidic platform in physiologically correct volume ratios. While several advantages arise from the current embodiments of CCAS in a microfluidic format that can be exploited for realistic simulations of drug absorption, metabolism and action, we explore some of the concerns with these systems, and provide a potential path forward to realizing animal-free solutions. Furthermore we envision that, together with theoretical modeling, CCAs may produce reliable predictions of the efficacy of newly developed drugs. PMID:22571482
Artificial neural networks in gynaecological diseases: current and potential future applications.
Siristatidis, Charalampos S; Chrelias, Charalampos; Pouliakis, Abraham; Katsimanis, Evangelos; Kassanos, Dimitrios
2010-10-01
Current (and probably future) practice of medicine is mostly associated with prediction and accurate diagnosis. Especially in clinical practice, there is an increasing interest in constructing and using valid models of diagnosis and prediction. Artificial neural networks (ANNs) are mathematical systems being used as a prospective tool for reliable, flexible and quick assessment. They demonstrate high power in evaluating multifactorial data, assimilating information from multiple sources and detecting subtle and complex patterns. Their capability and difference from other statistical techniques lies in performing nonlinear statistical modelling. They represent a new alternative to logistic regression, which is the most commonly used method for developing predictive models for outcomes resulting from partitioning in medicine. In combination with the other non-algorithmic artificial intelligence techniques, they provide useful software engineering tools for the development of systems in quantitative medicine. Our paper first presents a brief introduction to ANNs, then, using what we consider the best available evidence through paradigms, we evaluate the ability of these networks to serve as first-line detection and prediction techniques in some of the most crucial fields in gynaecology. Finally, through the analysis of their current application, we explore their dynamics for future use.
Will it rise or will it fall? Managing the complex effects of urbanization on base flow
Bhaskar, Aditi; Beesley, Leah; Burns, Matthew J.; Fletcher, T. D.; Hamel, Perrine; Oldham, Carolyn; Roy, Allison
2016-01-01
Sustaining natural levels of base flow is critical to maintaining ecological function as stream catchments are urbanized. Research shows a variable response of stream base flow to urbanization, with base flow or water tables rising in some locations, falling in others, or elsewhere remaining constant. The variable baseflow response is due to the array of natural (e.g., physiographic setting and climate) and anthropogenic (e.g., urban development and infrastructure) factors that influence hydrology. Perhaps as a consequence of this complexity, few simple tools exist to assist managers to predict baseflow change in their local urban area. This paper addresses this management need by presenting a decision support tool. The tool considers the natural vulnerability of the landscape, together with aspects of urban development in predicting the likelihood and direction of baseflow change. Where the tool identifies a likely increase or decrease it guides managers toward strategies that can reduce or increase groundwater recharge, respectively. Where the tool finds an equivocal result, it suggests a detailed water balance be performed. The decision support tool is embedded within an adaptive-management framework that encourages managers to define their ecological objectives, assess the vulnerability of their ecological objectives to changes in water table height, and monitor baseflow responses to urbanization. We trial our framework using two very different case studies: Perth, Western Australia, and Baltimore, Maryland, USA. Together, these studies show how pre-development water table height, climate and geology together with aspects of urban infrastructure (e.g., stormwater practices, leaky pipes) interact such that urbanization has overall led to rising base flow (Perth) and falling base flow (Baltimore). Greater consideration of subsurface components of the water cycle will help to protect and restore the ecology of urban freshwaters.
NASA Technical Reports Server (NTRS)
Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)
2000-01-01
This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.
Local Debonding and Fiber Breakage in Composite Materials Modeled Accurately
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2001-01-01
A prerequisite for full utilization of composite materials in aerospace components is accurate design and life prediction tools that enable the assessment of component performance and reliability. Such tools assist both structural analysts, who design and optimize structures composed of composite materials, and materials scientists who design and optimize the composite materials themselves. NASA Glenn Research Center's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package (http://www.grc.nasa.gov/WWW/LPB/mac) addresses this need for composite design and life prediction tools by providing a widely applicable and accurate approach to modeling composite materials. Furthermore, MAC/GMC serves as a platform for incorporating new local models and capabilities that are under development at NASA, thus enabling these new capabilities to progress rapidly to a stage in which they can be employed by the code's end users.
Development of Next Generation Synthetic Biology Tools for Use in Streptomyces venezuelae
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phelan, Ryan M.; Sachs, Daniel; Petkiewicz, Shayne J.
Streptomyces have a rich history as producers of important natural products and this genus of bacteria has recently garnered attention for its potential applications in the broader context of synthetic biology. However, the dearth of genetic tools available to control and monitor protein production precludes rapid and predictable metabolic engineering that is possible in hosts such as Escherichia coli or Saccharomyces cerevisiae. In an effort to improve genetic tools for Streptomyces venezuelae, we developed a suite of standardized, orthogonal integration vectors and an improved method to monitor protein production in this host. These tools were applied to characterize heterologous promotersmore » and various attB chromosomal integration sites. A final study leveraged the characterized toolset to demonstrate its use in producing the biofuel precursor bisabolene using a chromosomally integrated expression system. In conclusion, these tools advance S. venezuelae to be a practical host for future metabolic engineering efforts.« less
Nealon, John Oliver; Philomina, Limcy Seby
2017-01-01
The elucidation of protein–protein interactions is vital for determining the function and action of quaternary protein structures. Here, we discuss the difficulty and importance of establishing protein quaternary structure and review in vitro and in silico methods for doing so. Determining the interacting partner proteins of predicted protein structures is very time-consuming when using in vitro methods, this can be somewhat alleviated by use of predictive methods. However, developing reliably accurate predictive tools has proved to be difficult. We review the current state of the art in predictive protein interaction software and discuss the problem of scoring and therefore ranking predictions. Current community-based predictive exercises are discussed in relation to the growth of protein interaction prediction as an area within these exercises. We suggest a fusion of experimental and predictive methods that make use of sparse experimental data to determine higher resolution predicted protein interactions as being necessary to drive forward development. PMID:29206185
Goenka, Anu; Jeena, Prakash M; Mlisana, Koleka; Solomon, Tom; Spicer, Kevin; Stephenson, Rebecca; Verma, Arpana; Dhada, Barnesh; Griffiths, Michael J
2018-03-01
Early diagnosis of tuberculous meningitis (TBM) is crucial to achieve optimum outcomes. There is no effective rapid diagnostic test for use in children. We aimed to develop a clinical decision tool to facilitate the early diagnosis of childhood TBM. Retrospective case-control study was performed across 7 hospitals in KwaZulu-Natal, South Africa (2010-2014). We identified the variables most predictive of microbiologically confirmed TBM in children (3 months to 15 years) by univariate analysis. These variables were modelled into a clinical decision tool and performance tested on an independent sample group. Of 865 children with suspected TBM, 3% (25) were identified with microbiologically confirmed TBM. Clinical information was retrieved for 22 microbiologically confirmed cases of TBM and compared with 66 controls matched for age, ethnicity, sex and geographical origin. The 9 most predictive variables among the confirmed cases were used to develop a clinical decision tool (CHILD TB LP): altered Consciousness; caregiver HIV infected; Illness length >7 days; Lethargy; focal neurologic Deficit; failure to Thrive; Blood/serum sodium <132 mmol/L; CSF >10 Lymphocytes ×10/L; CSF Protein >0.65 g/L. This tool successfully classified an independent sample of 7 cases and 21 controls with a sensitivity of 100% and specificity of 90%. The CHILD TB LP decision tool accurately classified microbiologically confirmed TBM. We propose that CHILD TB LP is prospectively evaluated as a novel rapid diagnostic tool for use in the initial evaluation of children with suspected neurologic infection presenting to hospitals in similar settings.
Aliabadi, Mohsen; Golmohammadi, Rostam; Khotanlou, Hassan; Mansoorizadeh, Muharram; Salarpour, Amir
2014-01-01
Noise prediction is considered to be the best method for evaluating cost-preventative noise controls in industrial workrooms. One of the most important issues is the development of accurate models for analysis of the complex relationships among acoustic features affecting noise level in workrooms. In this study, advanced fuzzy approaches were employed to develop relatively accurate models for predicting noise in noisy industrial workrooms. The data were collected from 60 industrial embroidery workrooms in the Khorasan Province, East of Iran. The main acoustic and embroidery process features that influence the noise were used to develop prediction models using MATLAB software. Multiple regression technique was also employed and its results were compared with those of fuzzy approaches. Prediction errors of all prediction models based on fuzzy approaches were within the acceptable level (lower than one dB). However, Neuro-fuzzy model (RMSE=0.53dB and R2=0.88) could slightly improve the accuracy of noise prediction compared with generate fuzzy model. Moreover, fuzzy approaches provided more accurate predictions than did regression technique. The developed models based on fuzzy approaches as useful prediction tools give professionals the opportunity to have an optimum decision about the effectiveness of acoustic treatment scenarios in embroidery workrooms.
Engineering bacterial translation initiation - Do we have all the tools we need?
Vigar, Justin R J; Wieden, Hans-Joachim
2017-11-01
Reliable tools that allow precise and predictable control over gene expression are critical for the success of nearly all bioengineering applications. Translation initiation is the most regulated phase during protein biosynthesis, and is therefore a promising target for exerting control over gene expression. At the translational level, the copy number of a protein can be fine-tuned by altering the interaction between the translation initiation region of an mRNA and the ribosome. These interactions can be controlled by modulating the mRNA structure using numerous approaches, including small molecule ligands, RNAs, or RNA-binding proteins. A variety of naturally occurring regulatory elements have been repurposed, facilitating advances in synthetic gene regulation strategies. The pursuit of a comprehensive understanding of mechanisms governing translation initiation provides the framework for future engineering efforts. Here we outline state-of-the-art strategies used to predictably control translation initiation in bacteria. We also discuss current limitations in the field and future goals. Due to its function as the rate-determining step, initiation is the ideal point to exert effective translation regulation. Several engineering tools are currently available to rationally design the initiation characteristics of synthetic mRNAs. However, improvements are required to increase the predictability, effectiveness, and portability of these tools. Predictable and reliable control over translation initiation will allow greater predictability when designing, constructing, and testing genetic circuits. The ability to build more complex circuits predictably will advance synthetic biology and contribute to our fundamental understanding of the underlying principles of these processes. "This article is part of a Special Issue entitled "Biochemistry of Synthetic Biology - Recent Developments" Guest Editor: Dr. Ilka Heinemann and Dr. Patrick O'Donoghue. Copyright © 2017 Elsevier B.V. All rights reserved.
iPat: intelligent prediction and association tool for genomic research.
Chen, Chunpeng James; Zhang, Zhiwu
2018-06-01
The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. zhiwu.zhang@wsu.edu.
Online Analysis of Wind and Solar Part II: Transmission Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian
2012-01-31
To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the toolmore » has been developed and implemented in software.« less
NASA Technical Reports Server (NTRS)
Wissler, Steven S.; Maldague, Pierre; Rocca, Jennifer; Seybold, Calina
2006-01-01
The Deep Impact mission was ambitious and challenging. JPL's well proven, easily adaptable multi-mission sequence planning tools combined with integrated spacecraft subsystem models enabled a small operations team to develop, validate, and execute extremely complex sequence-based activities within very short development times. This paper focuses on the core planning tool used in the mission, APGEN. It shows how the multi-mission design and adaptability of APGEN made it possible to model spacecraft subsystems as well as ground assets throughout the lifecycle of the Deep Impact project, starting with models of initial, high-level mission objectives, and culminating in detailed predictions of spacecraft behavior during mission-critical activities.
Modeling and Visualizing Flow of Chemical Agents Across Complex Terrain
NASA Technical Reports Server (NTRS)
Kao, David; Kramer, Marc; Chaderjian, Neal
2005-01-01
Release of chemical agents across complex terrain presents a real threat to homeland security. Modeling and visualization tools are being developed that capture flow fluid terrain interaction as well as point dispersal downstream flow paths. These analytic tools when coupled with UAV atmospheric observations provide predictive capabilities to allow for rapid emergency response as well as developing a comprehensive preemptive counter-threat evacuation plan. The visualization tools involve high-end computing and massive parallel processing combined with texture mapping. We demonstrate our approach across a mountainous portion of North California under two contrasting meteorological conditions. Animations depicting flow over this geographical location provide immediate assistance in decision support and crisis management.
Early postnatal weight gain as a predictor for the development of retinopathy of prematurity.
Biniwale, Manoj; Weiner, Angela; Sardesai, Smeeta; Cayabyab, Rowena; Barton, Lorayne; Ramanathan, Rangasamy
2017-10-01
The objective of this study is to validate the reliability of early postnatal weight gain as an accurate predictor of type 1 retinopathy of prematurity (ROP) requiring treatment in a large predominantly Hispanic US cohort with the use of an online tool called WINROP (weight, neonatal retinopathy of prematurity (IGF-1), neonatal retinopathy of prematurity). Retrospective cohort study consisted of preterm infants <32 weeks gestation and birth weight <1500 g. Weekly weights to 36 weeks post-menstrual age or discharge if earlier were entered into the WINROP tool. This tool generated alarm and risk indicator for developing ROP. The infants with type 1 ROP requiring treatment as well as all stages of ROP were compared with the alarms and risks generated by WINROP tool. A total of 492 infants were entered into the WINROP tool. The infants who developed type 1 ROP requiring treatment, the WINROP tool detected 80/89 (90%) at less than 32 weeks gestation. Nine infants developed type 1 ROP were classified as low risk and did not alarm. Postnatal weight gain alone, in predominantly Hispanic US population, predicted type 1 ROP requiring treatment before 32 weeks of gestation in infants with a sensitivity of 90%. The tool appeared to identify majority of affected infants much earlier than the scheduled screening.
Investigation of wear land and rate of locally made HSS cutting tool
NASA Astrophysics Data System (ADS)
Afolalu, S. A.; Abioye, A. A.; Dirisu, J. O.; Okokpujie, I. P.; Ajayi, O. O.; Adetunji, O. R.
2018-04-01
Production technology and machining are inseparable with cutting operation playing important roles. Investigation of wear land and rate of cutting tool developed locally (C=0.56%) with an HSS cutting tool (C=0.65%) as a control was carried out. Wear rate test was carried out using Rotopol -V and Impact tester. The samples (12) of locally made cutting tools and one (1) sample of a control HSS cutting tool were weighed to get the initial weight and grit was fixed at a point for the sample to revolve at a specific time of 10 mins interval. Approach of macro transfer particles that involved mechanism of abrasion and adhesion which was termed as mechanical wear to handle abrasion adhesion processes was used in developing equation for growth wear at flank. It was observed from the wear test that best minimum wear rate of 1.09 × 10-8 and 2.053 × 10-8 for the tools developed and control were measured. MATLAB was used to simulate the wear land and rate under different conditions. Validated results of both the experimental and modeling showed that cutting speed has effect on wear rate while cutting time has predicted measure on wear land. Both experimental and modeling result showed best performances of tools developed over the control.
Analysis of high vacuum systems using SINDA'85
NASA Technical Reports Server (NTRS)
Spivey, R. A.; Clanton, S. E.; Moore, J. D.
1993-01-01
The theory, algorithms, and test data correlation analysis of a math model developed to predict performance of the Space Station Freedom Vacuum Exhaust System are presented. The theory used to predict the flow characteristics of viscous, transition, and molecular flow is presented in detail. Development of user subroutines which predict the flow characteristics in conjunction with the SINDA'85/FLUINT analysis software are discussed. The resistance-capacitance network approach with application to vacuum system analysis is demonstrated and results from the model are correlated with test data. The model was developed to predict the performance of the Space Station Freedom Vacuum Exhaust System. However, the unique use of the user subroutines developed in this model and written into the SINDA'85/FLUINT thermal analysis model provides a powerful tool that can be used to predict the transient performance of vacuum systems and gas flow in tubes of virtually any geometry. This can be accomplished using a resistance-capacitance (R-C) method very similar to the methods used to perform thermal analyses.
NASA Astrophysics Data System (ADS)
Murtha, T., Jr.; Orland, B.; Goldberg, L.; Hammond, R.
2014-12-01
Deep shale natural gas deposits made accessible by new technologies are quickly becoming a considerable share of North America's energy portfolio. Unlike traditional deposits and extraction footprints, shale gas offers dispersed and complex landscape and community challenges. These challenges are both cultural and environmental. This paper describes the development and application of creative geospatial tools as a means to engage communities along the northern tier counties of Pennsylvania, experiencing Marcellus shale drilling in design and planning. Uniquely combining physical landscape models with predictive models of exploration activities, including drilling, pipeline construction and road reconstruction, the tools quantify the potential impacts of drilling activities for communities and landscapes in the commonwealth of Pennsylvania. Dividing the state into 9836 watershed sub-basins, we first describe the current state of Marcellus related activities through 2014. We then describe and report the results of three scaled predictive models designed to investigate probable sub-basins where future activities will be focused. Finally, the core of the paper reports on the second level of tools we have now developed to engage communities in planning for unconventional gas extraction in Pennsylvania. Using a geodesign approach we are working with communities to transfer information for comprehensive landscape planning and informed decision making. These tools not only quantify physical landscape impacts, but also quantify potential visual, aesthetic and cultural resource implications.
Risk stratification, genomic data and the law.
Hall, Alison; Finnegan, Thomas; Chowdhury, Susmita; Dent, Tom; Kroese, Mark; Burton, Hilary
2018-02-22
Risk prediction models have a key role in stratified disease prevention, and the incorporation of genomic data into these models promises more effective personalisation. Although the clinical utility of incorporating genomic data into risk prediction tools is increasingly compelling, at least for some applications and disease types, the legal and regulatory implications have not been examined and have been overshadowed by discussions about clinical and scientific utility and feasibility. We held a workshop to explore relevant legal and regulatory perspectives from four EU Member States: France, Germany, the Netherlands and the UK. While we found no absolute prohibition on the use of such data in those tools, there are considerable challenges. Currently, these are modest and result from genomic data being classified as sensitive data under existing Data Protection regulation. However, these challenges will increase in the future following the implementation of EU Regulations on data protection which take effect in 2018, and reforms to the governance of the manufacture, development and use of in vitro diagnostic devices to be implemented in 2022. Collectively these will increase the regulatory burden placed on these products as risk stratification tools will be brought within the scope of these new Regulations. The failure to respond to the challenges posed by the use of genomic data in disease risk stratification tools could therefore prove costly to those developing and using such tools.
Pavurala, Naresh; Xu, Xiaoming; Krishnaiah, Yellela S R
2017-05-15
Hyperspectral imaging using near infrared spectroscopy (NIRS) integrates spectroscopy and conventional imaging to obtain both spectral and spatial information of materials. The non-invasive and rapid nature of hyperspectral imaging using NIRS makes it a valuable process analytical technology (PAT) tool for in-process monitoring and control of the manufacturing process for transdermal drug delivery systems (TDS). The focus of this investigation was to develop and validate the use of Near Infra-red (NIR) hyperspectral imaging to monitor coat thickness uniformity, a critical quality attribute (CQA) for TDS. Chemometric analysis was used to process the hyperspectral image and a partial least square (PLS) model was developed to predict the coat thickness of the TDS. The goodness of model fit and prediction were 0.9933 and 0.9933, respectively, indicating an excellent fit to the training data and also good predictability. The % Prediction Error (%PE) for internal and external validation samples was less than 5% confirming the accuracy of the PLS model developed in the present study. The feasibility of the hyperspectral imaging as a real-time process analytical tool for continuous processing was also investigated. When the PLS model was applied to detect deliberate variation in coating thickness, it was able to predict both the small and large variations as well as identify coating defects such as non-uniform regions and presence of air bubbles. Published by Elsevier B.V.
Prediction of Thermal Fatigue in Tooling for Die-casting Copper via Finite Element Analysis
NASA Astrophysics Data System (ADS)
Sakhuja, Amit; Brevick, Jerald R.
2004-06-01
Recent research by the Copper Development Association (CDA) has demonstrated the feasibility of die-casting electric motor rotors using copper. Electric motors using copper rotors are significantly more energy efficient relative to motors using aluminum rotors. However, one of the challenges in copper rotor die-casting is low tool life. Experiments have shown that the higher molten metal temperature of copper (1085 °C), as compared to aluminum (660 °C) accelerates the onset of thermal fatigue or heat checking in traditional H-13 tool steel. This happens primarily because the mechanical properties of H-13 tool steel decrease significantly above 650 °C. Potential approaches to mitigate the heat checking problem include: 1) identification of potential tool materials having better high temperature mechanical properties than H-13, and 2) reduction of the magnitude of cyclic thermal excursions experienced by the tooling by increasing the bulk die temperature. A preliminary assessment of alternative tool materials has led to the selection of nickel-based alloys Haynes 230 and Inconel 617 as potential candidates. These alloys were selected based on their elevated temperature physical and mechanical properties. Therefore, the overall objective of this research work was to predict the number of copper rotor die-casting cycles to the onset of heat checking (tool life) as a function of bulk die temperature (up to 650 °C) for Haynes 230 and Inconel 617 alloys. To achieve these goals, a 2D thermo-mechanical FEA was performed to evaluate strain ranges on selected die surfaces. The method of Universal Slopes (Strain Life Method) was then employed for thermal fatigue life predictions.
An expert system based software sizing tool, phase 2
NASA Technical Reports Server (NTRS)
Friedlander, David
1990-01-01
A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.
Lee, Eun-Ju; Podoltsev, Nikolai; Gore, Steven D; Zeidan, Amer M
2016-01-01
The clinical course of patients with myelodysplastic syndromes (MDS) is characterized by wide variability reflecting the underlying genetic and biological heterogeneity of the disease. Accurate prediction of outcomes for individual patients is an integral part of the evidence-based risk/benefit calculations that are necessary for tailoring the aggressiveness of therapeutic interventions. While several prognostication tools have been developed and validated for risk stratification, each of these systems has limitations. The recent progress in genomic sequencing techniques has led to discoveries of recurrent molecular mutations in MDS patients with independent impact on relevant clinical outcomes. Reliable assays of these mutations have already entered the clinic and efforts are currently ongoing to formally incorporate mutational analysis into the existing clinicopathologic risk stratification tools. Additionally, mutational analysis holds promise for going beyond prognostication to therapeutic selection and individualized treatment-specific prediction of outcomes; abilities that would revolutionize MDS patient care. Despite these exciting developments, the best way of incorporating molecular testing for use in prognostication and prediction of outcomes in clinical practice remains undefined and further research is warranted. Copyright © 2015 Elsevier Ltd. All rights reserved.
miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.
Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M
2009-07-01
Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/.
Summary appraisals of the Nation's ground-water resources; Lower Mississippi region
Terry, J.E.; Hosman, R.L.; Bryant, C.T.
1979-01-01
Great advances have been made in hydrologic technology in recent years. Predictive models have been developed that make it possible for the hydroiogist to simulate aquifer responses to proposed development or other stresses. These models would be invaluable tools in progressive water-resources planning and management.
Monitoring viability of seeds in gene banks: developing software tools to increase efficiency
USDA-ARS?s Scientific Manuscript database
Monitoring the decline of seed viability is essential for effective long term seed storage in ex situ collections. Recent FAO Genebank Standards recommend monitoring intervals at one-third the time predicted for viability to fall to 85% of initial viability. This poster outlines the development of ...
Review of methods for developing probabilistic risk assessments
D. A. Weinstein; P.B. Woodbury
2010-01-01
We describe methodologies currently in use or those under development containing features for estimating fire occurrence risk assessment. We describe two major categories of fire risk assessment tools: those that predict fire under current conditions, assuming that vegetation, climate, and the interactions between them and fire remain relatively similar to their...
The Simple Theory of Public Library Services.
ERIC Educational Resources Information Center
Newhouse, Joseph P.
A simple normative theory applicable to public library services was developed as a tool to aid libraries in answering the question: which books should be bought by the library? Although developed for normative purposes, the theory generates testable predictions. It is relevant to measuring benefits from services which are provided publicly because…
Ceramic Matrix Composites (CMC) Life Prediction Development
NASA Technical Reports Server (NTRS)
Levine, Stanley R.; Verrilli, Michael J.; Thomas, David J.; Halbig, Michael C.; Calomino, Anthony M.; Ellis, John R.; Opila, Elizabeth J.
1990-01-01
Advanced launch systems will very likely incorporate fiber reinforced ceramic matrix composites (CMC) in critical propulsion and airframe components. The use of CMC will save weight, increase operating margin, safety and performance, and improve reuse capability. For reusable and single mission use, accurate life prediction is critical to success. The tools to accomplish this are immature and not oriented toward the behavior of carbon fiber reinforced silicon carbide (C/SiC), the primary system of interest for many applications. This paper describes an approach and progress made to satisfy the need to develop an integrated life prediction system that addresses mechanical durability and environmental degradation.
An Interactive Tool For Semi-automated Statistical Prediction Using Earth Observations and Models
NASA Astrophysics Data System (ADS)
Zaitchik, B. F.; Berhane, F.; Tadesse, T.
2015-12-01
We developed a semi-automated statistical prediction tool applicable to concurrent analysis or seasonal prediction of any time series variable in any geographic location. The tool was developed using Shiny, JavaScript, HTML and CSS. A user can extract a predictand by drawing a polygon over a region of interest on the provided user interface (global map). The user can select the Climatic Research Unit (CRU) precipitation or Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS) as predictand. They can also upload their own predictand time series. Predictors can be extracted from sea surface temperature, sea level pressure, winds at different pressure levels, air temperature at various pressure levels, and geopotential height at different pressure levels. By default, reanalysis fields are applied as predictors, but the user can also upload their own predictors, including a wide range of compatible satellite-derived datasets. The package generates correlations of the variables selected with the predictand. The user also has the option to generate composites of the variables based on the predictand. Next, the user can extract predictors by drawing polygons over the regions that show strong correlations (composites). Then, the user can select some or all of the statistical prediction models provided. Provided models include Linear Regression models (GLM, SGLM), Tree-based models (bagging, random forest, boosting), Artificial Neural Network, and other non-linear models such as Generalized Additive Model (GAM) and Multivariate Adaptive Regression Splines (MARS). Finally, the user can download the analysis steps they used, such as the region they selected, the time period they specified, the predictand and predictors they chose and preprocessing options they used, and the model results in PDF or HTML format. Key words: Semi-automated prediction, Shiny, R, GLM, ANN, RF, GAM, MARS
Trabecular bone score (TBS): Method and applications.
Martineau, P; Leslie, W D
2017-11-01
Trabecular bone score (TBS) is a texture index derived from standard lumbar spine dual energy X-ray absorptiometry (DXA) images and provides information about the underlying bone independent of the bone mineral density (BMD). Several salient observations have emerged. Numerous studies have examined the relationship between TBS and fracture risk and have shown that lower TBS values are associated with increased risk for major osteoporotic fracture in postmenopausal women and older men, with this result being independent of BMD values and other clinical risk factors. Therefore, despite being derived from standard DXA images, the information contained in TBS is independent and complementary to the information provided by BMD and the FRAX® tool. A procedure to generate TBS-adjusted FRAX probabilities has become available with the resultant predicted fracture risks shown to be more accurate than the standard FRAX tool. With these developments, TBS has emerged as a clinical tool for improved fracture risk prediction and guiding decisions regarding treatment initiation, particularly for patients with FRAX probabilities around an intervention threshold. In this article, we review the development, validation, clinical application, and limitations of TBS. Copyright © 2017 Elsevier Inc. All rights reserved.
A Design Tool for Liquid Rocket Engine Injectors
NASA Technical Reports Server (NTRS)
Farmer, R.; Cheng, G.; Trinh, H.; Tucker, K.
2000-01-01
A practical design tool which emphasizes the analysis of flowfields near the injector face of liquid rocket engines has been developed and used to simulate preliminary configurations of NASA's Fastrac and vortex engines. This computational design tool is sufficiently detailed to predict the interactive effects of injector element impingement angles and points and the momenta of the individual orifice flows and the combusting flow which results. In order to simulate a significant number of individual orifices, a homogeneous computational fluid dynamics model was developed. To describe sub- and supercritical liquid and vapor flows, the model utilized thermal and caloric equations of state which were valid over a wide range of pressures and temperatures. The model was constructed such that the local quality of the flow was determined directly. Since both the Fastrac and vortex engines utilize RP-1/LOX propellants, a simplified hydrocarbon combustion model was devised in order to accomplish three-dimensional, multiphase flow simulations. Such a model does not identify drops or their distribution, but it does allow the recirculating flow along the injector face and into the acoustic cavity and the film coolant flow to be accurately predicted.