Sample records for enable improved prediction

  1. Enhancing understanding and improving prediction of severe weather through spatiotemporal relational learning.

    PubMed

    McGovern, Amy; Gagne, David J; Williams, John K; Brown, Rodger A; Basara, Jeffrey B

    Severe weather, including tornadoes, thunderstorms, wind, and hail annually cause significant loss of life and property. We are developing spatiotemporal machine learning techniques that will enable meteorologists to improve the prediction of these events by improving their understanding of the fundamental causes of the phenomena and by building skillful empirical predictive models. In this paper, we present significant enhancements of our Spatiotemporal Relational Probability Trees that enable autonomous discovery of spatiotemporal relationships as well as learning with arbitrary shapes. We focus our evaluation on two real-world case studies using our technique: predicting tornadoes in Oklahoma and predicting aircraft turbulence in the United States. We also discuss how to evaluate success for a machine learning algorithm in the severe weather domain, which will enable new methods such as ours to transfer from research to operations, provide a set of lessons learned for embedded machine learning applications, and discuss how to field our technique.

  2. Combustion of Nitramine Propellants

    DTIC Science & Technology

    1983-03-01

    through development of a comprehensive analytical model. The ultimate goals are to enable prediction of deflagration rate over a wide pressure range...superior in burn rate prediction , both simple models fail in correlating existing temperature- sensitivity data. (2) In the second part, a...auxiliary condition to enable independent burn rate prediction ; improved melt phase model including decomposition-gas bubbles; model for far-field

  3. JPL's Role in Advancing Earth System Science to Meet the Challenges of Climate and Environmental Change

    NASA Technical Reports Server (NTRS)

    Evans, Diane

    2012-01-01

    Objective 2.1.1: Improve understanding of and improve the predictive capability for changes in the ozone layer, climate forcing, and air quality associated with changes in atmospheric composition. Objective 2.1.2: Enable improved predictive capability for weather and extreme weather events. Objective 2.1.3: Quantify, understand, and predict changes in Earth s ecosystems and biogeochemical cycles, including the global carbon cycle, land cover, and biodiversity. Objective 2.1.4: Quantify the key reservoirs and fluxes in the global water cycle and assess water cycle change and water quality. Objective 2.1.5: Improve understanding of the roles of the ocean, atmosphere, land and ice in the climate system and improve predictive capability for its future evolution. Objective 2.1.6: Characterize the dynamics of Earth s surface and interior and form the scientific basis for the assessment and mitigation of natural hazards and response to rare and extreme events. Objective 2.1.7: Enable the broad use of Earth system science observations and results in decision-making activities for societal benefits.

  4. Improved Prediction Models For PCC Pavement Performance-Related Specifications, Volume II: PAVESPEC 3.0 User's Guide

    DOT National Transportation Integrated Search

    2014-01-01

    Connected vehicle wireless data communications can enable safety applications that may reduce injuries and fatalities suffered on our roads and highways, as well as enabling reductions in traffic congestion and impacts on the environment. As a critic...

  5. On Meeting Students Where They Are: Teacher Judgment and the Use of Data in Higher Education

    ERIC Educational Resources Information Center

    Schouten, Gina

    2017-01-01

    It is treated as a truism that teaching well requires "meeting students where they are". Data enable us to know better where that is. Data can improve instructional practice by informing predictions about which pedagogies will be most successful for which students, and it can improve advising practice by informing predictions about which…

  6. Evaluation of single nucleotide polymorphisms in chromosomal regions impacting pregnancy status in cattle

    USDA-ARS?s Scientific Manuscript database

    Reproductive success is an important component of commercial beef cattle production, and identification of DNA markers with predictive merit for reproductive success would facilitate accurate prediction of mean daughter pregnancy rate, enabling effective selection of bulls to improve female fertilit...

  7. Testing of the European Union exposure-response relationships and annoyance equivalents model for annoyance due to transportation noises: The need of revised exposure-response relationships and annoyance equivalents model.

    PubMed

    Gille, Laure-Anne; Marquis-Favre, Catherine; Morel, Julien

    2016-09-01

    An in situ survey was performed in 8 French cities in 2012 to study the annoyance due to combined transportation noises. As the European Commission recommends to use the exposure-response relationships suggested by Miedema and Oudshoorn [Environmental Health Perspective, 2001] to predict annoyance due to single transportation noise, these exposure-response relationships were tested using the annoyance due to each transportation noise measured during the French survey. These relationships only enabled a good prediction in terms of the percentages of people highly annoyed by road traffic noise. For the percentages of people annoyed and a little annoyed by road traffic noise, the quality of prediction is weak. For aircraft and railway noises, prediction of annoyance is not satisfactory either. As a consequence, the annoyance equivalents model of Miedema [The Journal of the Acoustical Society of America, 2004], based on these exposure-response relationships did not enable a good prediction of annoyance due to combined transportation noises. Local exposure-response relationships were derived, following the whole computation suggested by Miedema and Oudshoorn [Environmental Health Perspective, 2001]. They led to a better calculation of annoyance due to each transportation noise in the French cities. A new version of the annoyance equivalents model was proposed using these new exposure-response relationships. This model enabled a better prediction of the total annoyance due to the combined transportation noises. These results encourage therefore to improve the annoyance prediction for noise in isolation with local or revised exposure-response relationships, which will also contribute to improve annoyance modeling for combined noises. With this aim in mind, a methodology is proposed to consider noise sensitivity in exposure-response relationships and in the annoyance equivalents model. The results showed that taking into account such variable did not enable to enhance both exposure-response relationships and the annoyance equivalents model. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Novel modes and adaptive block scanning order for intra prediction in AV1

    NASA Astrophysics Data System (ADS)

    Hadar, Ofer; Shleifer, Ariel; Mukherjee, Debargha; Joshi, Urvang; Mazar, Itai; Yuzvinsky, Michael; Tavor, Nitzan; Itzhak, Nati; Birman, Raz

    2017-09-01

    The demand for streaming video content is on the rise and growing exponentially. Networks bandwidth is very costly and therefore there is a constant effort to improve video compression rates and enable the sending of reduced data volumes while retaining quality of experience (QoE). One basic feature that utilizes the spatial correlation of pixels for video compression is Intra-Prediction, which determines the codec's compression efficiency. Intra prediction enables significant reduction of the Intra-Frame (I frame) size and, therefore, contributes to efficient exploitation of bandwidth. In this presentation, we propose new Intra-Prediction algorithms that improve the AV1 prediction model and provide better compression ratios. Two (2) types of methods are considered: )1( New scanning order method that maximizes spatial correlation in order to reduce prediction error; and )2( New Intra-Prediction modes implementation in AVI. Modern video coding standards, including AVI codec, utilize fixed scan orders in processing blocks during intra coding. The fixed scan orders typically result in residual blocks with high prediction error mainly in blocks with edges. This means that the fixed scan orders cannot fully exploit the content-adaptive spatial correlations between adjacent blocks, thus the bitrate after compression tends to be large. To reduce the bitrate induced by inaccurate intra prediction, the proposed approach adaptively chooses the scanning order of blocks according to criteria of firstly predicting blocks with maximum number of surrounding, already Inter-Predicted blocks. Using the modified scanning order method and the new modes has reduced the MSE by up to five (5) times when compared to conventional TM mode / Raster scan and up to two (2) times when compared to conventional CALIC mode / Raster scan, depending on the image characteristics (which determines the percentage of blocks predicted with Inter-Prediction, which in turn impacts the efficiency of the new scanning method). For the same cases, the PSNR was shown to improve by up to 7.4dB and up to 4 dB, respectively. The new modes have yielded 5% improvement in BD-Rate over traditionally used modes, when run on K-Frame, which is expected to yield 1% of overall improvement.

  9. Genome-enabled selection doubles the accuracy of predicted breeding values for bacterial cold water disease resistance compared to traditional family-based selection in rainbow trout aquaculture

    USDA-ARS?s Scientific Manuscript database

    We have shown previously that bacterial cold water disease (BCWD) resistance in rainbow trout can be improved using traditional family-based selection, but progress has been limited to exploiting only between-family genetic variation. Genomic selection (GS) is a new alternative enabling exploitation...

  10. Department of Defense Space Science and Technology Strategy 2015

    DTIC Science & Technology

    2015-01-01

    solar cells at 34% efficiency enabling higher power spacecraft capability. These solar cells developed by the Air Force Research Laboratory (AFRL...Reduce size, weight, power , cost, and improve thermal management for SATCOM terminals  Support intelligence surveillance and reconnaissance (ISR...Improve understanding and awareness of the Earth-to-Sun environment  Improve space environment forecast capabilities and tools to predict operational

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, L.; Britt, J.; Birkmire, R.

    ITN Energy Systems, Inc., and Global Solar Energy, Inc., assisted by NREL's PV Manufacturing R&D program, have continued to advance CIGS production technology by developing trajectory-oriented predictive/control models, fault-tolerance control, control platform development, in-situ sensors, and process improvements. Modeling activities included developing physics-based and empirical models for CIGS and sputter-deposition processing, implementing model-based control, and applying predictive models to the construction of new evaporation sources and for control. Model-based control is enabled by implementing reduced or empirical models into a control platform. Reliability improvement activities include implementing preventive maintenance schedules; detecting failed sensors/equipment and reconfiguring to tinue processing; and systematicmore » development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which in turn have been enabled by control and reliability improvements due to this PV Manufacturing R&D program.« less

  12. Comparison of Models and Whole-Genome Profiling Approaches for Genomic-Enabled Prediction of Septoria Tritici Blotch, Stagonospora Nodorum Blotch, and Tan Spot Resistance in Wheat.

    PubMed

    Juliana, Philomin; Singh, Ravi P; Singh, Pawan K; Crossa, Jose; Rutkoski, Jessica E; Poland, Jesse A; Bergstrom, Gary C; Sorrells, Mark E

    2017-07-01

    The leaf spotting diseases in wheat that include Septoria tritici blotch (STB) caused by , Stagonospora nodorum blotch (SNB) caused by , and tan spot (TS) caused by pose challenges to breeding programs in selecting for resistance. A promising approach that could enable selection prior to phenotyping is genomic selection that uses genome-wide markers to estimate breeding values (BVs) for quantitative traits. To evaluate this approach for seedling and/or adult plant resistance (APR) to STB, SNB, and TS, we compared the predictive ability of least-squares (LS) approach with genomic-enabled prediction models including genomic best linear unbiased predictor (GBLUP), Bayesian ridge regression (BRR), Bayes A (BA), Bayes B (BB), Bayes Cπ (BC), Bayesian least absolute shrinkage and selection operator (BL), and reproducing kernel Hilbert spaces markers (RKHS-M), a pedigree-based model (RKHS-P) and RKHS markers and pedigree (RKHS-MP). We observed that LS gave the lowest prediction accuracies and RKHS-MP, the highest. The genomic-enabled prediction models and RKHS-P gave similar accuracies. The increase in accuracy using genomic prediction models over LS was 48%. The mean genomic prediction accuracies were 0.45 for STB (APR), 0.55 for SNB (seedling), 0.66 for TS (seedling) and 0.48 for TS (APR). We also compared markers from two whole-genome profiling approaches: genotyping by sequencing (GBS) and diversity arrays technology sequencing (DArTseq) for prediction. While, GBS markers performed slightly better than DArTseq, combining markers from the two approaches did not improve accuracies. We conclude that implementing GS in breeding for these diseases would help to achieve higher accuracies and rapid gains from selection. Copyright © 2017 Crop Science Society of America.

  13. Enabling CoO improvement thru green initiatives

    NASA Astrophysics Data System (ADS)

    Gross, Eric; Padmabandu, G. G.; Ujazdowski, Richard; Haran, Don; Lake, Matt; Mason, Eric; Gillespie, Walter

    2015-03-01

    Chipmakers continued pressure to drive down costs while increasing utilization requires development in all areas. Cymer's commitment to meeting customer's needs includes developing solutions that enable higher productivity as well as lowering cost of lightsource operation. Improvements in system power efficiency and predictability were deployed to chipmakers' in 2014 with release of our latest Master Oscillating gas chamber. In addition, Cymer has committed to reduced gas usage, completing development in methods to reduce Helium gas usage while maintaining superior bandwidth and wavelength stability. The latest developments in lowering cost of operations are paired with our advanced ETC controller in Cymer's XLR 700ix product.

  14. Evaluation of genome-enabled selection for bacterial cold water disease resistance using progeny performance data in Rainbow Trout: Insights on genotyping methods and genomic prediction models

    USDA-ARS?s Scientific Manuscript database

    Bacterial cold water disease (BCWD) causes significant economic losses in salmonid aquaculture, and traditional family-based breeding programs aimed at improving BCWD resistance have been limited to exploiting only between-family variation. We used genomic selection (GS) models to predict genomic br...

  15. Neural preservation underlies speech improvement from auditory deprivation in young cochlear implant recipients.

    PubMed

    Feng, Gangyi; Ingvalson, Erin M; Grieco-Calub, Tina M; Roberts, Megan Y; Ryan, Maura E; Birmingham, Patrick; Burrowes, Delilah; Young, Nancy M; Wong, Patrick C M

    2018-01-30

    Although cochlear implantation enables some children to attain age-appropriate speech and language development, communicative delays persist in others, and outcomes are quite variable and difficult to predict, even for children implanted early in life. To understand the neurobiological basis of this variability, we used presurgical neural morphological data obtained from MRI of individual pediatric cochlear implant (CI) candidates implanted younger than 3.5 years to predict variability of their speech-perception improvement after surgery. We first compared neuroanatomical density and spatial pattern similarity of CI candidates to that of age-matched children with normal hearing, which allowed us to detail neuroanatomical networks that were either affected or unaffected by auditory deprivation. This information enables us to build machine-learning models to predict the individual children's speech development following CI. We found that regions of the brain that were unaffected by auditory deprivation, in particular the auditory association and cognitive brain regions, produced the highest accuracy, specificity, and sensitivity in patient classification and the most precise prediction results. These findings suggest that brain areas unaffected by auditory deprivation are critical to developing closer to typical speech outcomes. Moreover, the findings suggest that determination of the type of neural reorganization caused by auditory deprivation before implantation is valuable for predicting post-CI language outcomes for young children.

  16. Next-generation genome-scale models for metabolic engineering.

    PubMed

    King, Zachary A; Lloyd, Colton J; Feist, Adam M; Palsson, Bernhard O

    2015-12-01

    Constraint-based reconstruction and analysis (COBRA) methods have become widely used tools for metabolic engineering in both academic and industrial laboratories. By employing a genome-scale in silico representation of the metabolic network of a host organism, COBRA methods can be used to predict optimal genetic modifications that improve the rate and yield of chemical production. A new generation of COBRA models and methods is now being developed--encompassing many biological processes and simulation strategies-and next-generation models enable new types of predictions. Here, three key examples of applying COBRA methods to strain optimization are presented and discussed. Then, an outlook is provided on the next generation of COBRA models and the new types of predictions they will enable for systems metabolic engineering. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Enhanced CARES Software Enables Improved Ceramic Life Prediction

    NASA Technical Reports Server (NTRS)

    Janosik, Lesley A.

    1997-01-01

    The NASA Lewis Research Center has developed award-winning software that enables American industry to establish the reliability and life of brittle material (e.g., ceramic, intermetallic, graphite) structures in a wide variety of 21st century applications. The CARES (Ceramics Analysis and Reliability Evaluation of Structures) series of software is successfully used by numerous engineers in industrial, academic, and government organizations as an essential element of the structural design and material selection processes. The latest version of this software, CARES/Life, provides a general- purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. CARES/Life was recently enhanced by adding new modules designed to improve functionality and user-friendliness. In addition, a beta version of the newly-developed CARES/Creep program (for determining the creep life of monolithic ceramic components) has just been released to selected organizations.

  18. Cloud Based Metalearning System for Predictive Modeling of Biomedical Data

    PubMed Central

    Vukićević, Milan

    2014-01-01

    Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data. PMID:24892101

  19. Route Prediction on Tracking Data to Location-Based Services

    NASA Astrophysics Data System (ADS)

    Petróczi, Attila István; Gáspár-Papanek, Csaba

    Wireless networks have become so widespread, it is beneficial to determine the ability of cellular networks for localization. This property enables the development of location-based services, providing useful information. These services can be improved by route prediction under the condition of using simple algorithms, because of the limited capabilities of mobile stations. This study gives alternative solutions for this problem of route prediction based on a specific graph model. Our models provide the opportunity to reach our destinations with less effort.

  20. ENFIN--A European network for integrative systems biology.

    PubMed

    Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan

    2009-11-01

    Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.

  1. Improved Displacement Transfer Functions for Structure Deformed Shape Predictions Using Discretely Distributed Surface Strains

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Fleischer, Van Tran

    2012-01-01

    In the formulations of earlier Displacement Transfer Functions for structure shape predictions, the surface strain distributions, along a strain-sensing line, were represented with piecewise linear functions. To improve the shape-prediction accuracies, Improved Displacement Transfer Functions were formulated using piecewise nonlinear strain representations. Through discretization of an embedded beam (depth-wise cross section of a structure along a strain-sensing line) into multiple small domains, piecewise nonlinear functions were used to describe the surface strain distributions along the discretized embedded beam. Such piecewise approach enabled the piecewise integrations of the embedded beam curvature equations to yield slope and deflection equations in recursive forms. The resulting Improved Displacement Transfer Functions, written in summation forms, were expressed in terms of beam geometrical parameters and surface strains along the strain-sensing line. By feeding the surface strains into the Improved Displacement Transfer Functions, structural deflections could be calculated at multiple points for mapping out the overall structural deformed shapes for visual display. The shape-prediction accuracies of the Improved Displacement Transfer Functions were then examined in view of finite-element-calculated deflections using different tapered cantilever tubular beams. It was found that by using the piecewise nonlinear strain representations, the shape-prediction accuracies could be greatly improved, especially for highly-tapered cantilever tubular beams.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, L.

    ITN Energy Systems, Inc., and Global Solar Energy, Inc., with the assistance of NREL's PV Manufacturing R&D program, have continued the advancement of CIGS production technology through the development of trajectory-oriented predictive/control models, fault-tolerance control, control-platform development, in-situ sensors, and process improvements. Modeling activities to date include the development of physics-based and empirical models for CIGS and sputter-deposition processing, implementation of model-based control, and application of predictive models to the construction of new evaporation sources and for control. Model-based control is enabled through implementation of reduced or empirical models into a control platform. Reliability improvement activities include implementation of preventivemore » maintenance schedules; detection of failed sensors/equipment and reconfiguration to continue processing; and systematic development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which, in turn, have been enabled by control and reliability improvements due to this PV Manufacturing R&D program. This has resulted in substantial improvements of flexible CIGS PV module performance and efficiency.« less

  3. Applications of LANCE Data at SPoRT

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew

    2014-01-01

    Short term Prediction Research and Transition (SPoRT) Center: Mission: Apply NASA and NOAA measurement systems and unique Earth science research to improve the accuracy of short term weather prediction at the regional/local scale. Goals: Evaluate and assess the utility of NASA and NOAA Earth science data and products and unique research capabilities to address operational weather forecast problems; Provide an environment which enables the development and testing of new capabilities to improve short term weather forecasts on a regional scale; Help ensure successful transition of new capabilities to operational weather entities for the benefit of society

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Partridge Jr, William P.; Choi, Jae-Soon

    By directly resolving spatial and temporal species distributions within operating honeycomb monolith catalysts, spatially resolved capillary inlet mass spectrometry (SpaciMS) provides a uniquely enabling perspective for advancing automotive catalysis. Specifically, the ability to follow the spatiotemporal evolution of reactions throughout the catalyst is a significant advantage over inlet-and-effluent-limited analysis. Intracatalyst resolution elucidates numerous catalyst details including the network and sequence of reactions, clarifying reaction pathways; the relative rates of different reactions and impacts of operating conditions and catalyst state; and reaction dynamics and intermediate species that exist only within the catalyst. These details provide a better understanding of how themore » catalyst functions and have basic and practical benefits; e.g., catalyst system design; strategies for on-road catalyst state assessment, control, and on-board diagnostics; and creating robust and accurate predictive catalyst models. Moreover, such spatiotemporally distributed data provide for critical model assessment, and identification of improvement opportunities that might not be apparent from effluent assessment; i.e., while an incorrectly formulated model may provide correct effluent predictions, one that can accurately predict the spatiotemporal evolution of reactions along the catalyst channels will be more robust, accurate, and reliable. In such ways, intracatalyst diagnostics comprehensively enable improved design and development tools, and faster and lower-cost development of more efficient and durable automotive catalyst systems. Beyond these direct contributions, SpaciMS has spawned and been applied to enable other analytical techniques for resolving transient distributed intracatalyst performance. This chapter focuses on SpaciMS applications and associated catalyst insights and improvements, with specific sections related to lean NOx traps, selective catalytic reduction catalysts, oxidation catalysts, and particulate filters. The objective is to promote broader use and development of intracatalyst analytical methods, and thereby expand the insights resulting from this detailed perspective for advancing automotive catalyst technologies.« less

  5. Algorithm for Training a Recurrent Multilayer Perceptron

    NASA Technical Reports Server (NTRS)

    Parlos, Alexander G.; Rais, Omar T.; Menon, Sunil K.; Atiya, Amir F.

    2004-01-01

    An improved algorithm has been devised for training a recurrent multilayer perceptron (RMLP) for optimal performance in predicting the behavior of a complex, dynamic, and noisy system multiple time steps into the future. [An RMLP is a computational neural network with self-feedback and cross-talk (both delayed by one time step) among neurons in hidden layers]. Like other neural-network-training algorithms, this algorithm adjusts network biases and synaptic-connection weights according to a gradient-descent rule. The distinguishing feature of this algorithm is a combination of global feedback (the use of predictions as well as the current output value in computing the gradient at each time step) and recursiveness. The recursive aspect of the algorithm lies in the inclusion of the gradient of predictions at each time step with respect to the predictions at the preceding time step; this recursion enables the RMLP to learn the dynamics. It has been conjectured that carrying the recursion to even earlier time steps would enable the RMLP to represent a noisier, more complex system.

  6. Improved Fiber-Optic-Coupled Pressure And Vibration Sensors

    NASA Technical Reports Server (NTRS)

    Zuckerwar, Allan J.; Cuomo, Frank W.

    1994-01-01

    Improved fiber-optic coupler enables use of single optical fiber to carry light to and from sensor head. Eliminates problem of alignment of multiple fibers in sensor head and simplifies calibration by making performance both more predictable and more stable. Sensitivities increased, sizes reduced. Provides increased margin for design of compact sensor heads not required to contain amplifier circuits and withstand high operating temperatures.

  7. The Earth Science Vision

    NASA Technical Reports Server (NTRS)

    Schoeberl, Mark; Rychekewkitsch, Michael; Andrucyk, Dennis; McConaughy, Gail; Meeson, Blanche; Hildebrand, Peter; Einaudi, Franco (Technical Monitor)

    2000-01-01

    NASA's Earth Science Enterprise's long range vision is to enable the development of a national proactive environmental predictive capability through targeted scientific research and technological innovation. Proactive environmental prediction means the prediction of environmental events and their secondary consequences. These consequences range from disasters and disease outbreak to improved food production and reduced transportation, energy and insurance costs. The economic advantage of this predictive capability will greatly outweigh the cost of development. Developing this predictive capability requires a greatly improved understanding of the earth system and the interaction of the various components of that system. It also requires a change in our approach to gathering data about the earth and a change in our current methodology in processing that data including its delivery to the customers. And, most importantly, it requires a renewed partnership between NASA and its sister agencies. We identify six application themes that summarize the potential of proactive environmental prediction. We also identify four technology themes that articulate our approach to implementing proactive environmental prediction.

  8. Modeling the prediction of business intelligence system effectiveness.

    PubMed

    Weng, Sung-Shun; Yang, Ming-Hsien; Koo, Tian-Lih; Hsiao, Pei-I

    2016-01-01

    Although business intelligence (BI) technologies are continually evolving, the capability to apply BI technologies has become an indispensable resource for enterprises running in today's complex, uncertain and dynamic business environment. This study performed pioneering work by constructing models and rules for the prediction of business intelligence system effectiveness (BISE) in relation to the implementation of BI solutions. For enterprises, effectively managing critical attributes that determine BISE to develop prediction models with a set of rules for self-evaluation of the effectiveness of BI solutions is necessary to improve BI implementation and ensure its success. The main study findings identified the critical prediction indicators of BISE that are important to forecasting BI performance and highlighted five classification and prediction rules of BISE derived from decision tree structures, as well as a refined regression prediction model with four critical prediction indicators constructed by logistic regression analysis that can enable enterprises to improve BISE while effectively managing BI solution implementation and catering to academics to whom theory is important.

  9. Modelling directional solidification

    NASA Technical Reports Server (NTRS)

    Wilcox, William R.

    1991-01-01

    The long range goal of this program is to develop an improved understanding of phenomena of importance to directional solidification and to enable explanation and prediction of differences in behavior between solidification on Earth and in space. Current emphasis is on determining the influence of perturbations on directional solidification.

  10. Predictive Analytics for Safer Food Supply

    USDA-ARS?s Scientific Manuscript database

    Science based risk analysis improves the USDA Food Safety Inspection Service’s ability to combat threats to public health from food-borne illness by allowing the Agency to focus resources on hazards that pose the greatest risk. Innovative algorithms enable detection and containment of threat by an...

  11. Organs-on-chips at the frontiers of drug discovery

    PubMed Central

    Esch, Eric W.; Bahinski, Anthony; Huh, Dongeun

    2016-01-01

    Improving the effectiveness of preclinical predictions of human drug responses is critical to reducing costly failures in clinical trials. Recent advances in cell biology, microfabrication and microfluidics have enabled the development of microengineered models of the functional units of human organs — known as organs-on-chips — that could provide the basis for preclinical assays with greater predictive power. Here, we examine the new opportunities for the application of organ-on-chip technologies in a range of areas in preclinical drug discovery, such as target identification and validation, target-based screening, and phenotypic screening. We also discuss emerging drug discovery opportunities enabled by organs-on-chips, as well as important challenges in realizing the full potential of this technology. PMID:25792263

  12. Status report on the Aeronautical Research Institute of Sweden version of the missile aerodynamics program LARV, for calculation of static aerodynamic properties and longitudinal aerodynamic damping derivatives. Part 1: Theory

    NASA Astrophysics Data System (ADS)

    Weibust, E.

    Improvements to a missile aerodynamics program which enable it to (a) calculate aerodynamic coefficients as input for a flight mechanics model, (b) check manufacturers' data or estimate performance from photographs, (c) reduce wind tunnel testing, and (d) aid optimization studies, are discussed. Slender body theory is used for longitudinal damping derivatives prediction. Program predictions were compared to known values. Greater accuracy is required in the estimation of drag due to excrescences on actual missile configurations, the influence of a burning motor, and nonlinear effects in the stall region. Prediction of pressure centers on wings and on bodies in presence of wings must be improved.

  13. The Scintillation Prediction Observations Research Task (SPORT) Mission

    NASA Technical Reports Server (NTRS)

    Spann, James; Swenson, Charles; Durao, Otavio; Loures, Luis; Heelis, Rod; Bishop, Rebecca; Le, Guan; Abdu, Mangalathayil; Krause, Linda; Denardin, Clezio; hide

    2017-01-01

    SPORT is a science mission using a 6U CubeSat and integrated ground network that will (1) advance understanding and (2) enable improved predictions of scintillation occurrence that impact GPS signals and radio communications. This is the science of Space Weather. SPORT is an international partnership with NASA, U.S. institutions, the Brazilian National Institute for Space Research (INPE), and the Technical Aeronautics Institute under the Brazilian Air Force Command Department (DCTA/ITA).

  14. Simultaneous optimization of biomolecular energy function on features from small molecules and macromolecules

    PubMed Central

    Park, Hahnbeom; Bradley, Philip; Greisen, Per; Liu, Yuan; Mulligan, Vikram Khipple; Kim, David E.; Baker, David; DiMaio, Frank

    2017-01-01

    Most biomolecular modeling energy functions for structure prediction, sequence design, and molecular docking, have been parameterized using existing macromolecular structural data; this contrasts molecular mechanics force fields which are largely optimized using small-molecule data. In this study, we describe an integrated method that enables optimization of a biomolecular modeling energy function simultaneously against small-molecule thermodynamic data and high-resolution macromolecular structural data. We use this approach to develop a next-generation Rosetta energy function that utilizes a new anisotropic implicit solvation model, and an improved electrostatics and Lennard-Jones model, illustrating how energy functions can be considerably improved in their ability to describe large-scale energy landscapes by incorporating both small-molecule and macromolecule data. The energy function improves performance in a wide range of protein structure prediction challenges, including monomeric structure prediction, protein-protein and protein-ligand docking, protein sequence design, and prediction of the free energy changes by mutation, while reasonably recapitulating small-molecule thermodynamic properties. PMID:27766851

  15. Using gaze patterns to predict task intent in collaboration.

    PubMed

    Huang, Chien-Ming; Andrist, Sean; Sauppé, Allison; Mutlu, Bilge

    2015-01-01

    In everyday interactions, humans naturally exhibit behavioral cues, such as gaze and head movements, that signal their intentions while interpreting the behavioral cues of others to predict their intentions. Such intention prediction enables each partner to adapt their behaviors to the intent of others, serving a critical role in joint action where parties work together to achieve a common goal. Among behavioral cues, eye gaze is particularly important in understanding a person's attention and intention. In this work, we seek to quantify how gaze patterns may indicate a person's intention. Our investigation was contextualized in a dyadic sandwich-making scenario in which a "worker" prepared a sandwich by adding ingredients requested by a "customer." In this context, we investigated the extent to which the customers' gaze cues serve as predictors of which ingredients they intend to request. Predictive features were derived to represent characteristics of the customers' gaze patterns. We developed a support vector machine-based (SVM-based) model that achieved 76% accuracy in predicting the customers' intended requests based solely on gaze features. Moreover, the predictor made correct predictions approximately 1.8 s before the spoken request from the customer. We further analyzed several episodes of interactions from our data to develop a deeper understanding of the scenarios where our predictor succeeded and failed in making correct predictions. These analyses revealed additional gaze patterns that may be leveraged to improve intention prediction. This work highlights gaze cues as a significant resource for understanding human intentions and informs the design of real-time recognizers of user intention for intelligent systems, such as assistive robots and ubiquitous devices, that may enable more complex capabilities and improved user experience.

  16. Modelling Directional Solidification

    NASA Technical Reports Server (NTRS)

    Wilcox, William R.; Regel, Liya L.; Zhou, Jian; Yuan, Weijun

    1992-01-01

    The long range goal of this program has been to develop an improved understanding of phenomena of importance to directional solidification, in order to enable explanation and prediction of differences in behavior between solidification on Earth and in space. Current emphasis is on determining the influence of perturbations on directional solidification.

  17. Improving Performance and Predictability of Storage Arrays

    ERIC Educational Resources Information Center

    Altiparmak, Nihat

    2013-01-01

    Massive amount of data is generated everyday through sensors, Internet transactions, social networks, video, and all other digital sources available. Many organizations store this data to enable breakthrough discoveries and innovation in science, engineering, medicine, and commerce. Such massive scale of data poses new research problems called big…

  18. Global precipitation measurement (GPM) preliminary design

    NASA Astrophysics Data System (ADS)

    Neeck, Steven P.; Kakar, Ramesh K.; Azarbarzin, Ardeshir A.; Hou, Arthur Y.

    2008-10-01

    The overarching Earth science mission objective of the Global Precipitation Measurement (GPM) mission is to develop a scientific understanding of the Earth system and its response to natural and human-induced changes. This will enable improved prediction of climate, weather, and natural hazards for present and future generations. The specific scientific objectives of GPM are advancing: Precipitation Measurement through combined use of active and passive remote-sensing techniques, Water/Energy Cycle Variability through improved knowledge of the global water/energy cycle and fresh water availability, Climate Prediction through better understanding of surface water fluxes, soil moisture storage, cloud/precipitation microphysics and latent heat release, Weather Prediction through improved numerical weather prediction (NWP) skills from more accurate and frequent measurements of instantaneous rain rates with better error characterizations and improved assimilation methods, Hydrometeorological Prediction through better temporal sampling and spatial coverage of highresolution precipitation measurements and innovative hydro-meteorological modeling. GPM is a joint initiative with the Japan Aerospace Exploration Agency (JAXA) and other international partners and is the backbone of the Committee on Earth Observation Satellites (CEOS) Precipitation Constellation. It will unify and improve global precipitation measurements from a constellation of dedicated and operational active/passive microwave sensors. GPM is completing the Preliminary Design Phase and is advancing towards launch in 2013 and 2014.

  19. Improving Disease Prediction by Incorporating Family Disease History in Risk Prediction Models with Large-Scale Genetic Data.

    PubMed

    Gim, Jungsoo; Kim, Wonji; Kwak, Soo Heon; Choi, Hosik; Park, Changyi; Park, Kyong Soo; Kwon, Sunghoon; Park, Taesung; Won, Sungho

    2017-11-01

    Despite the many successes of genome-wide association studies (GWAS), the known susceptibility variants identified by GWAS have modest effect sizes, leading to notable skepticism about the effectiveness of building a risk prediction model from large-scale genetic data. However, in contrast to genetic variants, the family history of diseases has been largely accepted as an important risk factor in clinical diagnosis and risk prediction. Nevertheless, the complicated structures of the family history of diseases have limited their application in clinical practice. Here, we developed a new method that enables incorporation of the general family history of diseases with a liability threshold model, and propose a new analysis strategy for risk prediction with penalized regression analysis that incorporates both large numbers of genetic variants and clinical risk factors. Application of our model to type 2 diabetes in the Korean population (1846 cases and 1846 controls) demonstrated that single-nucleotide polymorphisms accounted for 32.5% of the variation explained by the predicted risk scores in the test data set, and incorporation of family history led to an additional 6.3% improvement in prediction. Our results illustrate that family medical history provides valuable information on the variation of complex diseases and improves prediction performance. Copyright © 2017 by the Genetics Society of America.

  20. Development of an improved MATLAB GUI for the prediction of coefficients of restitution, and integration into LMS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baca, Renee Nicole; Congdon, Michael L.; Brake, Matthew Robert

    In 2012, a Matlab GUI for the prediction of the coefficient of restitution was developed in order to enable the formulation of more accurate Finite Element Analysis (FEA) models of components. This report details the development of a new Rebound Dynamics GUI, and how it differs from the previously developed program. The new GUI includes several new features, such as source and citation documentation for the material database, as well as a multiple materials impact modeler for use with LMS Virtual.Lab Motion (LMS VLM), and a rigid body dynamics modeling software. The Rebound Dynamics GUI has been designed to workmore » with LMS VLM to enable straightforward incorporation of velocity-dependent coefficients of restitution in rigid body dynamics simulations.« less

  1. New Advanced Technology to Improve Prediction and Prevention of Type 1 Diabetes

    DTIC Science & Technology

    2006-11-01

    NEPH2 is located at the glomerular slit diaphragm, interacts with nephrin and is cleaved from podocytes by metalloproteinases. J Am Soc Nephrol 2005 16...localization of nephrin , podocin, and the actin cytoskeleton: evidence for a role in podocyte foot process formation. Am J Pathol 2002 161:1459... markers linked to these phenotypes will improve understanding of the molecular mechanisms underlying these diseases and enable the development of

  2. Modeling of temperature-induced near-infrared and low-field time-domain nuclear magnetic resonance spectral variation: chemometric prediction of limonene and water content in spray-dried delivery systems.

    PubMed

    Andrade, Letícia; Farhat, Imad A; Aeberhardt, Kasia; Bro, Rasmus; Engelsen, Søren Balling

    2009-02-01

    The influence of temperature on near-infrared (NIR) and nuclear magnetic resonance (NMR) spectroscopy complicates the industrial applications of both spectroscopic methods. The focus of this study is to analyze and model the effect of temperature variation on NIR spectra and NMR relaxation data. Different multivariate methods were tested for constructing robust prediction models based on NIR and NMR data acquired at various temperatures. Data were acquired on model spray-dried limonene systems at five temperatures in the range from 20 degrees C to 60 degrees C and partial least squares (PLS) regression models were computed for limonene and water predictions. The predictive ability of the models computed on the NIR spectra (acquired at various temperatures) improved significantly when data were preprocessed using extended inverted signal correction (EISC). The average PLS regression prediction error was reduced to 0.2%, corresponding to 1.9% and 3.4% of the full range of limonene and water reference values, respectively. The removal of variation induced by temperature prior to calibration, by direct orthogonalization (DO), slightly enhanced the predictive ability of the models based on NMR data. Bilinear PLS models, with implicit inclusion of the temperature, enabled limonene and water predictions by NMR with an error of 0.3% (corresponding to 2.8% and 7.0% of the full range of limonene and water). For NMR, and in contrast to the NIR results, modeling the data using multi-way N-PLS improved the models' performance. N-PLS models, in which temperature was included as an extra variable, enabled more accurate prediction, especially for limonene (prediction error was reduced to 0.2%). Overall, this study proved that it is possible to develop models for limonene and water content prediction based on NIR and NMR data, independent of the measurement temperature.

  3. Watching novice action degrades expert motor performance: Causation between action production and outcome prediction of observed actions by humans

    PubMed Central

    Ikegami, Tsuyoshi; Ganesh, Gowrishankar

    2014-01-01

    Our social skills are critically determined by our ability to understand and appropriately respond to actions performed by others. However despite its obvious importance, the mechanisms enabling action understanding in humans have remained largely unclear. A popular but controversial belief is that parts of the motor system contribute to our ability to understand observed actions. Here, using a novel behavioral paradigm, we investigated this belief by examining a causal relation between action production, and a component of action understanding - outcome prediction, the ability of a person to predict the outcome of observed actions. We asked dart experts to watch novice dart throwers and predict the outcome of their throws. We modulated the feedbacks provided to them, caused a specific improvement in the expert's ability to predict watched actions while controlling the other experimental factors, and exhibited that a change (improvement) in their outcome prediction ability results in a progressive and proportional deterioration in the expert's own darts performance. This causal relationship supports involvement of the motor system in outcome prediction by humans of actions observed in others. PMID:25384755

  4. Improved design of constrained model predictive tracking control for batch processes against unknown uncertainties.

    PubMed

    Wu, Sheng; Jin, Qibing; Zhang, Ridong; Zhang, Junfeng; Gao, Furong

    2017-07-01

    In this paper, an improved constrained tracking control design is proposed for batch processes under uncertainties. A new process model that facilitates process state and tracking error augmentation with further additional tuning is first proposed. Then a subsequent controller design is formulated using robust stable constrained MPC optimization. Unlike conventional robust model predictive control (MPC), the proposed method enables the controller design to bear more degrees of tuning so that improved tracking control can be acquired, which is very important since uncertainties exist inevitably in practice and cause model/plant mismatches. An injection molding process is introduced to illustrate the effectiveness of the proposed MPC approach in comparison with conventional robust MPC. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Prediction of response factors for gas chromatography with flame ionization detection: Algorithm improvement, extension to silylated compounds, and application to the quantification of metabolites

    PubMed Central

    de Saint Laumer, Jean‐Yves; Leocata, Sabine; Tissot, Emeline; Baroux, Lucie; Kampf, David M.; Merle, Philippe; Boschung, Alain; Seyfried, Markus

    2015-01-01

    We previously showed that the relative response factors of volatile compounds were predictable from either combustion enthalpies or their molecular formulae only 1. We now extend this prediction to silylated derivatives by adding an increment in the ab initio calculation of combustion enthalpies. The accuracy of the experimental relative response factors database was also improved and its population increased to 490 values. In particular, more brominated compounds were measured, and their prediction accuracy was improved by adding a correction factor in the algorithm. The correlation coefficient between predicted and measured values increased from 0.936 to 0.972, leading to a mean prediction accuracy of ± 6%. Thus, 93% of the relative response factors values were predicted with an accuracy of better than ± 10%. The capabilities of the extended algorithm are exemplified by (i) the quick and accurate quantification of hydroxylated metabolites resulting from a biodegradation test after silylation and prediction of their relative response factors, without having the reference substances available; and (ii) the rapid purity determinations of volatile compounds. This study confirms that Gas chromatography with a flame ionization detector and using predicted relative response factors is one of the few techniques that enables quantification of volatile compounds without calibrating the instrument with the pure reference substance. PMID:26179324

  6. Bankruptcy prediction based on financial ratios using Jordan Recurrent Neural Networks: a case study in Polish companies

    NASA Astrophysics Data System (ADS)

    Hardinata, Lingga; Warsito, Budi; Suparti

    2018-05-01

    Complexity of bankruptcy causes the accurate models of bankruptcy prediction difficult to be achieved. Various prediction models have been developed to improve the accuracy of bankruptcy predictions. Machine learning has been widely used to predict because of its adaptive capabilities. Artificial Neural Networks (ANN) is one of machine learning which proved able to complete inference tasks such as prediction and classification especially in data mining. In this paper, we propose the implementation of Jordan Recurrent Neural Networks (JRNN) to classify and predict corporate bankruptcy based on financial ratios. Feedback interconnection in JRNN enable to make the network keep important information well allowing the network to work more effectively. The result analysis showed that JRNN works very well in bankruptcy prediction with average success rate of 81.3785%.

  7. Blood autoantibody and cytokine profiles predict response to anti-tumor necrosis factor therapy in rheumatoid arthritis

    PubMed Central

    Hueber, Wolfgang; Tomooka, Beren H; Batliwalla, Franak; Li, Wentian; Monach, Paul A; Tibshirani, Robert J; Van Vollenhoven, Ronald F; Lampa, Jon; Saito, Kazuyoshi; Tanaka, Yoshiya; Genovese, Mark C; Klareskog, Lars; Gregersen, Peter K; Robinson, William H

    2009-01-01

    Introduction Anti-TNF therapies have revolutionized the treatment of rheumatoid arthritis (RA), a common systemic autoimmune disease involving destruction of the synovial joints. However, in the practice of rheumatology approximately one-third of patients demonstrate no clinical improvement in response to treatment with anti-TNF therapies, while another third demonstrate a partial response, and one-third an excellent and sustained response. Since no clinical or laboratory tests are available to predict response to anti-TNF therapies, great need exists for predictive biomarkers. Methods Here we present a multi-step proteomics approach using arthritis antigen arrays, a multiplex cytokine assay, and conventional ELISA, with the objective to identify a biomarker signature in three ethnically diverse cohorts of RA patients treated with the anti-TNF therapy etanercept. Results We identified a 24-biomarker signature that enabled prediction of a positive clinical response to etanercept in all three cohorts (positive predictive values 58 to 72%; negative predictive values 63 to 78%). Conclusions We identified a multi-parameter protein biomarker that enables pretreatment classification and prediction of etanercept responders, and tested this biomarker using three independent cohorts of RA patients. Although further validation in prospective and larger cohorts is needed, our observations demonstrate that multiplex characterization of autoantibodies and cytokines provides clinical utility for predicting response to the anti-TNF therapy etanercept in RA patients. PMID:19460157

  8. Predicting and interpreting identification errors in military vehicle training using multidimensional scaling.

    PubMed

    Bohil, Corey J; Higgins, Nicholas A; Keebler, Joseph R

    2014-01-01

    We compared methods for predicting and understanding the source of confusion errors during military vehicle identification training. Participants completed training to identify main battle tanks. They also completed card-sorting and similarity-rating tasks to express their mental representation of resemblance across the set of training items. We expected participants to selectively attend to a subset of vehicle features during these tasks, and we hypothesised that we could predict identification confusion errors based on the outcomes of the card-sort and similarity-rating tasks. Based on card-sorting results, we were able to predict about 45% of observed identification confusions. Based on multidimensional scaling of the similarity-rating data, we could predict more than 80% of identification confusions. These methods also enabled us to infer the dimensions receiving significant attention from each participant. This understanding of mental representation may be crucial in creating personalised training that directs attention to features that are critical for accurate identification. Participants completed military vehicle identification training and testing, along with card-sorting and similarity-rating tasks. The data enabled us to predict up to 84% of identification confusion errors and to understand the mental representation underlying these errors. These methods have potential to improve training and reduce identification errors leading to fratricide.

  9. Modeling the viscosity of polydisperse suspensions: Improvements in prediction of limiting behavior

    NASA Astrophysics Data System (ADS)

    Mwasame, Paul M.; Wagner, Norman J.; Beris, Antony N.

    2016-06-01

    The present study develops a fully consistent extension of the approach pioneered by Farris ["Prediction of the viscosity of multimodal suspensions from unimodal viscosity data," Trans. Soc. Rheol. 12, 281-301 (1968)] to describe the viscosity of polydisperse suspensions significantly improving upon our previous model [P. M. Mwasame, N. J. Wagner, and A. N. Beris, "Modeling the effects of polydispersity on the viscosity of noncolloidal hard sphere suspensions," J. Rheol. 60, 225-240 (2016)]. The new model captures the Farris limit of large size differences between consecutive particle size classes in a suspension. Moreover, the new model includes a further generalization that enables its application to real, complex suspensions that deviate from ideal non-colloidal suspension behavior. The capability of the new model to predict the viscosity of complex suspensions is illustrated by comparison against experimental data.

  10. Higher Self-Control Capacity Predicts Lower Anxiety-Impaired Cognition during Math Examinations.

    PubMed

    Bertrams, Alex; Baumeister, Roy F; Englert, Chris

    2016-01-01

    We assumed that self-control capacity, self-efficacy, and self-esteem would enable students to keep attentional control during tests. Therefore, we hypothesized that the three personality traits would be negatively related to anxiety-impaired cognition during math examinations. Secondary school students (N = 158) completed measures of self-control capacity, self-efficacy, and self-esteem at the beginning of the school year. Five months later, anxiety-impaired cognition during math examinations was assessed. Higher self-control capacity, but neither self-efficacy nor self-esteem, predicted lower anxiety-impaired cognition 5 months later, over and above baseline anxiety-impaired cognition. Moreover, self-control capacity was indirectly related to math grades via anxiety-impaired cognition. The findings suggest that improving self-control capacity may enable students to deal with anxiety-related problems during school tests.

  11. Higher Self-Control Capacity Predicts Lower Anxiety-Impaired Cognition during Math Examinations

    PubMed Central

    Bertrams, Alex; Baumeister, Roy F.; Englert, Chris

    2016-01-01

    We assumed that self-control capacity, self-efficacy, and self-esteem would enable students to keep attentional control during tests. Therefore, we hypothesized that the three personality traits would be negatively related to anxiety-impaired cognition during math examinations. Secondary school students (N = 158) completed measures of self-control capacity, self-efficacy, and self-esteem at the beginning of the school year. Five months later, anxiety-impaired cognition during math examinations was assessed. Higher self-control capacity, but neither self-efficacy nor self-esteem, predicted lower anxiety-impaired cognition 5 months later, over and above baseline anxiety-impaired cognition. Moreover, self-control capacity was indirectly related to math grades via anxiety-impaired cognition. The findings suggest that improving self-control capacity may enable students to deal with anxiety-related problems during school tests. PMID:27065013

  12. Improving Prediction Accuracy of “Central Line-Associated Blood Stream Infections” Using Data Mining Models

    PubMed Central

    Noaman, Amin Y.; Jamjoom, Arwa; Al-Abdullah, Nabeela; Nasir, Mahreen; Ali, Anser G.

    2017-01-01

    Prediction of nosocomial infections among patients is an important part of clinical surveillance programs to enable the related personnel to take preventive actions in advance. Designing a clinical surveillance program with capability of predicting nosocomial infections is a challenging task due to several reasons, including high dimensionality of medical data, heterogenous data representation, and special knowledge required to extract patterns for prediction. In this paper, we present details of six data mining methods implemented using cross industry standard process for data mining to predict central line-associated blood stream infections. For our study, we selected datasets of healthcare-associated infections from US National Healthcare Safety Network and consumer survey data from Hospital Consumer Assessment of Healthcare Providers and Systems. Our experiments show that central line-associated blood stream infections (CLABSIs) can be successfully predicted using AdaBoost method with an accuracy up to 89.7%. This will help in implementing effective clinical surveillance programs for infection control, as well as improving the accuracy detection of CLABSIs. Also, this reduces patients' hospital stay cost and maintains patients' safety. PMID:29085836

  13. Towards programmable plant genetic circuits.

    PubMed

    Medford, June I; Prasad, Ashok

    2016-07-01

    Synthetic biology enables the construction of genetic circuits with predictable gene functions in plants. Detailed quantitative descriptions of the transfer function or input-output function for genetic parts (promoters, 5' and 3' untranslated regions, etc.) are collected. These data are then used in computational simulations to determine their robustness and desired properties, thereby enabling the best components to be selected for experimental testing in plants. In addition, the process forms an iterative workflow which allows vast improvement to validated elements with sub-optimal function. These processes enable computational functions such as digital logic in living plants and follow the pathway of technological advances which took us from vacuum tubes to cell phones. © 2016 The Authors The Plant Journal © 2016 John Wiley & Sons Ltd.

  14. Departure Queue Prediction for Strategic and Tactical Surface Scheduler Integration

    NASA Technical Reports Server (NTRS)

    Zelinski, Shannon; Windhorst, Robert

    2016-01-01

    A departure metering concept to be demonstrated at Charlotte Douglas International Airport (CLT) will integrate strategic and tactical surface scheduling components to enable the respective collaborative decision making and improved efficiency benefits these two methods of scheduling provide. This study analyzes the effect of tactical scheduling on strategic scheduler predictability. Strategic queue predictions and target gate pushback times to achieve a desired queue length are compared between fast time simulations of CLT surface operations with and without tactical scheduling. The use of variable departure rates as a strategic scheduler input was shown to substantially improve queue predictions over static departure rates. With target queue length calibration, the strategic scheduler can be tuned to produce average delays within one minute of the tactical scheduler. However, root mean square differences between strategic and tactical delays were between 12 and 15 minutes due to the different methods the strategic and tactical schedulers use to predict takeoff times and generate gate pushback clearances. This demonstrates how difficult it is for the strategic scheduler to predict tactical scheduler assigned gate delays on an individual flight basis as the tactical scheduler adjusts departure sequence to accommodate arrival interactions. Strategic/tactical scheduler compatibility may be improved by providing more arrival information to the strategic scheduler and stabilizing tactical scheduler changes to runway sequence in response to arrivals.

  15. Machine learning and predictive data analytics enabling metrology and process control in IC fabrication

    NASA Astrophysics Data System (ADS)

    Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.

    2015-03-01

    Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.

  16. A novel metric for quantification of homogeneous and heterogeneous tumors in PET for enhanced clinical outcome prediction

    NASA Astrophysics Data System (ADS)

    Rahmim, Arman; Schmidtlein, C. Ross; Jackson, Andrew; Sheikhbahaei, Sara; Marcus, Charles; Ashrafinia, Saeed; Soltani, Madjid; Subramaniam, Rathan M.

    2016-01-01

    Oncologic PET images provide valuable information that can enable enhanced prognosis of disease. Nonetheless, such information is simplified significantly in routine clinical assessment to meet workflow constraints. Examples of typical FDG PET metrics include: (i) SUVmax, (2) total lesion glycolysis (TLG), and (3) metabolic tumor volume (MTV). We have derived and implemented a novel metric for tumor quantification, inspired in essence by a model of generalized equivalent uniform dose as used in radiation therapy. The proposed metric, denoted generalized effective total uptake (gETU), is attractive as it encompasses the abovementioned commonly invoked metrics, and generalizes them, for both homogeneous and heterogeneous tumors, using a single parameter a. We evaluated this new metric for improved overall survival (OS) prediction on two different baseline FDG PET/CT datasets: (a) 113 patients with squamous cell cancer of the oropharynx, and (b) 72 patients with locally advanced pancreatic adenocarcinoma. Kaplan-Meier survival analysis was performed, where the subjects were subdivided into two groups using the median threshold, from which the hazard ratios (HR) were computed in Cox proportional hazards regression. For the oropharyngeal cancer dataset, MTV, TLG, SUVmax, SUVmean and SUVpeak produced HR values of 1.86, 3.02, 1.34, 1.36 and 1.62, while the proposed gETU metric for a  = 0.25 (greater emphasis on volume information) enabled significantly enhanced OS prediction with HR  =  3.94. For the pancreatic cancer dataset, MTV, TLG, SUVmax, SUVmean and SUVpeak resulted in HR values of 1.05, 1.25, 1.42, 1.45 and 1.52, while gETU at a  = 3.2 (greater emphasis on SUV information) arrived at an improved HR value of 1.61. Overall, the proposed methodology allows placement of differing degrees of emphasis on tumor volume versus uptake for different types of tumors to enable enhanced clinical outcome prediction.

  17. The Nexus of Place and Finance in the Analysis of Educational Attainment: A Spatial Econometric Approach

    ERIC Educational Resources Information Center

    Sutton, Farah

    2012-01-01

    This study examines the spatial distribution of educational attainment and then builds upon current predictive frameworks for understanding patterns of educational attainment by applying a spatial econometric method of analysis. The research from this study enables a new approach to the policy discussion on how to improve educational attainment…

  18. Probabilities and Predictions: Modeling the Development of Scientific Problem-Solving Skills

    ERIC Educational Resources Information Center

    Stevens, Ron; Johnson, David F.; Soller, Amy

    2005-01-01

    The IMMEX (Interactive Multi-Media Exercises) Web-based problem set platform enables the online delivery of complex, multimedia simulations, the rapid collection of student performance data, and has already been used in several genetic simulations. The next step is the use of these data to understand and improve student learning in a formative…

  19. Fractal structure enables temporal prediction in music.

    PubMed

    Rankin, Summer K; Fink, Philip W; Large, Edward W

    2014-10-01

    1/f serial correlations and statistical self-similarity (fractal structure) have been measured in various dimensions of musical compositions. Musical performances also display 1/f properties in expressive tempo fluctuations, and listeners predict tempo changes when synchronizing. Here the authors show that the 1/f structure is sufficient for listeners to predict the onset times of upcoming musical events. These results reveal what information listeners use to anticipate events in complex, non-isochronous acoustic rhythms, and this will entail innovative models of temporal synchronization. This finding could improve therapies for Parkinson's and related disorders and inform deeper understanding of how endogenous neural rhythms anticipate events in complex, temporally structured communication signals.

  20. Taxi Time Prediction at Charlotte Airport Using Fast-Time Simulation and Machine Learning Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong

    2016-01-01

    Accurate taxi time prediction is required for enabling efficient runway scheduling that can increase runway throughput and reduce taxi times and fuel consumptions on the airport surface. Currently NASA and American Airlines are jointly developing a decision-support tool called Spot and Runway Departure Advisor (SARDA) that assists airport ramp controllers to make gate pushback decisions and improve the overall efficiency of airport surface traffic. In this presentation, we propose to use Linear Optimized Sequencing (LINOS), a discrete-event fast-time simulation tool, to predict taxi times and provide the estimates to the runway scheduler in real-time airport operations. To assess its prediction accuracy, we also introduce a data-driven analytical method using machine learning techniques. These two taxi time prediction methods are evaluated with actual taxi time data obtained from the SARDA human-in-the-loop (HITL) simulation for Charlotte Douglas International Airport (CLT) using various performance measurement metrics. Based on the taxi time prediction results, we also discuss how the prediction accuracy can be affected by the operational complexity at this airport and how we can improve the fast time simulation model before implementing it with an airport scheduling algorithm in a real-time environment.

  1. Development of a Higher Fidelity Model for the Cascade Distillation Subsystem (CDS)

    NASA Technical Reports Server (NTRS)

    Perry, Bruce; Anderson, Molly

    2014-01-01

    Significant improvements have been made to the ACM model of the CDS, enabling accurate predictions of dynamic operations with fewer assumptions. The model has been utilized to predict how CDS performance would be impacted by changing operating parameters, revealing performance trade-offs and possibilities for improvement. CDS efficiency is driven by the THP coefficient of performance, which in turn is dependent on heat transfer within the system. Based on the remaining limitations of the simulation, priorities for further model development include: center dot Relaxing the assumption of total condensation center dot Incorporating dynamic simulation capability for the buildup of dissolved inert gasses in condensers center dot Examining CDS operation with more complex feeds center dot Extending heat transfer analysis to all surfaces

  2. Rational Design of Mouse Models for Cancer Research.

    PubMed

    Landgraf, Marietta; McGovern, Jacqui A; Friedl, Peter; Hutmacher, Dietmar W

    2018-03-01

    The laboratory mouse is widely considered as a valid and affordable model organism to study human disease. Attempts to improve the relevance of murine models for the investigation of human pathologies led to the development of various genetically engineered, xenograft and humanized mouse models. Nevertheless, most preclinical studies in mice suffer from insufficient predictive value when compared with cancer biology and therapy response of human patients. We propose an innovative strategy to improve the predictive power of preclinical cancer models. Combining (i) genomic, tissue engineering and regenerative medicine approaches for rational design of mouse models with (ii) rapid prototyping and computational benchmarking against human clinical data will enable fast and nonbiased validation of newly generated models. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. The role of thermal and lubricant boundary layers in the transient thermal analysis of spur gears

    NASA Technical Reports Server (NTRS)

    El-Bayoumy, L. E.; Akin, L. S.; Townsend, D. P.; Choy, F. C.

    1989-01-01

    An improved convection heat-transfer model has been developed for the prediction of the transient tooth surface temperature of spur gears. The dissipative quality of the lubricating fluid is shown to be limited to the capacity extent of the thermal boundary layer. This phenomenon can be of significance in the determination of the thermal limit of gears accelerating to the point where gear scoring occurs. Steady-state temperature prediction is improved considerably through the use of a variable integration time step that substantially reduces computer time. Computer-generated plots of temperature contours enable the user to animate the propagation of the thermal wave as the gears come into and out of contact, thus contributing to better understanding of this complex problem. This model has a much better capability at predicting gear-tooth temperatures than previous models.

  4. Atmospheric radiance interpolation for the modeling of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Fuehrer, Perry; Healey, Glenn; Rauch, Brian; Slater, David; Ratkowski, Anthony

    2008-04-01

    The calibration of data from hyperspectral sensors to spectral radiance enables the use of physical models to predict measured spectra. Since environmental conditions are often unknown, material detection algorithms have emerged that utilize predicted spectra over ranges of environmental conditions. The predicted spectra are typically generated by a radiative transfer (RT) code such as MODTRAN TM. Such techniques require the specification of a set of environmental conditions. This is particularly challenging in the LWIR for which temperature and atmospheric constituent profiles are required as inputs for the RT codes. We have developed an automated method for generating environmental conditions to obtain a desired sampling of spectra in the sensor radiance domain. Our method provides a way of eliminating the usual problems encountered, because sensor radiance spectra depend nonlinearly on the environmental parameters, when model conditions are specified by a uniform sampling of environmental parameters. It uses an initial set of radiance vectors concatenated over a set of conditions to define the mapping from environmental conditions to sensor spectral radiance. This approach enables a given number of model conditions to span the space of desired radiance spectra and improves both the accuracy and efficiency of detection algorithms that rely upon use of predicted spectra.

  5. Speckle lithography for fabricating Gaussian, quasi-random 2D structures and black silicon structures.

    PubMed

    Bingi, Jayachandra; Murukeshan, Vadakke Matham

    2015-12-18

    Laser speckle pattern is a granular structure formed due to random coherent wavelet interference and generally considered as noise in optical systems including photolithography. Contrary to this, in this paper, we use the speckle pattern to generate predictable and controlled Gaussian random structures and quasi-random structures photo-lithographically. The random structures made using this proposed speckle lithography technique are quantified based on speckle statistics, radial distribution function (RDF) and fast Fourier transform (FFT). The control over the speckle size, density and speckle clustering facilitates the successful fabrication of black silicon with different surface structures. The controllability and tunability of randomness makes this technique a robust method for fabricating predictable 2D Gaussian random structures and black silicon structures. These structures can enhance the light trapping significantly in solar cells and hence enable improved energy harvesting. Further, this technique can enable efficient fabrication of disordered photonic structures and random media based devices.

  6. Predicting hospital visits from geo-tagged Internet search logs.

    PubMed

    Agarwal, Vibhu; Han, Lichy; Madan, Isaac; Saluja, Shaurya; Shidham, Aaditya; Shah, Nigam H

    2016-01-01

    The steady rise in healthcare costs has deprived over 45 million Americans of healthcare services (1, 2) and has encouraged healthcare providers to look for opportunities to improve their operational efficiency. Prior studies have shown that evidence of healthcare seeking intent in Internet searches correlates well with healthcare resource utilization. Given the ubiquitous nature of mobile Internet search, we hypothesized that analyzing geo-tagged mobile search logs could enable us to machine-learn predictors of future patient visits. Using a de-identified dataset of geo-tagged mobile Internet search logs, we mined text and location patterns that are predictors of healthcare resource utilization and built statistical models that predict the probability of a user's future visit to a medical facility. Our efforts will enable the development of innovative methods for modeling and optimizing the use of healthcare resources-a crucial prerequisite for securing healthcare access for everyone in the days to come.

  7. Recurrent Neural Network Applications for Astronomical Time Series

    NASA Astrophysics Data System (ADS)

    Protopapas, Pavlos

    2017-06-01

    The benefits of good predictive models in astronomy lie in early event prediction systems and effective resource allocation. Current time series methods applicable to regular time series have not evolved to generalize for irregular time series. In this talk, I will describe two Recurrent Neural Network methods, Long Short-Term Memory (LSTM) and Echo State Networks (ESNs) for predicting irregular time series. Feature engineering along with a non-linear modeling proved to be an effective predictor. For noisy time series, the prediction is improved by training the network on error realizations using the error estimates from astronomical light curves. In addition to this, we propose a new neural network architecture to remove correlation from the residuals in order to improve prediction and compensate for the noisy data. Finally, I show how to set hyperparameters for a stable and performant solution correctly. In this work, we circumvent this obstacle by optimizing ESN hyperparameters using Bayesian optimization with Gaussian Process priors. This automates the tuning procedure, enabling users to employ the power of RNN without needing an in-depth understanding of the tuning procedure.

  8. Improved Performance and Safety for High Energy Batteries Through Use of Hazard Anticipation and Capacity Prediction

    NASA Technical Reports Server (NTRS)

    Atwater, Terrill

    1993-01-01

    Prediction of the capacity remaining in used high rate, high energy batteries is important information to the user. Knowledge of the capacity remaining in used batteries results in better utilization. This translates into improved readiness and cost savings due to complete, efficient use. High rate batteries, due to their chemical nature, are highly sensitive to misuse (i.e., over discharge or very high rate discharge). Battery failure due to misuse or manufacturing defects could be disastrous. Since high rate, high energy batteries are expensive and energetic, a reliable method of predicting both failures and remaining energy has been actively sought. Due to concerns over safety, the behavior of lithium/sulphur dioxide cells at different temperatures and current drains was examined. The main thrust of this effort was to determine failure conditions for incorporation in hazard anticipation circuitry. In addition, capacity prediction formulas have been developed from test data. A process that performs continuous, real-time hazard anticipation and capacity prediction was developed. The introduction of this process into microchip technology will enable the production of reliable, safe, and efficient high energy batteries.

  9. Analysis of the predictive qualities of betting odds and FIFA World Ranking: evidence from the 2006, 2010 and 2014 Football World Cups.

    PubMed

    Wunderlich, Fabian; Memmert, Daniel

    2016-12-01

    The present study aims to investigate the ability of a new framework enabling to derive more detailed model-based predictions from ranking systems. These were compared to predictions from the bet market including data from the World Cups 2006, 2010, and 2014. The results revealed that the FIFA World Ranking has essentially improved its predictive qualities compared to the bet market since the mode of calculation was changed in 2006. While both predictors were useful to obtain accurate predictions in general, the world ranking was able to outperform the bet market significantly for the World Cup 2014 and when the data from the World Cups 2010 and 2014 were pooled. Our new framework can be extended in future research to more detailed prediction tasks (i.e., predicting the final scores of a match or the tournament progress of a team).

  10. Improved Slip Casting Of Ceramic Models

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.; Vasquez, Peter; Hicks, Lana P.

    1994-01-01

    Improved technique of investment slip casting developed for making precise ceramic wind-tunnel models. Needed in wind-tunnel experiments to verify predictions of aerothermodynamical computer codes. Ceramic materials used because of their low heat conductivities and ability to survive high temperatures. Present improved slip-casting technique enables casting of highly detailed models from aqueous or nonaqueous solutions. Wet shell molds peeled off models to ensure precise and undamaged details. Used at NASA Langley Research Center to form superconducting ceramic components from nonaqueous slip solutions. Technique has many more applications when ceramic materials developed further for such high-strength/ temperature components as engine parts.

  11. Improving lung cancer prognosis assessment by incorporating synthetic minority oversampling technique and score fusion method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Shiju; Qian, Wei; Guan, Yubao

    2016-06-15

    Purpose: This study aims to investigate the potential to improve lung cancer recurrence risk prediction performance for stage I NSCLS patients by integrating oversampling, feature selection, and score fusion techniques and develop an optimal prediction model. Methods: A dataset involving 94 early stage lung cancer patients was retrospectively assembled, which includes CT images, nine clinical and biological (CB) markers, and outcome of 3-yr disease-free survival (DFS) after surgery. Among the 94 patients, 74 remained DFS and 20 had cancer recurrence. Applying a computer-aided detection scheme, tumors were segmented from the CT images and 35 quantitative image (QI) features were initiallymore » computed. Two normalized Gaussian radial basis function network (RBFN) based classifiers were built based on QI features and CB markers separately. To improve prediction performance, the authors applied a synthetic minority oversampling technique (SMOTE) and a BestFirst based feature selection method to optimize the classifiers and also tested fusion methods to combine QI and CB based prediction results. Results: Using a leave-one-case-out cross-validation (K-fold cross-validation) method, the computed areas under a receiver operating characteristic curve (AUCs) were 0.716 ± 0.071 and 0.642 ± 0.061, when using the QI and CB based classifiers, respectively. By fusion of the scores generated by the two classifiers, AUC significantly increased to 0.859 ± 0.052 (p < 0.05) with an overall prediction accuracy of 89.4%. Conclusions: This study demonstrated the feasibility of improving prediction performance by integrating SMOTE, feature selection, and score fusion techniques. Combining QI features and CB markers and performing SMOTE prior to feature selection in classifier training enabled RBFN based classifier to yield improved prediction accuracy.« less

  12. Genomic selection models double the accuracy of predicted breeding values for bacterial cold water disease resistance compared to a traditional pedigree-based model in rainbow trout aquaculture

    USDA-ARS?s Scientific Manuscript database

    Previously we have shown that bacterial cold water disease (BCWD) resistance in rainbow trout can be improved using traditional family-based selection, but progress has been limited to exploiting only between-family genetic variation. Genomic selection (GS) is a new alternative enabling exploitation...

  13. Design and Implementation of an Intelligent System to Predict the Student Graduation AGPA

    ERIC Educational Resources Information Center

    Ismail, Sameh; Abdulla, Shubair

    2015-01-01

    Since Accumulated Grad-Point Average (AGPA) is crucial in the professional life of students, it is an interesting and challenging problem to create profiles for those students who are likely to graduate with low AGPA. Identifying this kind of students accurately will enable the university staff to help them improve their ability by providing them…

  14. Inlet Flow Control and Prediction Technologies for Embedded Propulsion Systems

    NASA Technical Reports Server (NTRS)

    McMillan, Michelle L.; Mackie, Scott A.; Gissen, Abe; Vukasinovic, Bojan; Lakebrink, Matthew T.; Glezer, Ari; Mani, Mori; Mace, James L.

    2011-01-01

    Fail-safe, hybrid, flow control (HFC) is a promising technology for meeting high-speed cruise efficiency, low-noise signature, and reduced fuel-burn goals for future, Hybrid-Wing-Body (HWB) aircraft with embedded engines. This report details the development of HFC technology that enables improved inlet performance in HWB vehicles with highly integrated inlets and embedded engines without adversely affecting vehicle performance. In addition, new test techniques for evaluating Boundary-Layer-Ingesting (BLI)-inlet flow-control technologies developed and demonstrated through this program are documented, including the ability to generate a BLI-like inlet-entrance flow in a direct-connect, wind-tunnel facility, as well as, the use of D-optimal, statistically designed experiments to optimize test efficiency and enable interpretation of results. Validated improvements in numerical analysis tools and methods accomplished through this program are also documented, including Reynolds-Averaged Navier-Stokes CFD simulations of steady-state flow physics for baseline, BLI-inlet diffuser flow, as well as, that created by flow-control devices. Finally, numerical methods were employed in a ground-breaking attempt to directly simulate dynamic distortion. The advances in inlet technologies and prediction tools will help to meet and exceed "N+2" project goals for future HWB aircraft.

  15. Understanding socio-economic impacts of geohazards aided by cyber-enabled systems

    NASA Astrophysics Data System (ADS)

    Klose, C. D.; Webersik, C.

    2008-12-01

    Due to an increase in the volume of geohazards worldwide, not only are impoverished regions in less developed countries such as Haiti, vulnerable to risk but also low income regions in industrialized countries, e.g. USA, as well. This has been exemplified once again by Hurricanes Gustav, Hanna and Ike and the impact on the Caribbean countries during the summer of 2008. To date, extensive research has been conducted to improve the monitoring of human-nature coupled systems. However, there is little emphasis on improving and developing methodologies to a) interpret multi-dimensional and complex data and b) validate prediction and modeling results. This presentation tries to motivate more research initiatives to address the aforementioned issues, bringing together two academic disciplines, earth and social sciences, to research the relationship between natural and socio-economic processes. Results are presented where cyber-enabled methods based on artificial intelligence are applied to different geohazards and regions in the world. They include 1) modeling of public health risks associated with volcanic gas hazards, 2) prediction and validation of potential areas of mining-triggered earthquakes, and 3) modeling of socio-economic risks associated with tropical storms in Haiti and the Dominican Republic.

  16. Bigger data, collaborative tools and the future of predictive drug discovery

    NASA Astrophysics Data System (ADS)

    Ekins, Sean; Clark, Alex M.; Swamidass, S. Joshua; Litterman, Nadia; Williams, Antony J.

    2014-10-01

    Over the past decade we have seen a growth in the provision of chemistry data and cheminformatics tools as either free websites or software as a service commercial offerings. These have transformed how we find molecule-related data and use such tools in our research. There have also been efforts to improve collaboration between researchers either openly or through secure transactions using commercial tools. A major challenge in the future will be how such databases and software approaches handle larger amounts of data as it accumulates from high throughput screening and enables the user to draw insights, enable predictions and move projects forward. We now discuss how information from some drug discovery datasets can be made more accessible and how privacy of data should not overwhelm the desire to share it at an appropriate time with collaborators. We also discuss additional software tools that could be made available and provide our thoughts on the future of predictive drug discovery in this age of big data. We use some examples from our own research on neglected diseases, collaborations, mobile apps and algorithm development to illustrate these ideas.

  17. Real-time assessments of water quality: expanding nowcasting throughout the Great Lakes

    USGS Publications Warehouse

    ,

    2013-01-01

    Nowcasts are systems that inform the public of current bacterial water-quality conditions at beaches on the basis of predictive models. During 2010–12, the U.S. Geological Survey (USGS) worked with 23 local and State agencies to improve existing operational beach nowcast systems at 4 beaches and expand the use of predictive models in nowcasts at an additional 45 beaches throughout the Great Lakes. The predictive models were specific to each beach, and the best model for each beach was based on a unique combination of environmental and water-quality explanatory variables. The variables used most often in models to predict Escherichia coli (E. coli) concentrations or the probability of exceeding a State recreational water-quality standard included turbidity, day of the year, wave height, wind direction and speed, antecedent rainfall for various time periods, and change in lake level over 24 hours. During validation of 42 beach models during 2012, the models performed better than the current method to assess recreational water quality (previous day's E. coli concentration). The USGS will continue to work with local agencies to improve nowcast predictions, enable technology transfer of predictive model development procedures, and implement more operational systems during 2013 and beyond.

  18. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  19. Landing Gear Noise Prediction and Analysis for Tube-and-Wing and Hybrid-Wing-Body Aircraft

    NASA Technical Reports Server (NTRS)

    Guo, Yueping; Burley, Casey L.; Thomas, Russell H.

    2016-01-01

    Improvements and extensions to landing gear noise prediction methods are developed. New features include installation effects such as reflection from the aircraft, gear truck angle effect, local flow calculation at the landing gear locations, gear size effect, and directivity for various gear designs. These new features have not only significantly improved the accuracy and robustness of the prediction tools, but also have enabled applications to unconventional aircraft designs and installations. Systematic validations of the improved prediction capability are then presented, including parametric validations in functional trends as well as validations in absolute amplitudes, covering a wide variety of landing gear designs, sizes, and testing conditions. The new method is then applied to selected concept aircraft configurations in the portfolio of the NASA Environmentally Responsible Aviation Project envisioned for the timeframe of 2025. The landing gear noise levels are on the order of 2 to 4 dB higher than previously reported predictions due to increased fidelity in accounting for installation effects and gear design details. With the new method, it is now possible to reveal and assess the unique noise characteristics of landing gear systems for each type of aircraft. To address the inevitable uncertainties in predictions of landing gear noise models for future aircraft, an uncertainty analysis is given, using the method of Monte Carlo simulation. The standard deviation of the uncertainty in predicting the absolute level of landing gear noise is quantified and determined to be 1.4 EPNL dB.

  20. The circadian profile of epilepsy improves seizure forecasting.

    PubMed

    Karoly, Philippa J; Ung, Hoameng; Grayden, David B; Kuhlmann, Levin; Leyde, Kent; Cook, Mark J; Freestone, Dean R

    2017-08-01

    It is now established that epilepsy is characterized by periodic dynamics that increase seizure likelihood at certain times of day, and which are highly patient-specific. However, these dynamics are not typically incorporated into seizure prediction algorithms due to the difficulty of estimating patient-specific rhythms from relatively short-term or unreliable data sources. This work outlines a novel framework to develop and assess seizure forecasts, and demonstrates that the predictive power of forecasting models is improved by circadian information. The analyses used long-term, continuous electrocorticography from nine subjects, recorded for an average of 320 days each. We used a large amount of out-of-sample data (a total of 900 days for algorithm training, and 2879 days for testing), enabling the most extensive post hoc investigation into seizure forecasting. We compared the results of an electrocorticography-based logistic regression model, a circadian probability, and a combined electrocorticography and circadian model. For all subjects, clinically relevant seizure prediction results were significant, and the addition of circadian information (combined model) maximized performance across a range of outcome measures. These results represent a proof-of-concept for implementing a circadian forecasting framework, and provide insight into new approaches for improving seizure prediction algorithms. The circadian framework adds very little computational complexity to existing prediction algorithms, and can be implemented using current-generation implant devices, or even non-invasively via surface electrodes using a wearable application. The ability to improve seizure prediction algorithms through straightforward, patient-specific modifications provides promise for increased quality of life and improved safety for patients with epilepsy. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Evolving biomarkers improve prediction of long-term mortality in patients with stable coronary artery disease: the BIO-VILCAD score.

    PubMed

    Kleber, M E; Goliasch, G; Grammer, T B; Pilz, S; Tomaschitz, A; Silbernagel, G; Maurer, G; März, W; Niessner, A

    2014-08-01

    Algorithms to predict the future long-term risk of patients with stable coronary artery disease (CAD) are rare. The VIenna and Ludwigshafen CAD (VILCAD) risk score was one of the first scores specifically tailored for this clinically important patient population. The aim of this study was to refine risk prediction in stable CAD creating a new prediction model encompassing various pathophysiological pathways. Therefore, we assessed the predictive power of 135 novel biomarkers for long-term mortality in patients with stable CAD. We included 1275 patients with stable CAD from the LUdwigshafen RIsk and Cardiovascular health study with a median follow-up of 9.8 years to investigate whether the predictive power of the VILCAD score could be improved by the addition of novel biomarkers. Additional biomarkers were selected in a bootstrapping procedure based on Cox regression to determine the most informative predictors of mortality. The final multivariable model encompassed nine clinical and biochemical markers: age, sex, left ventricular ejection fraction (LVEF), heart rate, N-terminal pro-brain natriuretic peptide, cystatin C, renin, 25OH-vitamin D3 and haemoglobin A1c. The extended VILCAD biomarker score achieved a significantly improved C-statistic (0.78 vs. 0.73; P = 0.035) and net reclassification index (14.9%; P < 0.001) compared to the original VILCAD score. Omitting LVEF, which might not be readily measureable in clinical practice, slightly reduced the accuracy of the new BIO-VILCAD score but still significantly improved risk classification (net reclassification improvement 12.5%; P < 0.001). The VILCAD biomarker score based on routine parameters complemented by novel biomarkers outperforms previous risk algorithms and allows more accurate classification of patients with stable CAD, enabling physicians to choose more personalized treatment regimens for their patients.

  2. Complexity Science Framework for Big Data: Data-enabled Science

    NASA Astrophysics Data System (ADS)

    Surjalal Sharma, A.

    2016-07-01

    The ubiquity of Big Data has stimulated the development of analytic tools to harness the potential for timely and improved modeling and prediction. While much of the data is available near-real time and can be compiled to specify the current state of the system, the capability to make predictions is lacking. The main reason is the basic nature of Big Data - the traditional techniques are challenged in their ability to cope with its velocity, volume and variability to make optimum use of the available information. Another aspect is the absence of an effective description of the time evolution or dynamics of the specific system, derived from the data. Once such dynamical models are developed predictions can be made readily. This approach of " letting the data speak for itself " is distinct from the first-principles models based on the understanding of the fundamentals of the system. The predictive capability comes from the data-derived dynamical model, with no modeling assumptions, and can address many issues such as causality and correlation. This approach provides a framework for addressing the challenges in Big Data, especially in the case of spatio-temporal time series data. The reconstruction of dynamics from time series data is based on recognition that in most systems the different variables or degrees of freedom are coupled nonlinearly and in the presence of dissipation the state space contracts, effectively reducing the number of variables, thus enabling a description of its dynamical evolution and consequently prediction of future states. The predictability is analysed from the intrinsic characteristics of the distribution functions, such as Hurst exponents and Hill estimators. In most systems the distributions have heavy tails, which imply higher likelihood for extreme events. The characterization of the probabilities of extreme events are critical in many cases e. g., natural hazards, for proper assessment of risk and mitigation strategies. Big Data with such new analytics can yield improved risk estimates. The challenges of scientific inference from complex and massive data are addressed by data-enabled science, also referred as the Fourth paradigm, after experiment, theory and simulation. An example of this approach is the modelling of dynamical and statistical features of natural systems, without assumptions of specific processes. An effective use of the techniques of complexity science to yield the inherent features of a system from extensive data from observations and large scale numerical simulations is evident in the case of Earth's magnetosphere. The multiscale nature of the magnetosphere makes the numerical simulations a challenge, requiring very large computing resources. The reconstruction of dynamics from observational data can however yield the inherent characteristics using typical desktop computers. Such studies for other systems are in progress. Data-enabled approach using the framework of complexity science provides new techniques for modelling and prediction using Big Data. The studies of Earth's magnetosphere, provide an example of the potential for a new approach to the development of quantitative analytic tools.

  3. Application of agent-based system for bioprocess description and process improvement.

    PubMed

    Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J

    2010-01-01

    Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers

  4. On-Line Fringe Tracking and Prediction at IOTA

    NASA Technical Reports Server (NTRS)

    Wilson, Edward; Mah, Robert; Lau, Sonie (Technical Monitor)

    1999-01-01

    The Infrared/Optical Telescope Array (IOTA) is a multi-aperture Michelson interferometer located on Mt. Hopkins near Tucson, Arizona. To enable viewing of fainter targets, an on-line fringe tracking system is presently under development at NASA Ames Research Center. The system has been developed off-line using actual data from IOTA, and is presently undergoing on-line implementation at IOTA. The system has two parts: (1) a fringe tracking system that identifies the center of a fringe packet by fitting a parametric model to the data; and (2) a fringe packet motion prediction system that uses characteristics of past fringe packets to predict fringe packet motion. Combined, this information will be used to optimize on-line the scanning trajectory, resulting in improved visibility of faint targets. Fringe packet identification is highly accurate and robust (99% of the 4000 fringe packets were identified correctly, the remaining 1% were either out of the scan range or too noisy to be seen) and is performed in 30-90 milliseconds on a Pentium II-based computer. Fringe packet prediction, currently performed using an adaptive linear predictor, delivers a 10% improvement over the baseline of predicting no motion.

  5. Progressive Dictionary Learning with Hierarchical Predictive Structure for Scalable Video Coding.

    PubMed

    Dai, Wenrui; Shen, Yangmei; Xiong, Hongkai; Jiang, Xiaoqian; Zou, Junni; Taubman, David

    2017-04-12

    Dictionary learning has emerged as a promising alternative to the conventional hybrid coding framework. However, the rigid structure of sequential training and prediction degrades its performance in scalable video coding. This paper proposes a progressive dictionary learning framework with hierarchical predictive structure for scalable video coding, especially in low bitrate region. For pyramidal layers, sparse representation based on spatio-temporal dictionary is adopted to improve the coding efficiency of enhancement layers (ELs) with a guarantee of reconstruction performance. The overcomplete dictionary is trained to adaptively capture local structures along motion trajectories as well as exploit the correlations between neighboring layers of resolutions. Furthermore, progressive dictionary learning is developed to enable the scalability in temporal domain and restrict the error propagation in a close-loop predictor. Under the hierarchical predictive structure, online learning is leveraged to guarantee the training and prediction performance with an improved convergence rate. To accommodate with the stateof- the-art scalable extension of H.264/AVC and latest HEVC, standardized codec cores are utilized to encode the base and enhancement layers. Experimental results show that the proposed method outperforms the latest SHVC and HEVC simulcast over extensive test sequences with various resolutions.

  6. Population-Level Prediction of Type 2 Diabetes From Claims Data and Analysis of Risk Factors.

    PubMed

    Razavian, Narges; Blecker, Saul; Schmidt, Ann Marie; Smith-McLallen, Aaron; Nigam, Somesh; Sontag, David

    2015-12-01

    We present a new approach to population health, in which data-driven predictive models are learned for outcomes such as type 2 diabetes. Our approach enables risk assessment from readily available electronic claims data on large populations, without additional screening cost. Proposed model uncovers early and late-stage risk factors. Using administrative claims, pharmacy records, healthcare utilization, and laboratory results of 4.1 million individuals between 2005 and 2009, an initial set of 42,000 variables were derived that together describe the full health status and history of every individual. Machine learning was then used to methodically enhance predictive variable set and fit models predicting onset of type 2 diabetes in 2009-2011, 2010-2012, and 2011-2013. We compared the enhanced model with a parsimonious model consisting of known diabetes risk factors in a real-world environment, where missing values are common and prevalent. Furthermore, we analyzed novel and known risk factors emerging from the model at different age groups at different stages before the onset. Parsimonious model using 21 classic diabetes risk factors resulted in area under ROC curve (AUC) of 0.75 for diabetes prediction within a 2-year window following the baseline. The enhanced model increased the AUC to 0.80, with about 900 variables selected as predictive (p < 0.0001 for differences between AUCs). Similar improvements were observed for models predicting diabetes onset 1-3 years and 2-4 years after baseline. The enhanced model improved positive predictive value by at least 50% and identified novel surrogate risk factors for type 2 diabetes, such as chronic liver disease (odds ratio [OR] 3.71), high alanine aminotransferase (OR 2.26), esophageal reflux (OR 1.85), and history of acute bronchitis (OR 1.45). Liver risk factors emerge later in the process of diabetes development compared with obesity-related factors such as hypertension and high hemoglobin A1c. In conclusion, population-level risk prediction for type 2 diabetes using readily available administrative data is feasible and has better prediction performance than classical diabetes risk prediction algorithms on very large populations with missing data. The new model enables intervention allocation at national scale quickly and accurately and recovers potentially novel risk factors at different stages before the disease onset.

  7. Toward a Predictive Understanding of Earth’s Microbiomes to Address 21st Century Challenges

    PubMed Central

    Blaser, Martin J.; Cardon, Zoe G.; Cho, Mildred K.; Dangl, Jeffrey L.; Green, Jessica L.; Knight, Rob; Maxon, Mary E.; Northen, Trent R.; Pollard, Katherine S.

    2016-01-01

    ABSTRACT Microorganisms have shaped our planet and its inhabitants for over 3.5 billion years. Humankind has had a profound influence on the biosphere, manifested as global climate and land use changes, and extensive urbanization in response to a growing population. The challenges we face to supply food, energy, and clean water while maintaining and improving the health of our population and ecosystems are significant. Given the extensive influence of microorganisms across our biosphere, we propose that a coordinated, cross-disciplinary effort is required to understand, predict, and harness microbiome function. From the parallelization of gene function testing to precision manipulation of genes, communities, and model ecosystems and development of novel analytical and simulation approaches, we outline strategies to move microbiome research into an era of causality. These efforts will improve prediction of ecosystem response and enable the development of new, responsible, microbiome-based solutions to significant challenges of our time. PMID:27178263

  8. Toward a Predictive Understanding of Earth's Microbiomes to Address 21st Century Challenges.

    PubMed

    Blaser, Martin J; Cardon, Zoe G; Cho, Mildred K; Dangl, Jeffrey L; Donohue, Timothy J; Green, Jessica L; Knight, Rob; Maxon, Mary E; Northen, Trent R; Pollard, Katherine S; Brodie, Eoin L

    2016-05-13

    Microorganisms have shaped our planet and its inhabitants for over 3.5 billion years. Humankind has had a profound influence on the biosphere, manifested as global climate and land use changes, and extensive urbanization in response to a growing population. The challenges we face to supply food, energy, and clean water while maintaining and improving the health of our population and ecosystems are significant. Given the extensive influence of microorganisms across our biosphere, we propose that a coordinated, cross-disciplinary effort is required to understand, predict, and harness microbiome function. From the parallelization of gene function testing to precision manipulation of genes, communities, and model ecosystems and development of novel analytical and simulation approaches, we outline strategies to move microbiome research into an era of causality. These efforts will improve prediction of ecosystem response and enable the development of new, responsible, microbiome-based solutions to significant challenges of our time. Copyright © 2016 Blaser et al.

  9. Integration of Multi-Modal Biomedical Data to Predict Cancer Grade and Patient Survival.

    PubMed

    Phan, John H; Hoffman, Ryan; Kothari, Sonal; Wu, Po-Yen; Wang, May D

    2016-02-01

    The Big Data era in Biomedical research has resulted in large-cohort data repositories such as The Cancer Genome Atlas (TCGA). These repositories routinely contain hundreds of matched patient samples for genomic, proteomic, imaging, and clinical data modalities, enabling holistic and multi-modal integrative analysis of human disease. Using TCGA renal and ovarian cancer data, we conducted a novel investigation of multi-modal data integration by combining histopathological image and RNA-seq data. We compared the performances of two integrative prediction methods: majority vote and stacked generalization. Results indicate that integration of multiple data modalities improves prediction of cancer grade and outcome. Specifically, stacked generalization, a method that integrates multiple data modalities to produce a single prediction result, outperforms both single-data-modality prediction and majority vote. Moreover, stacked generalization reveals the contribution of each data modality (and specific features within each data modality) to the final prediction result and may provide biological insights to explain prediction performance.

  10. Predictors and enablers of mental health nurses' family-focused practice.

    PubMed

    Grant, Anne; Reupert, Andrea; Maybery, Darryl; Goodyear, Melinda

    2018-06-27

    Family-focused practice improves outcomes for families where parents have a mental illness. However, there is limited understanding regarding the factors that predict and enable these practices. This study aimed to identify factors that predict and enable mental health nurses' family-focused practice. A sequential mixed methods design was used. A total of 343 mental health nurses, practicing in 12 mental health services (in acute inpatient and community settings), throughout Ireland completed the Family Focused Mental Health Practice Questionnaire, measuring family-focused behaviours and other factors that impact family-focused activities. Hierarchical multiple regression identified 14 predictors of family-focused practice. The most important predictors noted were nurses' skill and knowledge, own parenting experience, and work setting (i.e. community). Fourteen nurses, who achieved high scores on the questionnaire, subsequently participated in semistructured interviews to elaborate on enablers of family-focused practice. Participants described drawing on their parenting experiences to normalize parenting challenges, encouraging service users to disclose parenting concerns, and promoting trust. The opportunity to visit a service user's home allowed them to observe how the parent was coping and forge a close relationship with them. Nurses' personal characteristics and work setting are key factors in determining family-focused practice. This study extends current research by clearly highlighting predictors of family-focused practice and reporting how various enablers promoted family-focused practice. The capacity of nurses to support families has training, organizational and policy implications within adult mental health services in Ireland and elsewhere. © 2018 Australian College of Mental Health Nurses Inc.

  11. Design of Oil-Lubricated Machine for Life and Reliability

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.

    2007-01-01

    In the post-World War II era, the major technology drivers for improving the life, reliability, and performance of rolling-element bearings and gears have been the jet engine and the helicopter. By the late 1950s, most of the materials used for bearings and gears in the aerospace industry had been introduced into use. By the early 1960s, the life of most steels was increased over that experienced in the early 1940s, primarily by the introduction of vacuum degassing and vacuum melting processes in the late 1950s. The development of elastohydrodynamic (EHD) theory showed that most rolling bearings and gears have a thin film separating the contacting bodies during motion and it is that film which affects their lives. Computer programs modeling bearing and gear dynamics that incorporate probabilistic life prediction methods and EHD theory enable optimization of rotating machinery based on life and reliability. With improved manufacturing and processing, the potential improvement in bearing and gear life can be as much as 80 times that attainable in the early 1950s. The work presented summarizes the use of laboratory fatigue data for bearings and gears coupled with probabilistic life prediction and EHD theories to predict the life and reliability of a commercial turboprop gearbox. The resulting predictions are compared with field data.

  12. Matsu: An Elastic Cloud Connected to a SensorWeb for Disaster Response

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel

    2011-01-01

    This slide presentation reviews the use of cloud computing combined with the SensorWeb in aiding disaster response planning. Included is an overview of the architecture of the SensorWeb, and overviews of the phase 1 of the EO-1 system and the steps to improve it to transform it to an On-demand product cloud as part of the Open Cloud Consortium (OCC). The effectiveness of this system is demonstrated in the SensorWeb for the Namibia flood in 2010, using information blended from MODIS, TRMM, River Gauge data, and the Google Earth version of Namibia the system enabled river surge predictions and could enable planning for future disaster responses.

  13. Low abundance of the matrix arm of complex I in mitochondria predicts longevity in mice

    PubMed Central

    Miwa, Satomi; Jow, Howsun; Baty, Karen; Johnson, Amy; Czapiewski, Rafal; Saretzki, Gabriele; Treumann, Achim; von Zglinicki, Thomas

    2014-01-01

    Mitochondrial function is an important determinant of the ageing process; however, the mitochondrial properties that enable longevity are not well understood. Here we show that optimal assembly of mitochondrial complex I predicts longevity in mice. Using an unbiased high-coverage high-confidence approach, we demonstrate that electron transport chain proteins, especially the matrix arm subunits of complex I, are decreased in young long-living mice, which is associated with improved complex I assembly, higher complex I-linked state 3 oxygen consumption rates and decreased superoxide production, whereas the opposite is seen in old mice. Disruption of complex I assembly reduces oxidative metabolism with concomitant increase in mitochondrial superoxide production. This is rescued by knockdown of the mitochondrial chaperone, prohibitin. Disrupted complex I assembly causes premature senescence in primary cells. We propose that lower abundance of free catalytic complex I components supports complex I assembly, efficacy of substrate utilization and minimal ROS production, enabling enhanced longevity. PMID:24815183

  14. Speckle lithography for fabricating Gaussian, quasi-random 2D structures and black silicon structures

    PubMed Central

    Bingi, Jayachandra; Murukeshan, Vadakke Matham

    2015-01-01

    Laser speckle pattern is a granular structure formed due to random coherent wavelet interference and generally considered as noise in optical systems including photolithography. Contrary to this, in this paper, we use the speckle pattern to generate predictable and controlled Gaussian random structures and quasi-random structures photo-lithographically. The random structures made using this proposed speckle lithography technique are quantified based on speckle statistics, radial distribution function (RDF) and fast Fourier transform (FFT). The control over the speckle size, density and speckle clustering facilitates the successful fabrication of black silicon with different surface structures. The controllability and tunability of randomness makes this technique a robust method for fabricating predictable 2D Gaussian random structures and black silicon structures. These structures can enhance the light trapping significantly in solar cells and hence enable improved energy harvesting. Further, this technique can enable efficient fabrication of disordered photonic structures and random media based devices. PMID:26679513

  15. Development of the ARISTOTLE webware for cloud-based rarefied gas flow modeling

    NASA Astrophysics Data System (ADS)

    Deschenes, Timothy R.; Grot, Jonathan; Cline, Jason A.

    2016-11-01

    Rarefied gas dynamics are important for a wide variety of applications. An improvement in the ability of general users to predict these gas flows will enable optimization of current, and discovery of future processes. Despite this potential, most rarefied simulation software is designed by and for experts in the community. This has resulted in low adoption of the methods outside of the immediate RGD community. This paper outlines an ongoing effort to create a rarefied gas dynamics simulation tool that can be used by a general audience. The tool leverages a direct simulation Monte Carlo (DSMC) library that is available to the entire community and a web-based simulation process that will enable all users to take advantage of high performance computing capabilities. First, the DSMC library and simulation architecture are described. Then the DSMC library is used to predict a number of representative transient gas flows that are applicable to the rarefied gas dynamics community. The paper closes with a summary and future direction.

  16. Application of TOPEX/Poseidon altimetry to ocean dynamics and geophysics

    NASA Technical Reports Server (NTRS)

    Douglas, Bruce; Cheney, R.; Miller, L.; Mcadoo, D.; Leetmaa, A.; Schopf, P.; Schwiderski, E. W.

    1991-01-01

    We will analyze the TOPEX/POSEIDON data using techniques developed for Geosat, although the more accurate TOPEX/POSEIDON data will enable a wider range of problems to be addressed. Our proposed investigations will have five distinct areas: (1) a description of global sea level variability; (2) tropical ocean dynamics; (3) coupled models for El Nino prediction; (4) structure of the lithosphere; and (5) global tide model improvement.

  17. Investigation of metabolic objectives in cultured hepatocytes.

    PubMed

    Uygun, Korkut; Matthew, Howard W T; Huang, Yinlun

    2007-06-15

    Using optimization based methods to predict fluxes in metabolic flux balance models has been a successful approach for some microorganisms, enabling construction of in silico models and even inference of some regulatory motifs. However, this success has not been translated to mammalian cells. The lack of knowledge about metabolic objectives in mammalian cells is a major obstacle that prevents utilization of various metabolic engineering tools and methods for tissue engineering and biomedical purposes. In this work, we investigate and identify possible metabolic objectives for hepatocytes cultured in vitro. To achieve this goal, we present a special data-mining procedure for identifying metabolic objective functions in mammalian cells. This multi-level optimization based algorithm enables identifying the major fluxes in the metabolic objective from MFA data in the absence of information about critical active constraints of the system. Further, once the objective is determined, active flux constraints can also be identified and analyzed. This information can be potentially used in a predictive manner to improve cell culture results or clinical metabolic outcomes. As a result of the application of this method, it was found that in vitro cultured hepatocytes maximize oxygen uptake, coupling of urea and TCA cycles, and synthesis of serine and urea. Selection of these fluxes as the metabolic objective enables accurate prediction of the flux distribution in the system given a limited amount of flux data; thus presenting a workable in silico model for cultured hepatocytes. It is observed that an overall homeostasis picture is also emergent in the findings.

  18. Improved prediction of higher heating value of biomass using an artificial neural network model based on proximate analysis.

    PubMed

    Uzun, Harun; Yıldız, Zeynep; Goldfarb, Jillian L; Ceylan, Selim

    2017-06-01

    As biomass becomes more integrated into our energy feedstocks, the ability to predict its combustion enthalpies from routine data such as carbon, ash, and moisture content enables rapid decisions about utilization. The present work constructs a novel artificial neural network model with a 3-3-1 tangent sigmoid architecture to predict biomasses' higher heating values from only their proximate analyses, requiring minimal specificity as compared to models based on elemental composition. The model presented has a considerably higher correlation coefficient (0.963) and lower root mean square (0.375), mean absolute (0.328), and mean bias errors (0.010) than other models presented in the literature which, at least when applied to the present data set, tend to under-predict the combustion enthalpy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Efficient statistical mapping of avian count data

    USGS Publications Warehouse

    Royle, J. Andrew; Wikle, C.K.

    2005-01-01

    We develop a spatial modeling framework for count data that is efficient to implement in high-dimensional prediction problems. We consider spectral parameterizations for the spatially varying mean of a Poisson model. The spectral parameterization of the spatial process is very computationally efficient, enabling effective estimation and prediction in large problems using Markov chain Monte Carlo techniques. We apply this model to creating avian relative abundance maps from North American Breeding Bird Survey (BBS) data. Variation in the ability of observers to count birds is modeled as spatially independent noise, resulting in over-dispersion relative to the Poisson assumption. This approach represents an improvement over existing approaches used for spatial modeling of BBS data which are either inefficient for continental scale modeling and prediction or fail to accommodate important distributional features of count data thus leading to inaccurate accounting of prediction uncertainty.

  20. Integrating System Dynamics and Bayesian Networks with Application to Counter-IED Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarman, Kenneth D.; Brothers, Alan J.; Whitney, Paul D.

    2010-06-06

    The practice of choosing a single modeling paradigm for predictive analysis can limit the scope and relevance of predictions and their utility to decision-making processes. Considering multiple modeling methods simultaneously may improve this situation, but a better solution provides a framework for directly integrating different, potentially complementary modeling paradigms to enable more comprehensive modeling and predictions, and thus better-informed decisions. The primary challenges of this kind of model integration are to bridge language and conceptual gaps between modeling paradigms, and to determine whether natural and useful linkages can be made in a formal mathematical manner. To address these challenges inmore » the context of two specific modeling paradigms, we explore mathematical and computational options for linking System Dynamics (SD) and Bayesian network (BN) models and incorporating data into the integrated models. We demonstrate that integrated SD/BN models can naturally be described as either state space equations or Dynamic Bayes Nets, which enables the use of many existing computational methods for simulation and data integration. To demonstrate, we apply our model integration approach to techno-social models of insurgent-led attacks and security force counter-measures centered on improvised explosive devices.« less

  1. Bigger Data, Collaborative Tools and the Future of Predictive Drug Discovery

    PubMed Central

    Clark, Alex M.; Swamidass, S. Joshua; Litterman, Nadia; Williams, Antony J.

    2014-01-01

    Over the past decade we have seen a growth in the provision of chemistry data and cheminformatics tools as either free websites or software as a service (SaaS) commercial offerings. These have transformed how we find molecule-related data and use such tools in our research. There have also been efforts to improve collaboration between researchers either openly or through secure transactions using commercial tools. A major challenge in the future will be how such databases and software approaches handle larger amounts of data as it accumulates from high throughput screening and enables the user to draw insights, enable predictions and move projects forward. We now discuss how information from some drug discovery datasets can be made more accessible and how privacy of data should not overwhelm the desire to share it at an appropriate time with collaborators. We also discuss additional software tools that could be made available and provide our thoughts on the future of predictive drug discovery in this age of big data. We use some examples from our own research on neglected diseases, collaborations, mobile apps and algorithm development to illustrate these ideas. PMID:24943138

  2. A Critical Review of Validation, Blind Testing, and Real- World Use of Alchemical Protein-Ligand Binding Free Energy Calculations.

    PubMed

    Abel, Robert; Wang, Lingle; Mobley, David L; Friesner, Richard A

    2017-01-01

    Protein-ligand binding is among the most fundamental phenomena underlying all molecular biology, and a greater ability to more accurately and robustly predict the binding free energy of a small molecule ligand for its cognate protein is expected to have vast consequences for improving the efficiency of pharmaceutical drug discovery. We briefly reviewed a number of scientific and technical advances that have enabled alchemical free energy calculations to recently emerge as a preferred approach, and critically considered proper validation and effective use of these techniques. In particular, we characterized a selection bias effect which may be important in prospective free energy calculations, and introduced a strategy to improve the accuracy of the free energy predictions. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  3. NASA Tools for Climate Impacts on Water Resources

    NASA Technical Reports Server (NTRS)

    Toll, David; Doorn, Brad

    2010-01-01

    Climate and environmental change are expected to fundamentally alter the nation's hydrological cycle and water availability. Satellites provide global or near-global coverage using instruments, allowing for consistent, well-calibrated, and equivalent-quality data of the Earth system. A major goal for NASA climate and environmental change research is to create multi-instrument data sets to span the multi-decadal time scales of climate change and to combine these data with those from modeling and surface-based observing systems to improve process understanding and predictions. NASA and Earth science data and analyses will ultimately enable more accurate climate prediction, and characterization of uncertainties. NASA's Applied Sciences Program works with other groups, including other federal agencies, to transition demonstrated observational capabilities to operational capabilities. A summary of some of NASA tools for improved water resources management will be presented.

  4. An ensemble framework for identifying essential proteins.

    PubMed

    Zhang, Xue; Xiao, Wangxin; Acencio, Marcio Luis; Lemke, Ney; Wang, Xujing

    2016-08-25

    Many centrality measures have been proposed to mine and characterize the correlations between network topological properties and protein essentiality. However, most of them show limited prediction accuracy, and the number of common predicted essential proteins by different methods is very small. In this paper, an ensemble framework is proposed which integrates gene expression data and protein-protein interaction networks (PINs). It aims to improve the prediction accuracy of basic centrality measures. The idea behind this ensemble framework is that different protein-protein interactions (PPIs) may show different contributions to protein essentiality. Five standard centrality measures (degree centrality, betweenness centrality, closeness centrality, eigenvector centrality, and subgraph centrality) are integrated into the ensemble framework respectively. We evaluated the performance of the proposed ensemble framework using yeast PINs and gene expression data. The results show that it can considerably improve the prediction accuracy of the five centrality measures individually. It can also remarkably increase the number of common predicted essential proteins among those predicted by each centrality measure individually and enable each centrality measure to find more low-degree essential proteins. This paper demonstrates that it is valuable to differentiate the contributions of different PPIs for identifying essential proteins based on network topological characteristics. The proposed ensemble framework is a successful paradigm to this end.

  5. PredictSNP: Robust and Accurate Consensus Classifier for Prediction of Disease-Related Mutations

    PubMed Central

    Bendl, Jaroslav; Stourac, Jan; Salanda, Ondrej; Pavelka, Antonin; Wieben, Eric D.; Zendulka, Jaroslav; Brezovsky, Jan; Damborsky, Jiri

    2014-01-01

    Single nucleotide variants represent a prevalent form of genetic variation. Mutations in the coding regions are frequently associated with the development of various genetic diseases. Computational tools for the prediction of the effects of mutations on protein function are very important for analysis of single nucleotide variants and their prioritization for experimental characterization. Many computational tools are already widely employed for this purpose. Unfortunately, their comparison and further improvement is hindered by large overlaps between the training datasets and benchmark datasets, which lead to biased and overly optimistic reported performances. In this study, we have constructed three independent datasets by removing all duplicities, inconsistencies and mutations previously used in the training of evaluated tools. The benchmark dataset containing over 43,000 mutations was employed for the unbiased evaluation of eight established prediction tools: MAPP, nsSNPAnalyzer, PANTHER, PhD-SNP, PolyPhen-1, PolyPhen-2, SIFT and SNAP. The six best performing tools were combined into a consensus classifier PredictSNP, resulting into significantly improved prediction performance, and at the same time returned results for all mutations, confirming that consensus prediction represents an accurate and robust alternative to the predictions delivered by individual tools. A user-friendly web interface enables easy access to all eight prediction tools, the consensus classifier PredictSNP and annotations from the Protein Mutant Database and the UniProt database. The web server and the datasets are freely available to the academic community at http://loschmidt.chemi.muni.cz/predictsnp. PMID:24453961

  6. Adolescents' knowledge and beliefs about pregnancy: the impact of "ENABL".

    PubMed

    Arnold, E M; Smith, T E; Harrison, D F; Springer, D W

    2000-01-01

    Finding effective ways to prevent adolescent pregnancy is a concern of public health officials, educators, social workers, parents, and legislators. Numerous programs exist, but there is debate as to whether it is the specific program itself or other factors that are responsible for participants' successful outcomes. Using a quasi-experimental design, this study sought to determine which factors predicted changes in knowledge and beliefs among middle school students (N = 1,450) after exposure to Postponing Sexual Involvement (PSI), the curricular component of Education Now and Babies Later (ENABL), a pregnancy prevention program. It was found that the single most important predictor of improvement in knowledge and beliefs about pregnancy prevention was PSI itself, not background variables. The findings contradict some of the previous studies on factors impacting teenage pregnancy and lend support for the continued examination of ENABL as a promising component of pregnancy prevention efforts.

  7. Earth Science System of the Future: Observing, Processing, and Delivering Data Products Directly to Users

    NASA Technical Reports Server (NTRS)

    Crisp, David; Komar, George (Technical Monitor)

    2001-01-01

    Advancement of our predictive capabilities will require new scientific knowledge, improvement of our modeling capabilities, and new observation strategies to generate the complex data sets needed by coupled modeling networks. New observation strategies must support remote sensing from a variety of vantage points and will include "sensorwebs" of small satellites in low Earth orbit, large aperture sensors in Geostationary orbits, and sentinel satellites at L1 and L2 to provide day/night views of the entire globe. Onboard data processing and high speed computing and communications will enable near real-time tailoring and delivery of information products (i.e., predictions) directly to users.

  8. Paramagnetic fluorinated nanoemulsions for sensitive cellular fluorine-19 magnetic resonance imaging

    PubMed Central

    Kislukhin, Alexander A.; Xu, Hongyan; Adams, Stephen R.; Narsinh, Kazim H.; Tsien, Roger Y.; Ahrens, Eric T.

    2016-01-01

    Fluorine-19 magnetic resonance imaging (19F MRI) probes enable quantitative in vivo detection of cell therapies and inflammatory cells. Here, we describe the formulation of perfluorocarbon-based nanoemulsions with improved sensitivity for cellular MRI. Reduction of the 19F spin-lattice relaxation time (T1) enables rapid imaging and an improved signal-to-noise ratio, thereby improving cell detection sensitivity. We synthesized metal-binding β-diketones conjugated to linear perfluoropolyether (PFPE), formulated these fluorinated ligands as aqueous nanoemulsions, and then metalated them with various transition and lanthanide ions in the fluorous phase. Iron(III) tris-β-diketonate ('FETRIS') nanoemulsions with PFPE have low cytotoxicity (<20%) and superior MRI properties. Moreover, the 19F T1 can readily be reduced by an order of magnitude and tuned by stoichiometric modulation of the iron concentration. The resulting 19F MRI detection sensitivity is enhanced by 3-to-5 fold over previously used tracers at 11.7 T, and is predicted to increase by at least 8-fold at clinical field strength of 3 T. PMID:26974409

  9. Making ecological models adequate

    USGS Publications Warehouse

    Getz, Wayne M.; Marshall, Charles R.; Carlson, Colin J.; Giuggioli, Luca; Ryan, Sadie J.; Romañach, Stephanie; Boettiger, Carl; Chamberlain, Samuel D.; Larsen, Laurel; D'Odorico, Paolo; O'Sullivan, David

    2018-01-01

    Critical evaluation of the adequacy of ecological models is urgently needed to enhance their utility in developing theory and enabling environmental managers and policymakers to make informed decisions. Poorly supported management can have detrimental, costly or irreversible impacts on the environment and society. Here, we examine common issues in ecological modelling and suggest criteria for improving modelling frameworks. An appropriate level of process description is crucial to constructing the best possible model, given the available data and understanding of ecological structures. Model details unsupported by data typically lead to over parameterisation and poor model performance. Conversely, a lack of mechanistic details may limit a model's ability to predict ecological systems’ responses to management. Ecological studies that employ models should follow a set of model adequacy assessment protocols that include: asking a series of critical questions regarding state and control variable selection, the determinacy of data, and the sensitivity and validity of analyses. We also need to improve model elaboration, refinement and coarse graining procedures to better understand the relevancy and adequacy of our models and the role they play in advancing theory, improving hind and forecasting, and enabling problem solving and management.

  10. Analyzing Medical Image Search Behavior: Semantics and Prediction of Query Results.

    PubMed

    De-Arteaga, Maria; Eggel, Ivan; Kahn, Charles E; Müller, Henning

    2015-10-01

    Log files of information retrieval systems that record user behavior have been used to improve the outcomes of retrieval systems, understand user behavior, and predict events. In this article, a log file of the ARRS GoldMiner search engine containing 222,005 consecutive queries is analyzed. Time stamps are available for each query, as well as masked IP addresses, which enables to identify queries from the same person. This article describes the ways in which physicians (or Internet searchers interested in medical images) search and proposes potential improvements by suggesting query modifications. For example, many queries contain only few terms and therefore are not specific; others contain spelling mistakes or non-medical terms that likely lead to poor or empty results. One of the goals of this report is to predict the number of results a query will have since such a model allows search engines to automatically propose query modifications in order to avoid result lists that are empty or too large. This prediction is made based on characteristics of the query terms themselves. Prediction of empty results has an accuracy above 88%, and thus can be used to automatically modify the query to avoid empty result sets for a user. The semantic analysis and data of reformulations done by users in the past can aid the development of better search systems, particularly to improve results for novice users. Therefore, this paper gives important ideas to better understand how people search and how to use this knowledge to improve the performance of specialized medical search engines.

  11. Canopy Temperature and Vegetation Indices from High-Throughput Phenotyping Improve Accuracy of Pedigree and Genomic Selection for Grain Yield in Wheat

    PubMed Central

    Rutkoski, Jessica; Poland, Jesse; Mondal, Suchismita; Autrique, Enrique; Pérez, Lorena González; Crossa, José; Reynolds, Matthew; Singh, Ravi

    2016-01-01

    Genomic selection can be applied prior to phenotyping, enabling shorter breeding cycles and greater rates of genetic gain relative to phenotypic selection. Traits measured using high-throughput phenotyping based on proximal or remote sensing could be useful for improving pedigree and genomic prediction model accuracies for traits not yet possible to phenotype directly. We tested if using aerial measurements of canopy temperature, and green and red normalized difference vegetation index as secondary traits in pedigree and genomic best linear unbiased prediction models could increase accuracy for grain yield in wheat, Triticum aestivum L., using 557 lines in five environments. Secondary traits on training and test sets, and grain yield on the training set were modeled as multivariate, and compared to univariate models with grain yield on the training set only. Cross validation accuracies were estimated within and across-environment, with and without replication, and with and without correcting for days to heading. We observed that, within environment, with unreplicated secondary trait data, and without correcting for days to heading, secondary traits increased accuracies for grain yield by 56% in pedigree, and 70% in genomic prediction models, on average. Secondary traits increased accuracy slightly more when replicated, and considerably less when models corrected for days to heading. In across-environment prediction, trends were similar but less consistent. These results show that secondary traits measured in high-throughput could be used in pedigree and genomic prediction to improve accuracy. This approach could improve selection in wheat during early stages if validated in early-generation breeding plots. PMID:27402362

  12. Verification and Validation Process for Progressive Damage and Failure Analysis Methods in the NASA Advanced Composites Consortium

    NASA Technical Reports Server (NTRS)

    Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl

    2017-01-01

    The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.

  13. Biomarkers to guide clinical therapeutics in rheumatology?

    PubMed

    Robinson, William H; Mao, Rong

    2016-03-01

    The use of biomarkers in rheumatology can help identify disease risk, improve diagnosis and prognosis, target therapy, assess response to treatment, and further our understanding of the underlying pathogenesis of disease. Here, we discuss the recent advances in biomarkers for rheumatic disorders, existing impediments to progress in this field, and the potential of biomarkers to enable precision medicine and thereby transform rheumatology. Although significant challenges remain, progress continues to be made in biomarker discovery and development for rheumatic diseases. The use of next-generation technologies, including large-scale sequencing, proteomic technologies, metabolomic technologies, mass cytometry, and other single-cell analysis and multianalyte analysis technologies, has yielded a slew of new candidate biomarkers. Nevertheless, these biomarkers still require rigorous validation and have yet to make their way into clinical practice and therapeutic development. This review focuses on advances in the biomarker field in the last 12 months as well as the challenges that remain. Better biomarkers, ideally mechanistic ones, are needed to guide clinical decision making in rheumatology. Although the use of next-generation techniques for biomarker discovery is making headway, it is imperative that the roadblocks in our search for new biomarkers are overcome to enable identification of biomarkers with greater diagnostic and predictive utility. Identification of biomarkers with robust diagnostic and predictive utility would enable precision medicine in rheumatology.

  14. A genome-scale metabolic flux model of Escherichia coli K–12 derived from the EcoCyc database

    PubMed Central

    2014-01-01

    Background Constraint-based models of Escherichia coli metabolic flux have played a key role in computational studies of cellular metabolism at the genome scale. We sought to develop a next-generation constraint-based E. coli model that achieved improved phenotypic prediction accuracy while being frequently updated and easy to use. We also sought to compare model predictions with experimental data to highlight open questions in E. coli biology. Results We present EcoCyc–18.0–GEM, a genome-scale model of the E. coli K–12 MG1655 metabolic network. The model is automatically generated from the current state of EcoCyc using the MetaFlux software, enabling the release of multiple model updates per year. EcoCyc–18.0–GEM encompasses 1445 genes, 2286 unique metabolic reactions, and 1453 unique metabolites. We demonstrate a three-part validation of the model that breaks new ground in breadth and accuracy: (i) Comparison of simulated growth in aerobic and anaerobic glucose culture with experimental results from chemostat culture and simulation results from the E. coli modeling literature. (ii) Essentiality prediction for the 1445 genes represented in the model, in which EcoCyc–18.0–GEM achieves an improved accuracy of 95.2% in predicting the growth phenotype of experimental gene knockouts. (iii) Nutrient utilization predictions under 431 different media conditions, for which the model achieves an overall accuracy of 80.7%. The model’s derivation from EcoCyc enables query and visualization via the EcoCyc website, facilitating model reuse and validation by inspection. We present an extensive investigation of disagreements between EcoCyc–18.0–GEM predictions and experimental data to highlight areas of interest to E. coli modelers and experimentalists, including 70 incorrect predictions of gene essentiality on glucose, 80 incorrect predictions of gene essentiality on glycerol, and 83 incorrect predictions of nutrient utilization. Conclusion Significant advantages can be derived from the combination of model organism databases and flux balance modeling represented by MetaFlux. Interpretation of the EcoCyc database as a flux balance model results in a highly accurate metabolic model and provides a rigorous consistency check for information stored in the database. PMID:24974895

  15. Comparison of four statistical and machine learning methods for crash severity prediction.

    PubMed

    Iranitalab, Amirfarrokh; Khattak, Aemal

    2017-11-01

    Crash severity prediction models enable different agencies to predict the severity of a reported crash with unknown severity or the severity of crashes that may be expected to occur sometime in the future. This paper had three main objectives: comparison of the performance of four statistical and machine learning methods including Multinomial Logit (MNL), Nearest Neighbor Classification (NNC), Support Vector Machines (SVM) and Random Forests (RF), in predicting traffic crash severity; developing a crash costs-based approach for comparison of crash severity prediction methods; and investigating the effects of data clustering methods comprising K-means Clustering (KC) and Latent Class Clustering (LCC), on the performance of crash severity prediction models. The 2012-2015 reported crash data from Nebraska, United States was obtained and two-vehicle crashes were extracted as the analysis data. The dataset was split into training/estimation (2012-2014) and validation (2015) subsets. The four prediction methods were trained/estimated using the training/estimation dataset and the correct prediction rates for each crash severity level, overall correct prediction rate and a proposed crash costs-based accuracy measure were obtained for the validation dataset. The correct prediction rates and the proposed approach showed NNC had the best prediction performance in overall and in more severe crashes. RF and SVM had the next two sufficient performances and MNL was the weakest method. Data clustering did not affect the prediction results of SVM, but KC improved the prediction performance of MNL, NNC and RF, while LCC caused improvement in MNL and RF but weakened the performance of NNC. Overall correct prediction rate had almost the exact opposite results compared to the proposed approach, showing that neglecting the crash costs can lead to misjudgment in choosing the right prediction method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. In silico prediction of cytochrome P450-mediated drug metabolism.

    PubMed

    Zhang, Tao; Chen, Qi; Li, Li; Liu, Limin Angela; Wei, Dong-Qing

    2011-06-01

    The application of combinatorial chemistry and high-throughput screening technique enables the large number of chemicals to be generated and tested simultaneously, which will facilitate the drug development and discovery. At the same time, it brings about a challenge of how to efficiently identify the potential drug candidates from thousands of compounds. A way used to deal with the challenge is to consider the drug pharmacokinetic properties, such as absorption, distribution, metabolism and excretion (ADME), in the early stage of drug development. Among ADME properties, metabolism is of importance due to the strong association with efficacy and safety of drug. The review will focus on in silico approaches for prediction of Cytochrome P450-mediated drug metabolism. We will describe these predictive methods from two aspects, structure-based and data-based. Moreover, the applications and limitations of various methods will be discussed. Finally, we provide further direction toward improving the predictive accuracy of these in silico methods.

  17. Binding ligand prediction for proteins using partial matching of local surface patches.

    PubMed

    Sael, Lee; Kihara, Daisuke

    2010-01-01

    Functional elucidation of uncharacterized protein structures is an important task in bioinformatics. We report our new approach for structure-based function prediction which captures local surface features of ligand binding pockets. Function of proteins, specifically, binding ligands of proteins, can be predicted by finding similar local surface regions of known proteins. To enable partial comparison of binding sites in proteins, a weighted bipartite matching algorithm is used to match pairs of surface patches. The surface patches are encoded with the 3D Zernike descriptors. Unlike the existing methods which compare global characteristics of the protein fold or the global pocket shape, the local surface patch method can find functional similarity between non-homologous proteins and binding pockets for flexible ligand molecules. The proposed method improves prediction results over global pocket shape-based method which was previously developed by our group.

  18. Binding Ligand Prediction for Proteins Using Partial Matching of Local Surface Patches

    PubMed Central

    Sael, Lee; Kihara, Daisuke

    2010-01-01

    Functional elucidation of uncharacterized protein structures is an important task in bioinformatics. We report our new approach for structure-based function prediction which captures local surface features of ligand binding pockets. Function of proteins, specifically, binding ligands of proteins, can be predicted by finding similar local surface regions of known proteins. To enable partial comparison of binding sites in proteins, a weighted bipartite matching algorithm is used to match pairs of surface patches. The surface patches are encoded with the 3D Zernike descriptors. Unlike the existing methods which compare global characteristics of the protein fold or the global pocket shape, the local surface patch method can find functional similarity between non-homologous proteins and binding pockets for flexible ligand molecules. The proposed method improves prediction results over global pocket shape-based method which was previously developed by our group. PMID:21614188

  19. Charting the future course of rural health and remote health in Australia: Why we need theory.

    PubMed

    Bourke, Lisa; Humphreys, John S; Wakerman, John; Taylor, Judy

    2010-04-01

    This paper argues that rural and remote health is in need of theoretical development. Based on the authors' discussions, reflections and critical analyses of literature, this paper proposes key reasons why rural and remote health warrants the development of theoretical frameworks. The paper cites five reasons why theory is needed: (i) theory provides an approach for how a topic is studied; (ii) theory articulates key assumptions in knowledge development; (iii) theory systematises knowledge, enabling it to be transferable; (iv) theory provides predictability; and (v) theory enables comprehensive understanding. This paper concludes with a call for theoretical development in both rural and remote health to expand its knowledge and be more relevant to improving health care for rural Australians.

  20. Trajectory-Based Takeoff Time Predictions Applied to Tactical Departure Scheduling: Concept Description, System Design, and Initial Observations

    NASA Technical Reports Server (NTRS)

    Engelland, Shawn A.; Capps, Alan

    2011-01-01

    Current aircraft departure release times are based on manual estimates of aircraft takeoff times. Uncertainty in takeoff time estimates may result in missed opportunities to merge into constrained en route streams and lead to lost throughput. However, technology exists to improve takeoff time estimates by using the aircraft surface trajectory predictions that enable air traffic control tower (ATCT) decision support tools. NASA s Precision Departure Release Capability (PDRC) is designed to use automated surface trajectory-based takeoff time estimates to improve en route tactical departure scheduling. This is accomplished by integrating an ATCT decision support tool with an en route tactical departure scheduling decision support tool. The PDRC concept and prototype software have been developed, and an initial test was completed at air traffic control facilities in Dallas/Fort Worth. This paper describes the PDRC operational concept, system design, and initial observations.

  1. Development and Life Prediction of Erosion Resistant Turbine Low Conductivity Thermal Barrier Coatings

    NASA Technical Reports Server (NTRS)

    Zhu, Dongming; Miller, Robert A.; Kuczmarski, Maria A.

    2010-01-01

    Future rotorcraft propulsion systems are required to operate under highly-loaded conditions and in harsh sand erosion environments, thereby imposing significant material design and durability issues. The incorporation of advanced thermal barrier coatings (TBC) in high pressure turbine systems enables engine designs with higher inlet temperatures, thus improving the engine efficiency, power density and reliability. The impact and erosion resistance of turbine thermal barrier coating systems are crucial to the turbine coating technology application, because a robust turbine blade TBC system is a prerequisite for fully utilizing the potential coating technology benefit in the rotorcraft propulsion. This paper describes the turbine blade TBC development in addressing the coating impact and erosion resistance. Advanced thermal barrier coating systems with improved performance have also been validated in laboratory simulated engine erosion and/or thermal gradient environments. A preliminary life prediction modeling approach to emphasize the turbine blade coating erosion is also presented.

  2. OSPREY Predicts Resistance Mutations Using Positive and Negative Computational Protein Design.

    PubMed

    Ojewole, Adegoke; Lowegard, Anna; Gainza, Pablo; Reeve, Stephanie M; Georgiev, Ivelin; Anderson, Amy C; Donald, Bruce R

    2017-01-01

    Drug resistance in protein targets is an increasingly common phenomenon that reduces the efficacy of both existing and new antibiotics. However, knowledge of future resistance mutations during pre-clinical phases of drug development would enable the design of novel antibiotics that are robust against not only known resistant mutants, but also against those that have not yet been clinically observed. Computational structure-based protein design (CSPD) is a transformative field that enables the prediction of protein sequences with desired biochemical properties such as binding affinity and specificity to a target. The use of CSPD to predict previously unseen resistance mutations represents one of the frontiers of computational protein design. In a recent study (Reeve et al. Proc Natl Acad Sci U S A 112(3):749-754, 2015), we used our OSPREY (Open Source Protein REdesign for You) suite of CSPD algorithms to prospectively predict resistance mutations that arise in the active site of the dihydrofolate reductase enzyme from methicillin-resistant Staphylococcus aureus (SaDHFR) in response to selective pressure from an experimental competitive inhibitor. We demonstrated that our top predicted candidates are indeed viable resistant mutants. Since that study, we have significantly enhanced the capabilities of OSPREY with not only improved modeling of backbone flexibility, but also efficient multi-state design, fast sparse approximations, partitioned continuous rotamers for more accurate energy bounds, and a computationally efficient representation of molecular-mechanics and quantum-mechanical energy functions. Here, using SaDHFR as an example, we present a protocol for resistance prediction using the latest version of OSPREY. Specifically, we show how to use a combination of positive and negative design to predict active site escape mutations that maintain the enzyme's catalytic function but selectively ablate binding of an inhibitor.

  3. Stochastic Earthquake Rupture Modeling Using Nonparametric Co-Regionalization

    NASA Astrophysics Data System (ADS)

    Lee, Kyungbook; Song, Seok Goo

    2017-09-01

    Accurate predictions of the intensity and variability of ground motions are essential in simulation-based seismic hazard assessment. Advanced simulation-based ground motion prediction methods have been proposed to complement the empirical approach, which suffers from the lack of observed ground motion data, especially in the near-source region for large events. It is important to quantify the variability of the earthquake rupture process for future events and to produce a number of rupture scenario models to capture the variability in simulation-based ground motion predictions. In this study, we improved the previously developed stochastic earthquake rupture modeling method by applying the nonparametric co-regionalization, which was proposed in geostatistics, to the correlation models estimated from dynamically derived earthquake rupture models. The nonparametric approach adopted in this study is computationally efficient and, therefore, enables us to simulate numerous rupture scenarios, including large events ( M > 7.0). It also gives us an opportunity to check the shape of true input correlation models in stochastic modeling after being deformed for permissibility. We expect that this type of modeling will improve our ability to simulate a wide range of rupture scenario models and thereby predict ground motions and perform seismic hazard assessment more accurately.

  4. Intermolecular shielding contributions studied by modeling the 13C chemical-shift tensors of organic single crystals with plane waves

    PubMed Central

    Johnston, Jessica C.; Iuliucci, Robbie J.; Facelli, Julio C.; Fitzgerald, George; Mueller, Karl T.

    2009-01-01

    In order to predict accurately the chemical shift of NMR-active nuclei in solid phase systems, magnetic shielding calculations must be capable of considering the complete lattice structure. Here we assess the accuracy of the density functional theory gauge-including projector augmented wave method, which uses pseudopotentials to approximate the nodal structure of the core electrons, to determine the magnetic properties of crystals by predicting the full chemical-shift tensors of all 13C nuclides in 14 organic single crystals from which experimental tensors have previously been reported. Plane-wave methods use periodic boundary conditions to incorporate the lattice structure, providing a substantial improvement for modeling the chemical shifts in hydrogen-bonded systems. Principal tensor components can now be predicted to an accuracy that approaches the typical experimental uncertainty. Moreover, methods that include the full solid-phase structure enable geometry optimizations to be performed on the input structures prior to calculation of the shielding. Improvement after optimization is noted here even when neutron diffraction data are used for determining the initial structures. After geometry optimization, the isotropic shift can be predicted to within 1 ppm. PMID:19831448

  5. Matching phenotypes to whole genomes: Lessons learned from four iterations of the personal genome project community challenges.

    PubMed

    Cai, Binghuang; Li, Biao; Kiga, Nikki; Thusberg, Janita; Bergquist, Timothy; Chen, Yun-Ching; Niknafs, Noushin; Carter, Hannah; Tokheim, Collin; Beleva-Guthrie, Violeta; Douville, Christopher; Bhattacharya, Rohit; Yeo, Hui Ting Grace; Fan, Jean; Sengupta, Sohini; Kim, Dewey; Cline, Melissa; Turner, Tychele; Diekhans, Mark; Zaucha, Jan; Pal, Lipika R; Cao, Chen; Yu, Chen-Hsin; Yin, Yizhou; Carraro, Marco; Giollo, Manuel; Ferrari, Carlo; Leonardi, Emanuela; Tosatto, Silvio C E; Bobe, Jason; Ball, Madeleine; Hoskins, Roger A; Repo, Susanna; Church, George; Brenner, Steven E; Moult, John; Gough, Julian; Stanke, Mario; Karchin, Rachel; Mooney, Sean D

    2017-09-01

    The advent of next-generation sequencing has dramatically decreased the cost for whole-genome sequencing and increased the viability for its application in research and clinical care. The Personal Genome Project (PGP) provides unrestricted access to genomes of individuals and their associated phenotypes. This resource enabled the Critical Assessment of Genome Interpretation (CAGI) to create a community challenge to assess the bioinformatics community's ability to predict traits from whole genomes. In the CAGI PGP challenge, researchers were asked to predict whether an individual had a particular trait or profile based on their whole genome. Several approaches were used to assess submissions, including ROC AUC (area under receiver operating characteristic curve), probability rankings, the number of correct predictions, and statistical significance simulations. Overall, we found that prediction of individual traits is difficult, relying on a strong knowledge of trait frequency within the general population, whereas matching genomes to trait profiles relies heavily upon a small number of common traits including ancestry, blood type, and eye color. When a rare genetic disorder is present, profiles can be matched when one or more pathogenic variants are identified. Prediction accuracy has improved substantially over the last 6 years due to improved methodology and a better understanding of features. © 2017 Wiley Periodicals, Inc.

  6. What must be the accuracy and target of optical sensor systems for patient monitoring?

    NASA Astrophysics Data System (ADS)

    Frank, Klaus H.; Kessler, Manfred D.

    2002-06-01

    Although the treatment in the intensive care unit has improved in recent years enabling greater surgical engagements and improving patients survival rate, no adequate monitoring is available in imminent severe pathological cases. Otherwise such kind of monitoring is necessary for early or prophylactic treatment in order to avoid or reduce the severity of the disease and protect the patient from sepsis or multiple organ failure. In these cases the common monitoring is limited, because clinical physiological and laboratory parameters indicate either the situation of macro-circulation or late disturbances of microcirculation, which arise previously on sub-cellular level. Optical sensor systems enable to reveal early variations in local capillary flow. The correlation between clinical parameters and changes in condition of oxygenation as a function of capillary flow disturbances is meaningful for the further treatment. The target should be to develop a predictive parameter, which is useful for detection and follow-up of changes in circulation.

  7. Trends in non-stationary signal processing techniques applied to vibration analysis of wind turbine drive train - A contemporary survey

    NASA Astrophysics Data System (ADS)

    Uma Maheswari, R.; Umamaheswari, R.

    2017-02-01

    Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.

  8. The Biophysics of Infection.

    PubMed

    Leake, Mark C

    2016-01-01

    Our understanding of the processes involved in infection has grown enormously in the past decade due in part to emerging methods of biophysics. This new insight has been enabled through advances in interdisciplinary experimental technologies and theoretical methods at the cutting-edge interface of the life and physical sciences. For example, this has involved several state-of-the-art biophysical tools used in conjunction with molecular and cell biology approaches, which enable investigation of infection in living cells. There are also new, emerging interfacial science tools which enable significant improvements to the resolution of quantitative measurements both in space and time. These include single-molecule biophysics methods and super-resolution microscopy approaches. These new technological tools in particular have underpinned much new understanding of dynamic processes of infection at a molecular length scale. Also, there are many valuable advances made recently in theoretical approaches of biophysics which enable advances in predictive modelling to generate new understanding of infection. Here, I discuss these advances, and take stock on our knowledge of the biophysics of infection and discuss where future advances may lead.

  9. Thermal comfort modelling of body temperature and psychological variations of a human exercising in an outdoor environment

    NASA Astrophysics Data System (ADS)

    Vanos, Jennifer K.; Warland, Jon S.; Gillespie, Terry J.; Kenny, Natasha A.

    2012-01-01

    Human thermal comfort assessments pertaining to exercise while in outdoor environments can improve urban and recreational planning. The current study applied a simple four-segment skin temperature approach to the COMFA (COMfort FormulA) outdoor energy balance model. Comparative results of measured mean skin temperature ( {{bar{T}}}nolimits_{{Msk}} ) with predicted {{bar{T}}}nolimits_{{sk}} indicate that the model accurately predicted {{bar{T}}}nolimits_{{sk}} , showing significantly strong agreement ( r = 0.859, P < 0.01) during outdoor exercise (cycling and running). The combined 5-min mean variation of the {{bar{T}}}nolimits_{{sk}} RMSE was 1.5°C, with separate cycling and running giving RMSE of 1.4°C and 1.6°C, respectively, and no significant difference in residuals. Subjects' actual thermal sensation (ATS) votes displayed significant strong rank correlation with budget scores calculated using both measured and predicted {{bar{T}}}nolimits_{{sk}} ( r s = 0.507 and 0.517, respectively, P < 0.01). These results show improved predictive strength of ATS of subjects as compared to the original and updated COMFA models. This psychological improvement, plus {{bar{T}}}nolimits_{{sk}} and T c validations, enables better application to a variety of outdoor spaces. This model can be used in future research studying linkages between thermal discomfort, subsequent decreases in physical activity, and negative health trends.

  10. Healthcare predictive analytics: An overview with a focus on Saudi Arabia.

    PubMed

    Alharthi, Hana

    2018-03-08

    Despite a newfound wealth of data and information, the healthcare sector is lacking in actionable knowledge. This is largely because healthcare data, though plentiful, tends to be inherently complex and fragmented. Health data analytics, with an emphasis on predictive analytics, is emerging as a transformative tool that can enable more proactive and preventative treatment options. This review considers the ways in which predictive analytics has been applied in the for-profit business sector to generate well-timed and accurate predictions of key outcomes, with a focus on key features that may be applicable to healthcare-specific applications. Published medical research presenting assessments of predictive analytics technology in medical applications are reviewed, with particular emphasis on how hospitals have integrated predictive analytics into their day-to-day healthcare services to improve quality of care. This review also highlights the numerous challenges of implementing predictive analytics in healthcare settings and concludes with a discussion of current efforts to implement healthcare data analytics in the developing country, Saudi Arabia. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  11. An improved method to detect correct protein folds using partial clustering.

    PubMed

    Zhou, Jianjun; Wishart, David S

    2013-01-16

    Structure-based clustering is commonly used to identify correct protein folds among candidate folds (also called decoys) generated by protein structure prediction programs. However, traditional clustering methods exhibit a poor runtime performance on large decoy sets. We hypothesized that a more efficient "partial" clustering approach in combination with an improved scoring scheme could significantly improve both the speed and performance of existing candidate selection methods. We propose a new scheme that performs rapid but incomplete clustering on protein decoys. Our method detects structurally similar decoys (measured using either C(α) RMSD or GDT-TS score) and extracts representatives from them without assigning every decoy to a cluster. We integrated our new clustering strategy with several different scoring functions to assess both the performance and speed in identifying correct or near-correct folds. Experimental results on 35 Rosetta decoy sets and 40 I-TASSER decoy sets show that our method can improve the correct fold detection rate as assessed by two different quality criteria. This improvement is significantly better than two recently published clustering methods, Durandal and Calibur-lite. Speed and efficiency testing shows that our method can handle much larger decoy sets and is up to 22 times faster than Durandal and Calibur-lite. The new method, named HS-Forest, avoids the computationally expensive task of clustering every decoy, yet still allows superior correct-fold selection. Its improved speed, efficiency and decoy-selection performance should enable structure prediction researchers to work with larger decoy sets and significantly improve their ab initio structure prediction performance.

  12. An improved method to detect correct protein folds using partial clustering

    PubMed Central

    2013-01-01

    Background Structure-based clustering is commonly used to identify correct protein folds among candidate folds (also called decoys) generated by protein structure prediction programs. However, traditional clustering methods exhibit a poor runtime performance on large decoy sets. We hypothesized that a more efficient “partial“ clustering approach in combination with an improved scoring scheme could significantly improve both the speed and performance of existing candidate selection methods. Results We propose a new scheme that performs rapid but incomplete clustering on protein decoys. Our method detects structurally similar decoys (measured using either Cα RMSD or GDT-TS score) and extracts representatives from them without assigning every decoy to a cluster. We integrated our new clustering strategy with several different scoring functions to assess both the performance and speed in identifying correct or near-correct folds. Experimental results on 35 Rosetta decoy sets and 40 I-TASSER decoy sets show that our method can improve the correct fold detection rate as assessed by two different quality criteria. This improvement is significantly better than two recently published clustering methods, Durandal and Calibur-lite. Speed and efficiency testing shows that our method can handle much larger decoy sets and is up to 22 times faster than Durandal and Calibur-lite. Conclusions The new method, named HS-Forest, avoids the computationally expensive task of clustering every decoy, yet still allows superior correct-fold selection. Its improved speed, efficiency and decoy-selection performance should enable structure prediction researchers to work with larger decoy sets and significantly improve their ab initio structure prediction performance. PMID:23323835

  13. Improved Time-Lapsed Angular Scattering Microscopy of Single Cells

    NASA Astrophysics Data System (ADS)

    Cannaday, Ashley E.

    By measuring angular scattering patterns from biological samples and fitting them with a Mie theory model, one can estimate the organelle size distribution within many cells. Quantitative organelle sizing of ensembles of cells using this method has been well established. Our goal is to develop the methodology to extend this approach to the single cell level, measuring the angular scattering at multiple time points and estimating the non-nuclear organelle size distribution parameters. The diameters of individual organelle-size beads were successfully extracted using scattering measurements with a minimum deflection angle of 20 degrees. However, the accuracy of size estimates can be limited by the angular range detected. In particular, simulations by our group suggest that, for cell organelle populations with a broader size distribution, the accuracy of size prediction improves substantially if the minimum angle of detection angle is 15 degrees or less. The system was therefore modified to collect scattering angles down to 10 degrees. To confirm experimentally that size predictions will become more stable when lower scattering angles are detected, initial validations were performed on individual polystyrene beads ranging in diameter from 1 to 5 microns. We found that the lower minimum angle enabled the width of this delta-function size distribution to be predicted more accurately. Scattering patterns were then acquired and analyzed from single mouse squamous cell carcinoma cells at multiple time points. The scattering patterns exhibit angular dependencies that look unlike those of any single sphere size, but are well-fit by a broad distribution of sizes, as expected. To determine the fluctuation level in the estimated size distribution due to measurement imperfections alone, formaldehyde-fixed cells were measured. Subsequent measurements on live (non-fixed) cells revealed an order of magnitude greater fluctuation in the estimated sizes compared to fixed cells. With our improved and better-understood approach to single cell angular scattering, we are now capable of reliably detecting changes in organelle size predictions due to biological causes above our measurement error of 20 nm, which enables us to apply our system to future studies of the investigation of various single cell biological processes.

  14. Maximal Predictability Approach for Identifying the Right Descriptors for Electrocatalytic Reactions.

    PubMed

    Krishnamurthy, Dilip; Sumaria, Vaidish; Viswanathan, Venkatasubramanian

    2018-02-01

    Density functional theory (DFT) calculations are being routinely used to identify new material candidates that approach activity near fundamental limits imposed by thermodynamics or scaling relations. DFT calculations are associated with inherent uncertainty, which limits the ability to delineate materials (distinguishability) that possess high activity. Development of error-estimation capabilities in DFT has enabled uncertainty propagation through activity-prediction models. In this work, we demonstrate an approach to propagating uncertainty through thermodynamic activity models leading to a probability distribution of the computed activity and thereby its expectation value. A new metric, prediction efficiency, is defined, which provides a quantitative measure of the ability to distinguish activity of materials and can be used to identify the optimal descriptor(s) ΔG opt . We demonstrate the framework for four important electrochemical reactions: hydrogen evolution, chlorine evolution, oxygen reduction and oxygen evolution. Future studies could utilize expected activity and prediction efficiency to significantly improve the prediction accuracy of highly active material candidates.

  15. Crowdsourced assessment of common genetic contribution to predicting anti-TNF treatment response in rheumatoid arthritis.

    PubMed

    Sieberts, Solveig K; Zhu, Fan; García-García, Javier; Stahl, Eli; Pratap, Abhishek; Pandey, Gaurav; Pappas, Dimitrios; Aguilar, Daniel; Anton, Bernat; Bonet, Jaume; Eksi, Ridvan; Fornés, Oriol; Guney, Emre; Li, Hongdong; Marín, Manuel Alejandro; Panwar, Bharat; Planas-Iglesias, Joan; Poglayen, Daniel; Cui, Jing; Falcao, Andre O; Suver, Christine; Hoff, Bruce; Balagurusamy, Venkat S K; Dillenberger, Donna; Neto, Elias Chaibub; Norman, Thea; Aittokallio, Tero; Ammad-Ud-Din, Muhammad; Azencott, Chloe-Agathe; Bellón, Víctor; Boeva, Valentina; Bunte, Kerstin; Chheda, Himanshu; Cheng, Lu; Corander, Jukka; Dumontier, Michel; Goldenberg, Anna; Gopalacharyulu, Peddinti; Hajiloo, Mohsen; Hidru, Daniel; Jaiswal, Alok; Kaski, Samuel; Khalfaoui, Beyrem; Khan, Suleiman Ali; Kramer, Eric R; Marttinen, Pekka; Mezlini, Aziz M; Molparia, Bhuvan; Pirinen, Matti; Saarela, Janna; Samwald, Matthias; Stoven, Véronique; Tang, Hao; Tang, Jing; Torkamani, Ali; Vert, Jean-Phillipe; Wang, Bo; Wang, Tao; Wennerberg, Krister; Wineinger, Nathan E; Xiao, Guanghua; Xie, Yang; Yeung, Rae; Zhan, Xiaowei; Zhao, Cheng; Greenberg, Jeff; Kremer, Joel; Michaud, Kaleb; Barton, Anne; Coenen, Marieke; Mariette, Xavier; Miceli, Corinne; Shadick, Nancy; Weinblatt, Michael; de Vries, Niek; Tak, Paul P; Gerlag, Danielle; Huizinga, Tom W J; Kurreeman, Fina; Allaart, Cornelia F; Louis Bridges, S; Criswell, Lindsey; Moreland, Larry; Klareskog, Lars; Saevarsdottir, Saedis; Padyukov, Leonid; Gregersen, Peter K; Friend, Stephen; Plenge, Robert; Stolovitzky, Gustavo; Oliva, Baldo; Guan, Yuanfang; Mangravite, Lara M; Bridges, S Louis; Criswell, Lindsey; Moreland, Larry; Klareskog, Lars; Saevarsdottir, Saedis; Padyukov, Leonid; Gregersen, Peter K; Friend, Stephen; Plenge, Robert; Stolovitzky, Gustavo; Oliva, Baldo; Guan, Yuanfang; Mangravite, Lara M

    2016-08-23

    Rheumatoid arthritis (RA) affects millions world-wide. While anti-TNF treatment is widely used to reduce disease progression, treatment fails in ∼one-third of patients. No biomarker currently exists that identifies non-responders before treatment. A rigorous community-based assessment of the utility of SNP data for predicting anti-TNF treatment efficacy in RA patients was performed in the context of a DREAM Challenge (http://www.synapse.org/RA_Challenge). An open challenge framework enabled the comparative evaluation of predictions developed by 73 research groups using the most comprehensive available data and covering a wide range of state-of-the-art modelling methodologies. Despite a significant genetic heritability estimate of treatment non-response trait (h(2)=0.18, P value=0.02), no significant genetic contribution to prediction accuracy is observed. Results formally confirm the expectations of the rheumatology community that SNP information does not significantly improve predictive performance relative to standard clinical traits, thereby justifying a refocusing of future efforts on collection of other data.

  16. Computational design of thermostabilizing point mutations for G protein-coupled receptors

    PubMed Central

    Popov, Petr; Peng, Yao; Shen, Ling; Stevens, Raymond C; Cherezov, Vadim; Liu, Zhi-Jie

    2018-01-01

    Engineering of GPCR constructs with improved thermostability is a key for successful structural and biochemical studies of this transmembrane protein family, targeted by 40% of all therapeutic drugs. Here we introduce a comprehensive computational approach to effective prediction of stabilizing mutations in GPCRs, named CompoMug, which employs sequence-based analysis, structural information, and a derived machine learning predictor. Tested experimentally on the serotonin 5-HT2C receptor target, CompoMug predictions resulted in 10 new stabilizing mutations, with an apparent thermostability gain ~8.8°C for the best single mutation and ~13°C for a triple mutant. Binding of antagonists confers further stabilization for the triple mutant receptor, with total gains of ~21°C as compared to wild type apo 5-HT2C. The predicted mutations enabled crystallization and structure determination for the 5-HT2C receptor complexes in inactive and active-like states. While CompoMug already shows high 25% hit rate and utility in GPCR structural studies, further improvements are expected with accumulation of structural and mutation data. PMID:29927385

  17. The NASA/industry Design Analysis Methods for Vibrations (DAMVIBS) program: Boeing Helicopters airframe finite element modeling

    NASA Technical Reports Server (NTRS)

    Gabel, R.; Lang, P.; Reed, D.

    1993-01-01

    Mathematical models based on the finite element method of structural analysis, as embodied in the NASTRAN computer code, are routinely used by the helicopter industry to calculate airframe static internal loads used for sizing structural members. Historically, less reliance has been placed on the vibration predictions based on these models. Beginning in the early 1980's NASA's Langley Research Center initiated an industry wide program with the objective of engendering the needed trust in vibration predictions using these models and establishing a body of modeling guides which would enable confident future prediction of airframe vibration as part of the regular design process. Emphasis in this paper is placed on the successful modeling of the Army/Boeing CH-47D which showed reasonable correlation with test data. A principal finding indicates that improved dynamic analysis requires greater attention to detail and perhaps a finer mesh, especially the mass distribution, than the usual stress model. Post program modeling efforts show improved correlation placing key modal frequencies in the b/rev range with 4 percent of the test frequencies.

  18. A Case Study Using Modeling and Simulation to Predict Logistics Supply Chain Issues

    NASA Technical Reports Server (NTRS)

    Tucker, David A.

    2007-01-01

    Optimization of critical supply chains to deliver thousands of parts, materials, sub-assemblies, and vehicle structures as needed is vital to the success of the Constellation Program. Thorough analysis needs to be performed on the integrated supply chain processes to plan, source, make, deliver, and return critical items efficiently. Process modeling provides simulation technology-based, predictive solutions for supply chain problems which enable decision makers to reduce costs, accelerate cycle time and improve business performance. For example, United Space Alliance, LLC utilized this approach in late 2006 to build simulation models that recreated shuttle orbiter thruster failures and predicted the potential impact of thruster removals on logistics spare assets. The main objective was the early identification of possible problems in providing thruster spares for the remainder of the Shuttle Flight Manifest. After extensive analysis the model results were used to quantify potential problems and led to improvement actions in the supply chain. Similarly the proper modeling and analysis of Constellation parts, materials, operations, and information flows will help ensure the efficiency of the critical logistics supply chains and the overall success of the program.

  19. Model swapping: A comparative performance signature for the prediction of flow duration curves in ungauged basins

    NASA Astrophysics Data System (ADS)

    Qamar, Muhammad Uzair; Azmat, Muhammad; Cheema, Muhammad Jehanzeb Masud; Shahid, Muhammad Adnan; Khushnood, Rao Arsalan; Ahmad, Sajjad

    2016-10-01

    The issue of lack of donor basins for prediction of flow duration curves (FDCs) in ungauged basins (PUB) is an important area of research that is not resolved in the literature. We present a distance based approach to predict FDCs at ungauged basins by quantifying the dissimilarity between FDCs and characteristics data of basins. This enables us to bracket hydrologically similar basins and thus allowing us to estimate FDCs at ungauged basins. Generally, a single regression model is selected to make hydrological estimates at an ungauged basin. Based on established laws and theories of hydrology, we work to devise a method to improve the output of selected model for an ungauged basin by swapping it with another model in case the latter gives better coverage and statistical estimates of the nearest neighbors of an ungauged basin. We report two examples to demonstrate the effectiveness of model swapping. Out of 124 basins used in analysis, 34 basins in example 1 and 41 basins in example 2 fulfill the set criteria of model swapping and subsequently their estimates are improved significantly.

  20. Appropriate clinical use of human leukocyte antigen typing for coeliac disease: an Australasian perspective

    PubMed Central

    Tye-Din, J A; Cameron, D J S; Daveson, A J; Day, A S; Dellsperger, P; Hogan, C; Newnham, E D; Shepherd, S J; Steele, R H; Wienholt, L; Varney, M D

    2015-01-01

    The past decade has seen human leukocyte antigen (HLA) typing emerge as a remarkably popular test for the diagnostic work-up of coeliac disease with high patient acceptance. Although limited in its positive predictive value for coeliac disease, the strong disease association with specific HLA genes imparts exceptional negative predictive value to HLA typing, enabling a negative result to exclude coeliac disease confidently. In response to mounting evidence that the clinical use and interpretation of HLA typing often deviates from best practice, this article outlines an evidence-based approach to guide clinically appropriate use of HLA typing, and establishes a reporting template for pathology providers to improve communication of results. PMID:25827511

  1. Final report for the DOE Early Career Award #DE-SC0003912

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jayaraman, Arthi

    This DoE supported early career project was aimed at developing computational models, theory and simulation methods that would be then be used to predict assembly and morphology in polymer nanocomposites. In particular, the focus was on composites in active layers of devices, containing conducting polymers that act as electron donors and nanoscale additives that act as electron acceptors. During the course this work, we developed the first of its kind molecular models to represent conducting polymers enabling simulations at the experimentally relevant length and time scales. By comparison with experimentally observed morphologies we validated these models. Furthermore, using these modelsmore » and molecular dynamics simulations on graphical processing units (GPUs) we predicted the molecular level design features in polymers and additive that lead to morphologies with optimal features for charge carrier behavior in solar cells. Additionally, we also predicted computationally new design rules for better dispersion of additives in polymers that have been confirmed through experiments. Achieving dispersion in polymer nanocomposites is valuable to achieve controlled macroscopic properties of the composite. The results obtained during the course of this DOE funded project enables optimal design of higher efficiency organic electronic and photovoltaic devices and improve every day life with engineering of these higher efficiency devices.« less

  2. Mechanical behaviour of a fibrous scaffold for ligament tissue engineering: finite elements analysis vs. X-ray tomography imaging.

    PubMed

    Laurent, Cédric P; Latil, Pierre; Durville, Damien; Rahouadj, Rachid; Geindreau, Christian; Orgéas, Laurent; Ganghoffer, Jean-François

    2014-12-01

    The use of biodegradable scaffolds seeded with cells in order to regenerate functional tissue-engineered substitutes offers interesting alternative to common medical approaches for ligament repair. Particularly, finite element (FE) method enables the ability to predict and optimise both the macroscopic behaviour of these scaffolds and the local mechanic signals that control the cell activity. In this study, we investigate the ability of a dedicated FE code to predict the geometrical evolution of a new braided and biodegradable polymer scaffold for ligament tissue engineering by comparing scaffold geometries issued from FE simulations and from X-ray tomographic imaging during a tensile test. Moreover, we compare two types of FE simulations the initial geometries of which are issued either from X-ray imaging or from a computed idealised configuration. We report that the dedicated FE simulations from an idealised reference configuration can be reasonably used in the future to predict the global and local mechanical behaviour of the braided scaffold. A valuable and original dialog between the fields of experimental and numerical characterisation of such fibrous media is thus achieved. In the future, this approach should enable to improve accurate characterisation of local and global behaviour of tissue-engineering scaffolds. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Modeling Interdependent and Periodic Real-World Action Sequences

    PubMed Central

    Kurashima, Takeshi; Althoff, Tim; Leskovec, Jure

    2018-01-01

    Mobile health applications, including those that track activities such as exercise, sleep, and diet, are becoming widely used. Accurately predicting human actions in the real world is essential for targeted recommendations that could improve our health and for personalization of these applications. However, making such predictions is extremely difficult due to the complexities of human behavior, which consists of a large number of potential actions that vary over time, depend on each other, and are periodic. Previous work has not jointly modeled these dynamics and has largely focused on item consumption patterns instead of broader types of behaviors such as eating, commuting or exercising. In this work, we develop a novel statistical model, called TIPAS, for Time-varying, Interdependent, and Periodic Action Sequences. Our approach is based on personalized, multivariate temporal point processes that model time-varying action propensities through a mixture of Gaussian intensities. Our model captures short-term and long-term periodic interdependencies between actions through Hawkes process-based self-excitations. We evaluate our approach on two activity logging datasets comprising 12 million real-world actions (e.g., eating, sleep, and exercise) taken by 20 thousand users over 17 months. We demonstrate that our approach allows us to make successful predictions of future user actions and their timing. Specifically, TIPAS improves predictions of actions, and their timing, over existing methods across multiple datasets by up to 156%, and up to 37%, respectively. Performance improvements are particularly large for relatively rare and periodic actions such as walking and biking, improving over baselines by up to 256%. This demonstrates that explicit modeling of dependencies and periodicities in real-world behavior enables successful predictions of future actions, with implications for modeling human behavior, app personalization, and targeting of health interventions. PMID:29780977

  4. Spectral Resolution and Coverage Impact on Advanced Sounder Information Content

    NASA Technical Reports Server (NTRS)

    Larar, Allen M.; Liu, Xu; Zhou, Daniel K.; Smith, William L.

    2010-01-01

    Advanced satellite sensors are tasked with improving global measurements of the Earth s atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring capability, and environmental change detection. Achieving such measurement improvements requires instrument system advancements. This presentation focuses on the impact of spectral resolution and coverage changes on remote sensing system information content, with a specific emphasis on thermodynamic state and trace species variables obtainable from advanced atmospheric sounders such as the Infrared Atmospheric Sounding Interferometer (IASI) and Cross-track Infrared Sounder (CrIS) systems on the MetOp and NPP/NPOESS series of satellites. Key words: remote sensing, advanced sounders, information content, IASI, CrIS

  5. Implementation of machine learning for high-volume manufacturing metrology challenges (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Timoney, Padraig; Kagalwala, Taher; Reis, Edward; Lazkani, Houssam; Hurley, Jonathan; Liu, Haibo; Kang, Charles; Isbester, Paul; Yellai, Naren; Shifrin, Michael; Etzioni, Yoav

    2018-03-01

    In recent years, the combination of device scaling, complex 3D device architecture and tightening process tolerances have strained the capabilities of optical metrology tools to meet process needs. Two main categories of approaches have been taken to address the evolving process needs. In the first category, new hardware configurations are developed to provide more spectral sensitivity. Most of this category of work will enable next generation optical metrology tools to try to maintain pace with next generation process needs. In the second category, new innovative algorithms have been pursued to increase the value of the existing measurement signal. These algorithms aim to boost sensitivity to the measurement parameter of interest, while reducing the impact of other factors that contribute to signal variability but are not influenced by the process of interest. This paper will evaluate the suitability of machine learning to address high volume manufacturing metrology requirements in both front end of line (FEOL) and back end of line (BEOL) sectors from advanced technology nodes. In the FEOL sector, initial feasibility has been demonstrated to predict the fin CD values from an inline measurement using machine learning. In this study, OCD spectra were acquired after an etch process that occurs earlier in the process flow than where the inline CD is measured. The fin hard mask etch process is known to impact the downstream inline CD value. Figure 1 shows the correlation of predicted CD vs downstream inline CD measurement obtained after the training of the machine learning algorithm. For BEOL, machine learning is shown to provide an additional source of information in prediction of electrical resistance from structures that are not compatible for direct copper height measurement. Figure 2 compares the trench height correlation to electrical resistance (Rs) and the correlation of predicted Rs to the e-test Rs value for a far back end of line (FBEOL) metallization level across 3 products. In the case of product C, it is found that the predicted Rs correlation to the e-test value is significantly improved utilizing spectra acquired at the e-test structure. This paper will explore the considerations required to enable use of machine learning derived metrology output to enable improved process monitoring and control. Further results from the FEOL and BEOL sectors will be presented, together with further discussion on future proliferation of machine learning based metrology solutions in high volume manufacturing.

  6. Sequence fingerprints distinguish erroneous from correct predictions of intrinsically disordered protein regions.

    PubMed

    Saravanan, Konda Mani; Dunker, A Keith; Krishnaswamy, Sankaran

    2017-12-27

    More than 60 prediction methods for intrinsically disordered proteins (IDPs) have been developed over the years, many of which are accessible on the World Wide Web. Nearly, all of these predictors give balanced accuracies in the ~65%-~80% range. Since predictors are not perfect, further studies are required to uncover the role of amino acid residues in native IDP as compared to predicted IDP regions. In the present work, we make use of sequences of 100% predicted IDP regions, false positive disorder predictions, and experimentally determined IDP regions to distinguish the characteristics of native versus predicted IDP regions. A higher occurrence of asparagine is observed in sequences of native IDP regions but not in sequences of false positive predictions of IDP regions. The occurrences of certain combinations of amino acids at the pentapeptide level provide a distinguishing feature in the IDPs with respect to globular proteins. The distinguishing features presented in this paper provide insights into the sequence fingerprints of amino acid residues in experimentally determined as compared to predicted IDP regions. These observations and additional work along these lines should enable the development of improvements in the accuracy of disorder prediction algorithm.

  7. Towards Cooperative Predictive Data Mining in Competitive Environments

    NASA Astrophysics Data System (ADS)

    Lisý, Viliam; Jakob, Michal; Benda, Petr; Urban, Štěpán; Pěchouček, Michal

    We study the problem of predictive data mining in a competitive multi-agent setting, in which each agent is assumed to have some partial knowledge required for correctly classifying a set of unlabelled examples. The agents are self-interested and therefore need to reason about the trade-offs between increasing their classification accuracy by collaborating with other agents and disclosing their private classification knowledge to other agents through such collaboration. We analyze the problem and propose a set of components which can enable cooperation in this otherwise competitive task. These components include measures for quantifying private knowledge disclosure, data-mining models suitable for multi-agent predictive data mining, and a set of strategies by which agents can improve their classification accuracy through collaboration. The overall framework and its individual components are validated on a synthetic experimental domain.

  8. Predictive factors for pharyngocutaneous fistulization after total laryngectomy: a Dutch Head and Neck Society audit.

    PubMed

    Lansaat, Liset; van der Noort, Vincent; Bernard, Simone E; Eerenstein, Simone E J; Plaat, Boudewijn E C; Langeveld, Ton A P M; Lacko, Martin; Hilgers, Frans J M; de Bree, Remco; Takes, Robert P; van den Brekel, Michiel W M

    2018-03-01

    Incidences of pharyngocutaneous fistulization (PCF) after total laryngectomy (TL) reported in the literature vary widely, ranging from 2.6 to 65.5%. Comparison between different centers might identify risk factors, but also might enable improvements in quality of care. To enable this on a national level, an audit in the 8 principle Dutch Head and Neck Centers (DHNC) was initiated. A retrospective chart review of all 324 patients undergoing laryngectomy in a 2-year (2012 and 2013) period was performed. Overall PCF%, PCF% per center and factors predictive for PCF were identified. Furthermore, a prognostic model predicting the PCF% per center was developed. To provide additional data, a survey among the head and neck surgeons of the participating centers was carried out. Overall PCF% was 25.9. The multivariable prediction model revealed that previous treatment with (chemo)radiotherapy in combination with a long interval between primary treatment and TL, previous tracheotomy, near total pharyngectomy, neck dissection, and BMI < 18 were the best predictors for PCF. Early oral intake did not influence PCF rate. PCF% varied quite widely between centers, but for a large extend this could be explained with the prediction model. PCF performance rate (difference between the PCF% and the predicted PCF%) per DHNC, though, shows that not all differences are explained by factors established in the prediction model. However, these factors explain enough of the differences that, compensating for these factors, hospital is no longer independently predictive for PCF. This nationwide audit has provided valid comparative PCF data confirming the known risk factors from the literature which are important for counseling on PCF risks. Data show that variations in PCF% in the DHNCs (in part) are explainable by the variations in these predictive factors. Since elective neck dissection is a major risk factor for PCF, it only should be performed on well funded indication.

  9. Improvement of Predictive Ability by Uniform Coverage of the Target Genetic Space

    PubMed Central

    Bustos-Korts, Daniela; Malosetti, Marcos; Chapman, Scott; Biddulph, Ben; van Eeuwijk, Fred

    2016-01-01

    Genome-enabled prediction provides breeders with the means to increase the number of genotypes that can be evaluated for selection. One of the major challenges in genome-enabled prediction is how to construct a training set of genotypes from a calibration set that represents the target population of genotypes, where the calibration set is composed of a training and validation set. A random sampling protocol of genotypes from the calibration set will lead to low quality coverage of the total genetic space by the training set when the calibration set contains population structure. As a consequence, predictive ability will be affected negatively, because some parts of the genotypic diversity in the target population will be under-represented in the training set, whereas other parts will be over-represented. Therefore, we propose a training set construction method that uniformly samples the genetic space spanned by the target population of genotypes, thereby increasing predictive ability. To evaluate our method, we constructed training sets alongside with the identification of corresponding genomic prediction models for four genotype panels that differed in the amount of population structure they contained (maize Flint, maize Dent, wheat, and rice). Training sets were constructed using uniform sampling, stratified-uniform sampling, stratified sampling and random sampling. We compared these methods with a method that maximizes the generalized coefficient of determination (CD). Several training set sizes were considered. We investigated four genomic prediction models: multi-locus QTL models, GBLUP models, combinations of QTL and GBLUPs, and Reproducing Kernel Hilbert Space (RKHS) models. For the maize and wheat panels, construction of the training set under uniform sampling led to a larger predictive ability than under stratified and random sampling. The results of our methods were similar to those of the CD method. For the rice panel, all training set construction methods led to similar predictive ability, a reflection of the very strong population structure in this panel. PMID:27672112

  10. Airspace Systems Program: Next Generation Air Transportation System, NextGen Systems Analysis, Integration and Evaluation Project. Version 1.0; Project Plan

    NASA Technical Reports Server (NTRS)

    Quon, Leighton

    2010-01-01

    The key objectives of the NASA ASP are to: Improve mobility, capacity efficiency and access of the airspace system. Improve collaboration, predictability, and flexibility for the airspace users. Enable accurate modeling and simulation of air transportation systems. Accommodate operations of all classes of aircraft. Maintain system safety and environmental protection. In support of these program objectives, the major goal of the NextGen-SAIE Project is to enable the transition of key capacity and efficiency improvements to the NAS. Since many aspects of the NAS are unique to specific airport or airspace environments, demand on various parts of the NAS is not expected to increase equally as system demand grows. SAIE will provide systems level analysis of the NAS characteristics, constraints, and demands such that a suite of capacity-increasing concepts and technologies for system solutions are enabled and facilitated. The technical objectives in support of this goal are the following: Integration, evaluation, and transition of more mature concepts and technologies in an environment that faithfully emulates real-world complexities. Interoperability research and analysis of ASP technologies across ATM functions is performed to facilitate integration and take ASP concepts and technologies to higher Technology Readiness Level (TRL). Analyses are conducted on the program s concepts to identify the system benefits or impacts. System level analysis is conducted to increase understanding of the characteristics and constraints of airspace system and its domains.

  11. Development of a New Time-Resolved Laser-Induced Fluorescence Technique

    NASA Astrophysics Data System (ADS)

    Durot, Christopher; Gallimore, Alec

    2012-10-01

    We are developing a time-resolved laser-induced fluorescence (LIF) technique to interrogate the ion velocity distribution function (VDF) of EP thruster plumes down to the microsecond time scale. Better measurements of dynamic plasma processes will lead to improvements in simulation and prediction of thruster operation and erosion. We present the development of the new technique and results of initial tests. Signal-to-noise ratio (SNR) is often a challenge for LIF studies, and it is only more challenging for time-resolved measurements since a lock-in amplifier cannot be used with a long time constant. The new system uses laser modulation on the order of MHz, which enables the use of electronic filtering and phase-sensitive detection to improve SNR while preserving time-resolved information. Statistical averaging over many cycles to further improve SNR is done in the frequency domain. This technique can have significant advantages, including (1) larger spatial maps enabled by shorter data acquisition time and (2) the ability to average data without creating a phase reference by modifying the thruster operating condition with a periodic cutoff in discharge current, which can modify the ion velocity distribution.

  12. Robust Online Monitoring for Calibration Assessment of Transmitters and Instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Coble, Jamie B.; Shumaker, Brent

    Robust online monitoring (OLM) technologies are expected to enable the extension or elimination of periodic sensor calibration intervals in operating and new reactors. These advances in OLM technologies will improve the safety and reliability of current and planned nuclear power systems through improved accuracy and increased reliability of sensors used to monitor key parameters. In this article, we discuss an overview of research being performed within the Nuclear Energy Enabling Technologies (NEET)/Advanced Sensors and Instrumentation (ASI) program, for the development of OLM algorithms to use sensor outputs and, in combination with other available information, 1) determine whether one or moremore » sensors are out of calibration or failing and 2) replace a failing sensor with reliable, accurate sensor outputs. Algorithm development is focused on the following OLM functions: • Signal validation • Virtual sensing • Sensor response-time assessment These algorithms incorporate, at their base, a Gaussian Process-based uncertainty quantification (UQ) method. Various plant models (using kernel regression, GP, or hierarchical models) may be used to predict sensor responses under various plant conditions. These predicted responses can then be applied in fault detection (sensor output and response time) and in computing the correct value (virtual sensing) of a failing physical sensor. The methods being evaluated in this work can compute confidence levels along with the predicted sensor responses, and as a result, may have the potential for compensating for sensor drift in real-time (online recalibration). Evaluation was conducted using data from multiple sources (laboratory flow loops and plant data). Ongoing research in this project is focused on further evaluation of the algorithms, optimization for accuracy and computational efficiency, and integration into a suite of tools for robust OLM that are applicable to monitoring sensor calibration state in nuclear power plants.« less

  13. High Performance Hydrometeorological Modeling, Land Data Assimilation and Parameter Estimation with the Land Information System at NASA/GSFC

    NASA Astrophysics Data System (ADS)

    Peters-Lidard, C. D.; Kumar, S. V.; Santanello, J. A.; Tian, Y.; Rodell, M.; Mocko, D.; Reichle, R.

    2008-12-01

    The Land Information System (LIS; http://lis.gsfc.nasa.gov; Kumar et al., 2006; Peters-Lidard et al., 2007) is a flexible land surface modeling framework that has been developed with the goal of integrating satellite- and ground-based observational data products and advanced land surface modeling techniques to produce optimal fields of land surface states and fluxes. The LIS software was the co-winner of NASA's 2005 Software of the Year award. LIS facilitates the integration of observations from Earth-observing systems and predictions and forecasts from Earth System and Earth science models into the decision-making processes of partnering agency and national organizations. Due to its flexible software design, LIS can serve both as a Problem Solving Environment (PSE) for hydrologic research to enable accurate global water and energy cycle predictions, and as a Decision Support System (DSS) to generate useful information for application areas including disaster management, water resources management, agricultural management, numerical weather prediction, air quality and military mobility assessment. LIS has evolved from two earlier efforts - North American Land Data Assimilation System (NLDAS; Mitchell et al. 2004) and Global Land Data Assimilation System (GLDAS; Rodell et al. 2004) that focused primarily on improving numerical weather prediction skills by improving the characterization of the land surface conditions. Both of these systems, now use specific configurations of the LIS software in their current implementations. LIS not only consolidates the capabilities of these two systems, but also enables a much larger variety of configurations with respect to horizontal spatial resolution, input datasets and choice of land surface model through 'plugins'. In addition to these capabilities, LIS has also been demonstrated for parameter estimation (Peters-Lidard et al., 2008; Santanello et al., 2007) and data assimilation (Kumar et al., 2008). Examples and case studies demonstrating the capabilities and impacts of LIS for hydrometeorological modeling, land data assimilation and parameter estimation will be presented.

  14. Genomic-Enabled Prediction of Ordinal Data with Bayesian Logistic Ordinal Regression.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Crossa, José; Burgueño, Juan; Eskridge, Kent

    2015-08-18

    Most genomic-enabled prediction models developed so far assume that the response variable is continuous and normally distributed. The exception is the probit model, developed for ordered categorical phenotypes. In statistical applications, because of the easy implementation of the Bayesian probit ordinal regression (BPOR) model, Bayesian logistic ordinal regression (BLOR) is implemented rarely in the context of genomic-enabled prediction [sample size (n) is much smaller than the number of parameters (p)]. For this reason, in this paper we propose a BLOR model using the Pólya-Gamma data augmentation approach that produces a Gibbs sampler with similar full conditional distributions of the BPOR model and with the advantage that the BPOR model is a particular case of the BLOR model. We evaluated the proposed model by using simulation and two real data sets. Results indicate that our BLOR model is a good alternative for analyzing ordinal data in the context of genomic-enabled prediction with the probit or logit link. Copyright © 2015 Montesinos-López et al.

  15. KSC-IMG_6548re

    NASA Image and Video Library

    2010-03-04

    Cape Canaveral AFS, Fla. - A United Launch Alliance Delta IV rocket sits poised on its launch pad with the NASA/NOAA Geostationary Operational Environmental Satellite P (GOES P) at Space Launch Complex-37. GOES P will provide NOAA and NASA scientists with data to support weather, solar and space operations, and will enable future science improvements in weather prediction and remote sensing. Additionally, GOES-P will provide data on global climate changes and capability for search and rescue. Photo credit: Carleton Bailie, The Boeing Company

  16. The characteristic ultrasound features of specific types of ovarian pathology (Review)

    PubMed Central

    SAYASNEH, AHMAD; EKECHI, CHRISTINE; FERRARA, LAURA; KAIJSER, JEROEN; STALDER, CATRIONA; SUR, SHYAMALY; TIMMERMAN, DIRK; BOURNE, TOM

    2015-01-01

    Characterizing ovarian masses enables patients with malignancy to be appropriately triaged for treatment by subspecialist gynecological oncologists, which has been shown to optimize care and improve survival. Furthermore, correctly classifying benign masses facilitates the selection of patients with ovarian pathology that may either not require intervention, or be suitable for minimal access surgery if intervention is required. However, predicting whether a mass is benign or malignant is not the only clinically relevant information that we need to know before deciding on appropriate treatment. Knowing the specific histology of a mass is becoming of increasing importance as management options become more tailored to the individual patient. For example predicting a mucinous borderline tumor gives the opportunity for fertility sparing surgery, and will highlight the need for further gastrointestinal assessment. For benign disease, predicting the presence of an endometrioma and possible deeply infiltrating endometriosis is important when considering both who should perform and the extent of surgery. An examiner’s subjective assessment of the morphological and vascular features of a mass using ultrasonography has been shown to be highly effective for predicting whether a mass is benign or malignant. Many masses also have features that enable a reliable diagnosis of the specific pathology of a particular mass to be made. In this narrative review we aim to describe the typical morphological features seen on ultrasound of different adnexal masses and illustrate these by showing representative ultrasound images. PMID:25406094

  17. A statistical model for predicting muscle performance

    NASA Astrophysics Data System (ADS)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing injury.

  18. Integrating Milk Metabolite Profile Information for the Prediction of Traditional Milk Traits Based on SNP Information for Holstein Cows

    PubMed Central

    Melzer, Nina; Wittenburg, Dörte; Repsilber, Dirk

    2013-01-01

    In this study the benefit of metabolome level analysis for the prediction of genetic value of three traditional milk traits was investigated. Our proposed approach consists of three steps: First, milk metabolite profiles are used to predict three traditional milk traits of 1,305 Holstein cows. Two regression methods, both enabling variable selection, are applied to identify important milk metabolites in this step. Second, the prediction of these important milk metabolite from single nucleotide polymorphisms (SNPs) enables the detection of SNPs with significant genetic effects. Finally, these SNPs are used to predict milk traits. The observed precision of predicted genetic values was compared to the results observed for the classical genotype-phenotype prediction using all SNPs or a reduced SNP subset (reduced classical approach). To enable a comparison between SNP subsets, a special invariable evaluation design was implemented. SNPs close to or within known quantitative trait loci (QTL) were determined. This enabled us to determine if detected important SNP subsets were enriched in these regions. The results show that our approach can lead to genetic value prediction, but requires less than 1% of the total amount of (40,317) SNPs., significantly more important SNPs in known QTL regions were detected using our approach compared to the reduced classical approach. Concluding, our approach allows a deeper insight into the associations between the different levels of the genotype-phenotype map (genotype-metabolome, metabolome-phenotype, genotype-phenotype). PMID:23990900

  19. Predictability of Top of Descent Location for Operational Idle-Thrust Descents

    NASA Technical Reports Server (NTRS)

    Stell, Laurel L.

    2010-01-01

    To enable arriving aircraft to fly optimized descents computed by the flight management system (FMS) in congested airspace, ground automation must accurately predict descent trajectories. To support development of the trajectory predictor and its uncertainty models, commercial flights executed idle-thrust descents at a specified descent speed, and the recorded data included the specified descent speed profile, aircraft weight, and the winds entered into the FMS as well as the radar data. The FMS computed the intended descent path assuming idle thrust after top of descent (TOD), and the controllers and pilots then endeavored to allow the FMS to fly the descent to the meter fix with minimal human intervention. The horizontal flight path, cruise and meter fix altitudes, and actual TOD location were extracted from the radar data. Using approximately 70 descents each in Boeing 757 and Airbus 319/320 aircraft, multiple regression estimated TOD location as a linear function of the available predictive factors. The cruise and meter fix altitudes, descent speed, and wind clearly improve goodness of fit. The aircraft weight improves fit for the Airbus descents but not for the B757. Except for a few statistical outliers, the residuals have absolute value less than 5 nmi. Thus, these predictive factors adequately explain the TOD location, which indicates the data do not include excessive noise.

  20. A Binaural Grouping Model for Predicting Speech Intelligibility in Multitalker Environments

    PubMed Central

    Colburn, H. Steven

    2016-01-01

    Spatially separating speech maskers from target speech often leads to a large intelligibility improvement. Modeling this phenomenon has long been of interest to binaural-hearing researchers for uncovering brain mechanisms and for improving signal-processing algorithms in hearing-assistive devices. Much of the previous binaural modeling work focused on the unmasking enabled by binaural cues at the periphery, and little quantitative modeling has been directed toward the grouping or source-separation benefits of binaural processing. In this article, we propose a binaural model that focuses on grouping, specifically on the selection of time-frequency units that are dominated by signals from the direction of the target. The proposed model uses Equalization-Cancellation (EC) processing with a binary decision rule to estimate a time-frequency binary mask. EC processing is carried out to cancel the target signal and the energy change between the EC input and output is used as a feature that reflects target dominance in each time-frequency unit. The processing in the proposed model requires little computational resources and is straightforward to implement. In combination with the Coherence-based Speech Intelligibility Index, the model is applied to predict the speech intelligibility data measured by Marrone et al. The predicted speech reception threshold matches the pattern of the measured data well, even though the predicted intelligibility improvements relative to the colocated condition are larger than some of the measured data, which may reflect the lack of internal noise in this initial version of the model. PMID:27698261

  1. A Binaural Grouping Model for Predicting Speech Intelligibility in Multitalker Environments.

    PubMed

    Mi, Jing; Colburn, H Steven

    2016-10-03

    Spatially separating speech maskers from target speech often leads to a large intelligibility improvement. Modeling this phenomenon has long been of interest to binaural-hearing researchers for uncovering brain mechanisms and for improving signal-processing algorithms in hearing-assistive devices. Much of the previous binaural modeling work focused on the unmasking enabled by binaural cues at the periphery, and little quantitative modeling has been directed toward the grouping or source-separation benefits of binaural processing. In this article, we propose a binaural model that focuses on grouping, specifically on the selection of time-frequency units that are dominated by signals from the direction of the target. The proposed model uses Equalization-Cancellation (EC) processing with a binary decision rule to estimate a time-frequency binary mask. EC processing is carried out to cancel the target signal and the energy change between the EC input and output is used as a feature that reflects target dominance in each time-frequency unit. The processing in the proposed model requires little computational resources and is straightforward to implement. In combination with the Coherence-based Speech Intelligibility Index, the model is applied to predict the speech intelligibility data measured by Marrone et al. The predicted speech reception threshold matches the pattern of the measured data well, even though the predicted intelligibility improvements relative to the colocated condition are larger than some of the measured data, which may reflect the lack of internal noise in this initial version of the model. © The Author(s) 2016.

  2. Microwave Atmospheric Sounder on CubeSat

    NASA Astrophysics Data System (ADS)

    Padmanabhan, S.; Brown, S. E.; Kangaslahti, P.; Cofield, R.; Russell, D.; Stachnik, R. A.; Su, H.; Wu, L.; Tanelli, S.; Niamsuwan, N.

    2014-12-01

    To accurately predict how the distribution of extreme events may change in the future we need to understand the mechanisms that influence such events in our current climate. Our current observing system is not well-suited for observing extreme events globally due to the sparse sampling and in-homogeneity of ground-based in-situ observations and the infrequent revisit time of satellite observations. Observations of weather extremes, such as extreme precipitation events, temperature extremes, tropical and extra-tropical cyclones among others, with temporal resolution on the order of minutes and spatial resolution on the order of few kms (<10 kms), are required for improved forecasting of extreme weather events. We envision a suite of low-cost passive microwave sounding and imaging sensors on CubeSats that would work in concert with traditional flagship observational systems, such as those manifested on large environmental satellites (i.e. JPSS,WSF,GCOM-W), to monitor weather extremes. A 118/183 GHz sensor would enable observations of temperature and precipitation extremes over land and ocean as well as tropical and extra-tropical cyclones. This proposed project would enable low cost, compact radiometer instrumentation at 118 and 183 GHz that would fit in a 6U Cubesat with the objective of mass-producing this design to enable a suite of small satellites to image the key geophysical parameters needed to improve prediction of extreme weather events. We take advantage of past and current technology developments at JPL viz. HAMSR (High Altitude Microwave Scanning Radiometer), Advanced Component Technology (ACT'08) to enable low-mass, low-power high frequency airborne radiometers. In this paper, we will describe the design and implementation of the 118 GHz temperature sounder and 183 GHz humidity sounder on the 6U CubeSat. In addition, a summary of radiometer calibration and retrieval techniques of temperature and humidity will be discussed. The successful demonstration of this instrument on the 6U CubeSat would pave the way for the development of a constellation which could sample tropospheric temperature and humidity with fine temporal and spatial resolution.

  3. Airborne Deployment and Calibration of Microwave Atmospheric Sounder on 6U CubeSat

    NASA Astrophysics Data System (ADS)

    Padmanabhan, S.; Brown, S. T.; Lim, B.; Kangaslahti, P.; Russell, D.; Stachnik, R. A.

    2015-12-01

    To accurately predict how the distribution of extreme events may change in the future we need to understand the mechanisms that influence such events in our current climate. Our current observing system is not well-suited for observing extreme events globally due to the sparse sampling and in-homogeneity of ground-based in-situ observations and the infrequent revisit time of satellite observations. Observations of weather extremes, such as extreme precipitation events, temperature extremes, tropical and extra-tropical cyclones among others, with temporal resolution on the order of minutes and spatial resolution on the order of few kms (<10 kms), are required for improved forecasting of extreme weather events. We envision a suite of low-cost passive microwave sounding and imaging sensors on CubeSats that would work in concert with traditional flagship observational systems, such as those manifested on large environmental satellites (i.e. JPSS,WSF,GCOM-W), to monitor weather extremes. A 118/183 GHz sensor would enable observations of temperature and precipitation extremes over land and ocean as well as tropical and extra-tropical cyclones. This proposed project would enable low cost, compact radiometer instrumentation at 118 and 183 GHz that would fit in a 6U Cubesat with the objective of mass-producing this design to enable a suite of small satellites to image the key geophysical parameters needed to improve prediction of extreme weather events. We take advantage of past and current technology developments at JPL viz. HAMSR (High Altitude Microwave Scanning Radiometer), Advanced Component Technology (ACT'08) to enable low-mass, low-power high frequency airborne radiometers. In this paper, we will describe the design and implementation of the 118 GHz temperature sounder and 183 GHz humidity sounder on the 6U CubeSat. In addition, we will discuss the maiden airborne deployment of the instrument during the Plain Elevated Convection at Night (PECAN) experiment. The successful demonstration of this instrument on the 6U CubeSat would pave the way for the development of a constellation which could sample tropospheric temperature and humidity with fine temporal and spatial resolution.

  4. Prediction of survival without morbidity for infants born at under 33 weeks gestational age: a user-friendly graphical tool.

    PubMed

    Shah, Prakesh S; Ye, Xiang Y; Synnes, Anne; Rouvinez-Bouali, Nicole; Yee, Wendy; Lee, Shoo K

    2012-03-01

    To develop models and a graphical tool for predicting survival to discharge without major morbidity for infants with a gestational age (GA) at birth of 22-32 weeks using infant information at birth. Retrospective cohort study. Canadian Neonatal Network data for 2003-2008 were utilised. Neonates born between 22 and 32 weeks gestation admitted to neonatal intensive care units in Canada. Survival to discharge without major morbidity defined as survival without severe neurological injury (intraventricular haemorrhage grade 3 or 4 or periventricular leukomalacia), severe retinopathy (stage 3 or higher), necrotising enterocolitis (stage 2 or 3) or chronic lung disease. Of the 17 148 neonates who met the eligibility criteria, 65% survived without major morbidity. Sex and GA at birth were significant predictors. Birth weight (BW) had a significant but non-linear effect on survival without major morbidity. Although maternal information characteristics such as steroid use, improved the prediction of survival without major morbidity, sex, GA at birth and BW for GA predicted survival without major morbidity almost as accurately (area under the curve: 0.84). The graphical tool based on the models showed how the GA and BW for GA interact, to enable prediction of outcomes especially for small and large for GA infants. This graphical tool provides an improved and easily interpretable method to predict survival without major morbidity for very preterm infants at the time of birth. These curves are especially useful for small and large for GA infants.

  5. Tracking, sensing and predicting flood wave propagation using nomadic satellite communication systems and hydrodynamic models

    NASA Astrophysics Data System (ADS)

    Hostache, R.; Matgen, P.; Giustarini, L.; Tailliez, C.; Iffly, J.-F.

    2011-11-01

    The main objective of this study is to contribute to the development and the improvement of flood forecasting systems. Since hydrometric stations are often poorly distributed for monitoring the propagation of extreme flood waves, the study aims at evaluating the hydrometric value of the Global Navigation Satellite System (GNSS). Integrated with satellite telecommunication systems, drifting or anchored floaters equipped with navigation systems such as GPS and Galileo, enable the quasi-continuous measurement and near real-time transmission of water level and flow velocity data, from virtually any point in the world. The presented study investigates the effect of assimilating GNSS-derived water level and flow velocity measurements into hydraulic models in order to reduce the associated predictive uncertainty.

  6. Method for Prediction of the Power Output from Photovoltaic Power Plant under Actual Operating Conditions

    NASA Astrophysics Data System (ADS)

    Obukhov, S. G.; Plotnikov, I. A.; Surzhikova, O. A.; Savkin, K. D.

    2017-04-01

    Solar photovoltaic technology is one of the most rapidly growing renewable sources of electricity that has practical application in various fields of human activity due to its high availability, huge potential and environmental compatibility. The original simulation model of the photovoltaic power plant has been developed to simulate and investigate the plant operating modes under actual operating conditions. The proposed model considers the impact of the external climatic factors on the solar panel energy characteristics that improves accuracy in the power output prediction. The data obtained through the photovoltaic power plant operation simulation enable a well-reasoned choice of the required capacity for storage devices and determination of the rational algorithms to control the energy complex.

  7. Improved safety of retinal photocoagulation with a shaped beam and modulated pulse

    NASA Astrophysics Data System (ADS)

    Sramek, Christopher; Brown, Jefferson; Paulus, Yannis M.; Nomoto, Hiroyuki; Palanker, Daniel

    2010-02-01

    Shorter pulse durations help confine thermal damage during retinal photocoagulation, decrease treatment time and minimize pain. However, safe therapeutic window (the ratio of threshold powers for rupture and mild coagulation) decreases with shorter exposures. A ring-shaped beam enables safer photocoagulation than conventional beams by reducing the maximum temperature in the center of the spot. Similarly, a temporal pulse modulation decreasing its power over time improves safety by maintaining constant temperature for a significant portion of the pulse. Optimization of the beam and pulse shapes was performed using a computational model. In vivo experiments were performed to verify the predicted improvement. With each of these approaches, the pulse duration can be decreased by a factor of two, from 20 ms down to 10 ms while maintaining the same therapeutic window.

  8. Ionosphere monitoring and forecast activities within the IAG working group "Ionosphere Prediction"

    NASA Astrophysics Data System (ADS)

    Hoque, Mainul; Garcia-Rigo, Alberto; Erdogan, Eren; Cueto Santamaría, Marta; Jakowski, Norbert; Berdermann, Jens; Hernandez-Pajares, Manuel; Schmidt, Michael; Wilken, Volker

    2017-04-01

    Ionospheric disturbances can affect technologies in space and on Earth disrupting satellite and airline operations, communications networks, navigation systems. As the world becomes ever more dependent on these technologies, ionospheric disturbances as part of space weather pose an increasing risk to the economic vitality and national security. Therefore, having the knowledge of ionospheric state in advance during space weather events is becoming more and more important. To promote scientific cooperation we recently formed a Working Group (WG) called "Ionosphere Predictions" within the International Association of Geodesy (IAG) under Sub-Commission 4.3 "Atmosphere Remote Sensing" of the Commission 4 "Positioning and Applications". The general objective of the WG is to promote the development of ionosphere prediction algorithm/models based on the dependence of ionospheric characteristics on solar and magnetic conditions combining data from different sensors to improve the spatial and temporal resolution and sensitivity taking advantage of different sounding geometries and latency. Our presented work enables the possibility to compare total electron content (TEC) prediction approaches/results from different centers contributing to this WG such as German Aerospace Center (DLR), Universitat Politècnica de Catalunya (UPC), Technische Universität München (TUM) and GMV. DLR developed a model-assisted TEC forecast algorithm taking benefit from actual trends of the TEC behavior at each grid point. Since during perturbations, characterized by large TEC fluctuations or ionization fronts, this approach may fail, the trend information is merged with the current background model which provides a stable climatological TEC behavior. The presented solution is a first step to regularly provide forecasted TEC services via SWACI/IMPC by DLR. UPC forecast model is based on applying linear regression to a temporal window of TEC maps in the Discrete Cosine Transform (DCT) domain. Performance tests are being conducted at the moment in order to improve UPC predicted products for 1-, 2-days ahead. In addition, UPC is working to enable short-term predictions based on UPC real-time GIMs (labelled URTG) and implementing an improved prediction approach. TUM developed a forecast method based on a time series analysis of TEC products which are either B-spline coefficients estimated by a Kalman filter or TEC grid maps derived from the B-spline coefficients. The forecast method uses a Fourier series expansion to extract the trend functions from the estimated TEC product. Then the trend functions are carried out to provide predicted TEC products. The forecast algorithm developed by GMV is based on the ionospheric delay estimation from previous epochs using GNSS data and the main dependence of ionospheric delays on solar and magnetic conditions. Since the ionospheric behavior is highly dependent on the region of the Earth, different region-based algorithmic modifications have been implemented in GMV's magicSBAS ionospheric algorithms to be able to estimate and forecast ionospheric delays worldwide. Different TEC prediction approaches outlined here will certainly help to learn about forecasting ionospheric ionization.

  9. Predicting Human Preferences Using the Block Structure of Complex Social Networks

    PubMed Central

    Guimerà, Roger; Llorente, Alejandro; Moro, Esteban; Sales-Pardo, Marta

    2012-01-01

    With ever-increasing available data, predicting individuals' preferences and helping them locate the most relevant information has become a pressing need. Understanding and predicting preferences is also important from a fundamental point of view, as part of what has been called a “new” computational social science. Here, we propose a novel approach based on stochastic block models, which have been developed by sociologists as plausible models of complex networks of social interactions. Our model is in the spirit of predicting individuals' preferences based on the preferences of others but, rather than fitting a particular model, we rely on a Bayesian approach that samples over the ensemble of all possible models. We show that our approach is considerably more accurate than leading recommender algorithms, with major relative improvements between 38% and 99% over industry-level algorithms. Besides, our approach sheds light on decision-making processes by identifying groups of individuals that have consistently similar preferences, and enabling the analysis of the characteristics of those groups. PMID:22984533

  10. Computational predictions of energy materials using density functional theory

    NASA Astrophysics Data System (ADS)

    Jain, Anubhav; Shin, Yongwoo; Persson, Kristin A.

    2016-01-01

    In the search for new functional materials, quantum mechanics is an exciting starting point. The fundamental laws that govern the behaviour of electrons have the possibility, at the other end of the scale, to predict the performance of a material for a targeted application. In some cases, this is achievable using density functional theory (DFT). In this Review, we highlight DFT studies predicting energy-related materials that were subsequently confirmed experimentally. The attributes and limitations of DFT for the computational design of materials for lithium-ion batteries, hydrogen production and storage materials, superconductors, photovoltaics and thermoelectric materials are discussed. In the future, we expect that the accuracy of DFT-based methods will continue to improve and that growth in computing power will enable millions of materials to be virtually screened for specific applications. Thus, these examples represent a first glimpse of what may become a routine and integral step in materials discovery.

  11. Monte Carlo modeling of atomic oxygen attack of polymers with protective coatings on LDEF

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A.; Degroh, Kim K.; Sechkar, Edward A.

    1992-01-01

    Characterization of the behavior of atomic oxygen interaction with materials on the Long Duration Exposure Facility (LDEF) will assist in understanding the mechanisms involved, and will lead to improved reliability in predicting in-space durability of materials based on ground laboratory testing. A computational simulation of atomic oxygen interaction with protected polymers was developed using Monte Carlo techniques. Through the use of assumed mechanistic behavior of atomic oxygen and results of both ground laboratory and LDEF data, a predictive Monte Carlo model was developed which simulates the oxidation processes that occur on polymers with applied protective coatings that have defects. The use of high atomic oxygen fluence-directed ram LDEF results has enabled mechanistic implications to be made by adjusting Monte Carlo modeling assumptions to match observed results based on scanning electron microscopy. Modeling assumptions, implications, and predictions are presented, along with comparison of observed ground laboratory and LDEF results.

  12. Fast, scalable prediction of deleterious noncoding variants from functional and population genomic data.

    PubMed

    Huang, Yi-Fei; Gulko, Brad; Siepel, Adam

    2017-04-01

    Many genetic variants that influence phenotypes of interest are located outside of protein-coding genes, yet existing methods for identifying such variants have poor predictive power. Here we introduce a new computational method, called LINSIGHT, that substantially improves the prediction of noncoding nucleotide sites at which mutations are likely to have deleterious fitness consequences, and which, therefore, are likely to be phenotypically important. LINSIGHT combines a generalized linear model for functional genomic data with a probabilistic model of molecular evolution. The method is fast and highly scalable, enabling it to exploit the 'big data' available in modern genomics. We show that LINSIGHT outperforms the best available methods in identifying human noncoding variants associated with inherited diseases. In addition, we apply LINSIGHT to an atlas of human enhancers and show that the fitness consequences at enhancers depend on cell type, tissue specificity, and constraints at associated promoters.

  13. Predictability of Extreme Climate Events via a Complex Network Approach

    NASA Astrophysics Data System (ADS)

    Muhkin, D.; Kurths, J.

    2017-12-01

    We analyse climate dynamics from a complex network approach. This leads to an inverse problem: Is there a backbone-like structure underlying the climate system? For this we propose a method to reconstruct and analyze a complex network from data generated by a spatio-temporal dynamical system. This approach enables us to uncover relations to global circulation patterns in oceans and atmosphere. This concept is then applied to Monsoon data; in particular, we develop a general framework to predict extreme events by combining a non-linear synchronization technique with complex networks. Applying this method, we uncover a new mechanism of extreme floods in the eastern Central Andes which could be used for operational forecasts. Moreover, we analyze the Indian Summer Monsoon (ISM) and identify two regions of high importance. By estimating an underlying critical point, this leads to an improved prediction of the onset of the ISM; this scheme was successful in 2016 and 2017.

  14. Computational analysis of conserved RNA secondary structure in transcriptomes and genomes.

    PubMed

    Eddy, Sean R

    2014-01-01

    Transcriptomics experiments and computational predictions both enable systematic discovery of new functional RNAs. However, many putative noncoding transcripts arise instead from artifacts and biological noise, and current computational prediction methods have high false positive rates. I discuss prospects for improving computational methods for analyzing and identifying functional RNAs, with a focus on detecting signatures of conserved RNA secondary structure. An interesting new front is the application of chemical and enzymatic experiments that probe RNA structure on a transcriptome-wide scale. I review several proposed approaches for incorporating structure probing data into the computational prediction of RNA secondary structure. Using probabilistic inference formalisms, I show how all these approaches can be unified in a well-principled framework, which in turn allows RNA probing data to be easily integrated into a wide range of analyses that depend on RNA secondary structure inference. Such analyses include homology search and genome-wide detection of new structural RNAs.

  15. Improved evidence-based genome-scale metabolic models for maize leaf, embryo, and endosperm

    PubMed Central

    Seaver, Samuel M. D.; Bradbury, Louis M. T.; Frelin, Océane; Zarecki, Raphy; Ruppin, Eytan; Hanson, Andrew D.; Henry, Christopher S.

    2015-01-01

    There is a growing demand for genome-scale metabolic reconstructions for plants, fueled by the need to understand the metabolic basis of crop yield and by progress in genome and transcriptome sequencing. Methods are also required to enable the interpretation of plant transcriptome data to study how cellular metabolic activity varies under different growth conditions or even within different organs, tissues, and developmental stages. Such methods depend extensively on the accuracy with which genes have been mapped to the biochemical reactions in the plant metabolic pathways. Errors in these mappings lead to metabolic reconstructions with an inflated number of reactions and possible generation of unreliable metabolic phenotype predictions. Here we introduce a new evidence-based genome-scale metabolic reconstruction of maize, with significant improvements in the quality of the gene-reaction associations included within our model. We also present a new approach for applying our model to predict active metabolic genes based on transcriptome data. This method includes a minimal set of reactions associated with low expression genes to enable activity of a maximum number of reactions associated with high expression genes. We apply this method to construct an organ-specific model for the maize leaf, and tissue specific models for maize embryo and endosperm cells. We validate our models using fluxomics data for the endosperm and embryo, demonstrating an improved capacity of our models to fit the available fluxomics data. All models are publicly available via the DOE Systems Biology Knowledgebase and PlantSEED, and our new method is generally applicable for analysis transcript profiles from any plant, paving the way for further in silico studies with a wide variety of plant genomes. PMID:25806041

  16. Improved evidence-based genome-scale metabolic models for maize leaf, embryo, and endosperm

    DOE PAGES

    Seaver, Samuel M.D.; Bradbury, Louis M.T.; Frelin, Océane; ...

    2015-03-10

    There is a growing demand for genome-scale metabolic reconstructions for plants, fueled by the need to understand the metabolic basis of crop yield and by progress in genome and transcriptome sequencing. Methods are also required to enable the interpretation of plant transcriptome data to study how cellular metabolic activity varies under different growth conditions or even within different organs, tissues, and developmental stages. Such methods depend extensively on the accuracy with which genes have been mapped to the biochemical reactions in the plant metabolic pathways. Errors in these mappings lead to metabolic reconstructions with an inflated number of reactions andmore » possible generation of unreliable metabolic phenotype predictions. Here we introduce a new evidence-based genome-scale metabolic reconstruction of maize, with significant improvements in the quality of the gene-reaction associations included within our model. We also present a new approach for applying our model to predict active metabolic genes based on transcriptome data. This method includes a minimal set of reactions associated with low expression genes to enable activity of a maximum number of reactions associated with high expression genes. We apply this method to construct an organ-specific model for the maize leaf, and tissue specific models for maize embryo and endosperm cells. We validate our models using fluxomics data for the endosperm and embryo, demonstrating an improved capacity of our models to fit the available fluxomics data. All models are publicly available via the DOE Systems Biology Knowledgebase and PlantSEED, and our new method is generally applicable for analysis transcript profiles from any plant, paving the way for further in silico studies with a wide variety of plant genomes.« less

  17. Evaluation of Penalized and Nonpenalized Methods for Disease Prediction with Large-Scale Genetic Data.

    PubMed

    Won, Sungho; Choi, Hosik; Park, Suyeon; Lee, Juyoung; Park, Changyi; Kwon, Sunghoon

    2015-01-01

    Owing to recent improvement of genotyping technology, large-scale genetic data can be utilized to identify disease susceptibility loci and this successful finding has substantially improved our understanding of complex diseases. However, in spite of these successes, most of the genetic effects for many complex diseases were found to be very small, which have been a big hurdle to build disease prediction model. Recently, many statistical methods based on penalized regressions have been proposed to tackle the so-called "large P and small N" problem. Penalized regressions including least absolute selection and shrinkage operator (LASSO) and ridge regression limit the space of parameters, and this constraint enables the estimation of effects for very large number of SNPs. Various extensions have been suggested, and, in this report, we compare their accuracy by applying them to several complex diseases. Our results show that penalized regressions are usually robust and provide better accuracy than the existing methods for at least diseases under consideration.

  18. Risk Assessment and Risk Minimization in Nanomedicine: A Need for Predictive, Alternative, and 3Rs Strategies.

    PubMed

    Accomasso, Lisa; Cristallini, Caterina; Giachino, Claudia

    2018-01-01

    The use of nanomaterials in medicine has grown very rapidly, leading to a concern about possible health risks. Surely, the application of nanotechnology in medicine has many significant potentialities as it can improve human health in at least three different ways: by contributing to early disease diagnosis, improved treatment outcomes and containment of health care costs. However, toxicology or safety assessment is an integral part of any new medical technology and the nanotechnologies are no exception. The principle aim of nanosafety studies in this frame is to enable safer design of nanomedicines. The most urgent need is finding and validating novel approaches able to extrapolate acute in vitro results for the prediction of chronic in vivo effects and to this purpose a few European initiatives have been launched. While a "safe-by-design" process may be considered as utopic, "safer-by-design" is probably a reachable goal in the field of nanomedicine.

  19. Enabling comparative modeling of closely related genomes: Example genus Brucella

    DOE PAGES

    Faria, José P.; Edirisinghe, Janaka N.; Davis, James J.; ...

    2014-03-08

    For many scientific applications, it is highly desirable to be able to compare metabolic models of closely related genomes. In this study, we attempt to raise awareness to the fact that taking annotated genomes from public repositories and using them for metabolic model reconstructions is far from being trivial due to annotation inconsistencies. We are proposing a protocol for comparative analysis of metabolic models on closely related genomes, using fifteen strains of genus Brucella, which contains pathogens of both humans and livestock. This study lead to the identification and subsequent correction of inconsistent annotations in the SEED database, as wellmore » as the identification of 31 biochemical reactions that are common to Brucella, which are not originally identified by automated metabolic reconstructions. We are currently implementing this protocol for improving automated annotations within the SEED database and these improvements have been propagated into PATRIC, Model-SEED, KBase and RAST. This method is an enabling step for the future creation of consistent annotation systems and high-quality model reconstructions that will support in predicting accurate phenotypes such as pathogenicity, media requirements or type of respiration.« less

  20. Enabling comparative modeling of closely related genomes: Example genus Brucella

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faria, José P.; Edirisinghe, Janaka N.; Davis, James J.

    For many scientific applications, it is highly desirable to be able to compare metabolic models of closely related genomes. In this study, we attempt to raise awareness to the fact that taking annotated genomes from public repositories and using them for metabolic model reconstructions is far from being trivial due to annotation inconsistencies. We are proposing a protocol for comparative analysis of metabolic models on closely related genomes, using fifteen strains of genus Brucella, which contains pathogens of both humans and livestock. This study lead to the identification and subsequent correction of inconsistent annotations in the SEED database, as wellmore » as the identification of 31 biochemical reactions that are common to Brucella, which are not originally identified by automated metabolic reconstructions. We are currently implementing this protocol for improving automated annotations within the SEED database and these improvements have been propagated into PATRIC, Model-SEED, KBase and RAST. This method is an enabling step for the future creation of consistent annotation systems and high-quality model reconstructions that will support in predicting accurate phenotypes such as pathogenicity, media requirements or type of respiration.« less

  1. Machine Learning and Neurosurgical Outcome Prediction: A Systematic Review.

    PubMed

    Senders, Joeky T; Staples, Patrick C; Karhade, Aditya V; Zaki, Mark M; Gormley, William B; Broekman, Marike L D; Smith, Timothy R; Arnaout, Omar

    2018-01-01

    Accurate measurement of surgical outcomes is highly desirable to optimize surgical decision-making. An important element of surgical decision making is identification of the patient cohort that will benefit from surgery before the intervention. Machine learning (ML) enables computers to learn from previous data to make accurate predictions on new data. In this systematic review, we evaluate the potential of ML for neurosurgical outcome prediction. A systematic search in the PubMed and Embase databases was performed to identify all potential relevant studies up to January 1, 2017. Thirty studies were identified that evaluated ML algorithms used as prediction models for survival, recurrence, symptom improvement, and adverse events in patients undergoing surgery for epilepsy, brain tumor, spinal lesions, neurovascular disease, movement disorders, traumatic brain injury, and hydrocephalus. Depending on the specific prediction task evaluated and the type of input features included, ML models predicted outcomes after neurosurgery with a median accuracy and area under the receiver operating curve of 94.5% and 0.83, respectively. Compared with logistic regression, ML models performed significantly better and showed a median absolute improvement in accuracy and area under the receiver operating curve of 15% and 0.06, respectively. Some studies also demonstrated a better performance in ML models compared with established prognostic indices and clinical experts. In the research setting, ML has been studied extensively, demonstrating an excellent performance in outcome prediction for a wide range of neurosurgical conditions. However, future studies should investigate how ML can be implemented as a practical tool supporting neurosurgical care. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Subtropical high predictability establishes a promising way for monsoon and tropical storm predictions.

    PubMed

    Wang, Bin; Xiang, Baoqiang; Lee, June-Yi

    2013-02-19

    Monsoon rainfall and tropical storms (TSs) impose great impacts on society, yet their seasonal predictions are far from successful. The western Pacific Subtropical High (WPSH) is a prime circulation system affecting East Asian summer monsoon (EASM) and western North Pacific TS activities, but the sources of its variability and predictability have not been established. Here we show that the WPSH variation faithfully represents fluctuations of EASM strength (r = -0.92), the total TS days over the subtropical western North Pacific (r = -0.81), and the total number of TSs impacting East Asian coasts (r = -0.76) during 1979-2009. Our numerical experiment results establish that the WPSH variation is primarily controlled by central Pacific cooling/warming and a positive atmosphere-ocean feedback between the WPSH and the Indo-Pacific warm pool oceans. With a physically based empirical model and the state-of-the-art dynamical models, we demonstrate that the WPSH is highly predictable; this predictability creates a promising way for prediction of monsoon and TS. The predictions using the WPSH predictability not only yields substantially improved skills in prediction of the EASM rainfall, but also enables skillful prediction of the TS activities that the current dynamical models fail. Our findings reveal that positive WPSH-ocean interaction can provide a source of climate predictability and highlight the importance of subtropical dynamics in understanding monsoon and TS predictability.

  3. Subtropical High predictability establishes a promising way for monsoon and tropical storm predictions

    PubMed Central

    Wang, Bin; Xiang, Baoqiang; Lee, June-Yi

    2013-01-01

    Monsoon rainfall and tropical storms (TSs) impose great impacts on society, yet their seasonal predictions are far from successful. The western Pacific Subtropical High (WPSH) is a prime circulation system affecting East Asian summer monsoon (EASM) and western North Pacific TS activities, but the sources of its variability and predictability have not been established. Here we show that the WPSH variation faithfully represents fluctuations of EASM strength (r = –0.92), the total TS days over the subtropical western North Pacific (r = –0.81), and the total number of TSs impacting East Asian coasts (r = –0.76) during 1979–2009. Our numerical experiment results establish that the WPSH variation is primarily controlled by central Pacific cooling/warming and a positive atmosphere-ocean feedback between the WPSH and the Indo-Pacific warm pool oceans. With a physically based empirical model and the state-of-the-art dynamical models, we demonstrate that the WPSH is highly predictable; this predictability creates a promising way for prediction of monsoon and TS. The predictions using the WPSH predictability not only yields substantially improved skills in prediction of the EASM rainfall, but also enables skillful prediction of the TS activities that the current dynamical models fail. Our findings reveal that positive WPSH–ocean interaction can provide a source of climate predictability and highlight the importance of subtropical dynamics in understanding monsoon and TS predictability. PMID:23341624

  4. Graphics Processing Unit (GPU) Acceleration of the Goddard Earth Observing System Atmospheric Model

    NASA Technical Reports Server (NTRS)

    Putnam, Williama

    2011-01-01

    The Goddard Earth Observing System 5 (GEOS-5) is the atmospheric model used by the Global Modeling and Assimilation Office (GMAO) for a variety of applications, from long-term climate prediction at relatively coarse resolution, to data assimilation and numerical weather prediction, to very high-resolution cloud-resolving simulations. GEOS-5 is being ported to a graphics processing unit (GPU) cluster at the NASA Center for Climate Simulation (NCCS). By utilizing GPU co-processor technology, we expect to increase the throughput of GEOS-5 by at least an order of magnitude, and accelerate the process of scientific exploration across all scales of global modeling, including: The large-scale, high-end application of non-hydrostatic, global, cloud-resolving modeling at 10- to I-kilometer (km) global resolutions Intermediate-resolution seasonal climate and weather prediction at 50- to 25-km on small clusters of GPUs Long-range, coarse-resolution climate modeling, enabled on a small box of GPUs for the individual researcher After being ported to the GPU cluster, the primary physics components and the dynamical core of GEOS-5 have demonstrated a potential speedup of 15-40 times over conventional processor cores. Performance improvements of this magnitude reduce the required scalability of 1-km, global, cloud-resolving models from an unfathomable 6 million cores to an attainable 200,000 GPU-enabled cores.

  5. Efficient Prediction Structures for H.264 Multi View Coding Using Temporal Scalability

    NASA Astrophysics Data System (ADS)

    Guruvareddiar, Palanivel; Joseph, Biju K.

    2014-03-01

    Prediction structures with "disposable view components based" hierarchical coding have been proven to be efficient for H.264 multi view coding. Though these prediction structures along with the QP cascading schemes provide superior compression efficiency when compared to the traditional IBBP coding scheme, the temporal scalability requirements of the bit stream could not be met to the fullest. On the other hand, a fully scalable bit stream, obtained by "temporal identifier based" hierarchical coding, provides a number of advantages including bit rate adaptations and improved error resilience, but lacks in compression efficiency when compared to the former scheme. In this paper it is proposed to combine the two approaches such that a fully scalable bit stream could be realized with minimal reduction in compression efficiency when compared to state-of-the-art "disposable view components based" hierarchical coding. Simulation results shows that the proposed method enables full temporal scalability with maximum BDPSNR reduction of only 0.34 dB. A novel method also has been proposed for the identification of temporal identifier for the legacy H.264/AVC base layer packets. Simulation results also show that this enables the scenario where the enhancement views could be extracted at a lower frame rate (1/2nd or 1/4th of base view) with average extraction time for a view component of only 0.38 ms.

  6. Real-time distortion correction of spiral and echo planar images using the gradient system impulse response function.

    PubMed

    Campbell-Washburn, Adrienne E; Xue, Hui; Lederman, Robert J; Faranesh, Anthony Z; Hansen, Michael S

    2016-06-01

    MRI-guided interventions demand high frame rate imaging, making fast imaging techniques such as spiral imaging and echo planar imaging (EPI) appealing. In this study, we implemented a real-time distortion correction framework to enable the use of these fast acquisitions for interventional MRI. Distortions caused by gradient waveform inaccuracies were corrected using the gradient impulse response function (GIRF), which was measured by standard equipment and saved as a calibration file on the host computer. This file was used at runtime to calculate the predicted k-space trajectories for image reconstruction. Additionally, the off-resonance reconstruction frequency was modified in real time to interactively deblur spiral images. Real-time distortion correction for arbitrary image orientations was achieved in phantoms and healthy human volunteers. The GIRF-predicted k-space trajectories matched measured k-space trajectories closely for spiral imaging. Spiral and EPI image distortion was visibly improved using the GIRF-predicted trajectories. The GIRF calibration file showed no systematic drift in 4 months and was demonstrated to correct distortions after 30 min of continuous scanning despite gradient heating. Interactive off-resonance reconstruction was used to sharpen anatomical boundaries during continuous imaging. This real-time distortion correction framework will enable the use of these high frame rate imaging methods for MRI-guided interventions. Magn Reson Med 75:2278-2285, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  7. Real-time distortion correction of spiral and echo planar images using the gradient system impulse response function

    PubMed Central

    Campbell-Washburn, Adrienne E; Xue, Hui; Lederman, Robert J; Faranesh, Anthony Z; Hansen, Michael S

    2015-01-01

    Purpose MRI-guided interventions demand high frame-rate imaging, making fast imaging techniques such as spiral imaging and echo planar imaging (EPI) appealing. In this study, we implemented a real-time distortion correction framework to enable the use of these fast acquisitions for interventional MRI. Methods Distortions caused by gradient waveform inaccuracies were corrected using the gradient impulse response function (GIRF), which was measured by standard equipment and saved as a calibration file on the host computer. This file was used at runtime to calculate the predicted k-space trajectories for image reconstruction. Additionally, the off-resonance reconstruction frequency was modified in real-time to interactively de-blur spiral images. Results Real-time distortion correction for arbitrary image orientations was achieved in phantoms and healthy human volunteers. The GIRF predicted k-space trajectories matched measured k-space trajectories closely for spiral imaging. Spiral and EPI image distortion was visibly improved using the GIRF predicted trajectories. The GIRF calibration file showed no systematic drift in 4 months and was demonstrated to correct distortions after 30 minutes of continuous scanning despite gradient heating. Interactive off-resonance reconstruction was used to sharpen anatomical boundaries during continuous imaging. Conclusions This real-time distortion correction framework will enable the use of these high frame-rate imaging methods for MRI-guided interventions. PMID:26114951

  8. TACD: a transportable ant colony discrimination model for corporate bankruptcy prediction

    NASA Astrophysics Data System (ADS)

    Lalbakhsh, Pooia; Chen, Yi-Ping Phoebe

    2017-05-01

    This paper presents a transportable ant colony discrimination strategy (TACD) to predict corporate bankruptcy, a topic of vital importance that is attracting increasing interest in the field of economics. The proposed algorithm uses financial ratios to build a binary prediction model for companies with the two statuses of bankrupt and non-bankrupt. The algorithm takes advantage of an improved version of continuous ant colony optimisation (CACO) at the core, which is used to create an accurate, simple and understandable linear model for discrimination. This also enables the algorithm to work with continuous values, leading to more efficient learning and adaption by avoiding data discretisation. We conduct a comprehensive performance evaluation on three real-world data sets under a stratified cross-validation strategy. In three different scenarios, TACD is compared with 11 other bankruptcy prediction strategies. We also discuss the efficiency of the attribute selection methods used in the experiments. In addition to its simplicity and understandability, statistical significance tests prove the efficiency of TACD against the other prediction algorithms in both measures of AUC and accuracy.

  9. A collaborative environment for developing and validating predictive tools for protein biophysical characteristics

    NASA Astrophysics Data System (ADS)

    Johnston, Michael A.; Farrell, Damien; Nielsen, Jens Erik

    2012-04-01

    The exchange of information between experimentalists and theoreticians is crucial to improving the predictive ability of theoretical methods and hence our understanding of the related biology. However many barriers exist which prevent the flow of information between the two disciplines. Enabling effective collaboration requires that experimentalists can easily apply computational tools to their data, share their data with theoreticians, and that both the experimental data and computational results are accessible to the wider community. We present a prototype collaborative environment for developing and validating predictive tools for protein biophysical characteristics. The environment is built on two central components; a new python-based integration module which allows theoreticians to provide and manage remote access to their programs; and PEATDB, a program for storing and sharing experimental data from protein biophysical characterisation studies. We demonstrate our approach by integrating PEATSA, a web-based service for predicting changes in protein biophysical characteristics, into PEATDB. Furthermore, we illustrate how the resulting environment aids method development using the Potapov dataset of experimentally measured ΔΔGfold values, previously employed to validate and train protein stability prediction algorithms.

  10. Crowdsourced assessment of common genetic contribution to predicting anti-TNF treatment response in rheumatoid arthritis

    PubMed Central

    Sieberts, Solveig K.; Zhu, Fan; García-García, Javier; Stahl, Eli; Pratap, Abhishek; Pandey, Gaurav; Pappas, Dimitrios; Aguilar, Daniel; Anton, Bernat; Bonet, Jaume; Eksi, Ridvan; Fornés, Oriol; Guney, Emre; Li, Hongdong; Marín, Manuel Alejandro; Panwar, Bharat; Planas-Iglesias, Joan; Poglayen, Daniel; Cui, Jing; Falcao, Andre O.; Suver, Christine; Hoff, Bruce; Balagurusamy, Venkat S. K.; Dillenberger, Donna; Neto, Elias Chaibub; Norman, Thea; Aittokallio, Tero; Ammad-ud-din, Muhammad; Azencott, Chloe-Agathe; Bellón, Víctor; Boeva, Valentina; Bunte, Kerstin; Chheda, Himanshu; Cheng, Lu; Corander, Jukka; Dumontier, Michel; Goldenberg, Anna; Gopalacharyulu, Peddinti; Hajiloo, Mohsen; Hidru, Daniel; Jaiswal, Alok; Kaski, Samuel; Khalfaoui, Beyrem; Khan, Suleiman Ali; Kramer, Eric R.; Marttinen, Pekka; Mezlini, Aziz M.; Molparia, Bhuvan; Pirinen, Matti; Saarela, Janna; Samwald, Matthias; Stoven, Véronique; Tang, Hao; Tang, Jing; Torkamani, Ali; Vert, Jean-Phillipe; Wang, Bo; Wang, Tao; Wennerberg, Krister; Wineinger, Nathan E.; Xiao, Guanghua; Xie, Yang; Yeung, Rae; Zhan, Xiaowei; Zhao, Cheng; Calaza, Manuel; Elmarakeby, Haitham; Heath, Lenwood S.; Long, Quan; Moore, Jonathan D.; Opiyo, Stephen Obol; Savage, Richard S.; Zhu, Jun; Greenberg, Jeff; Kremer, Joel; Michaud, Kaleb; Barton, Anne; Coenen, Marieke; Mariette, Xavier; Miceli, Corinne; Shadick, Nancy; Weinblatt, Michael; de Vries, Niek; Tak, Paul P.; Gerlag, Danielle; Huizinga, Tom W. J.; Kurreeman, Fina; Allaart, Cornelia F.; Louis Bridges Jr., S.; Criswell, Lindsey; Moreland, Larry; Klareskog, Lars; Saevarsdottir, Saedis; Padyukov, Leonid; Gregersen, Peter K.; Friend, Stephen; Plenge, Robert; Stolovitzky, Gustavo; Oliva, Baldo; Guan, Yuanfang; Mangravite, Lara M.

    2016-01-01

    Rheumatoid arthritis (RA) affects millions world-wide. While anti-TNF treatment is widely used to reduce disease progression, treatment fails in ∼one-third of patients. No biomarker currently exists that identifies non-responders before treatment. A rigorous community-based assessment of the utility of SNP data for predicting anti-TNF treatment efficacy in RA patients was performed in the context of a DREAM Challenge (http://www.synapse.org/RA_Challenge). An open challenge framework enabled the comparative evaluation of predictions developed by 73 research groups using the most comprehensive available data and covering a wide range of state-of-the-art modelling methodologies. Despite a significant genetic heritability estimate of treatment non-response trait (h2=0.18, P value=0.02), no significant genetic contribution to prediction accuracy is observed. Results formally confirm the expectations of the rheumatology community that SNP information does not significantly improve predictive performance relative to standard clinical traits, thereby justifying a refocusing of future efforts on collection of other data. PMID:27549343

  11. Predicting Space Weather: Challenges for Research and Operations

    NASA Astrophysics Data System (ADS)

    Singer, H. J.; Onsager, T. G.; Rutledge, R.; Viereck, R. A.; Kunches, J.

    2013-12-01

    Society's growing dependence on technologies and infrastructure susceptible to the consequences of space weather has given rise to increased attention at the highest levels of government as well as inspired the need for both research and improved space weather services. In part, for these reasons, the number one goal of the recent National Research Council report on a Decadal Strategy for Solar and Space Physics is to 'Determine the origins of the Sun's activity and predict the variations in the space environment.' Prediction of conditions in our space environment is clearly a challenge for both research and operations, and we require the near-term development and validation of models that have sufficient accuracy and lead time to be useful to those impacted by space weather. In this presentation, we will provide new scientific results of space weather conditions that have challenged space weather forecasters, and identify specific areas of research that can lead to improved capabilities. In addition, we will examine examples of customer impacts and requirements as well as the challenges to the operations community to establish metrics that enable the selection and transition of models and observations that can provide the greatest economic and societal benefit.

  12. NASA Contributions to Improve Understanding of Extreme Events in the Global Energy and Water Cycle

    NASA Technical Reports Server (NTRS)

    Lapenta, William M.

    2008-01-01

    The U.S. Climate Change Science Program (CCSP) has established the water cycle goals of the Nation's climate change program. Accomplishing these goals will require, in part, an accurate accounting of the key reservoirs and fluxes associated with the global water and energy cycle, including their spatial and temporal variability. through integration of all necessary observations and research tools, To this end, in conjunction with NASA's Earth science research strategy, the overarching long-term NASA Energy and Water Cycle Study (NEWS) grand challenge can he summarized as documenting and enabling improved, observationally based, predictions of water and energy cycle consequences of Earth system variability and change. This challenge requires documenting and predicting trends in the rate of the Earth's water and energy cycling that corresponds to climate change and changes in the frequency and intensity of naturally occurring related meteorological and hydrologic events, which may vary as climate may vary in the future. The cycling of water and energy has obvious and significant implications for the health and prosperity of our society. The importance of documenting and predicting water and energy cycle variations and extremes is necessary to accomplish this benefit to society.

  13. Next Generation Community Based Unified Global Modeling System Development and Operational Implementation Strategies at NCEP

    NASA Astrophysics Data System (ADS)

    Tallapragada, V.

    2017-12-01

    NOAA's Next Generation Global Prediction System (NGGPS) has provided the unique opportunity to develop and implement a non-hydrostatic global model based on Geophysical Fluid Dynamics Laboratory (GFDL) Finite Volume Cubed Sphere (FV3) Dynamic Core at National Centers for Environmental Prediction (NCEP), making a leap-step advancement in seamless prediction capabilities across all spatial and temporal scales. Model development efforts are centralized with unified model development in the NOAA Environmental Modeling System (NEMS) infrastructure based on Earth System Modeling Framework (ESMF). A more sophisticated coupling among various earth system components is being enabled within NEMS following National Unified Operational Prediction Capability (NUOPC) standards. The eventual goal of unifying global and regional models will enable operational global models operating at convective resolving scales. Apart from the advanced non-hydrostatic dynamic core and coupling to various earth system components, advanced physics and data assimilation techniques are essential for improved forecast skill. NGGPS is spearheading ambitious physics and data assimilation strategies, concentrating on creation of a Common Community Physics Package (CCPP) and Joint Effort for Data Assimilation Integration (JEDI). Both initiatives are expected to be community developed, with emphasis on research transitioning to operations (R2O). The unified modeling system is being built to support the needs of both operations and research. Different layers of community partners are also established with specific roles/responsibilities for researchers, core development partners, trusted super-users, and operations. Stakeholders are engaged at all stages to help drive the direction of development, resources allocations and prioritization. This talk presents the current and future plans of unified model development at NCEP for weather, sub-seasonal, and seasonal climate prediction applications with special emphasis on implementation of NCEP FV3 Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) into operations by 2019.

  14. Predicting pedestrian flow: a methodology and a proof of concept based on real-life data.

    PubMed

    Davidich, Maria; Köster, Gerta

    2013-01-01

    Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For instance, our analysis of pedestrian motion under natural conditions at a major German railway station reveals that the values for free-flow velocities and the flow-density relationship differ significantly from widely used literature values. It is thus necessary to calibrate and validate the model against relevant real-life data to make it capable of reproducing and predicting real-life scenarios. In this work we aim at constructing such realistic pedestrian stream simulation. Based on the analysis of real-life data, we present a methodology that identifies key parameters and interdependencies that enable us to properly calibrate the model. The success of the approach is demonstrated for a benchmark model, a cellular automaton. We show that the proposed approach significantly improves the reliability of the simulation and hence the potential prediction accuracy. The simulation is validated by comparing the local density evolution of the measured data to that of the simulated data. We find that for our model the most sensitive parameters are: the source-target distribution of the pedestrian trajectories, the schedule of pedestrian appearances in the scenario and the mean free-flow velocity. Our results emphasize the need for real-life data extraction and analysis to enable predictive simulations.

  15. Identification and characterization of the furfural and 5-(hydroxymethyl)furfural degradation pathways of Cupriavidus basilensis HMF14

    PubMed Central

    Koopman, Frank; Wierckx, Nick; de Winde, Johannes H.; Ruijssenaars, Harald J.

    2010-01-01

    The toxic fermentation inhibitors in lignocellulosic hydrolysates pose significant problems for the production of second-generation biofuels and biochemicals. Among these inhibitors, 5-(hydroxymethyl)furfural (HMF) and furfural are specifically notorious. In this study, we describe the complete molecular identification and characterization of the pathway by which Cupriavidus basilensis HMF14 metabolizes HMF and furfural. The identification of this pathway enabled the construction of an HMF and furfural-metabolizing Pseudomonas putida. The genetic information obtained furthermore enabled us to predict the HMF and furfural degrading capabilities of sequenced bacterial species that had not previously been connected to furanic aldehyde metabolism. These results pave the way for in situ detoxification of lignocellulosic hydrolysates, which is a major step toward improved efficiency of utilization of lignocellulosic feedstock. PMID:20194784

  16. Microstructural Modeling of Brittle Materials for Enhanced Performance and Reliability.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teague, Melissa Christine; Teague, Melissa Christine; Rodgers, Theron

    Brittle failure is often influenced by difficult to measure and variable microstructure-scale stresses. Recent advances in photoluminescence spectroscopy (PLS), including improved confocal laser measurement and rapid spectroscopic data collection have established the potential to map stresses with microscale spatial resolution (%3C2 microns). Advanced PLS was successfully used to investigate both residual and externally applied stresses in polycrystalline alumina at the microstructure scale. The measured average stresses matched those estimated from beam theory to within one standard deviation, validating the technique. Modeling the residual stresses within the microstructure produced general agreement in comparison with the experimentally measured results. Microstructure scale modelingmore » is primed to take advantage of advanced PLS to enable its refinement and validation, eventually enabling microstructure modeling to become a predictive tool for brittle materials.« less

  17. Improving Permafrost Hydrology Prediction Through Data-Model Integration

    NASA Astrophysics Data System (ADS)

    Wilson, C. J.; Andresen, C. G.; Atchley, A. L.; Bolton, W. R.; Busey, R.; Coon, E.; Charsley-Groffman, L.

    2017-12-01

    The CMIP5 Earth System Models were unable to adequately predict the fate of the 16GT of permafrost carbon in a warming climate due to poor representation of Arctic ecosystem processes. The DOE Office of Science Next Generation Ecosystem Experiment, NGEE-Arctic project aims to reduce uncertainty in the Arctic carbon cycle and its impact on the Earth's climate system by improved representation of the coupled physical, chemical and biological processes that drive how much buried carbon will be converted to CO2 and CH4, how fast this will happen, which form will dominate, and the degree to which increased plant productivity will offset increased soil carbon emissions. These processes fundamentally depend on permafrost thaw rate and its influence on surface and subsurface hydrology through thermal erosion, land subsidence and changes to groundwater flow pathways as soil, bedrock and alluvial pore ice and massive ground ice melts. LANL and its NGEE colleagues are co-developing data and models to better understand controls on permafrost degradation and improve prediction of the evolution of permafrost and its impact on Arctic hydrology. The LANL Advanced Terrestrial Simulator was built using a state of the art HPC software framework to enable the first fully coupled 3-dimensional surface-subsurface thermal-hydrology and land surface deformation simulations to simulate the evolution of the physical Arctic environment. Here we show how field data including hydrology, snow, vegetation, geochemistry and soil properties, are informing the development and application of the ATS to improve understanding of controls on permafrost stability and permafrost hydrology. The ATS is being used to inform parameterizations of complex coupled physical, ecological and biogeochemical processes for implementation in the DOE ACME land model, to better predict the role of changing Arctic hydrology on the global climate system. LA-UR-17-26566.

  18. Predicting Key Agronomic Soil Properties with UV-Vis Fluorescence Measurements Combined with Vis-NIR-SWIR Reflectance Spectroscopy: A Farm-Scale Study in a Mediterranean Viticultural Agroecosystem.

    PubMed

    Vaudour, Emmanuelle; Cerovic, Zoran G; Ebengo, Dav M; Latouche, Gwendal

    2018-04-10

    For adequate crop and soil management, rapid and accurate techniques for monitoring soil properties are particularly important when a farmer starts up his activities and needs a diagnosis of his cultivated fields. This study aimed to evaluate the potential of fluorescence measured directly on 146 whole soil solid samples, for predicting key soil properties at the scale of a 6 ha Mediterranean wine estate with contrasting soils. UV-Vis fluorescence measurements were carried out in conjunction with reflectance measurements in the Vis-NIR-SWIR range. Combining PLSR predictions from Vis-NIR-SWIR reflectance spectra and from a set of fluorescence signals enabled us to improve the power of prediction of a number of key agronomic soil properties including SOC, N tot , CaCO₃, iron, fine particle-sizes (clay, fine silt, fine sand), CEC, pH and exchangeable Ca 2+ with cross-validation RPD ≥ 2 and R² ≥ 0.75, while exchangeable K⁺, Na⁺, Mg 2+ , coarse silt and coarse sand contents were fairly predicted (1.42 ≤ RPD < 2 and 0.54 ≤ R² < 0.75). Predictions of SOC, N tot , CaCO₃, iron contents, and pH were still good (RPD ≥ 1.8, R² ≥ 0.68) when using a single fluorescence signal or index such as SFR_R or FERARI, highlighting the unexpected importance of red excitations and indices derived from plant studies. The predictive ability of single fluorescence indices or original signals was very significant for topsoil: this is very important for a farmer who wishes to update information on soil nutrient for the purpose of fertility diagnosis and particularly nitrogen fertilization. These results open encouraging perspectives for using miniaturized fluorescence devices enabling red excitation coupled with red or far-red fluorescence emissions directly in the field.

  19. Predicting Key Agronomic Soil Properties with UV-Vis Fluorescence Measurements Combined with Vis-NIR-SWIR Reflectance Spectroscopy: A Farm-Scale Study in a Mediterranean Viticultural Agroecosystem

    PubMed Central

    Vaudour, Emmanuelle; Cerovic, Zoran G.; Ebengo, Dav M.; Latouche, Gwendal

    2018-01-01

    For adequate crop and soil management, rapid and accurate techniques for monitoring soil properties are particularly important when a farmer starts up his activities and needs a diagnosis of his cultivated fields. This study aimed to evaluate the potential of fluorescence measured directly on 146 whole soil solid samples, for predicting key soil properties at the scale of a 6 ha Mediterranean wine estate with contrasting soils. UV-Vis fluorescence measurements were carried out in conjunction with reflectance measurements in the Vis-NIR-SWIR range. Combining PLSR predictions from Vis-NIR-SWIR reflectance spectra and from a set of fluorescence signals enabled us to improve the power of prediction of a number of key agronomic soil properties including SOC, Ntot, CaCO3, iron, fine particle-sizes (clay, fine silt, fine sand), CEC, pH and exchangeable Ca2+ with cross-validation RPD ≥ 2 and R² ≥ 0.75, while exchangeable K+, Na+, Mg2+, coarse silt and coarse sand contents were fairly predicted (1.42 ≤ RPD < 2 and 0.54 ≤ R² < 0.75). Predictions of SOC, Ntot, CaCO3, iron contents, and pH were still good (RPD ≥ 1.8, R² ≥ 0.68) when using a single fluorescence signal or index such as SFR_R or FERARI, highlighting the unexpected importance of red excitations and indices derived from plant studies. The predictive ability of single fluorescence indices or original signals was very significant for topsoil: this is very important for a farmer who wishes to update information on soil nutrient for the purpose of fertility diagnosis and particularly nitrogen fertilization. These results open encouraging perspectives for using miniaturized fluorescence devices enabling red excitation coupled with red or far-red fluorescence emissions directly in the field. PMID:29642640

  20. Electrical test prediction using hybrid metrology and machine learning

    NASA Astrophysics Data System (ADS)

    Breton, Mary; Chao, Robin; Muthinti, Gangadhara Raja; de la Peña, Abraham A.; Simon, Jacques; Cepler, Aron J.; Sendelbach, Matthew; Gaudiello, John; Emans, Susan; Shifrin, Michael; Etzioni, Yoav; Urenski, Ronen; Lee, Wei Ti

    2017-03-01

    Electrical test measurement in the back-end of line (BEOL) is crucial for wafer and die sorting as well as comparing intended process splits. Any in-line, nondestructive technique in the process flow to accurately predict these measurements can significantly improve mean-time-to-detect (MTTD) of defects and improve cycle times for yield and process learning. Measuring after BEOL metallization is commonly done for process control and learning, particularly with scatterometry (also called OCD (Optical Critical Dimension)), which can solve for multiple profile parameters such as metal line height or sidewall angle and does so within patterned regions. This gives scatterometry an advantage over inline microscopy-based techniques, which provide top-down information, since such techniques can be insensitive to sidewall variations hidden under the metal fill of the trench. But when faced with correlation to electrical test measurements that are specific to the BEOL processing, both techniques face the additional challenge of sampling. Microscopy-based techniques are sampling-limited by their small probe size, while scatterometry is traditionally limited (for microprocessors) to scribe targets that mimic device ground rules but are not necessarily designed to be electrically testable. A solution to this sampling challenge lies in a fast reference-based machine learning capability that allows for OCD measurement directly of the electrically-testable structures, even when they are not OCD-compatible. By incorporating such direct OCD measurements, correlation to, and therefore prediction of, resistance of BEOL electrical test structures is significantly improved. Improvements in prediction capability for multiple types of in-die electrically-testable device structures is demonstrated. To further improve the quality of the prediction of the electrical resistance measurements, hybrid metrology using the OCD measurements as well as X-ray metrology (XRF) is used. Hybrid metrology is the practice of combining information from multiple sources in order to enable or improve the measurement of one or more critical parameters. Here, the XRF measurements are used to detect subtle changes in barrier layer composition and thickness that can have second-order effects on the electrical resistance of the test structures. By accounting for such effects with the aid of the X-ray-based measurements, further improvement in the OCD correlation to electrical test measurements is achieved. Using both types of solution incorporation of fast reference-based machine learning on nonOCD-compatible test structures, and hybrid metrology combining OCD with XRF technology improvement in BEOL cycle time learning could be accomplished through improved prediction capability.

  1. Image-based computational fluid dynamics in the lung: virtual reality or new clinical practice?

    PubMed

    Burrowes, Kelly S; De Backer, Jan; Kumar, Haribalan

    2017-11-01

    The development and implementation of personalized medicine is paramount to improving the efficiency and efficacy of patient care. In the respiratory system, function is largely dictated by the choreographed movement of air and blood to the gas exchange surface. The passage of air begins in the upper airways, either via the mouth or nose, and terminates at the alveolar interface, while blood flows from the heart to the alveoli and back again. Computational fluid dynamics (CFD) is a well-established tool for predicting fluid flows and pressure distributions within complex systems. Traditionally CFD has been used to aid in the effective or improved design of a system or device; however, it has become increasingly exploited in biological and medical-based applications further broadening the scope of this computational technique. In this review, we discuss the advancement in application of CFD to the respiratory system and the contributions CFD is currently making toward improving precision medicine. The key areas CFD has been applied to in the pulmonary system are in predicting fluid transport and aerosol distribution within the airways. Here we focus our discussion on fluid flows and in particular on image-based clinically focused CFD in the ventilatory system. We discuss studies spanning from the paranasal sinuses through the conducting airways down to the level of the alveolar airways. The combination of imaging and CFD is enabling improved device design in aerosol transport, improved biomarkers of lung function in clinical trials, and improved predictions and assessment of surgical interventions in the nasal sinuses. WIREs Syst Biol Med 2017, 9:e1392. doi: 10.1002/wsbm.1392 For further resources related to this article, please visit the WIREs website. © 2017 Wiley Periodicals, Inc.

  2. Quantitative modeling of the reaction/diffusion kinetics of two-chemistry photopolymers

    NASA Astrophysics Data System (ADS)

    Kowalski, Benjamin Andrew

    Optically driven diffusion in photopolymers is an appealing material platform for a broad range of applications, in which the recorded refractive index patterns serve either as images (e.g. data storage, display holography) or as optical elements (e.g. custom GRIN components, integrated optical devices). A quantitative understanding of the reaction/diffusion kinetics is difficult to obtain directly, but is nevertheless necessary in order to fully exploit the wide array of design freedoms in these materials. A general strategy for characterizing these kinetics is proposed, in which key processes are decoupled and independently measured. This strategy enables prediction of a material's potential refractive index change, solely on the basis of its chemical components. The degree to which a material does not reach this potential reveals the fraction of monomer that has participated in unwanted reactions, reducing spatial resolution and dynamic range. This approach is demonstrated for a model material similar to commercial media, achieving quantitative predictions of index response over three orders of exposure dose (~1 to ~103 mJ cm-2) and three orders of feature size (0.35 to 500 microns). The resulting insights enable guided, rational design of new material formulations with demonstrated performance improvement.

  3. Skeletal assessment with finite element analysis: relevance, pitfalls and interpretation.

    PubMed

    Campbell, Graeme Michael; Glüer, Claus-C

    2017-07-01

    Finite element models simulate the mechanical response of bone under load, enabling noninvasive assessment of strength. Models generated from quantitative computed tomography (QCT) incorporate the geometry and spatial distribution of bone mineral density (BMD) to simulate physiological and traumatic loads as well as orthopaedic implant behaviour. The present review discusses the current strengths and weakness of finite element models for application to skeletal biomechanics. In cadaver studies, finite element models provide better estimations of strength compared to BMD. Data from clinical studies are encouraging; however, the superiority of finite element models over BMD measures for fracture prediction has not been shown conclusively, and may be sex and site dependent. Therapeutic effects on bone strength are larger than for BMD; however, model validation has only been performed on untreated bone. High-resolution modalities and novel image processing methods may enhance the structural representation and predictive ability. Despite extensive use of finite element models to study orthopaedic implant stability, accurate simulation of the bone-implant interface and fracture progression remains a significant challenge. Skeletal finite element models provide noninvasive assessments of strength and implant stability. Improved structural representation and implant surface interaction may enable more accurate models of fragility in the future.

  4. Constraint-based modeling in microbial food biotechnology

    PubMed Central

    Rau, Martin H.

    2018-01-01

    Genome-scale metabolic network reconstruction offers a means to leverage the value of the exponentially growing genomics data and integrate it with other biological knowledge in a structured format. Constraint-based modeling (CBM) enables both the qualitative and quantitative analyses of the reconstructed networks. The rapid advancements in these areas can benefit both the industrial production of microbial food cultures and their application in food processing. CBM provides several avenues for improving our mechanistic understanding of physiology and genotype–phenotype relationships. This is essential for the rational improvement of industrial strains, which can further be facilitated through various model-guided strain design approaches. CBM of microbial communities offers a valuable tool for the rational design of defined food cultures, where it can catalyze hypothesis generation and provide unintuitive rationales for the development of enhanced community phenotypes and, consequently, novel or improved food products. In the industrial-scale production of microorganisms for food cultures, CBM may enable a knowledge-driven bioprocess optimization by rationally identifying strategies for growth and stability improvement. Through these applications, we believe that CBM can become a powerful tool for guiding the areas of strain development, culture development and process optimization in the production of food cultures. Nevertheless, in order to make the correct choice of the modeling framework for a particular application and to interpret model predictions in a biologically meaningful manner, one should be aware of the current limitations of CBM. PMID:29588387

  5. Whole genome sequences in pulse crops: a global community resource to expedite translational genomics and knowledge-based crop improvement.

    PubMed

    Bohra, Abhishek; Singh, Narendra P

    2015-08-01

    Unprecedented developments in legume genomics over the last decade have resulted in the acquisition of a wide range of modern genomic resources to underpin genetic improvement of grain legumes. The genome enabled insights direct investigators in various ways that primarily include unearthing novel structural variations, retrieving the lost genetic diversity, introducing novel/exotic alleles from wider gene pools, finely resolving the complex quantitative traits and so forth. To this end, ready availability of cost-efficient and high-density genotyping assays allows genome wide prediction to be increasingly recognized as the key selection criterion in crop breeding. Further, the high-dimensional measurements of agronomically significant phenotypes obtained by using new-generation screening techniques will empower reference based resequencing as well as allele mining and trait mapping methods to comprehensively associate genome diversity with the phenome scale variation. Besides stimulating the forward genetic systems, accessibility to precisely delineated genomic segments reveals novel candidates for reverse genetic techniques like targeted genome editing. The shifting paradigm in plant genomics in turn necessitates optimization of crop breeding strategies to enable the most efficient integration of advanced omics knowledge and tools. We anticipate that the crop improvement schemes will be bolstered remarkably with rational deployment of these genome-guided approaches, ultimately resulting in expanded plant breeding capacities and improved crop performance.

  6. Planning for subacute care: predicting demand using acute activity data.

    PubMed

    Green, Janette P; McNamee, Jennifer P; Kobel, Conrad; Seraji, Md Habibur R; Lawrence, Suanne J

    2016-01-01

    Objective The aim of the present study was to develop a robust model that uses the concept of 'rehabilitation-sensitive' Diagnosis Related Groups (DRGs) in predicting demand for rehabilitation and geriatric evaluation and management (GEM) care following acute in-patient episodes provided in Australian hospitals. Methods The model was developed using statistical analyses of national datasets, informed by a panel of expert clinicians and jurisdictional advice. Logistic regression analysis was undertaken using acute in-patient data, published national hospital statistics and data from the Australasian Rehabilitation Outcomes Centre. Results The predictive model comprises tables of probabilities that patients will require rehabilitation or GEM care after an acute episode, with columns defined by age group and rows defined by grouped Australian Refined (AR)-DRGs. Conclusions The existing concept of rehabilitation-sensitive DRGs was revised and extended. When applied to national data, the model provided a conservative estimate of 83% of the activity actually provided. An example demonstrates the application of the model for service planning. What is known about the topic? Health service planning is core business for jurisdictions and local areas. With populations ageing and an acknowledgement of the underservicing of subacute care, it is timely to find improved methods of estimating demand for this type of care. Traditionally, age-sex standardised utilisation rates for individual DRGs have been applied to Australian Bureau of Statistics (ABS) population projections to predict the future need for subacute services. Improved predictions became possible when some AR-DRGs were designated 'rehabilitation-sensitive'. This improved methodology has been used in several Australian jurisdictions. What does this paper add? This paper presents a new tool, or model, to predict demand for rehabilitation and GEM services based on in-patient acute activity. In this model, the methodology based on rehabilitation-sensitive AR-DRGs has been extended by updating them to AR-DRG Version 7.0, quantifying the level of 'sensitivity' and incorporating the patient's age to improve the prediction of demand for subacute services. What are the implications for practitioners? The predictive model takes the form of tables of probabilities that patients will require rehabilitation or GEM care after an acute episode and can be applied to acute in-patient administrative datasets in any Australian jurisdiction or local area. The use of patient-level characteristics will enable service planners to improve their forecasting of demand for these services. Clinicians and jurisdictional representatives consulted during the project regarded the model favourably and believed that it was an improvement on currently available methods.

  7. Comprehensive curation and analysis of global interaction networks in Saccharomyces cerevisiae

    PubMed Central

    Reguly, Teresa; Breitkreutz, Ashton; Boucher, Lorrie; Breitkreutz, Bobby-Joe; Hon, Gary C; Myers, Chad L; Parsons, Ainslie; Friesen, Helena; Oughtred, Rose; Tong, Amy; Stark, Chris; Ho, Yuen; Botstein, David; Andrews, Brenda; Boone, Charles; Troyanskya, Olga G; Ideker, Trey; Dolinski, Kara; Batada, Nizar N; Tyers, Mike

    2006-01-01

    Background The study of complex biological networks and prediction of gene function has been enabled by high-throughput (HTP) methods for detection of genetic and protein interactions. Sparse coverage in HTP datasets may, however, distort network properties and confound predictions. Although a vast number of well substantiated interactions are recorded in the scientific literature, these data have not yet been distilled into networks that enable system-level inference. Results We describe here a comprehensive database of genetic and protein interactions, and associated experimental evidence, for the budding yeast Saccharomyces cerevisiae, as manually curated from over 31,793 abstracts and online publications. This literature-curated (LC) dataset contains 33,311 interactions, on the order of all extant HTP datasets combined. Surprisingly, HTP protein-interaction datasets currently achieve only around 14% coverage of the interactions in the literature. The LC network nevertheless shares attributes with HTP networks, including scale-free connectivity and correlations between interactions, abundance, localization, and expression. We find that essential genes or proteins are enriched for interactions with other essential genes or proteins, suggesting that the global network may be functionally unified. This interconnectivity is supported by a substantial overlap of protein and genetic interactions in the LC dataset. We show that the LC dataset considerably improves the predictive power of network-analysis approaches. The full LC dataset is available at the BioGRID () and SGD () databases. Conclusion Comprehensive datasets of biological interactions derived from the primary literature provide critical benchmarks for HTP methods, augment functional prediction, and reveal system-level attributes of biological networks. PMID:16762047

  8. Modelling Complexity: Making Sense of Leadership Issues in 14-19 Education

    ERIC Educational Resources Information Center

    Briggs, Ann R. J.

    2008-01-01

    Modelling of statistical data is a well established analytical strategy. Statistical data can be modelled to represent, and thereby predict, the forces acting upon a structure or system. For the rapidly changing systems in the world of education, modelling enables the researcher to understand, to predict and to enable decisions to be based upon…

  9. Material Characterization for Ductile Fracture Prediction

    NASA Technical Reports Server (NTRS)

    Hill, Michael R.

    2000-01-01

    The research summarized in this document provides valuable information for structural health evaluation of NASA infrastructure. Specifically, material properties are reported which will enable calibration of ductile fracture prediction methods for three high-toughness metallic materials and one aluminum alloy which can be found in various NASA facilities. The task of investigating these materials has also served to validate an overall methodology for ductile fracture prediction is currently being employed at NASA. In facilitating the ability to incorporate various materials into the prediction scheme, we have provided data to enable demonstration of the overall generality of the approach.

  10. Advances in traction drive technology

    NASA Technical Reports Server (NTRS)

    Loewenthal, S. H.; Anderson, N. E.; Rohn, D. A.

    1983-01-01

    Traction drives are traced from early uses as main transmissions in automobiles at the turn of the century to modern, high-powered traction drives capable of transmitting hundreds of horsepower. Recent advances in technology are described which enable today's traction drive to be a serious candidate for off-highway vehicles and helicopter applications. Improvements in materials, traction fluids, design techniques, power loss and life prediction methods will be highlighted. Performance characteristics of the Nasvytis fixed-ratio drive are given. Promising future drive applications, such as helicopter main transmissions and servo-control positioning mechanisms are also addressed.

  11. NASA Applied Sciences Program

    NASA Technical Reports Server (NTRS)

    Frederick, Martin

    2006-01-01

    This presentation highlights the NASA Applied Sciences Program. The goal of the program is to extend the results of scientific research and knowledge beyond the science community to contribute to NASA's partners' applications of national priority, such as agricultural efficiency, energy management and Homeland Security. Another purpose of the program's scientific research is to increase knowledge of the Earth-Sun system to enable improved predictions of climate, weather, and natural hazards. The program primarily optimizes benefits for citizens by contributing to partnering on applications that are used by state, local and tribal governments.

  12. Seismographs, sensors, and satellites: Better technology for safer communities

    USGS Publications Warehouse

    Groat, C.G.

    2004-01-01

    In the past 25 years, our ability to measure, monitor, and model the processes that lead to natural disasters has increased dramatically. Equally important has been the improvement in our technological capability to communicate information about hazards to those whose lives may be affected. These innovations in tracking and communicating the changes-floods, earthquakes, wildfires, volcanic eruptions-in our dynamic planet, supported by a deeper understanding of earth processes, enable us to expand our predictive capabilities and point the way to a safer future. ?? 2004 Elsevier Ltd. All rights reserved.

  13. NAPR: a Cloud-Based Framework for Neuroanatomical Age Prediction.

    PubMed

    Pardoe, Heath R; Kuzniecky, Ruben

    2018-01-01

    The availability of cloud computing services has enabled the widespread adoption of the "software as a service" (SaaS) approach for software distribution, which utilizes network-based access to applications running on centralized servers. In this paper we apply the SaaS approach to neuroimaging-based age prediction. Our system, named "NAPR" (Neuroanatomical Age Prediction using R), provides access to predictive modeling software running on a persistent cloud-based Amazon Web Services (AWS) compute instance. The NAPR framework allows external users to estimate the age of individual subjects using cortical thickness maps derived from their own locally processed T1-weighted whole brain MRI scans. As a demonstration of the NAPR approach, we have developed two age prediction models that were trained using healthy control data from the ABIDE, CoRR, DLBS and NKI Rockland neuroimaging datasets (total N = 2367, age range 6-89 years). The provided age prediction models were trained using (i) relevance vector machines and (ii) Gaussian processes machine learning methods applied to cortical thickness surfaces obtained using Freesurfer v5.3. We believe that this transparent approach to out-of-sample evaluation and comparison of neuroimaging age prediction models will facilitate the development of improved age prediction models and allow for robust evaluation of the clinical utility of these methods.

  14. Latin hypercube approach to estimate uncertainty in ground water vulnerability

    USGS Publications Warehouse

    Gurdak, J.J.; McCray, J.E.; Thyne, G.; Qi, S.L.

    2007-01-01

    A methodology is proposed to quantify prediction uncertainty associated with ground water vulnerability models that were developed through an approach that coupled multivariate logistic regression with a geographic information system (GIS). This method uses Latin hypercube sampling (LHS) to illustrate the propagation of input error and estimate uncertainty associated with the logistic regression predictions of ground water vulnerability. Central to the proposed method is the assumption that prediction uncertainty in ground water vulnerability models is a function of input error propagation from uncertainty in the estimated logistic regression model coefficients (model error) and the values of explanatory variables represented in the GIS (data error). Input probability distributions that represent both model and data error sources of uncertainty were simultaneously sampled using a Latin hypercube approach with logistic regression calculations of probability of elevated nonpoint source contaminants in ground water. The resulting probability distribution represents the prediction intervals and associated uncertainty of the ground water vulnerability predictions. The method is illustrated through a ground water vulnerability assessment of the High Plains regional aquifer. Results of the LHS simulations reveal significant prediction uncertainties that vary spatially across the regional aquifer. Additionally, the proposed method enables a spatial deconstruction of the prediction uncertainty that can lead to improved prediction of ground water vulnerability. ?? 2007 National Ground Water Association.

  15. THE FUTURE OF TOXICOLOGY-PREDICTIVE TOXICOLOGY ...

    EPA Pesticide Factsheets

    A chemistry approach to predictive toxicology relies on structure−activity relationship (SAR) modeling to predict biological activity from chemical structure. Such approaches have proven capabilities when applied to well-defined toxicity end points or regions of chemical space. These approaches are less well-suited, however, to the challenges of global toxicity prediction, i.e., to predicting the potential toxicity of structurally diverse chemicals across a wide range of end points of regulatory and pharmaceutical concern. New approaches that have the potential to significantly improve capabilities in predictive toxicology are elaborating the “activity” portion of the SAR paradigm. Recent advances in two areas of endeavor are particularly promising. Toxicity data informatics relies on standardized data schema, developed for particular areas of toxicological study, to facilitate data integration and enable relational exploration and mining of data across both historical and new areas of toxicological investigation. Bioassay profiling refers to large-scale high-throughput screening approaches that use chemicals as probes to broadly characterize biological response space, extending the concept of chemical “properties” to the biological activity domain. The effective capture and representation of legacy and new toxicity data into mineable form and the large-scale generation of new bioassay data in relation to chemical toxicity, both employing chemical stru

  16. The Incremental Value of Subjective and Quantitative Assessment of 18F-FDG PET for the Prediction of Pathologic Complete Response to Preoperative Chemoradiotherapy in Esophageal Cancer.

    PubMed

    van Rossum, Peter S N; Fried, David V; Zhang, Lifei; Hofstetter, Wayne L; van Vulpen, Marco; Meijer, Gert J; Court, Laurence E; Lin, Steven H

    2016-05-01

    A reliable prediction of a pathologic complete response (pathCR) to chemoradiotherapy before surgery for esophageal cancer would enable investigators to study the feasibility and outcome of an organ-preserving strategy after chemoradiotherapy. So far no clinical parameters or diagnostic studies are able to accurately predict which patients will achieve a pathCR. The aim of this study was to determine whether subjective and quantitative assessment of baseline and postchemoradiation (18)F-FDG PET can improve the accuracy of predicting pathCR to preoperative chemoradiotherapy in esophageal cancer beyond clinical predictors. This retrospective study was approved by the institutional review board, and the need for written informed consent was waived. Clinical parameters along with subjective and quantitative parameters from baseline and postchemoradiation (18)F-FDG PET were derived from 217 esophageal adenocarcinoma patients who underwent chemoradiotherapy followed by surgery. The associations between these parameters and pathCR were studied in univariable and multivariable logistic regression analysis. Four prediction models were constructed and internally validated using bootstrapping to study the incremental predictive values of subjective assessment of (18)F-FDG PET, conventional quantitative metabolic features, and comprehensive (18)F-FDG PET texture/geometry features, respectively. The clinical benefit of (18)F-FDG PET was determined using decision-curve analysis. A pathCR was found in 59 (27%) patients. A clinical prediction model (corrected c-index, 0.67) was improved by adding (18)F-FDG PET-based subjective assessment of response (corrected c-index, 0.72). This latter model was slightly improved by the addition of 1 conventional quantitative metabolic feature only (i.e., postchemoradiation total lesion glycolysis; corrected c-index, 0.73), and even more by subsequently adding 4 comprehensive (18)F-FDG PET texture/geometry features (corrected c-index, 0.77). However, at a decision threshold of 0.9 or higher, representing a clinically relevant predictive value for pathCR at which one may be willing to omit surgery, there was no clear incremental value. Subjective and quantitative assessment of (18)F-FDG PET provides statistical incremental value for predicting pathCR after preoperative chemoradiotherapy in esophageal cancer. However, the discriminatory improvement beyond clinical predictors does not translate into a clinically relevant benefit that could change decision making. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  17. Time-resolved fast-neutron radiography of air-water two-phase flows in a rectangular channel by an improved detection system

    NASA Astrophysics Data System (ADS)

    Zboray, Robert; Dangendorf, Volker; Mor, Ilan; Bromberger, Benjamin; Tittelmeier, Kai

    2015-07-01

    In a previous work, we have demonstrated the feasibility of high-frame-rate, fast-neutron radiography of generic air-water two-phase flows in a 1.5 cm thick, rectangular flow channel. The experiments have been carried out at the high-intensity, white-beam facility of the Physikalisch-Technische Bundesanstalt, Germany, using an multi-frame, time-resolved detector developed for fast neutron resonance radiography. The results were however not fully optimal and therefore we have decided to modify the detector and optimize it for the given application, which is described in the present work. Furthermore, we managed to improve the image post-processing methodology and the noise suppression. Using the tailored detector and the improved post-processing, significant increase in the image quality and an order of magnitude lower exposure times, down to 3.33 ms, have been achieved with minimized motion artifacts. Similar to the previous study, different two-phase flow regimes such as bubbly slug and churn flows have been examined. The enhanced imaging quality enables an improved prediction of two-phase flow parameters like the instantaneous volumetric gas fraction, bubble size, and bubble velocities. Instantaneous velocity fields around the gas enclosures can also be more robustly predicted using optical flow methods as previously.

  18. Conclusions about Interventions, Programs, and Approaches for Improving Executive Functions that appear Justified and those that, despite much hype, do not

    PubMed Central

    Diamond, Adele; Ling, Daphne S.

    2015-01-01

    The ‘Executive Functions’ (EFs) of inhibitory control, working memory, and cognitive flexibility enable us to think before we act, resist temptations or habitual reactions, stay focused, mentally play with ideas, reason, problem-solve, flexibly adjust to changed demands or priorities, and see things from new and different perspectives. These skills are critical for success in all life’s aspects. They are sometimes more predictive than even IQ or socioeconomic status. Understandably, there is great interest in improving EFs. It’s now clear they can be improved at any age through training and practice, much as physical exercise hones physical fitness. It also appears though, despite claims to the contrary, that wide transfer does not seem to occur and aerobic exercise per se does little to improve EFs. Important questions remain including: Are benefits just ephemeral and superficial? How much can EFs be improved and how long can benefits be sustained? What are the best methods for improving EFs? What about an approach accounts for its success? Do the answers to any of these differ by individual characteristics such as age or gender? Since stress, sadness, loneliness, or poor health impair EFs, and the reverse enhances EFs, I predict that approaches that not only directly train EFs but also indirectly support EFs by addressing emotional, social, and physical needs will be the most successful at improving EFs. PMID:26749076

  19. An approximate model for cancellous bone screw fixation.

    PubMed

    Brown, C J; Sinclair, R A; Day, A; Hess, B; Procter, P

    2013-04-01

    This paper presents a finite element (FE) model to identify parameters that affect the performance of an improved cancellous bone screw fixation technique, and hence potentially improve fracture treatment. In cancellous bone of low apparent density, it can be difficult to achieve adequate screw fixation and hence provide stable fracture fixation that enables bone healing. Data from predictive FE models indicate that cements can have a significant potential to improve screw holding power in cancellous bone. These FE models are used to demonstrate the key parameters that determine pull-out strength in a variety of screw, bone and cement set-ups, and to compare the effectiveness of different configurations. The paper concludes that significant advantages, up to an order of magnitude, in screw pull-out strength in cancellous bone might be gained by the appropriate use of a currently approved calcium phosphate cement.

  20. The ergonomics of vertical turret lathe operation.

    PubMed

    Pratt, F M; Corlett, E N

    1970-12-01

    A study of the work load of 14 vertical turret lathe operators engaged on different work tasks in two factories is reported. For eight of these workers continuous heart rate recordings were made throughout the day. It was shown that in four cases improved technology was unlikely to lead to higher output and certain aspects of posture and equipment manipulation were major contributors to the limitations on increased output. The role of the work-rest schedule in increasing work loads was also demonstrated. Improvements in technology and methods to reduce the extent of certain work loads to enable heavy work to be done in shorter periods followed by light work or rest periods are given as means to modify and improve the output of these machines. Finally, the direction for the development of a predictive model for man-machine matching is introduced.

  1. Prediction of space sickness in astronauts from preflight fluid, electrolyte, and cardiovascular variables and Weightless Environmental Training Facility (WETF) training

    NASA Technical Reports Server (NTRS)

    Simanonok, K.; Mosely, E.; Charles, J.

    1992-01-01

    Nine preflight variables related to fluid, electrolyte, and cardiovascular status from 64 first-time Shuttle crewmembers were differentially weighted by discrimination analysis to predict the incidence and severity of each crewmember's space sickness as rated by NASA flight surgeons. The nine variables are serum uric acid, red cell count, environmental temperature at the launch site, serum phosphate, urine osmolality, serum thyroxine, sitting systolic blood pressure, calculated blood volume, and serum chloride. Using two methods of cross-validation on the original samples (jackknife and a stratefied random subsample), these variables enable the prediction of space sickness incidence (NONE or SICK) with 80 percent sickness and space severity (NONE, MILD, MODERATE, of SEVERE) with 59 percent success by one method of cross-validation and 67 percent by another method. Addition of a tenth variable, hours spent in the Weightlessness Environment Training Facility (WETF) did not improve the prediction of space sickness incidences but did improve the prediction of space sickness severity to 66 percent success by the first method of cross-validation of original samples and to 71 percent by the second method. Results to date suggest the presence of predisposing physiologic factors to space sickness that implicate fluid shift etiology. The data also suggest that prior exposure to fluid shift during WETF training may produce some circulatory pre-adaption to fluid shifts in weightlessness that results in a reduction of space sickness severity.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sigeti, David E.; Pelak, Robert A.

    We present a Bayesian statistical methodology for identifying improvement in predictive simulations, including an analysis of the number of (presumably expensive) simulations that will need to be made in order to establish with a given level of confidence that an improvement has been observed. Our analysis assumes the ability to predict (or postdict) the same experiments with legacy and new simulation codes and uses a simple binomial model for the probability, {theta}, that, in an experiment chosen at random, the new code will provide a better prediction than the old. This model makes it possible to do statistical analysis withmore » an absolute minimum of assumptions about the statistics of the quantities involved, at the price of discarding some potentially important information in the data. In particular, the analysis depends only on whether or not the new code predicts better than the old in any given experiment, and not on the magnitude of the improvement. We show how the posterior distribution for {theta} may be used, in a kind of Bayesian hypothesis testing, both to decide if an improvement has been observed and to quantify our confidence in that decision. We quantify the predictive probability that should be assigned, prior to taking any data, to the possibility of achieving a given level of confidence, as a function of sample size. We show how this predictive probability depends on the true value of {theta} and, in particular, how there will always be a region around {theta} = 1/2 where it is highly improbable that we will be able to identify an improvement in predictive capability, although the width of this region will shrink to zero as the sample size goes to infinity. We show how the posterior standard deviation may be used, as a kind of 'plan B metric' in the case that the analysis shows that {theta} is close to 1/2 and argue that such a plan B should generally be part of hypothesis testing. All the analysis presented in the paper is done with a general beta-function prior for {theta}, enabling sequential analysis in which a small number of new simulations may be done and the resulting posterior for {theta} used as a prior to inform the next stage of power analysis.« less

  3. Computational Fluid Dynamics of Whole-Body Aircraft

    NASA Astrophysics Data System (ADS)

    Agarwal, Ramesh

    1999-01-01

    The current state of the art in computational aerodynamics for whole-body aircraft flowfield simulations is described. Recent advances in geometry modeling, surface and volume grid generation, and flow simulation algorithms have led to accurate flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics has emerged as a crucial enabling technology for the design and development of flight vehicles. Examples illustrating the current capability for the prediction of transport and fighter aircraft flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future, inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology, and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  4. Report for MaRIE Drivers Workshop on needs for energetic material's studies.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Specht, Paul Elliott

    Energetic materials (i.e. explosives, propellants, and pyrotechnics) have complex mesoscale features that influence their dynamic response. Direct measurement of the complex mechanical, thermal, and chemical response of energetic materials is critical for improving computational models and enabling predictive capabilities. Many of the physical phenomena of interest in energetic materials cover time and length scales spanning several orders of magnitude. Examples include chemical interactions in the reaction zone, the distribution and evolution of temperature fields, mesoscale deformation in heterogeneous systems, and phase transitions. This is particularly true for spontaneous phenomena, like thermal cook-off. The ability for MaRIE to capture multiple lengthmore » scales and stochastic phenomena can significantly advance our understanding of energetic materials and yield more realistic, predictive models.« less

  5. Gallium arsenide solar cell efficiency: Problems and potential

    NASA Technical Reports Server (NTRS)

    Weizer, V. G.; Godlewski, M. P.

    1985-01-01

    Under ideal conditions the GaAs solar cell should be able to operate at an AMO efficiency exceeding 27 percent, whereas to date the best measured efficiencies barely exceed 19 percent. Of more concern is the fact that there has been no improvement in the past half decade, despite the expenditure of considerable effort. State-of-the-art GaAs efficiency is analyzed in an attempt to determine the feasibility of improving on the status quo. The possible gains to be had in the planar cell. An attempt is also made to predict the efficiency levels that could be achieved with a grating geometry. Both the N-base and the P-base BaAs cells in their planar configurations have the potential to operate at AMO efficiencies between 23 and 24 percent. For the former the enabling technology is essentially in hand, while for the latter the problem of passivating the emitter surface remains to be solved. In the dot grating configuration, P-base efficiencies approaching 26 percent are possible with minor improvements in existing technology. N-base grating cell efficiencies comparable to those predicted for the P-base cell are achievable if the N surface can be sufficiently passivated.

  6. Design of an optical system for interrogation of implanted luminescent sensors and verification with silicone skin phantoms.

    PubMed

    Long, Ruiqi; McShane, Mike

    2012-09-01

    Implantable luminescent sensors are being developed for on-demand monitoring of blood glucose levels. For these sensors to be deployed in vivo, a matched external hardware system is needed. In this paper, we designed a compact, low-cost optical system with highly efficient photon delivery and collection using advanced optical modeling software. Compared to interrogation with a fiber bundle, the new system was predicted to improve interrogation efficiency by a factor of 200 for native sensors; an improvement of 37 times was predicted for sensors implanted at a depth of 1 mm in a skin-simulating phantom. A physical prototype was tested using silicone-based skin phantoms developed specifically to mimic the scattering and absorbing properties of human skin. The experimental evaluations revealed that the prototype device performed in agreement with expectations from simulation results, resulting in an overall improvement of over 2000 times. This efficient system enables use of a low-cost commercial spectrometer for recording sensor emission, which was not possible using only fiber optic delivery and collection, and will be used as a tool for in vivo studies with animal models or human subjects.

  7. Can high resolution 3D topographic surveys provide reliable grain size estimates in gravel bed rivers?

    NASA Astrophysics Data System (ADS)

    Pearson, E.; Smith, M. W.; Klaar, M. J.; Brown, L. E.

    2017-09-01

    High resolution topographic surveys such as those provided by Structure-from-Motion (SfM) contain a wealth of information that is not always exploited in the generation of Digital Elevation Models (DEMs). In particular, several authors have related sub-metre scale topographic variability (or 'surface roughness') to sediment grain size by deriving empirical relationships between the two. In fluvial applications, such relationships permit rapid analysis of the spatial distribution of grain size over entire river reaches, providing improved data to drive three-dimensional hydraulic models, allowing rapid geomorphic monitoring of sub-reach river restoration projects, and enabling more robust characterisation of riverbed habitats. However, comparison of previously published roughness-grain-size relationships shows substantial variability between field sites. Using a combination of over 300 laboratory and field-based SfM surveys, we demonstrate the influence of inherent survey error, irregularity of natural gravels, particle shape, grain packing structure, sorting, and form roughness on roughness-grain-size relationships. Roughness analysis from SfM datasets can accurately predict the diameter of smooth hemispheres, though natural, irregular gravels result in a higher roughness value for a given diameter and different grain shapes yield different relationships. A suite of empirical relationships is presented as a decision tree which improves predictions of grain size. By accounting for differences in patch facies, large improvements in D50 prediction are possible. SfM is capable of providing accurate grain size estimates, although further refinement is needed for poorly sorted gravel patches, for which c-axis percentiles are better predicted than b-axis percentiles.

  8. Incorporating discrete event simulation into quality improvement efforts in health care systems.

    PubMed

    Rutberg, Matthew Harris; Wenczel, Sharon; Devaney, John; Goldlust, Eric Jonathan; Day, Theodore Eugene

    2015-01-01

    Quality improvement (QI) efforts are an indispensable aspect of health care delivery, particularly in an environment of increasing financial and regulatory pressures. The ability to test predictions of proposed changes to flow, policy, staffing, and other process-level changes using discrete event simulation (DES) has shown significant promise and is well reported in the literature. This article describes how to incorporate DES into QI departments and programs in order to support QI efforts, develop high-fidelity simulation models, conduct experiments, make recommendations, and support adoption of results. The authors describe how DES-enabled QI teams can partner with clinical services and administration to plan, conduct, and sustain QI investigations. © 2013 by the American College of Medical Quality.

  9. Fault tolerant and lifetime control architecture for autonomous vehicles

    NASA Astrophysics Data System (ADS)

    Bogdanov, Alexander; Chen, Yi-Liang; Sundareswaran, Venkataraman; Altshuler, Thomas

    2008-04-01

    Increased vehicle autonomy, survivability and utility can provide an unprecedented impact on mission success and are one of the most desirable improvements for modern autonomous vehicles. We propose a general architecture of intelligent resource allocation, reconfigurable control and system restructuring for autonomous vehicles. The architecture is based on fault-tolerant control and lifetime prediction principles, and it provides improved vehicle survivability, extended service intervals, greater operational autonomy through lower rate of time-critical mission failures and lesser dependence on supplies and maintenance. The architecture enables mission distribution, adaptation and execution constrained on vehicle and payload faults and desirable lifetime. The proposed architecture will allow managing missions more efficiently by weighing vehicle capabilities versus mission objectives and replacing the vehicle only when it is necessary.

  10. Genomic selection models double the accuracy of predicted breeding values for bacterial cold water disease resistance compared to a traditional pedigree-based model in rainbow trout aquaculture.

    PubMed

    Vallejo, Roger L; Leeds, Timothy D; Gao, Guangtu; Parsons, James E; Martin, Kyle E; Evenhuis, Jason P; Fragomeni, Breno O; Wiens, Gregory D; Palti, Yniv

    2017-02-01

    Previously, we have shown that bacterial cold water disease (BCWD) resistance in rainbow trout can be improved using traditional family-based selection, but progress has been limited to exploiting only between-family genetic variation. Genomic selection (GS) is a new alternative that enables exploitation of within-family genetic variation. We compared three GS models [single-step genomic best linear unbiased prediction (ssGBLUP), weighted ssGBLUP (wssGBLUP), and BayesB] to predict genomic-enabled breeding values (GEBV) for BCWD resistance in a commercial rainbow trout population, and compared the accuracy of GEBV to traditional estimates of breeding values (EBV) from a pedigree-based BLUP (P-BLUP) model. We also assessed the impact of sampling design on the accuracy of GEBV predictions. For these comparisons, we used BCWD survival phenotypes recorded on 7893 fish from 102 families, of which 1473 fish from 50 families had genotypes [57 K single nucleotide polymorphism (SNP) array]. Naïve siblings of the training fish (n = 930 testing fish) were genotyped to predict their GEBV and mated to produce 138 progeny testing families. In the following generation, 9968 progeny were phenotyped to empirically assess the accuracy of GEBV predictions made on their non-phenotyped parents. The accuracy of GEBV from all tested GS models were substantially higher than the P-BLUP model EBV. The highest increase in accuracy relative to the P-BLUP model was achieved with BayesB (97.2 to 108.8%), followed by wssGBLUP at iteration 2 (94.4 to 97.1%) and 3 (88.9 to 91.2%) and ssGBLUP (83.3 to 85.3%). Reducing the training sample size to n = ~1000 had no negative impact on the accuracy (0.67 to 0.72), but with n = ~500 the accuracy dropped to 0.53 to 0.61 if the training and testing fish were full-sibs, and even substantially lower, to 0.22 to 0.25, when they were not full-sibs. Using progeny performance data, we showed that the accuracy of genomic predictions is substantially higher than estimates obtained from the traditional pedigree-based BLUP model for BCWD resistance. Overall, we found that using a much smaller training sample size compared to similar studies in livestock, GS can substantially improve the selection accuracy and genetic gains for this trait in a commercial rainbow trout breeding population.

  11. Computational fluid dynamic modelling of cavitation

    NASA Technical Reports Server (NTRS)

    Deshpande, Manish; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    Models in sheet cavitation in cryogenic fluids are developed for use in Euler and Navier-Stokes codes. The models are based upon earlier potential-flow models but enable the cavity inception point, length, and shape to be determined as part of the computation. In the present paper, numerical solutions are compared with experimental measurements for both pressure distribution and cavity length. Comparisons between models are also presented. The CFD model provides a relatively simple modification to an existing code to enable cavitation performance predictions to be included. The analysis also has the added ability of incorporating thermodynamic effects of cryogenic fluids into the analysis. Extensions of the current two-dimensional steady state analysis to three-dimensions and/or time-dependent flows are, in principle, straightforward although geometrical issues become more complicated. Linearized models, however offer promise of providing effective cavitation modeling in three-dimensions. This analysis presents good potential for improved understanding of many phenomena associated with cavity flows.

  12. Numerical Investigation of Vertical Plunging Jet Using a Hybrid Multifluid–VOF Multiphase CFD Solver

    DOE PAGES

    Shonibare, Olabanji Y.; Wardle, Kent E.

    2015-06-28

    A novel hybrid multiphase flow solver has been used to conduct simulations of a vertical plunging liquid jet. This solver combines a multifluid methodology with selective interface sharpening to enable simulation of both the initial jet impingement and the long-time entrained bubble plume phenomena. Models are implemented for variable bubble size capturing and dynamic switching of interface sharpened regions to capture transitions between the initially fully segregated flow types into the dispersed bubbly flow regime. It was found that the solver was able to capture the salient features of the flow phenomena under study and areas for quantitative improvement havemore » been explored and identified. In particular, a population balance approach is employed and detailed calibration of the underlying models with experimental data is required to enable quantitative prediction of bubble size and distribution to capture the transition between segregated and dispersed flow types with greater fidelity.« less

  13. Merlin: Computer-Aided Oligonucleotide Design for Large Scale Genome Engineering with MAGE.

    PubMed

    Quintin, Michael; Ma, Natalie J; Ahmed, Samir; Bhatia, Swapnil; Lewis, Aaron; Isaacs, Farren J; Densmore, Douglas

    2016-06-17

    Genome engineering technologies now enable precise manipulation of organism genotype, but can be limited in scalability by their design requirements. Here we describe Merlin ( http://merlincad.org ), an open-source web-based tool to assist biologists in designing experiments using multiplex automated genome engineering (MAGE). Merlin provides methods to generate pools of single-stranded DNA oligonucleotides (oligos) for MAGE experiments by performing free energy calculation and BLAST scoring on a sliding window spanning the targeted site. These oligos are designed not only to improve recombination efficiency, but also to minimize off-target interactions. The application further assists experiment planning by reporting predicted allelic replacement rates after multiple MAGE cycles, and enables rapid result validation by generating primer sequences for multiplexed allele-specific colony PCR. Here we describe the Merlin oligo and primer design procedures and validate their functionality compared to OptMAGE by eliminating seven AvrII restriction sites from the Escherichia coli genome.

  14. FY2016 Ceramic Fuels Development Annual Highlights

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mcclellan, Kenneth James

    Key challenges for the Advanced Fuels Campaign are the development of fuel technologies to enable major increases in fuel performance (safety, reliability, power and burnup) beyond current technologies, and development of characterization methods and predictive fuel performance models to enable more efficient development and licensing of advanced fuels. Ceramic fuel development activities for fiscal year 2016 fell within the areas of 1) National and International Technical Integration, 2) Advanced Accident Tolerant Ceramic Fuel Development, 3) Advanced Techniques and Reference Materials Development, and 4) Fabrication of Enriched Ceramic Fuels. High uranium density fuels were the focus of the ceramic fuels efforts.more » Accomplishments for FY16 primarily reflect the prioritization of identification and assessment of new ceramic fuels for light water reactors which have enhanced accident tolerance while also maintaining or improving normal operation performance, and exploration of advanced post irradiation examination techniques which will support more efficient testing and qualification of new fuel systems.« less

  15. Policy perspectives on the emerging pathways of personalized medicine

    PubMed Central

    Downing, Gregory J.

    2009-01-01

    Remarkable advances in the fundamental knowledge about the biological basis of disease and technical advances in methods to assess genomic information have led the health care system to the threshold of personalized medicine. It is now feasible to consider strategic application of genomic information to guide patient management by being predictive, preemptive, and preventive, and enabling patient participation in medical decisions. Early evidence of this transition has some hallmarks of disruptive innovation to existing health care practices. Presented here is an examination of the changes underway to enable this new concept in health care in the United States, to improve precision and quality of care through innovations aimed at individualized approaches to medical decision making. A broad range of public policy positions will need to be considered for the health care delivery enterprise to accommodate the promise of this new science and technology for the benefit of patients. PMID:20135895

  16. Predicting Microbial Fuel Cell Biofilm Communities and Bioreactor Performance using Artificial Neural Networks.

    PubMed

    Lesnik, Keaton Larson; Liu, Hong

    2017-09-19

    The complex interactions that occur in mixed-species bioelectrochemical reactors, like microbial fuel cells (MFCs), make accurate predictions of performance outcomes under untested conditions difficult. While direct correlations between any individual waste stream characteristic or microbial community structure and reactor performance have not been able to be directly established, the increase in sequencing data and readily available computational power enables the development of alternate approaches. In the current study, 33 MFCs were evaluated under a range of conditions including eight separate substrates and three different wastewaters. Artificial Neural Networks (ANNs) were used to establish mathematical relationships between wastewater/solution characteristics, biofilm communities, and reactor performance. ANN models that incorporated biotic interactions predicted reactor performance outcomes more accurately than those that did not. The average percent error of power density predictions was 16.01 ± 4.35%, while the average percent error of Coulombic efficiency and COD removal rate predictions were 1.77 ± 0.57% and 4.07 ± 1.06%, respectively. Predictions of power density improved to within 5.76 ± 3.16% percent error through classifying taxonomic data at the family versus class level. Results suggest that the microbial communities and performance of bioelectrochemical systems can be accurately predicted using data-mining, machine-learning techniques.

  17. Low thrust chemical rocket technology

    NASA Technical Reports Server (NTRS)

    Schneider, Steven J.

    1992-01-01

    An on-going technology program to improve the performance of low thrust chemical rockets for spacecraft on-board propulsion applications is reviewed. Improved performance and lifetime is sought by the development of new predictive tools to understand the combustion and flow physics, introduction of high temperature materials and improved component designs to optimize performance, and use of higher performance propellants. Improved predictive technology is sought through the comparison of both local and global predictions with experimental data. Predictions are based on both the RPLUS Navier-Stokes code with finite rate kinetics and the JANNAF methodology. Data were obtained with laser-based diagnostics along with global performance measurements. Results indicate that the modeling of the injector and the combustion process needs improvement in these codes and flow visualization with a technique such as 2-D laser induced fluorescence (LIF) would aid in resolving issues of flow symmetry and shear layer combustion processes. High temperature material fabrication processes are under development and small rockets are being designed, fabricated, and tested using these new materials. Rhenium coated with iridium for oxidation protection was produced by the Chemical Vapor Deposition (CVD) process and enabled an 800 K increase in rocket operating temperature. Performance gains with this material in rockets using Earth storable propellants (nitrogen tetroxide and monomethylhydrazine or hydrazine) were obtained through component redesign to eliminate fuel film cooling and its associated combustion inefficiency while managing head end thermal soakback. Material interdiffusion and oxidation characteristics indicated that the requisite lifetimes of tens of hours were available for thruster applications. Rockets were designed, fabricated, and tested with thrusts of 22, 62, 440 and 550 N. Performance improvements of 10 to 20 seconds specific impulse were demonstrated. Higher performance propellants were evaluated: Space storable propellants, including liquid oxygen (LOX) as the oxidizer with nitrogen hydrides or hydrocarbon as fuels. Specifically, a LOX/hydrazine engine was designed, fabricated, and shown to have a 95 pct theoretical c-star which translates into a projected vacuum specific impulse of 345 seconds at an area ratio of 204:1. Further performance improvment can be obtained by the use of LOX/hydrogen propellants, especially for manned spacecraft applications, and specific designs must be developed and advanced through flight qualification.

  18. BigFoot, a program to reduce risk for indirect drive laser fusion

    NASA Astrophysics Data System (ADS)

    Thomas, Cliff

    2016-10-01

    The conventional approach to inertial confinement fusion (ICF) is to maximize compressibility, or, total areal density. To achieve high convergence (40), the laser pulse is shaped to launch a weak first shock, which is followed in turn by 2-3 stronger shocks. Importantly, this has an outsized effect on integrated target physics, as the time it takes the shocks to transit the shell is related to hohlraum wall motion and filling, and can contribute to difficulties achieving an implosion that is fast, tunable, and/or predictable. At its outset, this approach attempts to predict the tradeoff in capsule and hohlraum physics in a case that is challenging, and assumes the hotspot can still reach the temperature and density necessary to self-heat (4-5 keV and 0.1-0.2 g/cm2, respectively). Here, we consider an alternate route to fusion ignition, for which the benefits of predictability, control, and coupling could exceed the benefits of convergence. In this approach we avoid uncertainty, and instead, seek a target that is predictable. To simplify hohlraum physics and limit wall motion we keep the implosion time short (6-7 ns), and design the target to avoid laser-plasma instabilities. Whereas the previous focus was on density, it is now on making a 1D hotspot at low convergence (20) that is robust with respect to alpha heating (5-6 keV, and 0.2-0.3 g/cm2) . At present, we estimate the tradeoff between convergence and control is relatively flat, and advantages in coupling enable high velocity (450-500 um/ns) and high yield (1E17). Were the approach successful, we believe it could reduce barriers to progress, as further improvements could be made with small, incremental increases in areal density. Details regarding the ``BigFoot'' platform and pulse are reported, as well as initial experiments. Work that could enable additional improvements in laser power, laser control, and capsule stability will also be discussed. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  19. MEMS and MOEMS for national security applications

    NASA Astrophysics Data System (ADS)

    Scott, Marion W.

    2003-01-01

    Major opportunities for microsystem insertion into commercial applications, such as telecommunications and medical prosthesis, are well known. Less well known are applications that ensure the security of our nation, the protection of its armed forces, and the safety of its citizens. Microsystems enable entirely new possibilities to meet National Security needs, which can be classed along three lines: anticipating security needs and threats, deterring the efficacy of identified threats, and defending against the application of these threats. In each of these areas, specific products that are enabled by MEMS and MOEMS are discussed. In the area of anticipating needs and threats, sensored microsystems designed for chem/bio/nuclear threats, and sensors for border and asset protection can significantly secure our borders, ports, and transportation systems. Key features for these applications include adaptive optics and spectroscopic capabilities. Microsystems to monitor soil and water quality can be used to secure critical infrastructure, food safety can be improved by in-situ identification of pathogens, and sensored buildings can ensure the architectural safety of our homes and workplaces. A challenge to commercializing these opportunities, and thus making them available for National Security needs, is developing predictable markets and predictable technology roadmaps. The integrated circuit manufacturing industry provides an example of predictable technology maturation and market insertion, primarily due to the existence of a "unit cell" that allows volume manufacturing. It is not clear that microsystems can follow an analogous path. The possible paths to affordable low-volume production, as well as the prospects of a microsystems unit cell, are discussed.

  20. Predicting vapor-liquid phase equilibria with augmented ab initio interatomic potentials

    NASA Astrophysics Data System (ADS)

    Vlasiuk, Maryna; Sadus, Richard J.

    2017-06-01

    The ability of ab initio interatomic potentials to accurately predict vapor-liquid phase equilibria is investigated. Monte Carlo simulations are reported for the vapor-liquid equilibria of argon and krypton using recently developed accurate ab initio interatomic potentials. Seventeen interatomic potentials are studied, formulated from different combinations of two-body plus three-body terms. The simulation results are compared to either experimental or reference data for conditions ranging from the triple point to the critical point. It is demonstrated that the use of ab initio potentials enables systematic improvements to the accuracy of predictions via the addition of theoretically based terms. The contribution of three-body interactions is accounted for using the Axilrod-Teller-Muto plus other multipole contributions and the effective Marcelli-Wang-Sadus potentials. The results indicate that the predictive ability of recent interatomic potentials, obtained from quantum chemical calculations, is comparable to that of accurate empirical models. It is demonstrated that the Marcelli-Wang-Sadus potential can be used in combination with accurate two-body ab initio models for the computationally inexpensive and accurate estimation of vapor-liquid phase equilibria.

  1. Predicting vapor-liquid phase equilibria with augmented ab initio interatomic potentials.

    PubMed

    Vlasiuk, Maryna; Sadus, Richard J

    2017-06-28

    The ability of ab initio interatomic potentials to accurately predict vapor-liquid phase equilibria is investigated. Monte Carlo simulations are reported for the vapor-liquid equilibria of argon and krypton using recently developed accurate ab initio interatomic potentials. Seventeen interatomic potentials are studied, formulated from different combinations of two-body plus three-body terms. The simulation results are compared to either experimental or reference data for conditions ranging from the triple point to the critical point. It is demonstrated that the use of ab initio potentials enables systematic improvements to the accuracy of predictions via the addition of theoretically based terms. The contribution of three-body interactions is accounted for using the Axilrod-Teller-Muto plus other multipole contributions and the effective Marcelli-Wang-Sadus potentials. The results indicate that the predictive ability of recent interatomic potentials, obtained from quantum chemical calculations, is comparable to that of accurate empirical models. It is demonstrated that the Marcelli-Wang-Sadus potential can be used in combination with accurate two-body ab initio models for the computationally inexpensive and accurate estimation of vapor-liquid phase equilibria.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dahlburg, Jill; Corones, James; Batchelor, Donald

    Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individualmore » features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC integrated planning document (IPPA, 2000), represents a significant opportunity for the DOE Office of Science to further the understanding of fusion plasmas to a level unparalleled worldwide.« less

  3. Invited review: A position on the Global Livestock Environmental Assessment Model (GLEAM).

    PubMed

    MacLeod, M J; Vellinga, T; Opio, C; Falcucci, A; Tempio, G; Henderson, B; Makkar, H; Mottet, A; Robinson, T; Steinfeld, H; Gerber, P J

    2018-02-01

    The livestock sector is one of the fastest growing subsectors of the agricultural economy and, while it makes a major contribution to global food supply and economic development, it also consumes significant amounts of natural resources and alters the environment. In order to improve our understanding of the global environmental impact of livestock supply chains, the Food and Agriculture Organization of the United Nations has developed the Global Livestock Environmental Assessment Model (GLEAM). The purpose of this paper is to provide a review of GLEAM. Specifically, it explains the model architecture, methods and functionality, that is the types of analysis that the model can perform. The model focuses primarily on the quantification of greenhouse gases emissions arising from the production of the 11 main livestock commodities. The model inputs and outputs are managed and produced as raster data sets, with spatial resolution of 0.05 decimal degrees. The Global Livestock Environmental Assessment Model v1.0 consists of five distinct modules: (a) the Herd Module; (b) the Manure Module; (c) the Feed Module; (d) the System Module; (e) the Allocation Module. In terms of the modelling approach, GLEAM has several advantages. For example spatial information on livestock distributions and crops yields enables rations to be derived that reflect the local availability of feed resources in developing countries. The Global Livestock Environmental Assessment Model also contains a herd model that enables livestock statistics to be disaggregated and variation in livestock performance and management to be captured. Priorities for future development of GLEAM include: improving data quality and the methods used to perform emissions calculations; extending the scope of the model to include selected additional environmental impacts and to enable predictive modelling; and improving the utility of GLEAM output.

  4. Computational design of auxotrophy-dependent microbial biosensors for combinatorial metabolic engineering experiments.

    PubMed

    Tepper, Naama; Shlomi, Tomer

    2011-01-21

    Combinatorial approaches in metabolic engineering work by generating genetic diversity in a microbial population followed by screening for strains with improved phenotypes. One of the most common goals in this field is the generation of a high rate chemical producing strain. A major hurdle with this approach is that many chemicals do not have easy to recognize attributes, making their screening expensive and time consuming. To address this problem, it was previously suggested to use microbial biosensors to facilitate the detection and quantification of chemicals of interest. Here, we present novel computational methods to: (i) rationally design microbial biosensors for chemicals of interest based on substrate auxotrophy that would enable their high-throughput screening; (ii) predict engineering strategies for coupling the synthesis of a chemical of interest with the production of a proxy metabolite for which high-throughput screening is possible via a designed bio-sensor. The biosensor design method is validated based on known genetic modifications in an array of E. coli strains auxotrophic to various amino-acids. Predicted chemical production rates achievable via the biosensor-based approach are shown to potentially improve upon those predicted by current rational strain design approaches. (A Matlab implementation of the biosensor design method is available via http://www.cs.technion.ac.il/~tomersh/tools).

  5. Cardiovascular risk

    PubMed Central

    Payne, Rupert A

    2012-01-01

    Cardiovascular disease is a major, growing, worldwide problem. It is important that individuals at risk of developing cardiovascular disease can be effectively identified and appropriately stratified according to risk. This review examines what we understand by the term risk, traditional and novel risk factors, clinical scoring systems, and the use of risk for informing prescribing decisions. Many different cardiovascular risk factors have been identified. Established, traditional factors such as ageing are powerful predictors of adverse outcome, and in the case of hypertension and dyslipidaemia are the major targets for therapeutic intervention. Numerous novel biomarkers have also been described, such as inflammatory and genetic markers. These have yet to be shown to be of value in improving risk prediction, but may represent potential therapeutic targets and facilitate more targeted use of existing therapies. Risk factors have been incorporated into several cardiovascular disease prediction algorithms, such as the Framingham equation, SCORE and QRISK. These have relatively poor predictive power, and uncertainties remain with regards to aspects such as choice of equation, different risk thresholds and the roles of relative risk, lifetime risk and reversible factors in identifying and treating at-risk individuals. Nonetheless, such scores provide objective and transparent means of quantifying risk and their integration into therapeutic guidelines enables equitable and cost-effective distribution of health service resources and improves the consistency and quality of clinical decision making. PMID:22348281

  6. Circulating predictive and diagnostic biomarkers for hepatitis B virus-associated hepatocellular carcinoma

    PubMed Central

    Van Hees, Stijn; Michielsen, Peter; Vanwolleghem, Thomas

    2016-01-01

    Chronic hepatitis B virus (HBV) infected patients have an almost 100-fold increased risk to develop hepatocellular carcinoma (HCC). HCC is the fifth most common and third most deadly cancer worldwide. Up to 50% of newly diagnosed HCC cases are attributed to HBV infection. Early detection improves survival and can be achieved through regular screening. Six-monthly abdominal ultrasound, either alone or in combination with alpha-fetoprotein serum levels, has been widely endorsed for this purpose. Both techniques however yield limited diagnostic accuracy, which is not improved when they are combined. Alternative circulating or histological markers to predict or diagnose HCC are therefore urgently needed. Recent advances in systems biology technologies have enabled the identification of several new putative circulating biomarkers. Although results from studies assessing combinations of these biomarkers are promising, evidence for their clinical utility remains low. In addition, most of the studies conducted so far show limitations in design. Attention must be paid for instance to different ethnicities and different etiologies when studying biomarkers for hepatocellular carcinoma. This review provides an overview on the current understandings and recent progress in the field of diagnostic and predictive circulating biomarkers for hepatocellular carcinoma in chronically infected HBV patients and discusses the future prospects. PMID:27729734

  7. Quasi-coarse-grained dynamics: modelling of metallic materials at mesoscales

    NASA Astrophysics Data System (ADS)

    Dongare, Avinash M.

    2014-12-01

    A computationally efficient modelling method called quasi-coarse-grained dynamics (QCGD) is developed to expand the capabilities of molecular dynamics (MD) simulations to model behaviour of metallic materials at the mesoscales. This mesoscale method is based on solving the equations of motion for a chosen set of representative atoms from an atomistic microstructure and using scaling relationships for the atomic-scale interatomic potentials in MD simulations to define the interactions between representative atoms. The scaling relationships retain the atomic-scale degrees of freedom and therefore energetics of the representative atoms as would be predicted in MD simulations. The total energetics of the system is retained by scaling the energetics and the atomic-scale degrees of freedom of these representative atoms to account for the missing atoms in the microstructure. This scaling of the energetics renders improved time steps for the QCGD simulations. The success of the QCGD method is demonstrated by the prediction of the structural energetics, high-temperature thermodynamics, deformation behaviour of interfaces, phase transformation behaviour, plastic deformation behaviour, heat generation during plastic deformation, as well as the wave propagation behaviour, as would be predicted using MD simulations for a reduced number of representative atoms. The reduced number of atoms and the improved time steps enables the modelling of metallic materials at the mesoscale in extreme environments.

  8. Musical Scales in Tone Sequences Improve Temporal Accuracy.

    PubMed

    Li, Min S; Di Luca, Massimiliano

    2018-01-01

    Predicting the time of stimulus onset is a key component in perception. Previous investigations of perceived timing have focused on the effect of stimulus properties such as rhythm and temporal irregularity, but the influence of non-temporal properties and their role in predicting stimulus timing has not been exhaustively considered. The present study aims to understand how a non-temporal pattern in a sequence of regularly timed stimuli could improve or bias the detection of temporal deviations. We presented interspersed sequences of 3, 4, 5, and 6 auditory tones where only the timing of the last stimulus could slightly deviate from isochrony. Participants reported whether the last tone was 'earlier' or 'later' relative to the expected regular timing. In two conditions, the tones composing the sequence were either organized into musical scales or they were random tones. In one experiment, all sequences ended with the same tone; in the other experiment, each sequence ended with a different tone. Results indicate higher discriminability of anisochrony with musical scales and with longer sequences, irrespective of the knowledge of the final tone. Such an outcome suggests that the predictability of non-temporal properties, as enabled by the musical scale pattern, can be a factor in determining the sensitivity of time judgments.

  9. Turbulence effects on volatilization rates of liquids and solutes

    USGS Publications Warehouse

    Lee, J.-F.; Chao, H.-P.; Chiou, C.T.; Manes, M.

    2004-01-01

    Volatilization rates of neat liquids (benzene, toluene, fluorobenzene, bromobenzene, ethylbenzene, m-xylene, o-xylene, o-dichlorobenzene, and 1-methylnaphthalene) and of solutes (phenol, m-cresol, benzene, toluene, ethylbenzene, o-xylene, and ethylene dibromide) from dilute water solutions have been measured in the laboratory over a wide range of air speeds and water-stirring rates. The overall transfer coefficients (KL) for individual solutes are independent of whether they are in single- or multi-solute solutions. The gas-film transfer coefficients (kG) for solutes in the two-film model, which have hitherto been estimated by extrapolation from reference coefficients, can now be determined directly from the volatilization rates of neatliquids through anew algorithm. The associated liquid-film transfer coefficients (KL) can then be obtained from measured KL and kG values and solute Henry law constants (H). This approach provides a novel means for checking the precision of any kL and kG estimation methods for ultimate prediction of KL. The improved kG estimation enables accurate K L predictions for low-volatility (i.e., low-H) solutes where K L and kGH are essentially equal. In addition, the prediction of KL values for high-volatility (i.e., high-H) solutes, where KL ??? kL, is also improved by using appropriate reference kL values.

  10. 60 Years of Studying the Earth-Sun System from Space: Explorer 1

    NASA Astrophysics Data System (ADS)

    Zurbuchen, T.

    2017-12-01

    The era of space-based observation of the Earth-Sun system initiated with the Explorer-1 satellite has revolutionized our knowledge of the Earth, Sun, and the processes that connect them. The space-based perspective has not only enabled us to achieve a fundamentally new understanding of our home planet and the star that sustains us, but it has allowed for significant improvements in predictive capability that serves to protect life, health, and property. NASA has played a leadership role in the United States in creating both the technology and science that has enabled and benefited from these new capabilities, and works closely with partner agencies and around the world to synergistically address these global challenges which are of sufficient magnitude that no one nation or organization can address on their own. Three areas are at the heart of NASA's comprehensive science program: Discovering the secrets of the universe, searching for life elsewhere, and safeguarding and improving life on Earth. Together, these tenets will help NASA lead on a civilization scale. In this talk, a review of these 60 years of advances, a status of current activities, and thoughts about their evolution into the future will be presented.

  11. Upgrades of Two Computer Codes for Analysis of Turbomachinery

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; Liou, Meng-Sing

    2005-01-01

    Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.

  12. Simplified procedures for correlation of experimentally measured and predicted thrust chamber performance

    NASA Technical Reports Server (NTRS)

    Powell, W. B.

    1973-01-01

    Thrust chamber performance is evaluated in terms of an analytical model incorporating all the loss processes that occur in a real rocket motor. The important loss processes in the real thrust chamber were identified, and a methodology and recommended procedure for predicting real thrust chamber vacuum specific impulse were developed. Simplified equations for the calculation of vacuum specific impulse are developed to relate the delivered performance (both vacuum specific impulse and characteristic velocity) to the ideal performance as degraded by the losses corresponding to a specified list of loss processes. These simplified equations enable the various performance loss components, and the corresponding efficiencies, to be quantified separately (except that interaction effects are arbitrarily assigned in the process). The loss and efficiency expressions presented can be used to evaluate experimentally measured thrust chamber performance, to direct development effort into the areas most likely to yield improvements in performance, and as a basis to predict performance of related thrust chamber configurations.

  13. Advanced Earth-to-orbit propulsion technology program overview: Impact of civil space technology initiative

    NASA Technical Reports Server (NTRS)

    Stephenson, Frank W., Jr.

    1988-01-01

    The NASA Earth-to-Orbit (ETO) Propulsion Technology Program is dedicated to advancing rocket engine technologies for the development of fully reusable engine systems that will enable space transportation systems to achieve low cost, routine access to space. The program addresses technology advancements in the areas of engine life extension/prediction, performance enhancements, reduced ground operations costs, and in-flight fault tolerant engine operations. The primary objective is to acquire increased knowledge and understanding of rocket engine chemical and physical processes in order to evolve more realistic analytical simulations of engine internal environments, to derive more accurate predictions of steady and unsteady loads, and using improved structural analyses, to more accurately predict component life and performance, and finally to identify and verify more durable advanced design concepts. In addition, efforts were focused on engine diagnostic needs and advances that would allow integrated health monitoring systems to be developed for enhanced maintainability, automated servicing, inspection, and checkout, and ultimately, in-flight fault tolerant engine operations.

  14. Genome-Wide Association Analysis of Adaptation Using Environmentally Predicted Traits.

    PubMed

    van Heerwaarden, Joost; van Zanten, Martijn; Kruijer, Willem

    2015-10-01

    Current methods for studying the genetic basis of adaptation evaluate genetic associations with ecologically relevant traits or single environmental variables, under the implicit assumption that natural selection imposes correlations between phenotypes, environments and genotypes. In practice, observed trait and environmental data are manifestations of unknown selective forces and are only indirectly associated with adaptive genetic variation. In theory, improved estimation of these forces could enable more powerful detection of loci under selection. Here we present an approach in which we approximate adaptive variation by modeling phenotypes as a function of the environment and using the predicted trait in multivariate and univariate genome-wide association analysis (GWAS). Based on computer simulations and published flowering time data from the model plant Arabidopsis thaliana, we find that environmentally predicted traits lead to higher recovery of functional loci in multivariate GWAS and are more strongly correlated to allele frequencies at adaptive loci than individual environmental variables. Our results provide an example of the use of environmental data to obtain independent and meaningful information on adaptive genetic variation.

  15. NetMHCpan-3.0; improved prediction of binding to MHC class I molecules integrating information from multiple receptor and peptide length datasets.

    PubMed

    Nielsen, Morten; Andreatta, Massimo

    2016-03-30

    Binding of peptides to MHC class I molecules (MHC-I) is essential for antigen presentation to cytotoxic T-cells. Here, we demonstrate how a simple alignment step allowing insertions and deletions in a pan-specific MHC-I binding machine-learning model enables combining information across both multiple MHC molecules and peptide lengths. This pan-allele/pan-length algorithm significantly outperforms state-of-the-art methods, and captures differences in the length profile of binders to different MHC molecules leading to increased accuracy for ligand identification. Using this model, we demonstrate that percentile ranks in contrast to affinity-based thresholds are optimal for ligand identification due to uniform sampling of the MHC space. We have developed a neural network-based machine-learning algorithm leveraging information across multiple receptor specificities and ligand length scales, and demonstrated how this approach significantly improves the accuracy for prediction of peptide binding and identification of MHC ligands. The method is available at www.cbs.dtu.dk/services/NetMHCpan-3.0 .

  16. Trimethylation enhancement using diazomethane (TrEnDi): rapid on-column quaternization of peptide amino groups via reaction with diazomethane significantly enhances sensitivity in mass spectrometry analyses via a fixed, permanent positive charge.

    PubMed

    Wasslen, Karl V; Tan, Le Hoa; Manthorpe, Jeffrey M; Smith, Jeffrey C

    2014-04-01

    Defining cellular processes relies heavily on elucidating the temporal dynamics of proteins. To this end, mass spectrometry (MS) is an extremely valuable tool; different MS-based quantitative proteomics strategies have emerged to map protein dynamics over the course of stimuli. Herein, we disclose our novel MS-based quantitative proteomics strategy with unique analytical characteristics. By passing ethereal diazomethane over peptides on strong cation exchange resin within a microfluidic device, peptides react to contain fixed, permanent positive charges. Modified peptides display improved ionization characteristics and dissociate via tandem mass spectrometry (MS(2)) to form strong a2 fragment ion peaks. Process optimization and determination of reactive functional groups enabled a priori prediction of MS(2) fragmentation patterns for modified peptides. The strategy was tested on digested bovine serum albumin (BSA) and successfully quantified a peptide that was not observable prior to modification. Our method ionizes peptides regardless of proton affinity, thus decreasing ion suppression and permitting predictable multiple reaction monitoring (MRM)-based quantitation with improved sensitivity.

  17. The Role of Genome Accessibility in Transcription Factor Binding in Bacteria.

    PubMed

    Gomes, Antonio L C; Wang, Harris H

    2016-04-01

    ChIP-seq enables genome-scale identification of regulatory regions that govern gene expression. However, the biological insights generated from ChIP-seq analysis have been limited to predictions of binding sites and cooperative interactions. Furthermore, ChIP-seq data often poorly correlate with in vitro measurements or predicted motifs, highlighting that binding affinity alone is insufficient to explain transcription factor (TF)-binding in vivo. One possibility is that binding sites are not equally accessible across the genome. A more comprehensive biophysical representation of TF-binding is required to improve our ability to understand, predict, and alter gene expression. Here, we show that genome accessibility is a key parameter that impacts TF-binding in bacteria. We developed a thermodynamic model that parameterizes ChIP-seq coverage in terms of genome accessibility and binding affinity. The role of genome accessibility is validated using a large-scale ChIP-seq dataset of the M. tuberculosis regulatory network. We find that accounting for genome accessibility led to a model that explains 63% of the ChIP-seq profile variance, while a model based in motif score alone explains only 35% of the variance. Moreover, our framework enables de novo ChIP-seq peak prediction and is useful for inferring TF-binding peaks in new experimental conditions by reducing the need for additional experiments. We observe that the genome is more accessible in intergenic regions, and that increased accessibility is positively correlated with gene expression and anti-correlated with distance to the origin of replication. Our biophysically motivated model provides a more comprehensive description of TF-binding in vivo from first principles towards a better representation of gene regulation in silico, with promising applications in systems biology.

  18. Computational toxicology using the OpenTox application programming interface and Bioclipse

    PubMed Central

    2011-01-01

    Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173

  19. Predictive Engineering Tools for Injection-Molded Long-Carbon-Thermoplastic Composites: Weight and Cost Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Fifield, Leonard S.; Gandhi, Umesh N.

    This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used asmore » resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klesmith, Justin R.; Bacik, John -Paul; Michalczyk, Ryszard

    Synthetic metabolic pathways often suffer from low specific productivity, and new methods that quickly assess pathway functionality for many thousands of variants are urgently needed. Here we present an approach that enables the rapid and parallel determination of sequence effects on flux for complete gene-encoding sequences. We show that this method can be used to determine the effects of over 8000 single point mutants of a pyrolysis oil catabolic pathway implanted in Escherichia coli. Experimental sequence-function data sets predicted whether fitness-enhancing mutations to the enzyme levoglucosan kinase resulted from enhanced catalytic efficiency or enzyme stability. A structure of one designmore » incorporating 38 mutations elucidated the structural basis of high fitness mutations. One design incorporating 15 beneficial mutations supported a 15-fold improvement in growth rate and greater than 24-fold improvement in enzyme activity relative to the starting pathway. Lastly, this technique can be extended to improve a wide variety of designed pathways.« less

  1. Designing deep sequencing experiments: detecting structural variation and estimating transcript abundance.

    PubMed

    Bashir, Ali; Bansal, Vikas; Bafna, Vineet

    2010-06-18

    Massively parallel DNA sequencing technologies have enabled the sequencing of several individual human genomes. These technologies are also being used in novel ways for mRNA expression profiling, genome-wide discovery of transcription-factor binding sites, small RNA discovery, etc. The multitude of sequencing platforms, each with their unique characteristics, pose a number of design challenges, regarding the technology to be used and the depth of sequencing required for a particular sequencing application. Here we describe a number of analytical and empirical results to address design questions for two applications: detection of structural variations from paired-end sequencing and estimating mRNA transcript abundance. For structural variation, our results provide explicit trade-offs between the detection and resolution of rearrangement breakpoints, and the optimal mix of paired-read insert lengths. Specifically, we prove that optimal detection and resolution of breakpoints is achieved using a mix of exactly two insert library lengths. Furthermore, we derive explicit formulae to determine these insert length combinations, enabling a 15% improvement in breakpoint detection at the same experimental cost. On empirical short read data, these predictions show good concordance with Illumina 200 bp and 2 Kbp insert length libraries. For transcriptome sequencing, we determine the sequencing depth needed to detect rare transcripts from a small pilot study. With only 1 Million reads, we derive corrections that enable almost perfect prediction of the underlying expression probability distribution, and use this to predict the sequencing depth required to detect low expressed genes with greater than 95% probability. Together, our results form a generic framework for many design considerations related to high-throughput sequencing. We provide software tools http://bix.ucsd.edu/projects/NGS-DesignTools to derive platform independent guidelines for designing sequencing experiments (amount of sequencing, choice of insert length, mix of libraries) for novel applications of next generation sequencing.

  2. Predisposing characteristics, enabling resources and need as predictors of utilization and clinical outcomes for veterans receiving mental health services.

    PubMed

    Fasoli, DiJon R; Glickman, Mark E; Eisen, Susan V

    2010-04-01

    Though demand for mental health services (MHS) among US veterans is increasing, MHS utilization per veteran is decreasing. With health and social service needs competing for limited resources, it is important to understand the association between patient factors, MHS utilization, and clinical outcomes. We use a framework based on Andersen's behavioral model of health service utilization to examine predisposing characteristics, enabling resources, and clinical need as predictors of MHS utilization and clinical outcomes. This was a prospective observational study of veterans receiving inpatient or outpatient MHS through Veterans Administration programs. Clinician ratings (Global Assessment of Functioning [GAF]) and self-report assessments (Behavior and Symptom Identification Scale-24) were completed for 421 veterans at enrollment and 3 months later. Linear and logistic regression analyses were conducted to examine: (1) predisposing characteristics, enabling resources, and need as predictors of MHS inpatient, residential, and outpatient utilization and (2) the association between individual characteristics, utilization, and clinical outcomes. Being older, female, having greater clinical need, lack of enabling resources (employment, stable housing, and social support), and easy access to treatment significantly predicted greater MHS utilization at 3-month follow-up. Less clinical need and no inpatient psychiatric hospitalization predicted better GAF and Behavior and Symptom Identification Scale-24 scores. White race and residential treatment also predicted better GAF scores. Neither enabling resources, nor number of outpatient mental health visits predicted clinical outcomes. This application of Andersen's behavioral model of health service utilization confirmed associations between some predisposing characteristics, need, and enabling resources on MHS utilization but only predisposing characteristics, need, and utilization were associated with clinical outcomes.

  3. Recovery of speed of information processing in closed-head-injury patients.

    PubMed

    Zwaagstra, R; Schmidt, I; Vanier, M

    1996-06-01

    After severe traumatic brain injury, patients almost invariably demonstrate a slowing of reaction time, reflecting a slowing of central information processing. Methodological problems associated with the traditional method for the analysis of longitudinal data (MANOVA) severely complicate studies on cognitive recovery. It is argued that multilevel models are often better suited for the analysis of improvement over time in clinical settings. Multilevel models take into account individual differences in both overall performance level and recovery. These models enable individual predictions for the recovery of speed of information processing. Recovery is modelled in a group of closed-head-injury patients (N = 24). Recovery was predicted by age and severity of injury, as indicated by coma duration. Over a period up to 44 months post trauma, reaction times were found to decrease faster for patients with longer coma duration.

  4. Performance of protein-structure predictions with the physics-based UNRES force field in CASP11.

    PubMed

    Krupa, Paweł; Mozolewska, Magdalena A; Wiśniewska, Marta; Yin, Yanping; He, Yi; Sieradzan, Adam K; Ganzynkowicz, Robert; Lipska, Agnieszka G; Karczyńska, Agnieszka; Ślusarz, Magdalena; Ślusarz, Rafał; Giełdoń, Artur; Czaplewski, Cezary; Jagieła, Dawid; Zaborowski, Bartłomiej; Scheraga, Harold A; Liwo, Adam

    2016-11-01

    Participating as the Cornell-Gdansk group, we have used our physics-based coarse-grained UNited RESidue (UNRES) force field to predict protein structure in the 11th Community Wide Experiment on the Critical Assessment of Techniques for Protein Structure Prediction (CASP11). Our methodology involved extensive multiplexed replica exchange simulations of the target proteins with a recently improved UNRES force field to provide better reproductions of the local structures of polypeptide chains. All simulations were started from fully extended polypeptide chains, and no external information was included in the simulation process except for weak restraints on secondary structure to enable us to finish each prediction within the allowed 3-week time window. Because of simplified UNRES representation of polypeptide chains, use of enhanced sampling methods, code optimization and parallelization and sufficient computational resources, we were able to treat, for the first time, all 55 human prediction targets with sizes from 44 to 595 amino acid residues, the average size being 251 residues. Complete structures of six single-domain proteins were predicted accurately, with the highest accuracy being attained for the T0769, for which the CαRMSD was 3.8 Å for 97 residues of the experimental structure. Correct structures were also predicted for 13 domains of multi-domain proteins with accuracy comparable to that of the best template-based modeling methods. With further improvements of the UNRES force field that are now underway, our physics-based coarse-grained approach to protein-structure prediction will eventually reach global prediction capacity and, consequently, reliability in simulating protein structure and dynamics that are important in biochemical processes. Freely available on the web at http://www.unres.pl/ CONTACT: has5@cornell.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Processing LiDAR Data to Predict Natural Hazards

    NASA Technical Reports Server (NTRS)

    Fairweather, Ian; Crabtree, Robert; Hager, Stacey

    2008-01-01

    ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.

  6. Pathway-Based Kernel Boosting for the Analysis of Genome-Wide Association Studies

    PubMed Central

    Manitz, Juliane; Burger, Patricia; Amos, Christopher I.; Chang-Claude, Jenny; Wichmann, Heinz-Erich; Kneib, Thomas; Bickeböller, Heike

    2017-01-01

    The analysis of genome-wide association studies (GWAS) benefits from the investigation of biologically meaningful gene sets, such as gene-interaction networks (pathways). We propose an extension to a successful kernel-based pathway analysis approach by integrating kernel functions into a powerful algorithmic framework for variable selection, to enable investigation of multiple pathways simultaneously. We employ genetic similarity kernels from the logistic kernel machine test (LKMT) as base-learners in a boosting algorithm. A model to explain case-control status is created iteratively by selecting pathways that improve its prediction ability. We evaluated our method in simulation studies adopting 50 pathways for different sample sizes and genetic effect strengths. Additionally, we included an exemplary application of kernel boosting to a rheumatoid arthritis and a lung cancer dataset. Simulations indicate that kernel boosting outperforms the LKMT in certain genetic scenarios. Applications to GWAS data on rheumatoid arthritis and lung cancer resulted in sparse models which were based on pathways interpretable in a clinical sense. Kernel boosting is highly flexible in terms of considered variables and overcomes the problem of multiple testing. Additionally, it enables the prediction of clinical outcomes. Thus, kernel boosting constitutes a new, powerful tool in the analysis of GWAS data and towards the understanding of biological processes involved in disease susceptibility. PMID:28785300

  7. Pathway-Based Kernel Boosting for the Analysis of Genome-Wide Association Studies.

    PubMed

    Friedrichs, Stefanie; Manitz, Juliane; Burger, Patricia; Amos, Christopher I; Risch, Angela; Chang-Claude, Jenny; Wichmann, Heinz-Erich; Kneib, Thomas; Bickeböller, Heike; Hofner, Benjamin

    2017-01-01

    The analysis of genome-wide association studies (GWAS) benefits from the investigation of biologically meaningful gene sets, such as gene-interaction networks (pathways). We propose an extension to a successful kernel-based pathway analysis approach by integrating kernel functions into a powerful algorithmic framework for variable selection, to enable investigation of multiple pathways simultaneously. We employ genetic similarity kernels from the logistic kernel machine test (LKMT) as base-learners in a boosting algorithm. A model to explain case-control status is created iteratively by selecting pathways that improve its prediction ability. We evaluated our method in simulation studies adopting 50 pathways for different sample sizes and genetic effect strengths. Additionally, we included an exemplary application of kernel boosting to a rheumatoid arthritis and a lung cancer dataset. Simulations indicate that kernel boosting outperforms the LKMT in certain genetic scenarios. Applications to GWAS data on rheumatoid arthritis and lung cancer resulted in sparse models which were based on pathways interpretable in a clinical sense. Kernel boosting is highly flexible in terms of considered variables and overcomes the problem of multiple testing. Additionally, it enables the prediction of clinical outcomes. Thus, kernel boosting constitutes a new, powerful tool in the analysis of GWAS data and towards the understanding of biological processes involved in disease susceptibility.

  8. Shifts and disruptions in resource-use trait syndromes during the evolution of herbaceous crops.

    PubMed

    Milla, Rubén; Morente-López, Javier; Alonso-Rodrigo, J Miguel; Martín-Robles, Nieves; Chapin, F Stuart

    2014-10-22

    Trait-based ecology predicts that evolution in high-resource agricultural environments should select for suites of traits that enable fast resource acquisition and rapid canopy closure. However, crop breeding targets specific agronomic attributes rather than broad trait syndromes. Breeding for specific traits, together with evolution in high-resource environments, might lead to reduced phenotypic integration, according to predictions from the ecological literature. We provide the first comprehensive test of these hypotheses, based on a trait-screening programme of 30 herbaceous crops and their wild progenitors. During crop evolution plants became larger, which enabled them to compete more effectively for light, but they had poorly integrated phenotypes. In a subset of six herbaceous crop species investigated in greater depth, competitiveness for light increased during early plant domestication, whereas diminished phenotypic integration occurred later during crop improvement. Mass-specific leaf and root traits relevant to resource-use strategies (e.g. specific leaf area or tissue density of fine roots) changed during crop evolution, but in diverse and contrasting directions and magnitudes, depending on the crop species. Reductions in phenotypic integration and overinvestment in traits involved in competition for light may affect the chances of upgrading modern herbaceous crops to face current climatic and food security challenges. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  9. Overview of MST Research

    NASA Astrophysics Data System (ADS)

    Chapman, B. E.

    2017-10-01

    MST progress in advancing the RFP for (1) fusion plasma confinement with ohmic heating and minimal external magnetization, (2) predictive capability in toroidal confinement physics, and (3) basic plasma physics is summarized. Validation of key plasma models is a program priority, which is enhanced by programmable power supplies (PPS) to maximize inductive capability. The existing PPS enables access to very low plasma current, down to Ip =0.02 MA. This greatly expands the Lundquist number range S =104 -108 and allows nonlinear, 3D MHD computation using NIMROD and DEBS with dimensionless parameters that overlap those of MST plasmas. A new, second PPS will allow simultaneous PPS control of the Bp and Bt circuits. The PPS also enables MST tokamak operation, thus far focused on disruptions and RMP suppression of runaway electrons. Gyrokinetic modeling with GENE predicts unstable TEM in improved-confinement RFP plasmas. Measured fluctuations have TEM properties including a density-gradient threshold larger than for tokamak plasmas. Turbulent energization of an electron tail occurs during sawtooth reconnection. Probe measurements hint that drift waves are also excited via the turbulent cascade in standard RFP plasmas. Exploration of basic plasma science frontiers in MST RFP and tokamak plasmas is proposed as part of WiPPL, a basic science user facility. Work supported by USDoE.

  10. Modelling directional solidification

    NASA Technical Reports Server (NTRS)

    Wilcox, William R.; Regel, Liya L.

    1994-01-01

    This grant, NAG8-831, was a continuation of a previous grant, NAG8-541. The long range goal of this program has been to develop an improved understanding of phenomena of importance to directional solidification, in order to enable explanation and prediction of differences in behavior between solidification on Earth and in space. Emphasis in the recently completed grant was on determining the influence of perturbations on directional solidification of InSb and InSb-GaSb alloys. In particular, the objective was to determine the influence of spin-up/spin-down (ACRT), electric current pulses and vibrations on compositional homogeneity and grain size.

  11. Fusing human and machine skills for remote robotic operations

    NASA Technical Reports Server (NTRS)

    Schenker, Paul S.; Kim, Won S.; Venema, Steven C.; Bejczy, Antal K.

    1991-01-01

    The question of how computer assists can improve teleoperator trajectory tracking during both free and force-constrained motions is addressed. Computer graphics techniques which enable the human operator to both visualize and predict detailed 3D trajectories in real-time are reported. Man-machine interactive control procedures for better management of manipulator contact forces and positioning are also described. It is found that collectively, these novel advanced teleoperations techniques both enhance system performance and significantly reduce control problems long associated with teleoperations under time delay. Ongoing robotic simulations of the 1984 space shuttle Solar Maximum EVA Repair Mission are briefly described.

  12. Online Resource for Earth-Observing Satellite Sensor Calibration

    NASA Technical Reports Server (NTRS)

    McCorkel, J.; Czapla-Myers, J.; Thome, K.; Wenny, B.

    2015-01-01

    The Radiometric Calibration Test Site (RadCaTS) at Railroad Valley Playa, Nevada is being developed by the University of Arizona to enable improved accuracy and consistency for airborne and satellite sensor calibration. Primary instrumentation at the site consists of ground-viewing radiometers, a sun photometer, and a meteorological station. Measurements made by these instruments are used to calculate surface reflectance, atmospheric properties and a prediction for top-of-atmosphere reflectance and radiance. This work will leverage research for RadCaTS, and describe the requirements for an online database, associated data formats and quality control, and processing levels.

  13. Gas House Autonomous System Monitoring

    NASA Technical Reports Server (NTRS)

    Miller, Luke; Edsall, Ashley

    2015-01-01

    Gas House Autonomous System Monitoring (GHASM) will employ Integrated System Health Monitoring (ISHM) of cryogenic fluids in the High Pressure Gas Facility at Stennis Space Center. The preliminary focus of development incorporates the passive monitoring and eventual commanding of the Nitrogen System. ISHM offers generic system awareness, adept at using concepts rather than specific error cases. As an enabler for autonomy, ISHM provides capabilities inclusive of anomaly detection, diagnosis, and abnormality prediction. Advancing ISHM and Autonomous Operation functional capabilities enhances quality of data, optimizes safety, improves cost effectiveness, and has direct benefits to a wide spectrum of aerospace applications.

  14. Enabling the environmentally clean air transportation of the future: a vision of computational fluid dynamics in 2030

    PubMed Central

    Slotnick, Jeffrey P.; Khodadoust, Abdollah; Alonso, Juan J.; Darmofal, David L.; Gropp, William D.; Lurie, Elizabeth A.; Mavriplis, Dimitri J.; Venkatakrishnan, Venkat

    2014-01-01

    As global air travel expands rapidly to meet demand generated by economic growth, it is essential to continue to improve the efficiency of air transportation to reduce its carbon emissions and address concerns about climate change. Future transports must be ‘cleaner’ and designed to include technologies that will continue to lower engine emissions and reduce community noise. The use of computational fluid dynamics (CFD) will be critical to enable the design of these new concepts. In general, the ability to simulate aerodynamic and reactive flows using CFD has progressed rapidly during the past several decades and has fundamentally changed the aerospace design process. Advanced simulation capabilities not only enable reductions in ground-based and flight-testing requirements, but also provide added physical insight, and enable superior designs at reduced cost and risk. In spite of considerable success, reliable use of CFD has remained confined to a small region of the operating envelope due, in part, to the inability of current methods to reliably predict turbulent, separated flows. Fortunately, the advent of much more powerful computing platforms provides an opportunity to overcome a number of these challenges. This paper summarizes the findings and recommendations from a recent NASA-funded study that provides a vision for CFD in the year 2030, including an assessment of critical technology gaps and needed development, and identifies the key CFD technology advancements that will enable the design and development of much cleaner aircraft in the future. PMID:25024413

  15. Progress in Space Weather Modeling and Observations Needed to Improve the Operational NAIRAS Model Aircraft Radiation Exposure Predictions

    NASA Astrophysics Data System (ADS)

    Mertens, C. J.; Kress, B. T.; Wiltberger, M. J.; Tobiska, W.; Xu, X.

    2011-12-01

    The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a prototype operational model for predicting commercial aircraft radiation exposure from galactic and solar cosmic rays. NAIRAS predictions are currently streaming live from the project's public website, and the exposure rate nowcast is also available on the SpaceWx smartphone app for iPhone, IPad, and Android. Cosmic rays are the primary source of human exposure to high linear energy transfer radiation at aircraft altitudes, which increases the risk of cancer and other adverse health effects. Thus, the NAIRAS model addresses an important national need with broad societal, public health and economic benefits. The processes responsible for the variability in the solar wind, interplanetary magnetic field, solar energetic particle spectrum, and the dynamical response of the magnetosphere to these space environment inputs, strongly influence the composition and energy distribution of the atmospheric ionizing radiation field. During the development of the NAIRAS model, new science questions were identified that must be addressed in order to obtain a more reliable and robust operational model of atmospheric radiation exposure. Addressing these science questions require improvements in both space weather modeling and observations. The focus of this talk is to present these science questions, the proposed methodologies for addressing these science questions, and the anticipated improvements to the operational predictions of atmospheric radiation exposure. The overarching goal of this work is to provide a decision support tool for the aviation industry that will enable an optimal balance to be achieved between minimizing health risks to passengers and aircrew while simultaneously minimizing costs to the airline companies.

  16. Incorporating redox processes improves prediction of carbon and nutrient cycling and greenhouse gas emission

    NASA Astrophysics Data System (ADS)

    Tang, Guoping; Zheng, Jianqiu; Yang, Ziming; Graham, David; Gu, Baohua; Mayes, Melanie; Painter, Scott; Thornton, Peter

    2016-04-01

    Among the coupled thermal, hydrological, geochemical, and biological processes, redox processes play major roles in carbon and nutrient cycling and greenhouse gas (GHG) emission. Increasingly, mechanistic representation of redox processes is acknowledged as necessary for accurate prediction of GHG emission in the assessment of land-atmosphere interactions. Simple organic substrates, Fe reduction, microbial reactions, and the Windermere Humic Aqueous Model (WHAM) were added to a reaction network used in the land component of an Earth system model. In conjunction with this amended reaction network, various temperature response functions used in ecosystem models were assessed for their ability to describe experimental observations from incubation tests with arctic soils. Incorporation of Fe reduction reactions improves the prediction of the lag time between CO2 and CH4 accumulation. The inclusion of the WHAM model enables us to approximately simulate the initial pH drop due to organic acid accumulation and then a pH increase due to Fe reduction without parameter adjustment. The CLM4.0, CENTURY, and Ratkowsky temperature response functions better described the observations than the Q10 method, Arrhenius equation, and ROTH-C. As electron acceptors between O2 and CO2 (e.g., Fe(III), SO42-) are often involved, our results support inclusion of these redox reactions for accurate prediction of CH4 production and consumption. Ongoing work includes improving the parameterization of organic matter decomposition to produce simple organic substrates, examining the influence of redox potential on methanogenesis under thermodynamically favorable conditions, and refining temperature response representation near the freezing point by additional model-experiment iterations. We will use the model to describe observed GHG emission at arctic and tropical sites.

  17. Predictors of incident heart failure in patients after an acute coronary syndrome: The LIPID heart failure risk-prediction model.

    PubMed

    Driscoll, Andrea; Barnes, Elizabeth H; Blankenberg, Stefan; Colquhoun, David M; Hunt, David; Nestel, Paul J; Stewart, Ralph A; West, Malcolm J; White, Harvey D; Simes, John; Tonkin, Andrew

    2017-12-01

    Coronary heart disease is a major cause of heart failure. Availability of risk-prediction models that include both clinical parameters and biomarkers is limited. We aimed to develop such a model for prediction of incident heart failure. A multivariable risk-factor model was developed for prediction of first occurrence of heart failure death or hospitalization. A simplified risk score was derived that enabled subjects to be grouped into categories of 5-year risk varying from <5% to >20%. Among 7101 patients from the LIPID study (84% male), with median age 61years (interquartile range 55-67years), 558 (8%) died or were hospitalized because of heart failure. Older age, history of claudication or diabetes mellitus, body mass index>30kg/m 2 , LDL-cholesterol >2.5mmol/L, heart rate>70 beats/min, white blood cell count, and the nature of the qualifying acute coronary syndrome (myocardial infarction or unstable angina) were associated with an increase in heart failure events. Coronary revascularization was associated with a lower event rate. Incident heart failure increased with higher concentrations of B-type natriuretic peptide >50ng/L, cystatin C>0.93nmol/L, D-dimer >273nmol/L, high-sensitivity C-reactive protein >4.8nmol/L, and sensitive troponin I>0.018μg/L. Addition of biomarkers to the clinical risk model improved the model's C statistic from 0.73 to 0.77. The net reclassification improvement incorporating biomarkers into the clinical model using categories of 5-year risk was 23%. Adding a multibiomarker panel to conventional parameters markedly improved discrimination and risk classification for future heart failure events. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  18. Improving Emotional Intelligence through Personality Development: The Effect of the Smart Phone Application based Dharma Life Program on Emotional Intelligence

    PubMed Central

    Poonamallee, Latha; Harrington, Alex M.; Nagpal, Manisha; Musial, Alec

    2018-01-01

    Emotional intelligence is established to predict success in leadership effectiveness in various contexts and has been linked to personality factors. This paper introduces Dharma Life Program, a novel approach to improving emotional intelligence by targeting maladaptive personality traits and triggering neuroplasticity through the use of a smart-phone application and mentoring. The program uses neuroplasticity to enable users to create a more adaptive application of their maladaptive traits, thus improving their emotional intelligence. In this study 26 participants underwent the Dharma Life Program in a leadership development setting. We assessed their emotional and social intelligence before and after the Dharma Life Program intervention using the Emotional and Social Competency Inventory (ESCI). The study found a significant improvement in the lowest three competencies and a significant improvement in almost all domains for the entire sample. Our findings suggest that the completion of the Dharma Life Program has a significant positive effect on Emotional and Social Competency scores and offers a new avenue for improving emotional intelligence competencies. PMID:29527182

  19. Improving Emotional Intelligence through Personality Development: The Effect of the Smart Phone Application based Dharma Life Program on Emotional Intelligence.

    PubMed

    Poonamallee, Latha; Harrington, Alex M; Nagpal, Manisha; Musial, Alec

    2018-01-01

    Emotional intelligence is established to predict success in leadership effectiveness in various contexts and has been linked to personality factors. This paper introduces Dharma Life Program, a novel approach to improving emotional intelligence by targeting maladaptive personality traits and triggering neuroplasticity through the use of a smart-phone application and mentoring. The program uses neuroplasticity to enable users to create a more adaptive application of their maladaptive traits, thus improving their emotional intelligence. In this study 26 participants underwent the Dharma Life Program in a leadership development setting. We assessed their emotional and social intelligence before and after the Dharma Life Program intervention using the Emotional and Social Competency Inventory (ESCI). The study found a significant improvement in the lowest three competencies and a significant improvement in almost all domains for the entire sample. Our findings suggest that the completion of the Dharma Life Program has a significant positive effect on Emotional and Social Competency scores and offers a new avenue for improving emotional intelligence competencies.

  20. Threshold models for genome-enabled prediction of ordinal categorical traits in plant breeding.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Pérez-Rodríguez, Paulino; de Los Campos, Gustavo; Eskridge, Kent; Crossa, José

    2014-12-23

    Categorical scores for disease susceptibility or resistance often are recorded in plant breeding. The aim of this study was to introduce genomic models for analyzing ordinal characters and to assess the predictive ability of genomic predictions for ordered categorical phenotypes using a threshold model counterpart of the Genomic Best Linear Unbiased Predictor (i.e., TGBLUP). The threshold model was used to relate a hypothetical underlying scale to the outward categorical response. We present an empirical application where a total of nine models, five without interaction and four with genomic × environment interaction (G×E) and genomic additive × additive × environment interaction (G×G×E), were used. We assessed the proposed models using data consisting of 278 maize lines genotyped with 46,347 single-nucleotide polymorphisms and evaluated for disease resistance [with ordinal scores from 1 (no disease) to 5 (complete infection)] in three environments (Colombia, Zimbabwe, and Mexico). Models with G×E captured a sizeable proportion of the total variability, which indicates the importance of introducing interaction to improve prediction accuracy. Relative to models based on main effects only, the models that included G×E achieved 9-14% gains in prediction accuracy; adding additive × additive interactions did not increase prediction accuracy consistently across locations. Copyright © 2015 Montesinos-López et al.

  1. Computational prediction of virus-human protein-protein interactions using embedding kernelized heterogeneous data.

    PubMed

    Nourani, Esmaeil; Khunjush, Farshad; Durmuş, Saliha

    2016-05-24

    Pathogenic microorganisms exploit host cellular mechanisms and evade host defense mechanisms through molecular pathogen-host interactions (PHIs). Therefore, comprehensive analysis of these PHI networks should be an initial step for developing effective therapeutics against infectious diseases. Computational prediction of PHI data is gaining increasing demand because of scarcity of experimental data. Prediction of protein-protein interactions (PPIs) within PHI systems can be formulated as a classification problem, which requires the knowledge of non-interacting protein pairs. This is a restricting requirement since we lack datasets that report non-interacting protein pairs. In this study, we formulated the "computational prediction of PHI data" problem using kernel embedding of heterogeneous data. This eliminates the abovementioned requirement and enables us to predict new interactions without randomly labeling protein pairs as non-interacting. Domain-domain associations are used to filter the predicted results leading to 175 novel PHIs between 170 human proteins and 105 viral proteins. To compare our results with the state-of-the-art studies that use a binary classification formulation, we modified our settings to consider the same formulation. Detailed evaluations are conducted and our results provide more than 10 percent improvements for accuracy and AUC (area under the receiving operating curve) results in comparison with state-of-the-art methods.

  2. Tools for outcome prediction in patients with community acquired pneumonia.

    PubMed

    Khan, Faheem; Owens, Mark B; Restrepo, Marcos; Povoa, Pedro; Martin-Loeches, Ignacio

    2017-02-01

    Community-acquired pneumonia (CAP) is one of the most common causes of mortality world-wide. The mortality rate of patients with CAP is influenced by the severity of the disease, treatment failure and the requirement for hospitalization and/or intensive care unit (ICU) management, all of which may be predicted by biomarkers and clinical scoring systems. Areas covered: We review the recent literature examining the efficacy of established and newly-developed clinical scores, biological and inflammatory markers such as C-Reactive protein (CRP), procalcitonin (PCT) and Interleukin-6 (IL-6), whether used alone or in conjunction with clinical severity scores to assess the severity of CAP, predict treatment failure, guide acute in-hospital or ICU admission and predict mortality. Expert commentary: The early prediction of treatment failure using clinical scores and biomarkers plays a developing role in improving survival of patients with CAP by identifying high-risk patients requiring hospitalization or ICU admission; and may enable more efficient allocation of resources. However, it is likely that combinations of scoring systems and biomarkers will be of greater use than individual markers. Further larger studies are needed to corroborate the additive value of these markers to clinical prediction scores to provide a safer and more effective assessment tool for clinicians.

  3. MUFOLD-SS: New deep inception-inside-inception networks for protein secondary structure prediction.

    PubMed

    Fang, Chao; Shang, Yi; Xu, Dong

    2018-05-01

    Protein secondary structure prediction can provide important information for protein 3D structure prediction and protein functions. Deep learning offers a new opportunity to significantly improve prediction accuracy. In this article, a new deep neural network architecture, named the Deep inception-inside-inception (Deep3I) network, is proposed for protein secondary structure prediction and implemented as a software tool MUFOLD-SS. The input to MUFOLD-SS is a carefully designed feature matrix corresponding to the primary amino acid sequence of a protein, which consists of a rich set of information derived from individual amino acid, as well as the context of the protein sequence. Specifically, the feature matrix is a composition of physio-chemical properties of amino acids, PSI-BLAST profile, and HHBlits profile. MUFOLD-SS is composed of a sequence of nested inception modules and maps the input matrix to either eight states or three states of secondary structures. The architecture of MUFOLD-SS enables effective processing of local and global interactions between amino acids in making accurate prediction. In extensive experiments on multiple datasets, MUFOLD-SS outperformed the best existing methods and other deep neural networks significantly. MUFold-SS can be downloaded from http://dslsrv8.cs.missouri.edu/~cf797/MUFoldSS/download.html. © 2018 Wiley Periodicals, Inc.

  4. DeSigN: connecting gene expression with therapeutics for drug repurposing and development.

    PubMed

    Lee, Bernard Kok Bang; Tiong, Kai Hung; Chang, Jit Kang; Liew, Chee Sun; Abdul Rahman, Zainal Ariff; Tan, Aik Choon; Khang, Tsung Fei; Cheong, Sok Ching

    2017-01-25

    The drug discovery and development pipeline is a long and arduous process that inevitably hampers rapid drug development. Therefore, strategies to improve the efficiency of drug development are urgently needed to enable effective drugs to enter the clinic. Precision medicine has demonstrated that genetic features of cancer cells can be used for predicting drug response, and emerging evidence suggest that gene-drug connections could be predicted more accurately by exploring the cumulative effects of many genes simultaneously. We developed DeSigN, a web-based tool for predicting drug efficacy against cancer cell lines using gene expression patterns. The algorithm correlates phenotype-specific gene signatures derived from differentially expressed genes with pre-defined gene expression profiles associated with drug response data (IC 50 ) from 140 drugs. DeSigN successfully predicted the right drug sensitivity outcome in four published GEO studies. Additionally, it predicted bosutinib, a Src/Abl kinase inhibitor, as a sensitive inhibitor for oral squamous cell carcinoma (OSCC) cell lines. In vitro validation of bosutinib in OSCC cell lines demonstrated that indeed, these cell lines were sensitive to bosutinib with IC 50 of 0.8-1.2 μM. As further confirmation, we demonstrated experimentally that bosutinib has anti-proliferative activity in OSCC cell lines, demonstrating that DeSigN was able to robustly predict drug that could be beneficial for tumour control. DeSigN is a robust method that is useful for the identification of candidate drugs using an input gene signature obtained from gene expression analysis. This user-friendly platform could be used to identify drugs with unanticipated efficacy against cancer cell lines of interest, and therefore could be used for the repurposing of drugs, thus improving the efficiency of drug development.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daily, Michael D.; Olsen, Brett N.; Schlesinger, Paul H.

    In mammalian cells cholesterol is essential for membrane function, but in excess can be cytototoxic. The cellular response to acute cholesterol loading involves biophysical-based mechanisms that regulate cholesterol levels, through modulation of the “activity” or accessibility of cholesterol to extra-membrane acceptors. Experiments and united atom (UA) simulations show that at high concentrations of cholesterol, lipid bilayers thin significantly and cholesterol availability to external acceptors increases substantially. Such cholesterol activation is critical to its trafficking within cells. Here we aim to reduce the computational cost to enable simulation of large and complex systems involved in cholesterol regulation, such as those includingmore » oxysterols and cholesterol-sensing proteins. To accomplish this, we have modified the published MARTINI coarse-grained force field to improve its predictions of cholesterol-induced changes in both macroscopic and microscopic properties of membranes. Most notably, MARTINI fails to capture both the (macroscopic) area condensation and membrane thickening seen at less than 30% cholesterol and the thinning seen above 40% cholesterol. The thinning at high concentration is critical to cholesterol activation. Microscopic properties of interest include cholesterol-cholesterol radial distribution functions (RDFs), tilt angle, and accessible surface area. First, we develop an “angle-corrected” model wherein we modify the coarse-grained bond angle potentials based on atomistic simulations. This modification significantly improves prediction of macroscopic properties, most notably the thickening/thinning behavior, and also slightly improves microscopic property prediction relative to MARTINI. Second, we add to the angle correction a “volume correction” by also adjusting phospholipid bond lengths to achieve a more accurate volume per molecule. The angle + volume correction substantially further improves the quantitative agreement of the macroscopic properties (area per molecule and thickness) with united atom simulations. However, this improvement also reduces the accuracy of microscopic predictions like radial distribution functions and cholesterol tilt below that of either MARTINI or the angle-corrected model. Thus, while both of our forcefield corrections improve MARTINI, the combined angle and volume correction should be used for problems involving sterol effects on the overall structure of the membrane, while our angle-corrected model should be used in cases where the properties of individual lipid and sterol models are critically important.« less

  6. Development of an aerodynamic measurement system for hypersonic rarefied flows

    NASA Astrophysics Data System (ADS)

    Ozawa, T.; Fujita, K.; Suzuki, T.

    2015-01-01

    A hypersonic rarefied wind tunnel (HRWT) has lately been developed at Japan Aerospace Exploration Agency in order to improve the prediction of rarefied aerodynamics. Flow characteristics of hypersonic rarefied flows have been investigated experimentally and numerically. By conducting dynamic pressure measurements with pendulous models and pitot pressure measurements, we have probed flow characteristics in the test section. We have also improved understandings of hypersonic rarefied flows by integrating a numerical approach with the HRWT measurement. The development of the integration scheme between HRWT and numerical approach enables us to estimate the hypersonic rarefied flow characteristics as well as the direct measurement of rarefied aerodynamics. Consequently, this wind tunnel is capable of generating 25 mm-core flows with the free stream Mach number greater than 10 and Knudsen number greater than 0.1.

  7. Comparison of transform coding methods with an optimal predictor for the data compression of digital elevation models

    NASA Technical Reports Server (NTRS)

    Lewis, Michael

    1994-01-01

    Statistical encoding techniques enable the reduction of the number of bits required to encode a set of symbols, and are derived from their probabilities. Huffman encoding is an example of statistical encoding that has been used for error-free data compression. The degree of compression given by Huffman encoding in this application can be improved by the use of prediction methods. These replace the set of elevations by a set of corrections that have a more advantageous probability distribution. In particular, the method of Lagrange Multipliers for minimization of the mean square error has been applied to local geometrical predictors. Using this technique, an 8-point predictor achieved about a 7 percent improvement over an existing simple triangular predictor.

  8. Test of Special Relativity Using a Fiber Network of Optical Clocks.

    PubMed

    Delva, P; Lodewyck, J; Bilicki, S; Bookjans, E; Vallet, G; Le Targat, R; Pottie, P-E; Guerlin, C; Meynadier, F; Le Poncin-Lafitte, C; Lopez, O; Amy-Klein, A; Lee, W-K; Quintin, N; Lisdat, C; Al-Masoudi, A; Dörscher, S; Grebing, C; Grosche, G; Kuhl, A; Raupach, S; Sterr, U; Hill, I R; Hobson, R; Bowden, W; Kronjäger, J; Marra, G; Rolland, A; Baynes, F N; Margolis, H S; Gill, P

    2017-06-02

    Phase compensated optical fiber links enable high accuracy atomic clocks separated by thousands of kilometers to be compared with unprecedented statistical resolution. By searching for a daily variation of the frequency difference between four strontium optical lattice clocks in different locations throughout Europe connected by such links, we improve upon previous tests of time dilation predicted by special relativity. We obtain a constraint on the Robertson-Mansouri-Sexl parameter |α|≲1.1×10^{-8}, quantifying a violation of time dilation, thus improving by a factor of around 2 the best known constraint obtained with Ives-Stilwell type experiments, and by 2 orders of magnitude the best constraint obtained by comparing atomic clocks. This work is the first of a new generation of tests of fundamental physics using optical clocks and fiber links. As clocks improve, and as fiber links are routinely operated, we expect that the tests initiated in this Letter will improve by orders of magnitude in the near future.

  9. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    PubMed

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  10. Predicting fundamental frequency from mel-frequency cepstral coefficients to enable speech reconstruction.

    PubMed

    Shao, Xu; Milner, Ben

    2005-08-01

    This work proposes a method to reconstruct an acoustic speech signal solely from a stream of mel-frequency cepstral coefficients (MFCCs) as may be encountered in a distributed speech recognition (DSR) system. Previous methods for speech reconstruction have required, in addition to the MFCC vectors, fundamental frequency and voicing components. In this work the voicing classification and fundamental frequency are predicted from the MFCC vectors themselves using two maximum a posteriori (MAP) methods. The first method enables fundamental frequency prediction by modeling the joint density of MFCCs and fundamental frequency using a single Gaussian mixture model (GMM). The second scheme uses a set of hidden Markov models (HMMs) to link together a set of state-dependent GMMs, which enables a more localized modeling of the joint density of MFCCs and fundamental frequency. Experimental results on speaker-independent male and female speech show that accurate voicing classification and fundamental frequency prediction is attained when compared to hand-corrected reference fundamental frequency measurements. The use of the predicted fundamental frequency and voicing for speech reconstruction is shown to give very similar speech quality to that obtained using the reference fundamental frequency and voicing.

  11. Structural and Computational Biology in the Design of Immunogenic Vaccine Antigens

    PubMed Central

    Liljeroos, Lassi; Malito, Enrico; Ferlenghi, Ilaria; Bottomley, Matthew James

    2015-01-01

    Vaccination is historically one of the most important medical interventions for the prevention of infectious disease. Previously, vaccines were typically made of rather crude mixtures of inactivated or attenuated causative agents. However, over the last 10–20 years, several important technological and computational advances have enabled major progress in the discovery and design of potently immunogenic recombinant protein vaccine antigens. Here we discuss three key breakthrough approaches that have potentiated structural and computational vaccine design. Firstly, genomic sciences gave birth to the field of reverse vaccinology, which has enabled the rapid computational identification of potential vaccine antigens. Secondly, major advances in structural biology, experimental epitope mapping, and computational epitope prediction have yielded molecular insights into the immunogenic determinants defining protective antigens, enabling their rational optimization. Thirdly, and most recently, computational approaches have been used to convert this wealth of structural and immunological information into the design of improved vaccine antigens. This review aims to illustrate the growing power of combining sequencing, structural and computational approaches, and we discuss how this may drive the design of novel immunogens suitable for future vaccines urgently needed to increase the global prevention of infectious disease. PMID:26526043

  12. Structural and Computational Biology in the Design of Immunogenic Vaccine Antigens.

    PubMed

    Liljeroos, Lassi; Malito, Enrico; Ferlenghi, Ilaria; Bottomley, Matthew James

    2015-01-01

    Vaccination is historically one of the most important medical interventions for the prevention of infectious disease. Previously, vaccines were typically made of rather crude mixtures of inactivated or attenuated causative agents. However, over the last 10-20 years, several important technological and computational advances have enabled major progress in the discovery and design of potently immunogenic recombinant protein vaccine antigens. Here we discuss three key breakthrough approaches that have potentiated structural and computational vaccine design. Firstly, genomic sciences gave birth to the field of reverse vaccinology, which has enabled the rapid computational identification of potential vaccine antigens. Secondly, major advances in structural biology, experimental epitope mapping, and computational epitope prediction have yielded molecular insights into the immunogenic determinants defining protective antigens, enabling their rational optimization. Thirdly, and most recently, computational approaches have been used to convert this wealth of structural and immunological information into the design of improved vaccine antigens. This review aims to illustrate the growing power of combining sequencing, structural and computational approaches, and we discuss how this may drive the design of novel immunogens suitable for future vaccines urgently needed to increase the global prevention of infectious disease.

  13. Autonomous Mission Operations for Sensor Webs

    NASA Astrophysics Data System (ADS)

    Underbrink, A.; Witt, K.; Stanley, J.; Mandl, D.

    2008-12-01

    We present interim results of a 2005 ROSES AIST project entitled, "Using Intelligent Agents to Form a Sensor Web for Autonomous Mission Operations", or SWAMO. The goal of the SWAMO project is to shift the control of spacecraft missions from a ground-based, centrally controlled architecture to a collaborative, distributed set of intelligent agents. The network of intelligent agents intends to reduce management requirements by utilizing model-based system prediction and autonomic model/agent collaboration. SWAMO agents are distributed throughout the Sensor Web environment, which may include multiple spacecraft, aircraft, ground systems, and ocean systems, as well as manned operations centers. The agents monitor and manage sensor platforms, Earth sensing systems, and Earth sensing models and processes. The SWAMO agents form a Sensor Web of agents via peer-to-peer coordination. Some of the intelligent agents are mobile and able to traverse between on-orbit and ground-based systems. Other agents in the network are responsible for encapsulating system models to perform prediction of future behavior of the modeled subsystems and components to which they are assigned. The software agents use semantic web technologies to enable improved information sharing among the operational entities of the Sensor Web. The semantics include ontological conceptualizations of the Sensor Web environment, plus conceptualizations of the SWAMO agents themselves. By conceptualizations of the agents, we mean knowledge of their state, operational capabilities, current operational capacities, Web Service search and discovery results, agent collaboration rules, etc. The need for ontological conceptualizations over the agents is to enable autonomous and autonomic operations of the Sensor Web. The SWAMO ontology enables automated decision making and responses to the dynamic Sensor Web environment and to end user science requests. The current ontology is compatible with Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) Sensor Model Language (SensorML) concepts and structures. The agents are currently deployed on the U.S. Naval Academy MidSTAR-1 satellite and are actively managing the power subsystem on-orbit without the need for human intervention.

  14. Replacement of SSE with NASA's POWER Project GIS-enabled Web Data Portal

    Atmospheric Science Data Center

    2018-04-30

    Replacement of SSE with NASA's POWER Project GIS-enabled Web Data Portal Friday, March ... 2018 Replacement of SSE (Release 6) with NASA's Prediction of Worldwide Energy Resource (POWER) Project GIS-enabled Web ... Worldwide Energy Resource (POWER) Project funded largely by NASA Earth Applied Sciences program.   The new POWER web portal ...

  15. Detecting causal drivers and empirical prediction of the Indian Summer Monsoon

    NASA Astrophysics Data System (ADS)

    Di Capua, G.; Vellore, R.; Raghavan, K.; Coumou, D.

    2017-12-01

    The Indian summer monsoon (ISM) is crucial for the economy, society and natural ecosystems on the Indian peninsula. Predict the total seasonal rainfall at several months lead time would help to plan effective water management strategies, improve flood or drought protection programs and prevent humanitarian crisis. However, the complexity and strong internal variability of the ISM circulation system make skillful seasonal forecasting challenging. Moreover, to adequately identify the low-frequency, and far-away processes which influence ISM behavior novel tools are needed. We applied a Response-Guided Causal Precursor Detection (RGCPD) scheme, which is a novel empirical prediction method which unites a response-guided community detection scheme with a causal discovery algorithm (CEN). These tool allow us to assess causal pathways between different components of the ISM circulation system and with far-away regions in the tropics, mid-latitudes or Arctic. The scheme has successfully been used to identify causal precursors of the Stratospheric polar vortex enabling skillful predictions at (sub) seasonal timescales (Kretschmer et al. 2016, J.Clim., Kretschmer et al. 2017, GRL). We analyze observed ISM monthly rainfall over the monsoon trough region. Applying causal discovery techniques, we identify several causal precursor communities in the fields of 2m-temperature, sea level pressure and snow depth over Eurasia. Specifically, our results suggest that surface temperature conditions in both tropical and Arctic regions contribute to ISM variability. A linear regression prediction model based on the identified set of communities has good hindcasting skills with 4-5 months lead times. Further we separate El Nino, La Nina and ENSO-neutral years from each other and find that the causal precursors are different dependent on ENSO state. The ENSO-state dependent causal precursors give even higher skill, especially for La Nina years when the ISM is relatively strong. These findings are promising results that might ultimately contribute to both improved understanding of the ISM circulation system and help improving seasonal ISM forecasts.

  16. Breeding and Genetics Symposium: networks and pathways to guide genomic selection.

    PubMed

    Snelling, W M; Cushman, R A; Keele, J W; Maltecca, C; Thomas, M G; Fortes, M R S; Reverter, A

    2013-02-01

    Many traits affecting profitability and sustainability of meat, milk, and fiber production are polygenic, with no single gene having an overwhelming influence on observed variation. No knowledge of the specific genes controlling these traits has been needed to make substantial improvement through selection. Significant gains have been made through phenotypic selection enhanced by pedigree relationships and continually improving statistical methodology. Genomic selection, recently enabled by assays for dense SNP located throughout the genome, promises to increase selection accuracy and accelerate genetic improvement by emphasizing the SNP most strongly correlated to phenotype although the genes and sequence variants affecting phenotype remain largely unknown. These genomic predictions theoretically rely on linkage disequilibrium (LD) between genotyped SNP and unknown functional variants, but familial linkage may increase effectiveness when predicting individuals related to those in the training data. Genomic selection with functional SNP genotypes should be less reliant on LD patterns shared by training and target populations, possibly allowing robust prediction across unrelated populations. Although the specific variants causing polygenic variation may never be known with certainty, a number of tools and resources can be used to identify those most likely to affect phenotype. Associations of dense SNP genotypes with phenotype provide a 1-dimensional approach for identifying genes affecting specific traits; in contrast, associations with multiple traits allow defining networks of genes interacting to affect correlated traits. Such networks are especially compelling when corroborated by existing functional annotation and established molecular pathways. The SNP occurring within network genes, obtained from public databases or derived from genome and transcriptome sequences, may be classified according to expected effects on gene products. As illustrated by functionally informed genomic predictions being more accurate than naive whole-genome predictions of beef tenderness, coupling evidence from livestock genotypes, phenotypes, gene expression, and genomic variants with existing knowledge of gene functions and interactions may provide greater insight into the genes and genomic mechanisms affecting polygenic traits and facilitate functional genomic selection for economically important traits.

  17. Clinical data miner: an electronic case report form system with integrated data preprocessing and machine-learning libraries supporting clinical diagnostic model research.

    PubMed

    Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk

    2014-10-20

    Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries. This integration improves the efficiency of the clinical diagnostic model research workflow. Moreover, by simplifying the generation of learning curves, CDM enables study coordinators to assess more accurately when data collection can be terminated, resulting in better models or lower patient recruitment costs.

  18. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  19. Enabling multiplexed testing of pooled donor cells through whole-genome sequencing.

    PubMed

    Chan, Yingleong; Chan, Ying Kai; Goodman, Daniel B; Guo, Xiaoge; Chavez, Alejandro; Lim, Elaine T; Church, George M

    2018-04-19

    We describe a method that enables the multiplex screening of a pool of many different donor cell lines. Our method accurately predicts each donor proportion from the pool without requiring the use of unique DNA barcodes as markers of donor identity. Instead, we take advantage of common single nucleotide polymorphisms, whole-genome sequencing, and an algorithm to calculate the proportions from the sequencing data. By testing using simulated and real data, we showed that our method robustly predicts the individual proportions from a mixed-pool of numerous donors, thus enabling the multiplexed testing of diverse donor cells en masse.More information is available at https://pgpresearch.med.harvard.edu/poolseq/.

  20. Latent Patient Cluster Discovery for Robust Future Forecasting and New-Patient Generalization

    PubMed Central

    Masino, Aaron J.

    2016-01-01

    Commonly referred to as predictive modeling, the use of machine learning and statistical methods to improve healthcare outcomes has recently gained traction in biomedical informatics research. Given the vast opportunities enabled by large Electronic Health Records (EHR) data and powerful resources for conducting predictive modeling, we argue that it is yet crucial to first carefully examine the prediction task and then choose predictive methods accordingly. Specifically, we argue that there are at least three distinct prediction tasks that are often conflated in biomedical research: 1) data imputation, where a model fills in the missing values in a dataset, 2) future forecasting, where a model projects the development of a medical condition for a known patient based on existing observations, and 3) new-patient generalization, where a model transfers the knowledge learned from previously observed patients to newly encountered ones. Importantly, the latter two tasks—future forecasting and new-patient generalizations—tend to be more difficult than data imputation as they require predictions to be made on potentially out-of-sample data (i.e., data following a different predictable pattern from what has been learned by the model). Using hearing loss progression as an example, we investigate three regression models and show that the modeling of latent clusters is a robust method for addressing the more challenging prediction scenarios. Overall, our findings suggest that there exist significant differences between various kinds of prediction tasks and that it is important to evaluate the merits of a predictive model relative to the specific purpose of a prediction task. PMID:27636203

  1. Latent Patient Cluster Discovery for Robust Future Forecasting and New-Patient Generalization.

    PubMed

    Qian, Ting; Masino, Aaron J

    2016-01-01

    Commonly referred to as predictive modeling, the use of machine learning and statistical methods to improve healthcare outcomes has recently gained traction in biomedical informatics research. Given the vast opportunities enabled by large Electronic Health Records (EHR) data and powerful resources for conducting predictive modeling, we argue that it is yet crucial to first carefully examine the prediction task and then choose predictive methods accordingly. Specifically, we argue that there are at least three distinct prediction tasks that are often conflated in biomedical research: 1) data imputation, where a model fills in the missing values in a dataset, 2) future forecasting, where a model projects the development of a medical condition for a known patient based on existing observations, and 3) new-patient generalization, where a model transfers the knowledge learned from previously observed patients to newly encountered ones. Importantly, the latter two tasks-future forecasting and new-patient generalizations-tend to be more difficult than data imputation as they require predictions to be made on potentially out-of-sample data (i.e., data following a different predictable pattern from what has been learned by the model). Using hearing loss progression as an example, we investigate three regression models and show that the modeling of latent clusters is a robust method for addressing the more challenging prediction scenarios. Overall, our findings suggest that there exist significant differences between various kinds of prediction tasks and that it is important to evaluate the merits of a predictive model relative to the specific purpose of a prediction task.

  2. 2015 Army Science Planning and Strategy Meeting Series: Outcomes and Conclusions

    DTIC Science & Technology

    2017-12-21

    modeling and nanoscale characterization tools to enable efficient design of hybridized manufacturing ; realtime, multiscale computational capability...to enable predictive analytics for expeditionary on-demand manufacturing • Discovery of design principles to enable programming advanced genetic...goals, significant research is needed to mature the fundamental materials science, processing and manufacturing sciences, design methodologies, data

  3. Insights from in-situ X-ray computed tomography during axial impregnation of unidirectional fiber beds

    DOE PAGES

    Larson, Natalie M.; Zok, Frank W.

    2017-12-27

    In-situ X-ray computed tomography during axial impregnation of unidirectional fiber beds is used to study coupled effects of fluid velocity, fiber movement and preferred flow channeling on permeability. Here, in order to interpret the experimental measurements, a new computational tool for predicting axial permeability of very large 2D arrays of non-uniformly packed fibers is developed. The results show that, when the impregnation velocity is high, full saturation is attained behind the flow front and the fibers rearrange into a less uniform configuration with higher permeability. In contrast, when the velocity is low, fluid flows preferentially in the narrowest channels betweenmore » fibers, yielding unsaturated permeabilities that are lower than those in the saturated state. Lastly, these insights combined with a new computational tool will enable improved prediction of permeability, ultimately for use in optimization of composite manufacturing via liquid impregnation.« less

  4. [Molecular characterization of osteosarcomas].

    PubMed

    Baumhoer, D

    2013-11-01

    Osteosarcomas are rare with an estimated incidence of 5-6 cases per one million inhabitants per year. As the prognosis has not improved significantly over the last 30 years and more than 30 % of patients still die of the disease a better understanding of the molecular tumorigenesis is urgently needed to identify prognostic and predictive biomarkers as well as potential therapeutic targets. Using genome-wide SNP chip analyses we were able to detect a genetic signature enabling a prognostic prediction of patients already at the time of initial diagnosis. Furthermore, we found the microRNA cluster 17-92 to be constitutively overexpressed in osteosarcomas. The microRNAs included here are intermingled in a complex network of several oncogenes and tumor suppressors that have been described to be deregulated in osteosarcomas. Therefore, the microRNA cluster 17-92 could represent a central regulator in the development of osteosarcomas.

  5. Sequence stratigraphy of the Triassic in the Barentsz Sea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skjold, L.JU.; Van Veen, P.M.; Gjelberg, J.

    1990-05-01

    A regional study of the Triassic in the Barentsz Sea (20-32{degree}E, 71-74{degree}N) revealed sequences that correlate seismically for hundreds of kilometers. Recent offshore drilling results enabled them to establish a biostratigraphic time framework. Comparisons with information from onshore outcrops (such as the Svalbard Archipelago) aided the piecing together of these superregional sequences. Seismic character analysis identified three units with composite progradational patterns (Induan, Olenekian, and Anisian). Fluvial, deltaic, and marine deposits can be distinguished and located relative to the paleocoastlines. Corresponding downlap surfaces suggest the development of condensed intervals, predicted to consist of organic-rich source rocks, as was later confirmedmore » by drilling. Regional predictions based on this sequence-stratigraphic approach have proved valuable when correlating and evaluating well information. The sequences identified also help define third-order sea level curves for the area; these improve published curves thought to have global significance.« less

  6. Insights from in-situ X-ray computed tomography during axial impregnation of unidirectional fiber beds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, Natalie M.; Zok, Frank W.

    In-situ X-ray computed tomography during axial impregnation of unidirectional fiber beds is used to study coupled effects of fluid velocity, fiber movement and preferred flow channeling on permeability. Here, in order to interpret the experimental measurements, a new computational tool for predicting axial permeability of very large 2D arrays of non-uniformly packed fibers is developed. The results show that, when the impregnation velocity is high, full saturation is attained behind the flow front and the fibers rearrange into a less uniform configuration with higher permeability. In contrast, when the velocity is low, fluid flows preferentially in the narrowest channels betweenmore » fibers, yielding unsaturated permeabilities that are lower than those in the saturated state. Lastly, these insights combined with a new computational tool will enable improved prediction of permeability, ultimately for use in optimization of composite manufacturing via liquid impregnation.« less

  7. Predictions for Swift Follow-up Observations of Advanced LIGO/Virgo Gravitational Wave Sources

    NASA Astrophysics Data System (ADS)

    Racusin, Judith; Evans, Phil; Connaughton, Valerie

    2015-04-01

    The likely detection of gravitational waves associated with the inspiral of neutron star binaries by the upcoming advanced LIGO/Virgo observatories will be complemented by searches for electromagnetic counterparts over large areas of the sky by Swift and other observatories. As short gamma-ray bursts (GRB) are the most likely electromagnetic counterpart candidates to these sources, we can make predictions based upon the last decade of GRB observations by Swift and Fermi. Swift is uniquely capable of accurately localizing new transients rapidly over large areas of the sky in single and tiled pointings, enabling ground-based follow-up. We describe simulations of the detectability of short GRB afterglows by Swift given existing and hypothetical tiling schemes with realistic observing conditions and delays, which guide the optimal observing strategy and improvements provided by coincident detection with observatories such as Fermi-GBM.

  8. The Mars Exploration Rover (MER) Transverse Impulse Rocket System (TIRS)

    NASA Technical Reports Server (NTRS)

    SanMartin, Alejandro Miguel; Bailey, Erik

    2005-01-01

    In a very short period of time the MER project successfully developed and tested a system, TIRS/DIMES, to improve the probability of success in the presence of large Martian winds. The successful development of TIRS/DIMES played a big role in the landing site selection process by enabling the landing of Spirit on Gusev crater, a site of very high scientific interest but with known high wind conditions. The performance of TIRS by Spirit at Gusev Crater was excellent. The velocity prediction error was small and Big TIRS was fired reducing the impact horizontal velocity from approximately 23 meters per second to approximately 11 meters per second, well within the airbag capabilities. The performance of TIRS by Opportunity at Meridiani was good. The velocity prediction error was rather large (approximately 6 meters per second, a less than 2 sigma value, but TIRS did not fire which was the correct action.

  9. Numerical Analysis of Incipient Separation on 53 Deg Swept Diamond Wing

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.

    2015-01-01

    A systematic analysis of incipient separation and subsequent vortex formation from moderately swept blunt leading edges is presented for a 53 deg swept diamond wing. This work contributes to a collective body of knowledge generated within the NATO/STO AVT-183 Task Group titled 'Reliable Prediction of Separated Flow Onset and Progression for Air and Sea Vehicles'. The objective is to extract insights from the experimentally measured and numerically computed flow fields that might enable turbulence experts to further improve their models for predicting swept blunt leading-edge flow separation. Details of vortex formation are inferred from numerical solutions after establishing a good correlation of the global flow field and surface pressure distributions between wind tunnel measurements and computed flow solutions. From this, significant and sometimes surprising insights into the nature of incipient separation and part-span vortex formation are derived from the wealth of information available in the computational solutions.

  10. A Robust Compositional Architecture for Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Deney, Ewen; Farrell, Kimberley; Giannakopoulos, Dimitra; Jonsson, Ari; Frank, Jeremy; Bobby, Mark; Carpenter, Todd; Estlin, Tara

    2006-01-01

    Space exploration applications can benefit greatly from autonomous systems. Great distances, limited communications and high costs make direct operations impossible while mandating operations reliability and efficiency beyond what traditional commanding can provide. Autonomous systems can improve reliability and enhance spacecraft capability significantly. However, there is reluctance to utilizing autonomous systems. In part this is due to general hesitation about new technologies, but a more tangible concern is that of reliability of predictability of autonomous software. In this paper, we describe ongoing work aimed at increasing robustness and predictability of autonomous software, with the ultimate goal of building trust in such systems. The work combines state-of-the-art technologies and capabilities in autonomous systems with advanced validation and synthesis techniques. The focus of this paper is on the autonomous system architecture that has been defined, and on how it enables the application of validation techniques for resulting autonomous systems.

  11. Cognitive orientation and genital infections in young women.

    PubMed

    Kreitler, S; Kreitler, H; Schwartz, R

    1991-01-01

    The purpose was to explore the psychological determinants of common genital infections in young women. The study was done in the framework of the cognitive orientation theory which assumes that cognition guides behavior and provides predictions of behaviors and psychophysiological phenomena. We expected that beliefs of four types (about self, norms, goals, and general) would predict the occurrence and/or frequency of 17 gynecological symptoms (e.g., itching, swelling, different vaginal discharges, abscesses). The subjects were 195 female volunteers, undergraduates, about 23 years old, without gross gynecological disorders, mostly (87.7%) unmarried, mostly (83.6%) having had intercourse. They were administered anonymously questionnaires about demographic variables, frequency and treatment of gynecological symptoms and 3 urological ones (for control), and about cognitive orientation that referred to pretested themes (e.g., assertiveness, hypochondriasis). Stepwise discriminant and regression analyses showed that the belief types enabled predicting the occurrence and frequency of all symptoms, with a mean 34.5% improvement over the 50% chance level, accounting for 45.7-67.2% of the variance. Also the urological symptoms were predicted although at a lower level. Discussion focuses on the specificity of cognitive-motivational determinants and their role in producing conditions favoring physical pathology.

  12. Antimicrobial Resistance Prediction in PATRIC and RAST.

    PubMed

    Davis, James J; Boisvert, Sébastien; Brettin, Thomas; Kenyon, Ronald W; Mao, Chunhong; Olson, Robert; Overbeek, Ross; Santerre, John; Shukla, Maulik; Wattam, Alice R; Will, Rebecca; Xia, Fangfang; Stevens, Rick

    2016-06-14

    The emergence and spread of antimicrobial resistance (AMR) mechanisms in bacterial pathogens, coupled with the dwindling number of effective antibiotics, has created a global health crisis. Being able to identify the genetic mechanisms of AMR and predict the resistance phenotypes of bacterial pathogens prior to culturing could inform clinical decision-making and improve reaction time. At PATRIC (http://patricbrc.org/), we have been collecting bacterial genomes with AMR metadata for several years. In order to advance phenotype prediction and the identification of genomic regions relating to AMR, we have updated the PATRIC FTP server to enable access to genomes that are binned by their AMR phenotypes, as well as metadata including minimum inhibitory concentrations. Using this infrastructure, we custom built AdaBoost (adaptive boosting) machine learning classifiers for identifying carbapenem resistance in Acinetobacter baumannii, methicillin resistance in Staphylococcus aureus, and beta-lactam and co-trimoxazole resistance in Streptococcus pneumoniae with accuracies ranging from 88-99%. We also did this for isoniazid, kanamycin, ofloxacin, rifampicin, and streptomycin resistance in Mycobacterium tuberculosis, achieving accuracies ranging from 71-88%. This set of classifiers has been used to provide an initial framework for species-specific AMR phenotype and genomic feature prediction in the RAST and PATRIC annotation services.

  13. Method and apparatus to predict the remaining service life of an operating system

    DOEpatents

    Greitzer, Frank L.; Kangas, Lars J.; Terrones, Kristine M.; Maynard, Melody A.; Pawlowski, Ronald A. , Ferryman; Thomas A.; Skorpik, James R.; Wilson, Bary W.

    2008-11-25

    A method and computer-based apparatus for monitoring the degradation of, predicting the remaining service life of, and/or planning maintenance for, an operating system are disclosed. Diagnostic information on degradation of the operating system is obtained through measurement of one or more performance characteristics by one or more sensors onboard and/or proximate the operating system. Though not required, it is preferred that the sensor data are validated to improve the accuracy and reliability of the service life predictions. The condition or degree of degradation of the operating system is presented to a user by way of one or more calculated, numeric degradation figures of merit that are trended against one or more independent variables using one or more mathematical techniques. Furthermore, more than one trendline and uncertainty interval may be generated for a given degradation figure of merit/independent variable data set. The trendline(s) and uncertainty interval(s) are subsequently compared to one or more degradation figure of merit thresholds to predict the remaining service life of the operating system. The present invention enables multiple mathematical approaches in determining which trendline(s) to use to provide the best estimate of the remaining service life.

  14. Modelling seagrass growth and development to evaluate transplanting strategies for restoration.

    PubMed

    Renton, Michael; Airey, Michael; Cambridge, Marion L; Kendrick, Gary A

    2011-10-01

    Seagrasses are important marine plants that are under threat globally. Restoration by transplanting vegetative fragments or seedlings into areas where seagrasses have been lost is possible, but long-term trial data are limited. The goal of this study is to use available short-term data to predict long-term outcomes of transplanting seagrass. A functional-structural plant model of seagrass growth that integrates data collected from short-term trials and experiments is presented. The model was parameterized for the species Posidonia australis, a limited validation of the model against independent data and a sensitivity analysis were conducted and the model was used to conduct a preliminary evaluation of different transplanting strategies. The limited validation was successful, and reasonable long-term outcomes could be predicted, based only on short-term data. This approach for modelling seagrass growth and development enables long-term predictions of the outcomes to be made from different strategies for transplanting seagrass, even when empirical long-term data are difficult or impossible to collect. More validation is required to improve confidence in the model's predictions, and inclusion of more mechanism will extend the model's usefulness. Marine restoration represents a novel application of functional-structural plant modelling.

  15. Multiple Diseases in Carrier Probability Estimation: Accounting for Surviving All Cancers Other than Breast and Ovary in BRCAPRO

    PubMed Central

    Katki, Hormuzd A.; Blackford, Amanda; Chen, Sining; Parmigiani, Giovanni

    2008-01-01

    SUMMARY Mendelian models can predict who carries an inherited deleterious mutation of known disease genes based on family history. For example, the BRCAPRO model is commonly used to identify families who carry mutations of BRCA1 and BRCA2, based on familial breast and ovarian cancers. These models incorporate the age of diagnosis of diseases in relatives and current age or age of death. We develop a rigorous foundation for handling multiple diseases with censoring. We prove that any disease unrelated to mutations can be excluded from the model, unless it is sufficiently common and dependent on a mutation-related disease time. Furthermore, if a family member has a disease with higher probability density among mutation carriers, but the model does not account for it, then the carrier probability is deflated. However, even if a family only has diseases the model accounts for, if the model excludes a mutation-related disease, then the carrier probability will be inflated. In light of these results, we extend BRCAPRO to account for surviving all non-breast/ovary cancers as a single outcome. The extension also enables BRCAPRO to extract more useful information from male relatives. Using 1500 familes from the Cancer Genetics Network, accounting for surviving other cancers improves BRCAPRO’s concordance index from 0.758 to 0.762 (p = 0.046), improves its positive predictive value from 35% to 39% (p < 10−6) without impacting its negative predictive value, and improves its overall calibration, although calibration slightly worsens for those with carrier probability < 10%. PMID:18407567

  16. Multiple diseases in carrier probability estimation: accounting for surviving all cancers other than breast and ovary in BRCAPRO.

    PubMed

    Katki, Hormuzd A; Blackford, Amanda; Chen, Sining; Parmigiani, Giovanni

    2008-09-30

    Mendelian models can predict who carries an inherited deleterious mutation of known disease genes based on family history. For example, the BRCAPRO model is commonly used to identify families who carry mutations of BRCA1 and BRCA2, based on familial breast and ovarian cancers. These models incorporate the age of diagnosis of diseases in relatives and current age or age of death. We develop a rigorous foundation for handling multiple diseases with censoring. We prove that any disease unrelated to mutations can be excluded from the model, unless it is sufficiently common and dependent on a mutation-related disease time. Furthermore, if a family member has a disease with higher probability density among mutation carriers, but the model does not account for it, then the carrier probability is deflated. However, even if a family only has diseases the model accounts for, if the model excludes a mutation-related disease, then the carrier probability will be inflated. In light of these results, we extend BRCAPRO to account for surviving all non-breast/ovary cancers as a single outcome. The extension also enables BRCAPRO to extract more useful information from male relatives. Using 1500 families from the Cancer Genetics Network, accounting for surviving other cancers improves BRCAPRO's concordance index from 0.758 to 0.762 (p=0.046), improves its positive predictive value from 35 to 39 per cent (p<10(-6)) without impacting its negative predictive value, and improves its overall calibration, although calibration slightly worsens for those with carrier probability<10 per cent. Copyright (c) 2008 John Wiley & Sons, Ltd.

  17. Disease Management: The Need for a Focus on Broader Self-Management Abilities and Quality of Life

    PubMed Central

    Nieboer, Anna Petra

    2015-01-01

    Abstract The study objective was to investigate long-term effects of disease management programs (DMPs) on (1) health behaviors (smoking, physical exercise); (2) self-management abilities (self-efficacy, investment behavior, initiative taking); and (3) physical and mental quality of life among chronically ill patients. The study also examined whether (changes in) health behaviors and self-management abilities predicted quality of life. Questionnaires were sent to all 5076 patients participating in 18 Dutch DMPs in 2010 (T0; 2676 [53%] respondents). Two years later (T1), questionnaires were sent to 4350 patients still participating in DMPs (1722 [40%] respondents). Structured interviews were held with the 18 DMP project leaders. DMP implementation improved patients' health behavior and physical quality of life, but mental quality of life and self-management abilities declined over time. Changes in patients' investment behavior predicted physical quality of life at T1 (P<.001); physical activity, investment behavior (both P<.05), and self-efficacy (P<.01) at T0, and changes in self-efficacy and investment behavior (both P<.001) predicted patients' mental quality of life at T1. The long-term benefits of these DMPs include successful improvement of chronically ill patients' health behaviors and physical quality of life. However, these programs were not able to improve or maintain broader self-management abilities or mental quality of life, highlighting the need to focus on these abilities and overall quality of life. As coproducers of care, patients should be stimulated and enabled to manage their health and quality of life. (Population Health Management 2015;18:246–255) PMID:25607246

  18. Disease Management: The Need for a Focus on Broader Self-Management Abilities and Quality of Life.

    PubMed

    Cramm, Jane Murray; Nieboer, Anna Petra

    2015-08-01

    The study objective was to investigate long-term effects of disease management programs (DMPs) on (1) health behaviors (smoking, physical exercise); (2) self-management abilities (self-efficacy, investment behavior, initiative taking); and (3) physical and mental quality of life among chronically ill patients. The study also examined whether (changes in) health behaviors and self-management abilities predicted quality of life. Questionnaires were sent to all 5076 patients participating in 18 Dutch DMPs in 2010 (T0; 2676 [53%] respondents). Two years later (T1), questionnaires were sent to 4350 patients still participating in DMPs (1722 [40%] respondents). Structured interviews were held with the 18 DMP project leaders. DMP implementation improved patients' health behavior and physical quality of life, but mental quality of life and self-management abilities declined over time. Changes in patients' investment behavior predicted physical quality of life at T1 (P<.001); physical activity, investment behavior (both P<.05), and self-efficacy (P<.01) at T0, and changes in self-efficacy and investment behavior (both P<.001) predicted patients' mental quality of life at T1. The long-term benefits of these DMPs include successful improvement of chronically ill patients' health behaviors and physical quality of life. However, these programs were not able to improve or maintain broader self-management abilities or mental quality of life, highlighting the need to focus on these abilities and overall quality of life. As coproducers of care, patients should be stimulated and enabled to manage their health and quality of life.

  19. Climate extremes in the Pacific: improving seasonal prediction of tropical cyclones and extreme ocean temperatures to improve resilience

    NASA Astrophysics Data System (ADS)

    Kuleshov, Y.; Jones, D.; Spillman, C. M.

    2012-04-01

    Climate change and climate extremes have a major impact on Australia and Pacific Island countries. Of particular concern are tropical cyclones and extreme ocean temperatures, the first being the most destructive events for terrestrial systems, while the latter has the potential to devastate ocean ecosystems through coral bleaching. As a practical response to climate change, under the Pacific-Australia Climate Change Science and Adaptation Planning program (PACCSAP), we are developing enhanced web-based information tools for providing seasonal forecasts for climatic extremes in the Western Pacific. Tropical cyclones are the most destructive weather systems that impact on coastal areas. Interannual variability in the intensity and distribution of tropical cyclones is large, and presently greater than any trends that are ascribable to climate change. In the warming environment, predicting tropical cyclone occurrence based on historical relationships, with predictors such as sea surface temperatures (SSTs) now frequently lying outside of the range of past variability meaning that it is not possible to find historical analogues for the seasonal conditions often faced by Pacific countries. Elevated SSTs are the primary trigger for mass coral bleaching events, which can lead to widespread damage and mortality on reef systems. Degraded coral reefs present many problems, including long-term loss of tourism and potential loss or degradation of fisheries. The monitoring and prediction of thermal stress events enables the support of a range of adaptive and management activities that could improve reef resilience to extreme conditions. Using the climate model POAMA (Predictive Ocean-Atmosphere Model for Australia), we aim to improve accuracy of seasonal forecasts of tropical cyclone activity and extreme SSTs for the regions of Western Pacific. Improved knowledge of extreme climatic events, with the assistance of tailored forecast tools, will help enhance the resilience and adaptive capacity of Australia and Pacific Island Countries under climate change. Acknowledgement The research discussed in this paper was conducted with the support of the PACCSAP supported by the AusAID and Department of Climate Change and Energy Efficiency and delivered by the Bureau of Meteorology and CSIRO.

  20. Basic state lower-tropospheric humidity distribution: key to successful simulation and prediction of the Madden-Julian oscillation

    NASA Astrophysics Data System (ADS)

    Kim, D.; Ahn, M. S.; DeMott, C. A.; Jiang, X.; Klingaman, N. P.; Kim, H. M.; Lee, J. H.; Lim, Y.; Xavier, P. K.

    2017-12-01

    The Madden-Julian Oscillation (MJO) influences the global weather-climate system, thereby providing the source of predictability on the intraseasonal timescales worldwide. An accurate representation of the MJO, however, is still one of the most challenging tasks for many contemporary global climate models (GCMs). Identifying aspects of the GCMs that are tightly linked to GCMs' MJO simulation capability is a step toward improving the GCM representation of the MJO. This study surveys recent modeling work that collectively evidence that the horizontal distribution of the basic state low-tropospheric humidity is crucial to a successful simulation and prediction of the MJO. Specifically, the simulated horizontal and meridional gradients of the mean low-tropospheric humidity determine the magnitude of the moistening (drying) to the east (west) of the enhance MJO, thereby enabling or disabling the eastward propagation of the MJO. Supporting this argument, many MJO-incompetent GCMs also exhibit biases in the mean humidity that weaken the horizontal moisture gradient. Also, MJO prediction skill of the S2S models is tightly related to the biases in the mean moisture gradient. Implications of the robust relationship between the MJO and the mean state on MJO modeling and prediction will be discussed.

  1. Systematically evaluating read-across prediction and ...

    EPA Pesticide Factsheets

    Read-across is a popular data gap filling technique within category and analogue approaches for regulatory purposes. Acceptance of read-across remains an ongoing challenge with several efforts underway for identifying and addressing uncertainties. Here we demonstrate an algorithmic, automated approach to evaluate the utility of using in vitro bioactivity data (“bioactivity descriptors”, from EPA’s ToxCast program) in conjunction with chemical descriptor information to derive local validity domains (specific sets of nearest neighbors) to facilitate read-across for a number of in vivo repeated dose toxicity study types. Over 3400 different chemical structure descriptors were generated for a set of 976 chemicals and supplemented with the outcomes from 821 in vitro assays. The read-across prediction for a given chemical was based on the similarity weighted endpoint outcomes of its nearest neighbors. The approach enabled a performance baseline for read-across predictions of specific study outcomes to be established. Bioactivity descriptors were often found to be more predictive of in vivo toxicity outcomes than chemical descriptors or a combination of both. The approach shows promise as part of a screening assessment in the absence of prior knowledge. Future work will investigate to what extent encoding expert knowledge leads to an improvement in read-across prediction. Read-across is a popular data gap filling technique within category and analogue approaches

  2. Metabolic Engineering for the Production of Natural Products

    PubMed Central

    Pickens, Lauren B.; Tang, Yi; Chooi, Yit-Heng

    2014-01-01

    Natural products and natural product derived compounds play an important role in modern healthcare as frontline treatments for many diseases and as inspiration for chemically synthesized therapeutics. With advances in sequencing and recombinant DNA technology, many of the biosynthetic pathways responsible for the production of these chemically complex and pharmaceutically valuable compounds have been elucidated. With an ever expanding toolkit of biosynthetic components, metabolic engineering is an increasingly powerful method to improve natural product titers and generate novel compounds. Heterologous production platforms have enabled access to pathways from difficult to culture strains; systems biology and metabolic modeling tools have resulted in increasing predictive and analytic capabilities; advances in expression systems and regulation have enabled the fine-tuning of pathways for increased efficiency, and characterization of individual pathway components has facilitated the construction of hybrid pathways for the production of new compounds. These advances in the many aspects of metabolic engineering have not only yielded fascinating scientific discoveries but also make it an increasingly viable approach for the optimization of natural product biosynthesis. PMID:22432617

  3. Temporary Shell Proof-of-Concept Technique: Digital-Assisted Workflow to Enable Customized Immediate Function in Two Visits in Partially Edentulous Patients

    PubMed

    Pozzi, Alessandro; Arcuri, Lorenzo; Moy, Peter K

    2018-03-01

    The growing interest in minimally invasive implant placement and delivery of a prefabricated provisional prosthesis immediately, thus minimizing "time to teeth," has led to the development of numerous 3-dimensional (3D) planning software programs. Given the enhancements associated with fully digital workflows, such as better 3D soft-tissue visualization and virtual tooth rendering, computer-guided implant surgery and immediate function has become an effective and reliable procedure. This article describes how modern implant planning software programs provide a comprehensive digital platform that enables efficient interplay between the surgical and restorative aspects of implant treatment. These new technologies that streamline the overall digital workflow allow transformation of the digital wax-up into a personalized, CAD/CAM-milled provisional restoration. Thus, collaborative digital workflows provide a novel approach for time-efficient delivery of a customized, screw-retained provisional restoration on the day of implant surgery, resulting in improved predictability for immediate function in the partially edentate patient.

  4. Multivariate Copula Analysis Toolbox (MvCAT): Describing dependence and underlying uncertainty using a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Sadegh, Mojtaba; Ragno, Elisa; AghaKouchak, Amir

    2017-06-01

    We present a newly developed Multivariate Copula Analysis Toolbox (MvCAT) which includes a wide range of copula families with different levels of complexity. MvCAT employs a Bayesian framework with a residual-based Gaussian likelihood function for inferring copula parameters and estimating the underlying uncertainties. The contribution of this paper is threefold: (a) providing a Bayesian framework to approximate the predictive uncertainties of fitted copulas, (b) introducing a hybrid-evolution Markov Chain Monte Carlo (MCMC) approach designed for numerical estimation of the posterior distribution of copula parameters, and (c) enabling the community to explore a wide range of copulas and evaluate them relative to the fitting uncertainties. We show that the commonly used local optimization methods for copula parameter estimation often get trapped in local minima. The proposed method, however, addresses this limitation and improves describing the dependence structure. MvCAT also enables evaluation of uncertainties relative to the length of record, which is fundamental to a wide range of applications such as multivariate frequency analysis.

  5. Diagnostic Reasoning using Prognostic Information for Unmanned Aerial Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Roychoudhury, Indranil; Kulkarni, Chetan

    2015-01-01

    With increasing popularity of unmanned aircraft, continuous monitoring of their systems, software, and health status is becoming more and more important to ensure safe, correct, and efficient operation and fulfillment of missions. The paper presents integration of prognosis models and prognostic information with the R2U2 (REALIZABLE, RESPONSIVE, and UNOBTRUSIVE Unit) monitoring and diagnosis framework. This integration makes available statistically reliable health information predictions of the future at a much earlier time to enable autonomous decision making. The prognostic information can be used in the R2U2 model to improve diagnostic accuracy and enable decisions to be made at the present time to deal with events in the future. This will be an advancement over the current state of the art, where temporal logic observers can only do such valuation at the end of the time interval. Usefulness and effectiveness of this integrated diagnostics and prognostics framework was demonstrated using simulation experiments with the NASA Dragon Eye electric unmanned aircraft.

  6. A Novel Way to Measure and Predict Development: A Heuristic Approach to Facilitate the Early Detection of Neurodevelopmental Disorders.

    PubMed

    Marschik, Peter B; Pokorny, Florian B; Peharz, Robert; Zhang, Dajie; O'Muircheartaigh, Jonathan; Roeyers, Herbert; Bölte, Sven; Spittle, Alicia J; Urlesberger, Berndt; Schuller, Björn; Poustka, Luise; Ozonoff, Sally; Pernkopf, Franz; Pock, Thomas; Tammimies, Kristiina; Enzinger, Christian; Krieber, Magdalena; Tomantschger, Iris; Bartl-Pokorny, Katrin D; Sigafoos, Jeff; Roche, Laura; Esposito, Gianluca; Gugatschka, Markus; Nielsen-Saines, Karin; Einspieler, Christa; Kaufmann, Walter E

    2017-05-01

    Substantial research exists focusing on the various aspects and domains of early human development. However, there is a clear blind spot in early postnatal development when dealing with neurodevelopmental disorders, especially those that manifest themselves clinically only in late infancy or even in childhood. This early developmental period may represent an important timeframe to study these disorders but has historically received far less research attention. We believe that only a comprehensive interdisciplinary approach will enable us to detect and delineate specific parameters for specific neurodevelopmental disorders at a very early age to improve early detection/diagnosis, enable prospective studies and eventually facilitate randomised trials of early intervention. In this article, we propose a dynamic framework for characterising neurofunctional biomarkers associated with specific disorders in the development of infants and children. We have named this automated detection 'Fingerprint Model', suggesting one possible approach to accurately and early identify neurodevelopmental disorders.

  7. The role of markup for enabling interoperability in health informatics.

    PubMed

    McKeever, Steve; Johnson, David

    2015-01-01

    Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  8. Genome-Scale Metabolic Model for the Green Alga Chlorella vulgaris UTEX 395 Accurately Predicts Phenotypes under Autotrophic, Heterotrophic, and Mixotrophic Growth Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuniga, Cristal; Li, Chien -Ting; Huelsman, Tyler

    The green microalgae Chlorella vulgaris has been widely recognized as a promising candidate for biofuel production due to its ability to store high lipid content and its natural metabolic versatility. Compartmentalized genome-scale metabolic models constructed from genome sequences enable quantitative insight into the transport and metabolism of compounds within a target organism. These metabolic models have long been utilized to generate optimized design strategies for an improved production process. Here, we describe the reconstruction, validation, and application of a genome-scale metabolic model for C. vulgaris UTEX 395, iCZ843. The reconstruction represents the most comprehensive model for any eukaryotic photosynthetic organismmore » to date, based on the genome size and number of genes in the reconstruction. The highly curated model accurately predicts phenotypes under photoautotrophic, heterotrophic, and mixotrophic conditions. The model was validated against experimental data and lays the foundation for model-driven strain design and medium alteration to improve yield. Calculated flux distributions under different trophic conditions show that a number of key pathways are affected by nitrogen starvation conditions, including central carbon metabolism and amino acid, nucleotide, and pigment biosynthetic pathways. Moreover, model prediction of growth rates under various medium compositions and subsequent experimental validation showed an increased growth rate with the addition of tryptophan and methionine.« less

  9. How Structure Defines Affinity in Protein-Protein Interactions

    PubMed Central

    Erijman, Ariel; Rosenthal, Eran; Shifman, Julia M.

    2014-01-01

    Protein-protein interactions (PPI) in nature are conveyed by a multitude of binding modes involving various surfaces, secondary structure elements and intermolecular interactions. This diversity results in PPI binding affinities that span more than nine orders of magnitude. Several early studies attempted to correlate PPI binding affinities to various structure-derived features with limited success. The growing number of high-resolution structures, the appearance of more precise methods for measuring binding affinities and the development of new computational algorithms enable more thorough investigations in this direction. Here, we use a large dataset of PPI structures with the documented binding affinities to calculate a number of structure-based features that could potentially define binding energetics. We explore how well each calculated biophysical feature alone correlates with binding affinity and determine the features that could be used to distinguish between high-, medium- and low- affinity PPIs. Furthermore, we test how various combinations of features could be applied to predict binding affinity and observe a slow improvement in correlation as more features are incorporated into the equation. In addition, we observe a considerable improvement in predictions if we exclude from our analysis low-resolution and NMR structures, revealing the importance of capturing exact intermolecular interactions in our calculations. Our analysis should facilitate prediction of new interactions on the genome scale, better characterization of signaling networks and design of novel binding partners for various target proteins. PMID:25329579

  10. Combining correlative and mechanistic habitat suitability models to improve ecological compensation.

    PubMed

    Meineri, Eric; Deville, Anne-Sophie; Grémillet, David; Gauthier-Clerc, Michel; Béchet, Arnaud

    2015-02-01

    Only a few studies have shown positive impacts of ecological compensation on species dynamics affected by human activities. We argue that this is due to inappropriate methods used to forecast required compensation in environmental impact assessments. These assessments are mostly descriptive and only valid at limited spatial and temporal scales. However, habitat suitability models developed to predict the impacts of environmental changes on potential species' distributions should provide rigorous science-based tools for compensation planning. Here we describe the two main classes of predictive models: correlative models and individual-based mechanistic models. We show how these models can be used alone or synoptically to improve compensation planning. While correlative models are easier to implement, they tend to ignore underlying ecological processes and lack accuracy. On the contrary, individual-based mechanistic models can integrate biological interactions, dispersal ability and adaptation. Moreover, among mechanistic models, those considering animal energy balance are particularly efficient at predicting the impact of foraging habitat loss. However, mechanistic models require more field data compared to correlative models. Hence we present two approaches which combine both methods for compensation planning, especially in relation to the spatial scale considered. We show how the availability of biological databases and software enabling fast and accurate population projections could be advantageously used to assess ecological compensation requirement efficiently in environmental impact assessments. © 2014 The Authors. Biological Reviews © 2014 Cambridge Philosophical Society.

  11. Coupling Protein Side-Chain and Backbone Flexibility Improves the Re-design of Protein-Ligand Specificity.

    PubMed

    Ollikainen, Noah; de Jong, René M; Kortemme, Tanja

    2015-01-01

    Interactions between small molecules and proteins play critical roles in regulating and facilitating diverse biological functions, yet our ability to accurately re-engineer the specificity of these interactions using computational approaches has been limited. One main difficulty, in addition to inaccuracies in energy functions, is the exquisite sensitivity of protein-ligand interactions to subtle conformational changes, coupled with the computational problem of sampling the large conformational search space of degrees of freedom of ligands, amino acid side chains, and the protein backbone. Here, we describe two benchmarks for evaluating the accuracy of computational approaches for re-engineering protein-ligand interactions: (i) prediction of enzyme specificity altering mutations and (ii) prediction of sequence tolerance in ligand binding sites. After finding that current state-of-the-art "fixed backbone" design methods perform poorly on these tests, we develop a new "coupled moves" design method in the program Rosetta that couples changes to protein sequence with alterations in both protein side-chain and protein backbone conformations, and allows for changes in ligand rigid-body and torsion degrees of freedom. We show significantly increased accuracy in both predicting ligand specificity altering mutations and binding site sequences. These methodological improvements should be useful for many applications of protein-ligand design. The approach also provides insights into the role of subtle conformational adjustments that enable functional changes not only in engineering applications but also in natural protein evolution.

  12. Genome-Scale Metabolic Model for the Green Alga Chlorella vulgaris UTEX 395 Accurately Predicts Phenotypes under Autotrophic, Heterotrophic, and Mixotrophic Growth Conditions

    DOE PAGES

    Zuniga, Cristal; Li, Chien -Ting; Huelsman, Tyler; ...

    2016-07-02

    The green microalgae Chlorella vulgaris has been widely recognized as a promising candidate for biofuel production due to its ability to store high lipid content and its natural metabolic versatility. Compartmentalized genome-scale metabolic models constructed from genome sequences enable quantitative insight into the transport and metabolism of compounds within a target organism. These metabolic models have long been utilized to generate optimized design strategies for an improved production process. Here, we describe the reconstruction, validation, and application of a genome-scale metabolic model for C. vulgaris UTEX 395, iCZ843. The reconstruction represents the most comprehensive model for any eukaryotic photosynthetic organismmore » to date, based on the genome size and number of genes in the reconstruction. The highly curated model accurately predicts phenotypes under photoautotrophic, heterotrophic, and mixotrophic conditions. The model was validated against experimental data and lays the foundation for model-driven strain design and medium alteration to improve yield. Calculated flux distributions under different trophic conditions show that a number of key pathways are affected by nitrogen starvation conditions, including central carbon metabolism and amino acid, nucleotide, and pigment biosynthetic pathways. Moreover, model prediction of growth rates under various medium compositions and subsequent experimental validation showed an increased growth rate with the addition of tryptophan and methionine.« less

  13. Molecular classification of gastric adenocarcinoma: translating new insights from the cancer genome atlas research network.

    PubMed

    Sunakawa, Yu; Lenz, Heinz-Josef

    2015-04-01

    Gastric cancer is a heterogenous cancer, which may be classified into several distinct subtypes based on pathology and epidemiology, each with different initiating pathological processes and each possibly having different tumor biology. A classification of gastric cancer should be important to select patients who can benefit from the targeted therapies or to precisely predict prognosis. The Cancer Genome Atlas (TCGA) study collaborated with previous reports regarding subtyping gastric cancer but also proposed a refined classification based on molecular characteristics. The addition of the new molecular classification strategy to a current classical subtyping may be a promising option, particularly stratification by Epstein-Barr virus (EBV) and microsatellite instability (MSI) statuses. According to TCGA study, EBV gastric cancer patients may benefit the programmed cell death protein 1 (PD-1)/programmed death-ligand 1 (PD-L1) antibodies or phosphoinositide 3-kinase (PI3K) inhibitors which are now being developed. The discoveries of predictive biomarkers should improve patient care and individualized medicine in the management since the targeted therapies may have the potential to change the landscape of gastric cancer treatment, moreover leading to both better understanding of the heterogeneity and better outcomes. Patient enrichment by predictive biomarkers for new treatment strategies will be critical to improve clinical outcomes. Additionally, liquid biopsies will be able to enable us to monitor in real-time molecular escape mechanism, resulting in better treatment strategies.

  14. Genome-Scale Metabolic Model for the Green Alga Chlorella vulgaris UTEX 395 Accurately Predicts Phenotypes under Autotrophic, Heterotrophic, and Mixotrophic Growth Conditions.

    PubMed

    Zuñiga, Cristal; Li, Chien-Ting; Huelsman, Tyler; Levering, Jennifer; Zielinski, Daniel C; McConnell, Brian O; Long, Christopher P; Knoshaug, Eric P; Guarnieri, Michael T; Antoniewicz, Maciek R; Betenbaugh, Michael J; Zengler, Karsten

    2016-09-01

    The green microalga Chlorella vulgaris has been widely recognized as a promising candidate for biofuel production due to its ability to store high lipid content and its natural metabolic versatility. Compartmentalized genome-scale metabolic models constructed from genome sequences enable quantitative insight into the transport and metabolism of compounds within a target organism. These metabolic models have long been utilized to generate optimized design strategies for an improved production process. Here, we describe the reconstruction, validation, and application of a genome-scale metabolic model for C. vulgaris UTEX 395, iCZ843. The reconstruction represents the most comprehensive model for any eukaryotic photosynthetic organism to date, based on the genome size and number of genes in the reconstruction. The highly curated model accurately predicts phenotypes under photoautotrophic, heterotrophic, and mixotrophic conditions. The model was validated against experimental data and lays the foundation for model-driven strain design and medium alteration to improve yield. Calculated flux distributions under different trophic conditions show that a number of key pathways are affected by nitrogen starvation conditions, including central carbon metabolism and amino acid, nucleotide, and pigment biosynthetic pathways. Furthermore, model prediction of growth rates under various medium compositions and subsequent experimental validation showed an increased growth rate with the addition of tryptophan and methionine. © 2016 American Society of Plant Biologists. All rights reserved.

  15. Genome-Scale Metabolic Model for the Green Alga Chlorella vulgaris UTEX 395 Accurately Predicts Phenotypes under Autotrophic, Heterotrophic, and Mixotrophic Growth Conditions1

    PubMed Central

    Zuñiga, Cristal; Li, Chien-Ting; Zielinski, Daniel C.; Guarnieri, Michael T.; Antoniewicz, Maciek R.; Zengler, Karsten

    2016-01-01

    The green microalga Chlorella vulgaris has been widely recognized as a promising candidate for biofuel production due to its ability to store high lipid content and its natural metabolic versatility. Compartmentalized genome-scale metabolic models constructed from genome sequences enable quantitative insight into the transport and metabolism of compounds within a target organism. These metabolic models have long been utilized to generate optimized design strategies for an improved production process. Here, we describe the reconstruction, validation, and application of a genome-scale metabolic model for C. vulgaris UTEX 395, iCZ843. The reconstruction represents the most comprehensive model for any eukaryotic photosynthetic organism to date, based on the genome size and number of genes in the reconstruction. The highly curated model accurately predicts phenotypes under photoautotrophic, heterotrophic, and mixotrophic conditions. The model was validated against experimental data and lays the foundation for model-driven strain design and medium alteration to improve yield. Calculated flux distributions under different trophic conditions show that a number of key pathways are affected by nitrogen starvation conditions, including central carbon metabolism and amino acid, nucleotide, and pigment biosynthetic pathways. Furthermore, model prediction of growth rates under various medium compositions and subsequent experimental validation showed an increased growth rate with the addition of tryptophan and methionine. PMID:27372244

  16. Facing Global Challenges with Materials Innovation

    NASA Astrophysics Data System (ADS)

    Rizzo, Fernando

    2017-10-01

    The path of society evolution has long been associated with a growing demand for natural resources and continuous environmental degradation. During the last decades, this pace has accelerated considerably, despite the general concern with the legacy being left for the next generations. Looking ahead, the predicted growth of the world population, and the improvement of life conditions in most regions, point to an increasing demand for energy generation, resulting in additional pressure on the Earth's sustainability. Materials have had a key role in decreasing the use of natural resources, by either improving efficiency of existing technologies or enabling the development of radical new ones. The greenhouse effect (CO2 emissions) and the energy crisis are global challenges that can benefit from the development of new materials for the successful implementation of promising technologies and for the imperative replacement of fossil fuels by renewable sources.

  17. A Geosynchronous Lidar System for Atmospheric Winds and Moisture Measurements

    NASA Technical Reports Server (NTRS)

    Emmitt, G. D.

    2001-01-01

    An observing system comprised of two lidars in geosychronous orbit would enable the synoptic and meso-scale measurement of atmospheric winds and moisture both of which are key first-order variables of the Earth's weather equation. Simultaneous measurement of these parameters at fast revisit rates promises large advancements in our weather prediction skills. Such capabilities would be unprecedented and a) yield greatly improved and finer resolution initial conditions for models, b) make existing costly and cumbersome measurement approaches obsolete, and c) obviate the use of numerical techniques needed to correct data obtained using present observing systems. Additionally, simultaneous synoptic wind and moisture observations would lead to improvements in model parameterizations, and in our knowledge of small-scale weather processes. Technology and science data product assessments are ongoing. Results will be presented during the conference.

  18. Taking a fresh look at boiling heat transfer on the road to improved nuclear economics and efficiency

    DOE PAGES

    Pointer, William David; Baglietto, Emilio

    2016-05-01

    Here, in the effort to reinvigorate innovation in the way we design, build, and operate the nuclear power generating stations of today and tomorrow, nothing can be taken for granted. Not even the seemingly familiar physics of boiling water. The Consortium for the Advanced Simulation of Light Water Reactors, or CASL, is focused on the deployment of advanced modeling and simulation capabilities to enable the nuclear industry to reduce uncertainties in the prediction of multi-physics phenomena and continue to improve the performance of today’s Light Water Reactors and their fuel. An important part of the CASL mission is the developmentmore » of a next generation thermal hydraulics simulation capability, integrating the history of engineering models based on experimental experience with the computing technology of the future.« less

  19. Materials International Space Station Experiment (MISSE): Overview, Accomplishments and Future Needs

    NASA Technical Reports Server (NTRS)

    deGroh, Kim K.; Jaworske, Donald A.; Pippin, Gary; Jenkins, Philip P.; Walters, Robert J.; Thibeault, Sheila A.; Palusinski, Iwona; Lorentzen, Justin R.

    2014-01-01

    Materials and devices used on the exterior of spacecraft in low Earth orbit (LEO) are subjected to environmental threats that can cause degradation in material properties, possibly threatening spacecraft mission success. These threats include: atomic oxygen (AO), ultraviolet and x-ray radiation, charged particle radiation, temperature extremes and thermal cycling, micrometeoroid and debris impacts, and contamination. Space environmental threats vary greatly based on spacecraft materials, thicknesses and stress levels, and the mission environment and duration. For more than a decade the Materials International Space Station Experiment (MISSE) has enabled the study of the long duration environmental durability of spacecraft materials in the LEO environment. The overall objective of MISSE is to test the stability and durability of materials and devices in the space environment in order to gain valuable knowledge on the performance of materials in space, as well as to enable lifetime predictions of new materials that may be used in future space flight. MISSE is a series of materials flight experiments, which are attached to the exterior of the International Space Station (ISS). Individual experiments were loaded onto suitcase-like trays, called Passive Experiment Containers (PECs). The PECs were transported to the ISS in the Space Shuttle cargo bay and attached to, and removed from, the ISS during extravehicular activities (EVAs). The PECs were retrieved after one or more years of space exposure and returned to Earth enabling post-flight experiment evaluation. MISSE is a multi-organization project with participants from the National Aeronautics and Space Administration (NASA), the Department of Defense (DoD), industry and academia. MISSE has provided a platform for environmental durability studies for thousands of samples and numerous devices, and it has produced many tangible impacts. Ten PECs (and one smaller tray) have been flown, representing MISSE 1 through MISSE 8, yielding long-duration space environmental performance and durability data that enable material validation, processing recertification and space qualification; improved predictions of materials and component lifetimes in space; model verification and development; and correlation factors between space-exposure and ground-facilities enabling more accurate in-space performance predictions based on ground-laboratory testing. A few of the many experiment results and observations, and their impacts, are provided. Those highlighted include examples on improved understanding of atomic oxygen scattering mechanisms, LEO coating durability results, and polymer erosion yields and their impacts on spacecraft design. The MISSE 2 Atomic Oxygen Scattering Chamber Experiment discovered that the peak flux of scattered AO was determined to be 45 deg from normal incidence, not the model predicted cosine dependence. In addition, the erosion yield (E(sub y)) of Kapton H for AO scattered off oxidized-Al is 22% of the E(sub y) of direct AO impingement. These results were used to help determine the degradation mechanism of a cesium iodide detector within the Hubble Space Telescope Cosmic Origins Spectrograph Experiment. The MISSE 6 Indium Tin Oxide (ITO) Degradation Experiment measured surface electrical resistance of ram and wake ITO coated samples. The data confirmed that ITO is a stable AO protective coating, and the results validated the durability of ITO conductive coatings for solar arrays for the Atmosphere-Space Transition 2 Explorer program. The MISSE 2, 6 and 7 Polymer Experiments have provided LEO AO Ey data on over 120 polymer and composites samples. The flight E(sub y) values were found to range from 3.05 x 10(exp -26) cu cm/atom for the AO resistant polymer CORIN to 9.14 x 10(exp -26) cu cm/atom for polyoxymethylene (POM). In addition, flying the same polymers on different missions has advanced the understanding of the AO E(sub y) dependency on solar exposure for polymers containing fluorine. The MISSE polymer results are highly requested and have impacted spacecraft design for WorldView-2 & -3, the Global Precipitation Measurement-Microwave Imager, and other spacecraft. The flight data has enabled the development of an Atomic Oxygen Erosion Predictive Tool that allows the erosion prediction of new and non-flown polymers. The data has also been used to develop a new NASA Technical Standards Handbook "Spacecraft Polymers Atomic Oxygen Durability Handbook." Many intangible benefits have also been derived from MISSE. For example, over 40 students have collaborated on Glenn's MISSE experiments, which have resulted in greater than $80K in student scholarships and awards in national and international science fairs. Students have also given presentations and won poster competition awards at international space conferences.

  20. Enabling the environmentally clean air transportation of the future: a vision of computational fluid dynamics in 2030.

    PubMed

    Slotnick, Jeffrey P; Khodadoust, Abdollah; Alonso, Juan J; Darmofal, David L; Gropp, William D; Lurie, Elizabeth A; Mavriplis, Dimitri J; Venkatakrishnan, Venkat

    2014-08-13

    As global air travel expands rapidly to meet demand generated by economic growth, it is essential to continue to improve the efficiency of air transportation to reduce its carbon emissions and address concerns about climate change. Future transports must be 'cleaner' and designed to include technologies that will continue to lower engine emissions and reduce community noise. The use of computational fluid dynamics (CFD) will be critical to enable the design of these new concepts. In general, the ability to simulate aerodynamic and reactive flows using CFD has progressed rapidly during the past several decades and has fundamentally changed the aerospace design process. Advanced simulation capabilities not only enable reductions in ground-based and flight-testing requirements, but also provide added physical insight, and enable superior designs at reduced cost and risk. In spite of considerable success, reliable use of CFD has remained confined to a small region of the operating envelope due, in part, to the inability of current methods to reliably predict turbulent, separated flows. Fortunately, the advent of much more powerful computing platforms provides an opportunity to overcome a number of these challenges. This paper summarizes the findings and recommendations from a recent NASA-funded study that provides a vision for CFD in the year 2030, including an assessment of critical technology gaps and needed development, and identifies the key CFD technology advancements that will enable the design and development of much cleaner aircraft in the future. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  1. The prognostic value of CT radiomic features for patients with pulmonary adenocarcinoma treated with EGFR tyrosine kinase inhibitors

    PubMed Central

    Kim, Hyungjin; Park, Sang Joon; Kim, Miso; Kim, Tae Min; Kim, Dong-Wan; Heo, Dae Seog; Goo, Jin Mo

    2017-01-01

    Purpose To determine if the radiomic features on CT can predict progression-free survival (PFS) in epidermal growth factor receptor (EGFR) mutant adenocarcinoma patients treated with first-line EGFR tyrosine kinase inhibitors (TKIs) and to identify the incremental value of radiomic features over conventional clinical factors in PFS prediction. Methods In this institutional review board–approved retrospective study, pretreatment contrast-enhanced CT and first follow-up CT after initiation of TKIs were analyzed in 48 patients (M:F = 23:25; median age: 61 years). Radiomic features at baseline, at 1st first follow-up, and the percentage change between the two were determined. A Cox regression model was used to predict PFS with nonredundant radiomic features and clinical factors, respectively. The incremental value of radiomic features over the clinical factors in PFS prediction was also assessed by way of a concordance index. Results Roundness (HR: 3.91; 95% CI: 1.72, 8.90; P = 0.001) and grey-level nonuniformity (HR: 3.60; 95% CI: 1.80, 7.18; P<0.001) were independent predictors of PFS. For clinical factors, patient age (HR: 2.11; 95% CI: 1.01, 4.39; P = 0.046), baseline tumor diameter (HR: 1.03; 95% CI: 1.01, 1.05; P = 0.002), and treatment response (HR: 0.46; 95% CI: 0.24, 0.87; P = 0.017) were independent predictors. The addition of radiomic features to clinical factors significantly improved predictive performance (concordance index; combined model = 0.77, clinical-only model = 0.69, P<0.001). Conclusions Radiomic features enable PFS estimation in EGFR mutant adenocarcinoma patients treated with first-line EGFR TKIs. Radiomic features combined with clinical factors provide significant improvement in prognostic performance compared with using only clinical factors. PMID:29099855

  2. Machine-Learning-Based Electronic Triage More Accurately Differentiates Patients With Respect to Clinical Outcomes Compared With the Emergency Severity Index.

    PubMed

    Levin, Scott; Toerper, Matthew; Hamrock, Eric; Hinson, Jeremiah S; Barnes, Sean; Gardner, Heather; Dugas, Andrea; Linton, Bob; Kirsch, Tom; Kelen, Gabor

    2018-05-01

    Standards for emergency department (ED) triage in the United States rely heavily on subjective assessment and are limited in their ability to risk-stratify patients. This study seeks to evaluate an electronic triage system (e-triage) based on machine learning that predicts likelihood of acute outcomes enabling improved patient differentiation. A multisite, retrospective, cross-sectional study of 172,726 ED visits from urban and community EDs was conducted. E-triage is composed of a random forest model applied to triage data (vital signs, chief complaint, and active medical history) that predicts the need for critical care, an emergency procedure, and inpatient hospitalization in parallel and translates risk to triage level designations. Predicted outcomes and secondary outcomes of elevated troponin and lactate levels were evaluated and compared with the Emergency Severity Index (ESI). E-triage predictions had an area under the curve ranging from 0.73 to 0.92 and demonstrated equivalent or improved identification of clinical patient outcomes compared with ESI at both EDs. E-triage provided rationale for risk-based differentiation of the more than 65% of ED visits triaged to ESI level 3. Matching the ESI patient distribution for comparisons, e-triage identified more than 10% (14,326 patients) of ESI level 3 patients requiring up triage who had substantially increased risk of critical care or emergency procedure (1.7% ESI level 3 versus 6.2% up triaged) and hospitalization (18.9% versus 45.4%) across EDs. E-triage more accurately classifies ESI level 3 patients and highlights opportunities to use predictive analytics to support triage decisionmaking. Further prospective validation is needed. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  3. Predicting disulfide connectivity from protein sequence using multiple sequence feature vectors and secondary structure.

    PubMed

    Song, Jiangning; Yuan, Zheng; Tan, Hao; Huber, Thomas; Burrage, Kevin

    2007-12-01

    Disulfide bonds are primary covalent crosslinks between two cysteine residues in proteins that play critical roles in stabilizing the protein structures and are commonly found in extracy-toplasmatic or secreted proteins. In protein folding prediction, the localization of disulfide bonds can greatly reduce the search in conformational space. Therefore, there is a great need to develop computational methods capable of accurately predicting disulfide connectivity patterns in proteins that could have potentially important applications. We have developed a novel method to predict disulfide connectivity patterns from protein primary sequence, using a support vector regression (SVR) approach based on multiple sequence feature vectors and predicted secondary structure by the PSIPRED program. The results indicate that our method could achieve a prediction accuracy of 74.4% and 77.9%, respectively, when averaged on proteins with two to five disulfide bridges using 4-fold cross-validation, measured on the protein and cysteine pair on a well-defined non-homologous dataset. We assessed the effects of different sequence encoding schemes on the prediction performance of disulfide connectivity. It has been shown that the sequence encoding scheme based on multiple sequence feature vectors coupled with predicted secondary structure can significantly improve the prediction accuracy, thus enabling our method to outperform most of other currently available predictors. Our work provides a complementary approach to the current algorithms that should be useful in computationally assigning disulfide connectivity patterns and helps in the annotation of protein sequences generated by large-scale whole-genome projects. The prediction web server and Supplementary Material are accessible at http://foo.maths.uq.edu.au/~huber/disulfide

  4. Quality and price--impact on patient satisfaction.

    PubMed

    Pantouvakis, Angelos; Bouranta, Nancy

    2014-01-01

    The purpose of this paper is to synthesize existing quality-measurement models and applies them to healthcare by combining a Nordic service-quality with an American service performance model. Results are based on a questionnaire survey of 1,298 respondents. Service quality dimensions were derived and related to satisfaction by employing a multinomial logistic model, which allows prediction and service improvement. Qualitative and empirical evidence indicates that customer satisfaction and service quality are multi-dimensional constructs, whose quality components, together with convenience and cost, influence the customer's overall satisfaction. The proposed model identifies important quality and satisfaction issues. It also enables transitions between different responses in different studies to be compared.

  5. Real-Time Charging Strategies for an Electric Vehicle Aggregator to Provide Ancillary Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wenzel, George; Negrete-Pincetic, Matias; Olivares, Daniel E.

    Real-time charging strategies, in the context of vehicle to grid (V2G) technology, are needed to enable the use of electric vehicle (EV) fleets batteries to provide ancillary services (AS). Here, we develop tools to manage charging and discharging in a fleet to track an Automatic Generation Control (AGC) signal when aggregated. We also propose a real-time controller that considers bidirectional charging efficiency and extend it to study the effect of looking ahead when implementing Model Predictive Control (MPC). Simulations show that the controller improves tracking error as compared with benchmark scheduling algorithms, as well as regulation capacity and battery cycling.

  6. Modelling directional solidification

    NASA Technical Reports Server (NTRS)

    Wilcox, William R.

    1990-01-01

    The long range goal is to develop an improved understanding of phenomena of importance to directional solidification, to enable explanation and prediction of differences in behavior between solidification on Earth and in space. Emphasis during the period of this grant was on experimentally determining the influence of convection and freezing rate fluctuations on compositional homogeneity and crystalline perfection in the vertical Bridgman-Stockbarger technique. Heater temperature profiles, buoyancy-driven convection, and doping inhomogeneties were correlated using naphthalene doped with azulene. In addition the influence of spin-up/spin-down on compositional homogeneity and microstructure of indium gallium antimonide and the effect of imposed melting-freezing cycles on indium gallium antimonide are discussed.

  7. Real-Time Charging Strategies for an Electric Vehicle Aggregator to Provide Ancillary Services

    DOE PAGES

    Wenzel, George; Negrete-Pincetic, Matias; Olivares, Daniel E.; ...

    2017-03-13

    Real-time charging strategies, in the context of vehicle to grid (V2G) technology, are needed to enable the use of electric vehicle (EV) fleets batteries to provide ancillary services (AS). Here, we develop tools to manage charging and discharging in a fleet to track an Automatic Generation Control (AGC) signal when aggregated. We also propose a real-time controller that considers bidirectional charging efficiency and extend it to study the effect of looking ahead when implementing Model Predictive Control (MPC). Simulations show that the controller improves tracking error as compared with benchmark scheduling algorithms, as well as regulation capacity and battery cycling.

  8. Treating cancer with selective CDK4/6 inhibitors.

    PubMed

    O'Leary, Ben; Finn, Richard S; Turner, Nicholas C

    2016-07-01

    Uncontrolled cellular proliferation, mediated by dysregulation of the cell-cycle machinery and activation of cyclin-dependent kinases (CDKs) to promote cell-cycle progression, lies at the heart of cancer as a pathological process. Clinical implementation of first-generation, nonselective CDK inhibitors, designed to inhibit this proliferation, was originally hampered by the high risk of toxicity and lack of efficacy noted with these agents. The emergence of a new generation of selective CDK4/6 inhibitors, including ribociclib, abemaciclib and palbociclib, has enabled tumour types in which CDK4/6 has a pivotal role in the G1-to-S-phase cell-cycle transition to be targeted with improved effectiveness, and fewer adverse effects. Results of pivotal phase III trials investigating palbociclib in patients with advanced-stage oestrogen receptor (ER)-positive breast cancer have demonstrated a substantial improvement in progression-free survival, with a well-tolerated toxicity profile. Mechanisms of acquired resistance to CDK4/6 inhibitors are beginning to emerge that, although unwelcome, might enable rational post-CDK4/6 inhibitor therapeutic strategies to be identified. Extending the use of CDK4/6 inhibitors beyond ER-positive breast cancer is challenging, and will likely require biomarkers that are predictive of a response, and the use of combination therapies in order to optimize CDK4/6 targeting.

  9. High Resolution Sensing and Control of Urban Water Networks

    NASA Astrophysics Data System (ADS)

    Bartos, M. D.; Wong, B. P.; Kerkez, B.

    2016-12-01

    We present a framework to enable high-resolution sensing, modeling, and control of urban watersheds using (i) a distributed sensor network based on low-cost cellular-enabled motes, (ii) hydraulic models powered by a cloud computing infrastructure, and (iii) automated actuation valves that allow infrastructure to be controlled in real time. This platform initiates two major advances. First, we achieve a high density of measurements in urban environments, with an anticipated 40+ sensors over each urban area of interest. In addition to new measurements, we also illustrate the design and evaluation of a "smart" control system for real-world hydraulic networks. This control system improves water quality and mitigates flooding by using real-time hydraulic models to adaptively control releases from retention basins. We evaluate the potential of this platform through two ongoing deployments: (i) a flood monitoring network in the Dallas-Fort Worth metropolitan area that detects and anticipates floods at the level of individual roadways, and (ii) a real-time hydraulic control system in the city of Ann Arbor, MI—soon to be one of the most densely instrumented urban watersheds in the United States. Through these applications, we demonstrate that distributed sensing and control of water infrastructure can improve flash flood predictions, emergency response, and stormwater contaminant mitigation.

  10. Prediction of recovery of motor function after stroke.

    PubMed

    Stinear, Cathy

    2010-12-01

    Stroke is a leading cause of disability. The ability to live independently after stroke depends largely on the reduction of motor impairment and the recovery of motor function. Accurate prediction of motor recovery assists rehabilitation planning and supports realistic goal setting by clinicians and patients. Initial impairment is negatively related to degree of recovery, but inter-individual variability makes accurate prediction difficult. Neuroimaging and neurophysiological assessments can be used to measure the extent of stroke damage to the motor system and predict subsequent recovery of function, but these techniques are not yet used routinely. The use of motor impairment scores and neuroimaging has been refined by two recent studies in which these investigations were used at multiple time points early after stroke. Voluntary finger extension and shoulder abduction within 5 days of stroke predicted subsequent recovery of upper-limb function. Diffusion-weighted imaging within 7 days detected the effects of stroke on caudal motor pathways and was predictive of lasting motor impairment. Thus, investigations done soon after stroke had good prognostic value. The potential prognostic value of cortical activation and neural plasticity has been explored for the first time by two recent studies. Functional MRI detected a pattern of cortical activation at the acute stage that was related to subsequent reduction in motor impairment. Transcranial magnetic stimulation enabled measurement of neural plasticity in the primary motor cortex, which was related to subsequent disability. These studies open interesting new lines of enquiry. WHERE NEXT?: The accuracy of prediction might be increased by taking into account the motor system's capacity for functional reorganisation in response to therapy, in addition to the extent of stroke-related damage. Improved prognostic accuracy could also be gained by combining simple tests of motor impairment with neuroimaging, genotyping, and neurophysiological assessment of neural plasticity. The development of algorithms to guide the sequential combinations of these assessments could also further increase accuracy, in addition to improving rehabilitation planning and outcomes. Copyright © 2010 Elsevier Ltd. All rights reserved.

  11. Eye Movements During Everyday Behavior Predict Personality Traits.

    PubMed

    Hoppe, Sabrina; Loetscher, Tobias; Morey, Stephanie A; Bulling, Andreas

    2018-01-01

    Besides allowing us to perceive our surroundings, eye movements are also a window into our mind and a rich source of information on who we are, how we feel, and what we do. Here we show that eye movements during an everyday task predict aspects of our personality. We tracked eye movements of 42 participants while they ran an errand on a university campus and subsequently assessed their personality traits using well-established questionnaires. Using a state-of-the-art machine learning method and a rich set of features encoding different eye movement characteristics, we were able to reliably predict four of the Big Five personality traits (neuroticism, extraversion, agreeableness, conscientiousness) as well as perceptual curiosity only from eye movements. Further analysis revealed new relations between previously neglected eye movement characteristics and personality. Our findings demonstrate a considerable influence of personality on everyday eye movement control, thereby complementing earlier studies in laboratory settings. Improving automatic recognition and interpretation of human social signals is an important endeavor, enabling innovative design of human-computer systems capable of sensing spontaneous natural user behavior to facilitate efficient interaction and personalization.

  12. Eye Movements During Everyday Behavior Predict Personality Traits

    PubMed Central

    Hoppe, Sabrina; Loetscher, Tobias; Morey, Stephanie A.; Bulling, Andreas

    2018-01-01

    Besides allowing us to perceive our surroundings, eye movements are also a window into our mind and a rich source of information on who we are, how we feel, and what we do. Here we show that eye movements during an everyday task predict aspects of our personality. We tracked eye movements of 42 participants while they ran an errand on a university campus and subsequently assessed their personality traits using well-established questionnaires. Using a state-of-the-art machine learning method and a rich set of features encoding different eye movement characteristics, we were able to reliably predict four of the Big Five personality traits (neuroticism, extraversion, agreeableness, conscientiousness) as well as perceptual curiosity only from eye movements. Further analysis revealed new relations between previously neglected eye movement characteristics and personality. Our findings demonstrate a considerable influence of personality on everyday eye movement control, thereby complementing earlier studies in laboratory settings. Improving automatic recognition and interpretation of human social signals is an important endeavor, enabling innovative design of human–computer systems capable of sensing spontaneous natural user behavior to facilitate efficient interaction and personalization. PMID:29713270

  13. Predicting Drug Safety and Communicating Risk: Benefits of a Bayesian Approach.

    PubMed

    Lazic, Stanley E; Edmunds, Nicholas; Pollard, Christopher E

    2018-03-01

    Drug toxicity is a major source of attrition in drug discovery and development. Pharmaceutical companies routinely use preclinical data to predict clinical outcomes and continue to invest in new assays to improve predictions. However, there are many open questions about how to make the best use of available data, combine diverse data, quantify risk, and communicate risk and uncertainty to enable good decisions. The costs of suboptimal decisions are clear: resources are wasted and patients may be put at risk. We argue that Bayesian methods provide answers to all of these problems and use hERG-mediated QT prolongation as a case study. Benefits of Bayesian machine learning models include intuitive probabilistic statements of risk that incorporate all sources of uncertainty, the option to include diverse data and external information, and visualizations that have a clear link between the output from a statistical model and what this means for risk. Furthermore, Bayesian methods are easy to use with modern software, making their adoption for safety screening straightforward. We include R and Python code to encourage the adoption of these methods.

  14. Scientific Prediction and Prophetic Patenting in Drug Discovery.

    PubMed

    Curry, Stephen H; Schneiderman, Anne M

    2015-01-01

    Pharmaceutical patenting involves writing claims based on both discoveries already made, and on prophesy of future developments in an ongoing project. This is necessitated by the very different timelines involved in the drug discovery and product development process on the one hand, and successful patenting on the other. If patents are sought too early there is a risk that patent examiners will disallow claims because of lack of enablement. If patenting is delayed, claims are at risk of being denied on the basis of existence of prior art, because the body of relevant known science will have developed significantly while the project was being pursued. This review examines the role of prophetic patenting in relation to the essential predictability of many aspects of drug discovery science, promoting the concepts of discipline-related and project-related prediction. This is especially directed towards patenting activities supporting commercialization of academia-based discoveries, where long project timelines occur, and where experience, and resources to pay for patenting, are limited. The need for improved collaborative understanding among project scientists, technology transfer professionals in, for example, universities, patent attorneys, and patent examiners is emphasized.

  15. Genome-Wide Association Analysis of Adaptation Using Environmentally Predicted Traits

    PubMed Central

    van Zanten, Martijn

    2015-01-01

    Current methods for studying the genetic basis of adaptation evaluate genetic associations with ecologically relevant traits or single environmental variables, under the implicit assumption that natural selection imposes correlations between phenotypes, environments and genotypes. In practice, observed trait and environmental data are manifestations of unknown selective forces and are only indirectly associated with adaptive genetic variation. In theory, improved estimation of these forces could enable more powerful detection of loci under selection. Here we present an approach in which we approximate adaptive variation by modeling phenotypes as a function of the environment and using the predicted trait in multivariate and univariate genome-wide association analysis (GWAS). Based on computer simulations and published flowering time data from the model plant Arabidopsis thaliana, we find that environmentally predicted traits lead to higher recovery of functional loci in multivariate GWAS and are more strongly correlated to allele frequencies at adaptive loci than individual environmental variables. Our results provide an example of the use of environmental data to obtain independent and meaningful information on adaptive genetic variation. PMID:26496492

  16. Modeling snowmelt infiltration in seasonally frozen ground

    NASA Astrophysics Data System (ADS)

    Budhathoki, S.; Ireson, A. M.

    2017-12-01

    In cold regions, freezing and thawing of the soil govern soil hydraulic properties that shape the surface and subsurface hydrological processes. The partitioning of snowmelt into infiltration and runoff has also important implications for integrated water resource management and flood risk. However, there is an inadequate representation of the snowmelt infiltration into frozen soils in most land-surface and hydrological models, creating the need for improved models and methods. Here we apply, the Frozen Soil Infiltration Model, FroSIn, which is a novel algorithm for infiltration in frozen soils that can be implemented in physically based models of coupled flow and heat transport. In this study, we apply the model in a simple configuration to reproduce observations from field sites in the Canadian prairies, specifically St Denis and Brightwater Creek in Saskatchewan, Canada. We demonstrate the limitations of conventional approaches to simulate infiltration, which systematically over-predict runoff and under predict infiltration. The findings show that FroSIn enables models to predict more reasonable infiltration volumes in frozen soils, and also represent how infiltration-runoff partitioning is impacted by antecedent soil moisture.

  17. Advanced Software for Analysis of High-Speed Rolling-Element Bearings

    NASA Technical Reports Server (NTRS)

    Poplawski, J. V.; Rumbarger, J. H.; Peters, S. M.; Galatis, H.; Flower, R.

    2003-01-01

    COBRA-AHS is a package of advanced software for analysis of rigid or flexible shaft systems supported by rolling-element bearings operating at high speeds under complex mechanical and thermal loads. These loads can include centrifugal and thermal loads generated by motions of bearing components. COBRA-AHS offers several improvements over prior commercial bearing-analysis programs: It includes innovative probabilistic fatigue-life-estimating software that provides for computation of three-dimensional stress fields and incorporates stress-based (in contradistinction to prior load-based) mathematical models of fatigue life. It interacts automatically with the ANSYS finite-element code to generate finite-element models for estimating distributions of temperature and temperature-induced changes in dimensions in iterative thermal/dimensional analyses: thus, for example, it can be used to predict changes in clearances and thermal lockup. COBRA-AHS provides an improved graphical user interface that facilitates the iterative cycle of analysis and design by providing analysis results quickly in graphical form, enabling the user to control interactive runs without leaving the program environment, and facilitating transfer of plots and printed results for inclusion in design reports. Additional features include roller-edge stress prediction and influence of shaft and housing distortion on bearing performance.

  18. An Improved Formulation of Hybrid Model Predictive Control With Application to Production-Inventory Systems.

    PubMed

    Nandola, Naresh N; Rivera, Daniel E

    2013-01-01

    We consider an improved model predictive control (MPC) formulation for linear hybrid systems described by mixed logical dynamical (MLD) models. The algorithm relies on a multiple-degree-of-freedom parametrization that enables the user to adjust the speed of setpoint tracking, measured disturbance rejection and unmeasured disturbance rejection independently in the closed-loop system. Consequently, controller tuning is more flexible and intuitive than relying on objective function weights (such as move suppression) traditionally used in MPC schemes. The controller formulation is motivated by the needs of non-traditional control applications that are suitably described by hybrid production-inventory systems. Two applications are considered in this paper: adaptive, time-varying interventions in behavioral health, and inventory management in supply chains under conditions of limited capacity. In the adaptive intervention application, a hypothetical intervention inspired by the Fast Track program, a real-life preventive intervention for reducing conduct disorder in at-risk children, is examined. In the inventory management application, the ability of the algorithm to judiciously alter production capacity under conditions of varying demand is presented. These case studies demonstrate that MPC for hybrid systems can be tuned for desired performance under demanding conditions involving noise and uncertainty.

  19. An Improved Formulation of Hybrid Model Predictive Control With Application to Production-Inventory Systems

    PubMed Central

    Nandola, Naresh N.; Rivera, Daniel E.

    2013-01-01

    We consider an improved model predictive control (MPC) formulation for linear hybrid systems described by mixed logical dynamical (MLD) models. The algorithm relies on a multiple-degree-of-freedom parametrization that enables the user to adjust the speed of setpoint tracking, measured disturbance rejection and unmeasured disturbance rejection independently in the closed-loop system. Consequently, controller tuning is more flexible and intuitive than relying on objective function weights (such as move suppression) traditionally used in MPC schemes. The controller formulation is motivated by the needs of non-traditional control applications that are suitably described by hybrid production-inventory systems. Two applications are considered in this paper: adaptive, time-varying interventions in behavioral health, and inventory management in supply chains under conditions of limited capacity. In the adaptive intervention application, a hypothetical intervention inspired by the Fast Track program, a real-life preventive intervention for reducing conduct disorder in at-risk children, is examined. In the inventory management application, the ability of the algorithm to judiciously alter production capacity under conditions of varying demand is presented. These case studies demonstrate that MPC for hybrid systems can be tuned for desired performance under demanding conditions involving noise and uncertainty. PMID:24348004

  20. Causes and consequences of occupational stress in emergency nurses, a longitudinal study.

    PubMed

    Adriaenssens, Jef; De Gucht, Veronique; Maes, Stan

    2015-04-01

    This longitudinal study examines the influence of changes over time in work and organisational characteristics on job satisfaction, work engagement, emotional exhaustion, turnover intention and psychosomatic distress in emergency room nurses. Organisational and job characteristics of nurses are important predictors of stress-health outcomes. Emergency room nurses are particularly exposed to stressful work-related events and unpredictable work conditions. The study was carried out in 15 emergency departments of Belgian general hospitals in 2008 (T1) and 18 months later (T2) (n = 170). Turnover rates between T1 and T2 were high. Important changes over time were found in predictors and outcomes. Changes in job demand, control and social support predicted job satisfaction, work engagement and emotional exhaustion. In addition, changes in reward, social harassment and work agreements predicted work engagement, emotional exhaustion and intention to leave, respectively. Work-related interventions are important to improve occupational health in emergency room nurses and should focus on lowering job demands, increasing job control, improving social support and a well-balanced reward system. Nursing managers should be aware of the causes and consequences of occupational stress in emergency room nurses in order to enable preventive interventions. © 2013 John Wiley & Sons Ltd.

  1. Improved Dynamic Lightpath Provisioning for Large Wavelength-Division Multiplexed Backbones

    NASA Astrophysics Data System (ADS)

    Kong, Huifang; Phillips, Chris

    2007-07-01

    Technology already exists that would allow future optical networks to support automatic lightpath configuration in response to dynamic traffic demands. Given appropriate commercial drivers, it is possible to foresee carrier network operators migrating away from semipermanent provisioning to enable on-demand short-duration communications. However, with traditional lightpath reservation protocols, a portion of the lightpath is idly held during the signaling propagation phase, which can significantly reduce the lightpath bandwidth efficiency in large wavelength-division multiplexed backbones. This paper proposes a prebooking mechanism to improve the lightpath efficiency over traditional reactive two-way reservation protocols, consequently liberating network resources to support higher traffic loads. The prebooking mechanism predicts the time when the traffic will appear at the optical cross connects, and intelligently schedules the lightpath components such that resources are only consumed as necessary. We describe the proposed signaling procedure for both centralized and distributed control planes and analyze its performance. This paper also investigates the aggregated flow length characteristics with the self-similar incident traffic and examines the effects of traffic prediction on the blocking probability as well as the ability to support latency sensitive traffic in a wide-area environment.

  2. New Toxico-Cheminformatics & Computational Toxicology ...

    EPA Pesticide Factsheets

    EPA’s National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than traditionally employed in SAR modeling, and in ways that facilitate data-mining, and data read-across. The DSSTox Structure-Browser provides structure searchability across all published DSSTox toxicity-related inventory, and is enabling linkages between previously isolated toxicity data resources. As of early March 2008, the public DSSTox inventory has been integrated into PubChem, allowing a user to take full advantage of PubChem structure-activity and bioassay clustering features. The most recent DSSTox version of the Carcinogenic Potency Database file (CPDBAS) illustrates ways in which various summary definitions of carcinogenic activity can be employed in modeling and data mining. Phase I of the ToxCastTM project is generating high-throughput screening data from several hundred biochemical and cell-based assays for a set of 320 chemicals, mostly pesticide actives, with rich toxicology profiles. Incorporating and expanding traditional SAR concepts into this new high-throughput and data-rich world pose conceptual and practical challenges, but also holds great promise for improving predictive capabilities.

  3. Multi-Source Multi-Target Dictionary Learning for Prediction of Cognitive Decline.

    PubMed

    Zhang, Jie; Li, Qingyang; Caselli, Richard J; Thompson, Paul M; Ye, Jieping; Wang, Yalin

    2017-06-01

    Alzheimer's Disease (AD) is the most common type of dementia. Identifying correct biomarkers may determine pre-symptomatic AD subjects and enable early intervention. Recently, Multi-task sparse feature learning has been successfully applied to many computer vision and biomedical informatics researches. It aims to improve the generalization performance by exploiting the shared features among different tasks. However, most of the existing algorithms are formulated as a supervised learning scheme. Its drawback is with either insufficient feature numbers or missing label information. To address these challenges, we formulate an unsupervised framework for multi-task sparse feature learning based on a novel dictionary learning algorithm. To solve the unsupervised learning problem, we propose a two-stage Multi-Source Multi-Target Dictionary Learning (MMDL) algorithm. In stage 1, we propose a multi-source dictionary learning method to utilize the common and individual sparse features in different time slots. In stage 2, supported by a rigorous theoretical analysis, we develop a multi-task learning method to solve the missing label problem. Empirical studies on an N = 3970 longitudinal brain image data set, which involves 2 sources and 5 targets, demonstrate the improved prediction accuracy and speed efficiency of MMDL in comparison with other state-of-the-art algorithms.

  4. Clinical implementation of a knowledge based planning tool for prostate VMAT.

    PubMed

    Powis, Richard; Bird, Andrew; Brennan, Matthew; Hinks, Susan; Newman, Hannah; Reed, Katie; Sage, John; Webster, Gareth

    2017-05-08

    A knowledge based planning tool has been developed and implemented for prostate VMAT radiotherapy plans providing a target average rectum dose value based on previously achievable values for similar rectum/PTV overlap. The purpose of this planning tool is to highlight sub-optimal clinical plans and to improve plan quality and consistency. A historical cohort of 97 VMAT prostate plans was interrogated using a RayStation script and used to develop a local model for predicting optimum average rectum dose based on individual anatomy. A preliminary validation study was performed whereby historical plans identified as "optimal" and "sub-optimal" by the local model were replanned in a blinded study by four experienced planners and compared to the original clinical plan to assess whether any improvement in rectum dose was observed. The predictive model was then incorporated into a RayStation script and used as part of the clinical planning process. Planners were asked to use the script during planning to provide a patient specific prediction for optimum average rectum dose and to optimise the plan accordingly. Plans identified as "sub-optimal" in the validation study observed a statistically significant improvement in average rectum dose compared to the clinical plan when replanned whereas plans that were identified as "optimal" observed no improvement when replanned. This provided confidence that the local model can identify plans that were suboptimal in terms of rectal sparing. Clinical implementation of the knowledge based planning tool reduced the population-averaged mean rectum dose by 5.6Gy. There was a small but statistically significant increase in total MU and femoral head dose and a reduction in conformity index. These did not affect the clinical acceptability of the plans and no significant changes to other plan quality metrics were observed. The knowledge-based planning tool has enabled substantial reductions in population-averaged mean rectum dose for prostate VMAT patients. This suggests plans are improved when planners receive quantitative feedback on plan quality against historical data.

  5. Measuring Earth's radiation imbalance with RAVAN: A CubeSat mission to measure the driver of global climate change

    NASA Astrophysics Data System (ADS)

    Swartz, W. H.; Dyrud, L. P.; Wiscombe, W. J.; Lorentz, S. R.; Papadakis, S.; Summers, R. A.; Smith, A. W.; Wu, D. L.; Deglau, D. M.; Arnold, S. P.

    2013-12-01

    The Earth radiation imbalance (ERI) is the single most important quantity for predicting the course of climate change over the next century. It is also the single most important metric for any geo-engineering scheme. We review the current scientific understanding of ERI and present a recently funded CubeSat mission, the Radiometer Assessment using Vertically Aligned Nanotubes (RAVAN), that will demonstrate an affordable, accurate radiometer that directly measures Earth-leaving fluxes of total and solar-reflected radiation. Coupled with knowledge of the incoming radiation from the Sun, RAVAN directly gives ERI. The objective of RAVAN is to demonstrate that a compact spaceborne radiometer that is absolutely accurate to NIST-traceable standards can be built for low cost. The key technologies that enable a radiometer with all these attributes are: a gallium fixed-point blackbody as a built-in calibration source and a vertically aligned carbon nanotube (VACNT) absorber. VACNTs are the blackest known substance, making them ideal radiometer absorbers with order-of-magnitude improvements in spectral flatness and stability over the existing art. The Johns Hopkins University Applied Physics Laboratory heritage 3U Multi-Mission Nanosat will host RAVAN, providing the reliability, agility, and resources needed. RAVAN will pave the way for a constellation Earth radiation budget mission that can provide the measurements needed to enable vastly superior predictions of future climate change.

  6. Machine Learning Estimation of Atom Condensed Fukui Functions.

    PubMed

    Zhang, Qingyou; Zheng, Fangfang; Zhao, Tanfeng; Qu, Xiaohui; Aires-de-Sousa, João

    2016-02-01

    To enable the fast estimation of atom condensed Fukui functions, machine learning algorithms were trained with databases of DFT pre-calculated values for ca. 23,000 atoms in organic molecules. The problem was approached as the ranking of atom types with the Bradley-Terry (BT) model, and as the regression of the Fukui function. Random Forests (RF) were trained to predict the condensed Fukui function, to rank atoms in a molecule, and to classify atoms as high/low Fukui function. Atomic descriptors were based on counts of atom types in spheres around the kernel atom. The BT coefficients assigned to atom types enabled the identification (93-94 % accuracy) of the atom with the highest Fukui function in pairs of atoms in the same molecule with differences ≥0.1. In whole molecules, the atom with the top Fukui function could be recognized in ca. 50 % of the cases and, on the average, about 3 of the top 4 atoms could be recognized in a shortlist of 4. Regression RF yielded predictions for test sets with R(2) =0.68-0.69, improving the ability of BT coefficients to rank atoms in a molecule. Atom classification (as high/low Fukui function) was obtained with RF with sensitivity of 55-61 % and specificity of 94-95 %. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Upgrading HepG2 cells with adenoviral vectors that encode drug-metabolizing enzymes: application for drug hepatotoxicity testing.

    PubMed

    Gómez-Lechón, M José; Tolosa, Laia; Donato, M Teresa

    2017-02-01

    Drug attrition rates due to hepatotoxicity are an important safety issue considered in drug development. The HepG2 hepatoma cell line is currently being used for drug-induced hepatotoxicity evaluations, but its expression of drug-metabolizing enzymes is poor compared with hepatocytes. Different approaches have been proposed to upgrade HepG2 cells for more reliable drug-induced liver injury predictions. Areas covered: We describe the advantages and limitations of HepG2 cells transduced with adenoviral vectors that encode drug-metabolizing enzymes for safety risk assessments of bioactivable compounds. Adenoviral transduction facilitates efficient and controlled delivery of multiple drug-metabolizing activities to HepG2 cells at comparable levels to primary human hepatocytes by generating an 'artificial hepatocyte'. Furthermore, adenoviral transduction enables the design of tailored cells expressing particular metabolic capacities. Expert opinion: Upgraded HepG2 cells that recreate known inter-individual variations in hepatic CYP and conjugating activities due to both genetic (e.g., polymorphisms) or environmental (e.g., induction, inhibition) factors seems a suitable model to identify bioactivable drug and conduct hepatotoxicity risk assessments. This strategy should enable the generation of customized cells by reproducing human pheno- and genotypic CYP variability to represent a valuable human hepatic cell model to develop new safer drugs and to improve existing predictive toxicity assays.

  8. Inference of ecological and social drivers of human brain-size evolution.

    PubMed

    González-Forero, Mauricio; Gardner, Andy

    2018-05-01

    The human brain is unusually large. It has tripled in size from Australopithecines to modern humans 1 and has become almost six times larger than expected for a placental mammal of human size 2 . Brains incur high metabolic costs 3 and accordingly a long-standing question is why the large human brain has evolved 4 . The leading hypotheses propose benefits of improved cognition for overcoming ecological 5-7 , social 8-10 or cultural 11-14 challenges. However, these hypotheses are typically assessed using correlative analyses, and establishing causes for brain-size evolution remains difficult 15,16 . Here we introduce a metabolic approach that enables causal assessment of social hypotheses for brain-size evolution. Our approach yields quantitative predictions for brain and body size from formalized social hypotheses given empirical estimates of the metabolic costs of the brain. Our model predicts the evolution of adult Homo sapiens-sized brains and bodies when individuals face a combination of 60% ecological, 30% cooperative and 10% between-group competitive challenges, and suggests that between-individual competition has been unimportant for driving human brain-size evolution. Moreover, our model indicates that brain expansion in Homo was driven by ecological rather than social challenges, and was perhaps strongly promoted by culture. Our metabolic approach thus enables causal assessments that refine, refute and unify hypotheses of brain-size evolution.

  9. Prediction of flood abnormalities for improved public safety using a modified adaptive neuro-fuzzy inference system.

    PubMed

    Aqil, M; Kita, I; Yano, A; Nishiyama, S

    2006-01-01

    It is widely accepted that an efficient flood alarm system may significantly improve public safety and mitigate economical damages caused by inundations. In this paper, a modified adaptive neuro-fuzzy system is proposed to modify the traditional neuro-fuzzy model. This new method employs a rule-correction based algorithm to replace the error back propagation algorithm that is employed by the traditional neuro-fuzzy method in backward pass calculation. The final value obtained during the backward pass calculation using the rule-correction algorithm is then considered as a mapping function of the learning mechanism of the modified neuro-fuzzy system. Effectiveness of the proposed identification technique is demonstrated through a simulation study on the flood series of the Citarum River in Indonesia. The first four-year data (1987 to 1990) was used for model training/calibration, while the other remaining data (1991 to 2002) was used for testing the model. The number of antecedent flows that should be included in the input variables was determined by two statistical methods, i.e. autocorrelation and partial autocorrelation between the variables. Performance accuracy of the model was evaluated in terms of two statistical indices, i.e. mean average percentage error and root mean square error. The algorithm was developed in a decision support system environment in order to enable users to process the data. The decision support system is found to be useful due to its interactive nature, flexibility in approach, and evolving graphical features, and can be adopted for any similar situation to predict the streamflow. The main data processing includes gauging station selection, input generation, lead-time selection/generation, and length of prediction. This program enables users to process the flood data, to train/test the model using various input options, and to visualize results. The program code consists of a set of files, which can be modified as well to match other purposes. This program may also serve as a tool for real-time flood monitoring and process control. The results indicate that the modified neuro-fuzzy model applied to the flood prediction seems to have reached encouraging results for the river basin under examination. The comparison of the modified neuro-fuzzy predictions with the observed data was satisfactory, where the error resulted from the testing period was varied between 2.632% and 5.560%. Thus, this program may also serve as a tool for real-time flood monitoring and process control.

  10. Affordable non-traditional source data mining for context assessment to improve distributed fusion system robustness

    NASA Astrophysics Data System (ADS)

    Bowman, Christopher; Haith, Gary; Steinberg, Alan; Morefield, Charles; Morefield, Michael

    2013-05-01

    This paper describes methods to affordably improve the robustness of distributed fusion systems by opportunistically leveraging non-traditional data sources. Adaptive methods help find relevant data, create models, and characterize the model quality. These methods also can measure the conformity of this non-traditional data with fusion system products including situation modeling and mission impact prediction. Non-traditional data can improve the quantity, quality, availability, timeliness, and diversity of the baseline fusion system sources and therefore can improve prediction and estimation accuracy and robustness at all levels of fusion. Techniques are described that automatically learn to characterize and search non-traditional contextual data to enable operators integrate the data with the high-level fusion systems and ontologies. These techniques apply the extension of the Data Fusion & Resource Management Dual Node Network (DNN) technical architecture at Level 4. The DNN architecture supports effectively assessment and management of the expanded portfolio of data sources, entities of interest, models, and algorithms including data pattern discovery and context conformity. Affordable model-driven and data-driven data mining methods to discover unknown models from non-traditional and `big data' sources are used to automatically learn entity behaviors and correlations with fusion products, [14 and 15]. This paper describes our context assessment software development, and the demonstration of context assessment of non-traditional data to compare to an intelligence surveillance and reconnaissance fusion product based upon an IED POIs workflow.

  11. Tele-ICU and Patient Safety Considerations.

    PubMed

    Hassan, Erkan

    The tele-ICU is designed to leverage, not replace, the need for bedside clinical expertise in the diagnosis, treatment, and assessment of various critical illnesses. Tele-ICUs are primarily decentralized or centralized models with differing advantages and disadvantages. The centralized model has sufficiently powered published data to be associated with improved mortality and ICU length of stay in a cost-effective manner. Factors associated with improved clinical outcomes include improved compliance with best practices; providing off-hours implementation of the bedside physician's care plan; and identification of and rapid response to physiological instability (initial clinical review within 1 hour) and rapid response to alerts, alarms, or direct notification by bedside clinicians. With improved communication and frequent review of patients between the tele-ICU and the bedside clinicians, the bedside clinician can provide the care that only they can provide. Although technology continues to evolve at a rapid pace, technology alone will most likely not improve clinical outcomes. Technology will enable us to process real or near real-time data into complex and powerful predictive algorithms. However, the remote and bedside teams must work collaboratively to develop care processes to better monitor, prioritize, standardize, and expedite care to drive greater efficiencies and improve patient safety.

  12. Recent advances in understanding idiopathic pulmonary fibrosis

    PubMed Central

    Daccord, Cécile; Maher, Toby M.

    2016-01-01

    Despite major research efforts leading to the recent approval of pirfenidone and nintedanib, the dismal prognosis of idiopathic pulmonary fibrosis (IPF) remains unchanged. The elaboration of international diagnostic criteria and disease stratification models based on clinical, physiological, radiological, and histopathological features has improved the accuracy of IPF diagnosis and prediction of mortality risk. Nevertheless, given the marked heterogeneity in clinical phenotype and the considerable overlap of IPF with other fibrotic interstitial lung diseases (ILDs), about 10% of cases of pulmonary fibrosis remain unclassifiable. Moreover, currently available tools fail to detect early IPF, predict the highly variable course of the disease, and assess response to antifibrotic drugs. Recent advances in understanding the multiple interrelated pathogenic pathways underlying IPF have identified various molecular phenotypes resulting from complex interactions among genetic, epigenetic, transcriptional, post-transcriptional, metabolic, and environmental factors. These different disease endotypes appear to confer variable susceptibility to the condition, differing risks of rapid progression, and, possibly, altered responses to therapy. The development and validation of diagnostic and prognostic biomarkers are necessary to enable a more precise and earlier diagnosis of IPF and to improve prediction of future disease behaviour. The availability of approved antifibrotic therapies together with potential new drugs currently under evaluation also highlights the need for biomarkers able to predict and assess treatment responsiveness, thereby allowing individualised treatment based on risk of progression and drug response. This approach of disease stratification and personalised medicine is already used in the routine management of many cancers and provides a potential road map for guiding clinical care in IPF. PMID:27303645

  13. Multiscale modeling of PVDF matrix carbon fiber composites

    NASA Astrophysics Data System (ADS)

    Greminger, Michael; Haghiashtiani, Ghazaleh

    2017-06-01

    Self-sensing carbon fiber reinforced composites have the potential to enable structural health monitoring that is inherent to the composite material rather than requiring external or embedded sensors. It has been demonstrated that a self-sensing carbon fiber reinforced polymer composite can be created by using the piezoelectric polymer polyvinylidene difluoride (PVDF) as the matrix material and using a Kevlar layer to separate two carbon fiber layers. In this configuration, the electrically conductive carbon fiber layers act as electrodes and the Kevlar layer acts as a dielectric to prevent the electrical shorting of the carbon fiber layers. This composite material has been characterized experimentally for its effective d 33 and d 31 piezoelectric coefficients. However, for design purposes, it is desirable to obtain a predictive model of the effective piezoelectric coefficients for the final smart composite material. Also, the inverse problem can be solved to determine the degree of polarization obtained in the PVDF material during polarization by comparing the effective d 33 and d 31 values obtained in experiment to those predicted by the finite element model. In this study, a multiscale micromechanics and coupled piezoelectric-mechanical finite element modeling approach is introduced to predict the mechanical and piezoelectric performance of a plain weave carbon fiber reinforced PVDF composite. The modeling results show good agreement with the experimental results for the mechanical and electrical properties of the composite. In addition, the degree of polarization of the PVDF component of the composite is predicted using this multiscale modeling approach and shows that there is opportunity to drastically improve the smart composite’s performance by improving the polarization procedure.

  14. Comprehensive sequence-flux mapping of a levoglucosan utilization pathway in E. coli

    DOE PAGES

    Klesmith, Justin R.; Bacik, John -Paul; Michalczyk, Ryszard; ...

    2015-09-14

    Synthetic metabolic pathways often suffer from low specific productivity, and new methods that quickly assess pathway functionality for many thousands of variants are urgently needed. Here we present an approach that enables the rapid and parallel determination of sequence effects on flux for complete gene-encoding sequences. We show that this method can be used to determine the effects of over 8000 single point mutants of a pyrolysis oil catabolic pathway implanted in Escherichia coli. Experimental sequence-function data sets predicted whether fitness-enhancing mutations to the enzyme levoglucosan kinase resulted from enhanced catalytic efficiency or enzyme stability. A structure of one designmore » incorporating 38 mutations elucidated the structural basis of high fitness mutations. One design incorporating 15 beneficial mutations supported a 15-fold improvement in growth rate and greater than 24-fold improvement in enzyme activity relative to the starting pathway. Lastly, this technique can be extended to improve a wide variety of designed pathways.« less

  15. MetaPathways v2.5: quantitative functional, taxonomic and usability improvements.

    PubMed

    Konwar, Kishori M; Hanson, Niels W; Bhatia, Maya P; Kim, Dongjae; Wu, Shang-Ju; Hahn, Aria S; Morgan-Lang, Connor; Cheung, Hiu Kan; Hallam, Steven J

    2015-10-15

    Next-generation sequencing is producing vast amounts of sequence information from natural and engineered ecosystems. Although this data deluge has an enormous potential to transform our lives, knowledge creation and translation need software applications that scale with increasing data processing and analysis requirements. Here, we present improvements to MetaPathways, an annotation and analysis pipeline for environmental sequence information that expedites this transformation. We specifically address pathway prediction hazards through integration of a weighted taxonomic distance and enable quantitative comparison of assembled annotations through a normalized read-mapping measure. Additionally, we improve LAST homology searches through BLAST-equivalent E-values and output formats that are natively compatible with prevailing software applications. Finally, an updated graphical user interface allows for keyword annotation query and projection onto user-defined functional gene hierarchies, including the Carbohydrate-Active Enzyme database. MetaPathways v2.5 is available on GitHub: http://github.com/hallamlab/metapathways2. shallam@mail.ubc.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  16. Blood biomarkers of kidney transplant rejection, an endless search?

    PubMed

    Jacquemont, Lola; Soulillou, Jean-Paul; Degauque, Nicolas

    2017-07-01

    The tailoring of immunosuppressive treatment is recognized as a promising strategy to improve long-term kidney graft outcome. To guide the standard care of transplant recipients, physicians need objective biomarkers that can identify an ongoing pathology with the graft or low intensity signals that will be later evolved to accelerated transplant rejection. The early identification of 'high-risk /low-risk' patients enables the adjustment of standard of caring, including managing the frequency of clinical visits and the immunosuppression dosing. Given their ease of availability and the compatibility with a large technical array, blood-based biomarkers have been widely scrutinized for use as potential predictive and diagnostic biomarkers. Areas covered: Here, the authors report on non-invasive biomarkers, such as modification of immune cell subsets and mRNA and miRNA profiles, identified in the blood of kidney transplant recipients collected before or after transplantation. Expert commentary: Combined with functional tests, the identification of biomarkers will improve our understanding of pathological processes and will contribute to a global improvement in clinical management.

  17. Understanding and Mitigating Tip Leakage and Endwall Losses in High Pressure Ratio Cores

    NASA Technical Reports Server (NTRS)

    Christophel, Jesse

    2015-01-01

    Reducing endwall and tip secondary flow losses will be a key enabler for the next generation of commercial and military air transport and will be an improvement on the state-of-the-art in turbine loss reduction strategies. The objective of this research is three-fold: 1) To improve understanding of endwall secondary flow and tip clearance losses 2) To develop novel technologies to mitigate these losses and test them in low-speed cascade and rig environments 3) To validate predictive tools To accomplish these objectives, Pratt & Whitney (P&W) has teamed with Pennsylvania State University (PSU) to experimentally test new features designed by P&W. P&W will create new rim-cavity features to reduce secondary flow loss and improve purge flow cooling effectiveness and new blade tip features to manage leakage flows and reduce tip leakage secondary flow loss. P&W is currently developing technologies in these two areas that expect to be assimilated in the N+2/N+3 generation of commercial engines.

  18. Predicting growth of graphene nanostructures using high-fidelity atomistic simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCarty, Keven F.; Zhou, Xiaowang; Ward, Donald K.

    2015-09-01

    In this project we developed t he atomistic models needed to predict how graphene grows when carbon is deposited on metal and semiconductor surfaces. We first calculated energies of many carbon configurations using first principles electronic structure calculations and then used these energies to construct an empirical bond order potentials that enable s comprehensive molecular dynamics simulation of growth. We validated our approach by comparing our predictions to experiments of graphene growth on Ir, Cu and Ge. The robustness of ou r understanding of graphene growth will enable high quality graphene to be grown on novel substrates which will expandmore » the number of potential types of graphene electronic devices.« less

  19. UAV-Based Hyperspectral Remote Sensing for Precision Agriculture: Challenges and Opportunities

    NASA Astrophysics Data System (ADS)

    Angel, Y.; Parkes, S. D.; Turner, D.; Houborg, R.; Lucieer, A.; McCabe, M.

    2017-12-01

    Modern agricultural production relies on monitoring crop status by observing and measuring variables such as soil condition, plant health, fertilizer and pesticide effect, irrigation and crop yield. Managing all of these factors is a considerable challenge for crop producers. As such, providing integrated technological solutions that enable improved diagnostics of field condition to maximize profits, while minimizing environmental impacts, would be of much interest. Such challenges can be addressed by implementing remote sensing systems such as hyperspectral imaging to produce precise biophysical indicator maps across the various cycles of crop development. Recent progress in unmanned aerial vehicles (UAVs) have advanced traditional satellite-based capabilities, providing a capacity for high-spatial, spectral and temporal response. However, while some hyperspectral sensors have been developed for use onboard UAVs, significant investment is required to develop a system and data processing workflow that retrieves accurately georeferenced mosaics. Here we explore the use of a pushbroom hyperspectral camera that is integrated on-board a multi-rotor UAV system to measure the surface reflectance in 272 distinct spectral bands across a wavelengths range spanning 400-1000 nm, and outline the requirement for sensor calibration, integration onto a stable UAV platform enabling accurate positional data, flight planning, and development of data post-processing workflows for georeferenced mosaics. The provision of high-quality and geo-corrected imagery facilitates the development of metrics of vegetation health that can be used to identify potential problems such as production inefficiencies, diseases and nutrient deficiencies and other data-streams to enable improved crop management. Immense opportunities remain to be exploited in the implementation of UAV-based hyperspectral sensing (and its combination with other imaging systems) to provide a transferable and scalable integrated framework for crop growth monitoring and yield prediction. Here we explore some of the challenges and issues in translating the available technological capacity into a useful and useable image collection and processing flow-path that enables these potential applications to be better realized.

  20. Quantification of hazard prediction ability at hazard prediction training (Kiken-Yochi Training: KYT) by free-response receiver-operating characteristic (FROC) analysis.

    PubMed

    Hashida, Masahiro; Kamezaki, Ryousuke; Goto, Makoto; Shiraishi, Junji

    2017-03-01

    The ability to predict hazards in possible situations in a general X-ray examination room created for Kiken-Yochi training (KYT) is quantified by use of free-response receiver-operating characteristics (FROC) analysis for determining whether the total number of years of clinical experience, involvement in general X-ray examinations, occupation, and training each have an impact on the hazard prediction ability. Twenty-three radiological technologists (RTs) (years of experience: 2-28), four nurses (years of experience: 15-19), and six RT students observed 53 scenes of KYT: 26 scenes with hazardous points (hazardous points are those that might cause injury to patients) and 27 scenes without points. Based on the results of these observations, we calculated the alternative free-response receiver-operating characteristic (AFROC) curve and the figure of merit (FOM) to quantify the hazard prediction ability. The results showed that the total number of years of clinical experience did not have any impact on hazard prediction ability, whereas recent experience with general X-ray examinations greatly influenced this ability. In addition, the hazard prediction ability varied depending on the occupations of the observers while they were observing the same scenes in KYT. The hazard prediction ability of the radiologic technology students was improved after they had undergone patient safety training. This proposed method with FROC observer study enabled the quantification and evaluation of the hazard prediction capability, and the application of this approach to clinical practice may help to ensure the safety of examinations and treatment in the radiology department.

  1. Multi-Scale Hydrometeorological Modeling, Land Data Assimilation and Parameter Estimation with the Land Information System

    NASA Technical Reports Server (NTRS)

    Peters-Lidard, Christa D.

    2011-01-01

    The Land Information System (LIS; http://lis.gsfc.nasa.gov) is a flexible land surface modeling framework that has been developed with the goal of integrating satellite-and ground-based observational data products and advanced land surface modeling techniques to produce optimal fields of land surface states and fluxes. As such, LIS represents a step towards the next generation land component of an integrated Earth system model. In recognition of LIS object-oriented software design, use and impact in the land surface and hydrometeorological modeling community, the LIS software was selected as a co-winner of NASA?s 2005 Software of the Year award.LIS facilitates the integration of observations from Earth-observing systems and predictions and forecasts from Earth System and Earth science models into the decision-making processes of partnering agency and national organizations. Due to its flexible software design, LIS can serve both as a Problem Solving Environment (PSE) for hydrologic research to enable accurate global water and energy cycle predictions, and as a Decision Support System (DSS) to generate useful information for application areas including disaster management, water resources management, agricultural management, numerical weather prediction, air quality and military mobility assessment. LIS has e volved from two earlier efforts -- North American Land Data Assimilation System (NLDAS) and Global Land Data Assimilation System (GLDAS) that focused primarily on improving numerical weather prediction skills by improving the characterization of the land surface conditions. Both of GLDAS and NLDAS now use specific configurations of the LIS software in their current implementations.In addition, LIS was recently transitioned into operations at the US Air Force Weather Agency (AFWA) to ultimately replace their Agricultural Meteorology (AGRMET) system, and is also used routinely by NOAA's National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center (EMC) for their land data assimilation systems to support weather and climate modeling. LIS not only consolidates the capabilities of these two systems, but also enables a much larger variety of configurations with respect to horizontal spatial resolution, input datasets and choice of land surface model through "plugins". LIS has been coupled to the Weather Research and Forecasting (WRF) model to support studies of land-atmosphere coupling be enabling ensembles of land surface states to be tested against multiple representations of the atmospheric boundary layer. LIS has also been demonstrated for parameter estimation, who showed that the use of sequential remotely sensed soil moisture products can be used to derive soil hydraulic and texture properties given a sufficient dynamic range in the soil moisture retrievals and accurate precipitation inputs.LIS has also recently been demonstrated for multi-model data assimilation using an Ensemble Kalman Filter for sequential assimilation of soil moisture, snow, and temperature.Ongoing work has demonstrated the value of bias correction as part of the filter, and also that of joint calibration and assimilation.Examples and case studies demonstrating the capabilities and impacts of LIS for hydrometeorological modeling, assimilation and parameter estimation will be presented as advancements towards the next generation of integrated observation and modeling systems

  2. Prediction of delayed retention of antibodies in hydrophobic interaction chromatography from sequence using machine learning.

    PubMed

    Jain, Tushar; Boland, Todd; Lilov, Asparouh; Burnina, Irina; Brown, Michael; Xu, Yingda; Vásquez, Maximiliano

    2017-12-01

    The hydrophobicity of a monoclonal antibody is an important biophysical property relevant for its developability into a therapeutic. In addition to characterizing heterogeneity, Hydrophobic Interaction Chromatography (HIC) is an assay that is often used to quantify the hydrophobicity of an antibody to assess downstream risks. Earlier studies have shown that retention times in this assay can be correlated to amino-acid or atomic propensities weighted by the surface areas obtained from protein 3-dimensional structures. The goal of this study is to develop models to enable prediction of delayed HIC retention times directly from sequence. We utilize the randomforest machine learning approach to estimate the surface exposure of amino-acid side-chains in the variable region directly from the antibody sequence. We obtain mean-absolute errors of 4.6% for the prediction of surface exposure. Using experimental HIC data along with the estimated surface areas, we derive an amino-acid propensity scale that enables prediction of antibodies likely to have delayed retention times in the assay. We achieve a cross-validation Area Under Curve of 0.85 for the Receiver Operating Characteristic curve of our model. The low computational expense and high accuracy of this approach enables real-time assessment of hydrophobic character to enable prioritization of antibodies during the discovery process and rational engineering to reduce hydrophobic liabilities. Structure data, aligned sequences, experimental data and prediction scores for test-cases, and R scripts used in this work are provided as part of the Supplementary Material. tushar.jain@adimab.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  3. Core-Noise Research

    NASA Technical Reports Server (NTRS)

    Hultgren, Lennart S.

    2012-01-01

    This presentation is a technical summary of and outlook for NASA-internal and NASA-sponsored external research on core noise funded by the Fundamental Aeronautics Program Subsonic Fixed Wing (SFW) Project. Sections of the presentation cover: the SFW system-level noise metrics for the 2015 (N+1), 2020 (N+2), and 2025 (N+3) timeframes; SFW strategic thrusts and technical challenges; SFW advanced subsystems that are broadly applicable to N+3 vehicle concepts, with an indication where further noise research is needed; the components of core noise (compressor, combustor and turbine noise) and a rationale for NASA's current emphasis on the combustor-noise component; the increase in the relative importance of core noise due to turbofan design trends; the need to understand and mitigate core-noise sources for high-efficiency small gas generators; and the current research activities in the core-noise area, with additional details given about forthcoming updates to NASA's Aircraft Noise Prediction Program (ANOPP) core-noise prediction capabilities, two NRA efforts (Honeywell International, Phoenix, AZ and University of Illinois at Urbana-Champaign, respectively) to improve the understanding of core-noise sources and noise propagation through the engine core, and an effort to develop oxide/oxide ceramic-matrix-composite (CMC) liners for broadband noise attenuation suitable for turbofan-core application. Core noise must be addressed to ensure that the N+3 noise goals are met. Focused, but long-term, core-noise research is carried out to enable the advanced high-efficiency small gas-generator subsystem, common to several N+3 conceptual designs, needed to meet NASA's technical challenges. Intermediate updates to prediction tools are implemented as the understanding of the source structure and engine-internal propagation effects is improved. The NASA Fundamental Aeronautics Program has the principal objective of overcoming today's national challenges in air transportation. The SFW Quiet-Aircraft Subproject aims to develop concepts and technologies to reduce perceived community noise attributable to aircraft with minimal impact on weight and performance. This reduction of aircraft noise is critical to enabling the anticipated large increase in future air traffic.

  4. The wave-based substructuring approach for the efficient description of interface dynamics in substructuring

    NASA Astrophysics Data System (ADS)

    Donders, S.; Pluymers, B.; Ragnarsson, P.; Hadjit, R.; Desmet, W.

    2010-04-01

    In the vehicle design process, design decisions are more and more based on virtual prototypes. Due to competitive and regulatory pressure, vehicle manufacturers are forced to improve product quality, to reduce time-to-market and to launch an increasing number of design variants on the global market. To speed up the design iteration process, substructuring and component mode synthesis (CMS) methods are commonly used, involving the analysis of substructure models and the synthesis of the substructure analysis results. Substructuring and CMS enable efficient decentralized collaboration across departments and allow to benefit from the availability of parallel computing environments. However, traditional CMS methods become prohibitively inefficient when substructures are coupled along large interfaces, i.e. with a large number of degrees of freedom (DOFs) at the interface between substructures. The reason is that the analysis of substructures involves the calculation of a number of enrichment vectors, one for each interface degree of freedom (DOF). Since large interfaces are common in vehicles (e.g. the continuous line connections to connect the body with the windshield, roof or floor), this interface bottleneck poses a clear limitation in the vehicle noise, vibration and harshness (NVH) design process. Therefore there is a need to describe the interface dynamics more efficiently. This paper presents a wave-based substructuring (WBS) approach, which allows reducing the interface representation between substructures in an assembly by expressing the interface DOFs in terms of a limited set of basis functions ("waves"). As the number of basis functions can be much lower than the number of interface DOFs, this greatly facilitates the substructure analysis procedure and results in faster design predictions. The waves are calculated once from a full nominal assembly analysis, but these nominal waves can be re-used for the assembly of modified components. The WBS approach thus enables efficient structural modification predictions of the global modes, so that efficient vibro-acoustic design modification, optimization and robust design become possible. The results show that wave-based substructuring offers a clear benefit for vehicle design modifications, by improving both the speed of component reduction processes and the efficiency and accuracy of design iteration predictions, as compared to conventional substructuring approaches.

  5. Governing Laws of Complex System Predictability under Co-evolving Uncertainty Sources: Theory and Nonlinear Geophysical Applications

    NASA Astrophysics Data System (ADS)

    Perdigão, R. A. P.

    2017-12-01

    Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.

  6. Competitive assessment of aerospace systems using system dynamics

    NASA Astrophysics Data System (ADS)

    Pfaender, Jens Holger

    Aircraft design has recently experienced a trend away from performance centric design towards a more balanced approach with increased emphasis on engineering an economically successful system. This approach focuses on bringing forward a comprehensive economic and life-cycle cost analysis. Since the success of any system also depends on many external factors outside of the control of the designer, this traditionally has been modeled as noise affecting the uncertainty of the design. However, this approach is currently lacking a strategic treatment of necessary early decisions affecting the probability of success of a given concept in a dynamic environment. This suggests that the introduction of a dynamic method into a life-cycle cost analysis should allow the analysis of the future attractiveness of such a concept in the presence of uncertainty. One way of addressing this is through the use of a competitive market model. However, existing market models do not focus on the dynamics of the market. Instead, they focus on modeling and predicting market share through logit regression models. The resulting models exhibit relatively poor predictive capabilities. The method proposed here focuses on a top-down approach that integrates a competitive model based on work in the field of system dynamics into the aircraft design process. Demonstrating such integration is one of the primary contributions of this work, which previously has not been demonstrated. This integration is achieved through the use of surrogate models, in this case neural networks. This enabled not only the practical integration of analysis techniques, but also reduced the computational requirements so that interactive exploration as envisioned was actually possible. The example demonstration of this integration is built on the competition in the 250 seat large commercial aircraft market exemplified by the Boeing 767-400ER and the Airbus A330-200. Both aircraft models were calibrated to existing performance and certification data and then integrated into the system dynamics market model. The market model was then calibrated with historical market data. This calibration showed a much improved predictive capability as compared to the conventional logit regression models. An additional advantage of this dynamic model is that to realize this improved capability, no additional explanatory variables were required. Furthermore, the resulting market model was then integrated into a prediction profiler environment with a time variant Monte-Carlo analysis resulting in a unique trade-off environment. This environment was shown to allow interactive trade-off between aircraft design decisions and economic considerations while allowing the exploration potential market success in the light of varying external market conditions and scenarios. The resulting method is capable of reduced decision support uncertainty and identification of robust design decisions in future scenarios with a high likelihood of occurrence with special focus on the path dependent nature of future implications of decisions. Furthermore, it was possible to demonstrate the increased importance of design and technology choices on the competitiveness in scenarios with drastic increases in commodity prices during the time period modeled. Another use of the existing outputs of the Monte-Carlo analysis was then realized by showing them on a multivariate scatter plot. This plot was then shown to enable by appropriate grouping of variables to enable the top down definition of an aircraft design, also known as inverse design. In other words this enables the designer to define strategic market and return on investment goals for a number of scenarios, for example the development of fuel prices, and then directly see which specific aircraft designs meet these goals.

  7. Proates a computer modelling system for power plant: Its description and application to heatrate improvement within PowerGen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, C.H.; Ready, A.B.; Rea, J.

    1995-06-01

    Versions of the computer program PROATES (PROcess Analysis for Thermal Energy Systems) have been used since 1979 to analyse plant performance improvement proposals relating to existing plant and also to evaluate new plant designs. Several plant modifications have been made to improve performance based on the model predictions and the predicted performance has been realised in practice. The program was born out of a need to model the overall steady state performance of complex plant to enable proposals to change plant component items or operating strategy to be evaluated. To do this with confidence it is necessary to model themore » multiple thermodynamic interactions between the plant components. The modelling system is modular in concept allowing the configuration of individual plant components to represent any particular power plant design. A library exists of physics based modules which have been extensively validated and which provide representations of a wide range of boiler, turbine and CW system components. Changes to model data and construction is achieved via a user friendly graphical model editing/analysis front-end with results being presented via the computer screen or hard copy. The paper describes briefly the modelling system but concentrates mainly on the application of the modelling system to assess design re-optimisation, firing with different fuels and the re-powering of an existing plant.« less

  8. Assessing Water Level Changes in Lake, Reservoir, Wetland, and River Systems with Remote Sensing Tools and Hydrological Model

    NASA Astrophysics Data System (ADS)

    Ricko, M.; Birkett, C. M.; Beckley, B. D.

    2017-12-01

    The NASA/USDA Global Reservoir and Lake Monitor (G-REALM) offers multi-mission satellite radar altimetry derived surface water level products for a subset of large reservoirs, lakes, and wetlands. These products complement the in situ networks by providing stage information at un-gauged locations, and filling existing data gaps. The availability of both satellite-based rainfall (e.g., TRMM, GPCP) and surface water level products offers great opportunities to estimate and monitor additional hydrologic properties of the lake/reservoir systems. A simple water balance model relating the net freshwater flux over a catchment basin to the lake/reservoir level has been previously utilized (Ricko et al., 2011). The applicability of this approach enables the construction of a longer record of surface water level, i.e. improving the climate data record. As instrument technology and data availability evolve, this method can be used to estimate the water level of a greater number of water bodies, and a greater number of much smaller targets. In addition, such information can improve water balance estimation in different lake, reservoir, wetland, and river systems, and be very useful for assessment of improved prediction of surface water availability. Connections to climatic variations on inter-annual to inter-decadal time-scales are explored here, with a focus on a future ability to predict changes in storage volume for water resources or natural hazards concerns.

  9. Beamforming applied to surface EEG improves ripple visibility.

    PubMed

    van Klink, Nicole; Mol, Arjen; Ferrier, Cyrille; Hillebrand, Arjan; Huiskamp, Geertjan; Zijlmans, Maeike

    2018-01-01

    Surface EEG can show epileptiform ripples in people with focal epilepsy, but identification is impeded by the low signal-to-noise ratio of the electrode recordings. We used beamformer-based virtual electrodes to improve ripple identification. We analyzed ten minutes of interictal EEG of nine patients with refractory focal epilepsy. EEGs with more than 60 channels and 20 spikes were included. We computed ∼79 virtual electrodes using a scalar beamformer and marked ripples (80-250 Hz) co-occurring with spikes in physical and virtual electrodes. Ripple numbers in physical and virtual electrodes were compared, and sensitivity and specificity of ripples for the region of interest (ROI; based on clinical information) were determined. Five patients had ripples in the physical electrodes and eight in the virtual electrodes, with more ripples in virtual than in physical electrodes (101 vs. 57, p = .007). Ripples in virtual electrodes predicted the ROI better than physical electrodes (AUC 0.65 vs. 0.56, p = .03). Beamforming increased ripple visibility in surface EEG. Virtual ripples predicted the ROI better than physical ripples, although sensitivity was still poor. Beamforming can facilitate ripple identification in EEG. Ripple localization needs to be improved to enable its use for presurgical evaluation in people with epilepsy. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  10. Short-term quality of life after subthalamic stimulation depends on non-motor symptoms in Parkinson's disease.

    PubMed

    Dafsari, Haidar Salimi; Weiß, Luisa; Silverdale, Monty; Rizos, Alexandra; Reddy, Prashanth; Ashkan, Keyoumars; Evans, Julian; Reker, Paul; Petry-Schmelzer, Jan Niklas; Samuel, Michael; Visser-Vandewalle, Veerle; Antonini, Angelo; Martinez-Martin, Pablo; Ray-Chaudhuri, K; Timmermann, Lars

    2018-02-24

    Subthalamic nucleus (STN) deep brain stimulation (DBS) improves quality of life (QoL), motor, and non-motor symptoms (NMS) in advanced Parkinson's disease (PD). However, considerable inter-individual variability has been observed for QoL outcome. We hypothesized that demographic and preoperative NMS characteristics can predict postoperative QoL outcome. In this ongoing, prospective, multicenter study (Cologne, Manchester, London) including 88 patients, we collected the following scales preoperatively and on follow-up 6 months postoperatively: PDQuestionnaire-8 (PDQ-8), NMSScale (NMSS), NMSQuestionnaire (NMSQ), Scales for Outcomes in PD (SCOPA)-motor examination, -complications, and -activities of daily living, levodopa equivalent daily dose. We dichotomized patients into "QoL responders"/"non-responders" and screened for factors associated with QoL improvement with (1) Spearman-correlations between baseline test scores and QoL improvement, (2) step-wise linear regressions with baseline test scores as independent and QoL improvement as dependent variables, (3) logistic regressions using aforementioned "responders/non-responders" as dependent variable. All outcomes improved significantly on follow-up. However, approximately 44% of patients were categorized as "QoL non-responders". Spearman-correlations, linear and logistic regression analyses were significant for NMSS and NMSQ but not for SCOPA-motor examination. Post-hoc, we identified specific NMS (flat moods, difficulties experiencing pleasure, pain, bladder voiding) as significant contributors to QoL outcome. Our results provide evidence that QoL improvement after STN-DBS depends on preoperative NMS characteristics. These findings are important in the advising and selection of individuals for DBS therapy. Future studies investigating motor and non-motor PD clusters may enable stratifying QoL outcomes and help predict patients' individual prospects of benefiting from DBS. Copyright © 2018. Published by Elsevier Inc.

  11. Assessing stability and performance of a digitally enabled supply chain: Retrospective of a pilot in Uttar Pradesh, India.

    PubMed

    Gilbert, Sarah Skye; Thakare, Neeraj; Ramanujapuram, Arun; Akkihal, Anup

    2017-04-19

    Immunization supply chains in low resource settings do not always reach children with necessary vaccines. Digital information systems can enable real time visibility of inventory and improve vaccine availability. In 2014, a digital, mobile/web-based information system was implemented in two districts of Uttar Pradesh, India. This retrospective investigates improvements and stabilization of supply chain performance following introduction of the digital information system. All data were collected via the digital information system between March 2014 and September 2015. Data included metadata and transaction logs providing information about users, facilities, and vaccines. Metrics evaluated include adoption (system access, timeliness and completeness), data quality (error rates), and performance (stock availability on immunization session days, replenishment response duration, rate of zero stock events). Stability was defined as the phase in which quality and performance metrics achieved equilibrium rates with minimal volatility. The analysis compared performance across different facilities and vaccines. Adoption appeared sufficiently high from the onset to commence stability measures of data quality and supply chain performance. Data quality stabilized from month 3 onwards, and supply chain performance stabilized from month 13 onwards. For data quality, error rates reduced by two thirds post stabilization. Although vaccine availability remained high throughout the pilot, the three lowest-performing facilities improved from 91.05% pre-stability to 98.70% post-stability (p<0.01; t-test). Average replenishment duration (as a corrective response to stock-out events) decreased 52.3% from 4.93days to 2.35days (p<0.01; t-test). Diphtheria-tetanus-pertussis vaccine was significantly less likely to be stocked out than any other material. The results suggest that given sufficient adoption, stability is sequentially achieved, beginning with data quality, and then performance. Identifying when a pilot stabilizes can enable more predictable, reliable cost estimates, and outcome forecasts in the scale-up phase. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Advanced Concepts, Technologies and Flight Experiments for NASA's Earth Science Enterprise

    NASA Technical Reports Server (NTRS)

    Meredith, Barry D.

    2000-01-01

    Over the last 25 years, NASA Langley Research Center (LaRC) has established a tradition of excellence in scientific research and leading-edge system developments, which have contributed to improved scientific understanding of our Earth system. Specifically, LaRC advances knowledge of atmospheric processes to enable proactive climate prediction and, in that role, develops first-of-a-kind atmospheric sensing capabilities that permit a variety of new measurements to be made within a constrained enterprise budget. These advances are enabled by the timely development and infusion of new, state-of-the-art (SOA), active and passive instrument and sensor technologies. In addition, LaRC's center-of-excellence in structures and materials is being applied to the technological challenges of reducing measurement system size, mass, and cost through the development and use of space-durable materials; lightweight, multi-functional structures; and large deployable/inflatable structures. NASA Langley is engaged in advancing these technologies across the full range of readiness levels from concept, to components, to prototypes, to flight experiments, and on to actual science mission infusion. The purpose of this paper is to describe current activities and capabilities, recent achievements, and future plans of the integrated science, engineering, and technology team at Langley Research Center who are working to enable the future of NASA's Earth Science Enterprise.

  13. High Fidelity System Simulation of Multiple Components in Support of the UEET Program

    NASA Technical Reports Server (NTRS)

    Plybon, Ronald C.; VanDeWall, Allan; Sampath, Rajiv; Balasubramaniam, Mahadevan; Mallina, Ramakrishna; Irani, Rohinton

    2006-01-01

    The High Fidelity System Simulation effort has addressed various important objectives to enable additional capability within the NPSS framework. The scope emphasized High Pressure Turbine and High Pressure Compressor components. Initial effort was directed at developing and validating intermediate fidelity NPSS model using PD geometry and extended to high-fidelity NPSS model by overlaying detailed geometry to validate CFD against rig data. Both "feedforward" and feedback" approaches of analysis zooming was employed to enable system simulation capability in NPSS. These approaches have certain benefits and applicability in terms of specific applications "feedback" zooming allows the flow-up of information from high-fidelity analysis to be used to update the NPSS model results by forcing the NPSS solver to converge to high-fidelity analysis predictions. This apporach is effective in improving the accuracy of the NPSS model; however, it can only be used in circumstances where there is a clear physics-based strategy to flow up the high-fidelity analysis results to update the NPSS system model. "Feed-forward" zooming approach is more broadly useful in terms of enabling detailed analysis at early stages of design for a specified set of critical operating points and using these analysis results to drive design decisions early in the development process.

  14. Reduce Manual Curation by Combining Gene Predictions from Multiple Annotation Engines, a Case Study of Start Codon Prediction

    PubMed Central

    Ederveen, Thomas H. A.; Overmars, Lex; van Hijum, Sacha A. F. T.

    2013-01-01

    Nowadays, prokaryotic genomes are sequenced faster than the capacity to manually curate gene annotations. Automated genome annotation engines provide users a straight-forward and complete solution for predicting ORF coordinates and function. For many labs, the use of AGEs is therefore essential to decrease the time necessary for annotating a given prokaryotic genome. However, it is not uncommon for AGEs to provide different and sometimes conflicting predictions. Combining multiple AGEs might allow for more accurate predictions. Here we analyzed the ab initio open reading frame (ORF) calling performance of different AGEs based on curated genome annotations of eight strains from different bacterial species with GC% ranging from 35–52%. We present a case study which demonstrates a novel way of comparative genome annotation, using combinations of AGEs in a pre-defined order (or path) to predict ORF start codons. The order of AGE combinations is from high to low specificity, where the specificity is based on the eight genome annotations. For each AGE combination we are able to derive a so-called projected confidence value, which is the average specificity of ORF start codon prediction based on the eight genomes. The projected confidence enables estimating likeliness of a correct prediction for a particular ORF start codon by a particular AGE combination, pinpointing ORFs notoriously difficult to predict start codons. We correctly predict start codons for 90.5±4.8% of the genes in a genome (based on the eight genomes) with an accuracy of 81.1±7.6%. Our consensus-path methodology allows a marked improvement over majority voting (9.7±4.4%) and with an optimal path ORF start prediction sensitivity is gained while maintaining a high specificity. PMID:23675487

  15. From Experiments to Simulations: Downscaling Measurements of Na+ Distribution at the Root-Soil Interface

    NASA Astrophysics Data System (ADS)

    Perelman, A.; Guerra, H. J.; Pohlmeier, A. J.; Vanderborght, J.; Lazarovitch, N.

    2017-12-01

    When salinity increases beyond a certain threshold, crop yield will decrease at a fixed rate, according to the Maas and Hoffman model (1976). Thus, it is highly important to predict salinization and its impact on crops. Current models do not consider the impact of the transpiration rate on plant salt tolerance, although it affects plant water uptake and thus salt accumulation around the roots, consequently influencing the plant's sensitivity to salinity. Better model parametrization can improve the prediction of real salinity effects on crop growth and yield. The aim of this research is to study Na+ distribution around roots at different scales using different non-invasive methods, and to examine how this distribution is affected by the transpiration rate and plant water uptake. Results from tomato plants that were grown on rhizoslides (a capillary paper growth system) showed that the Na+ concentration was higher at the root-substrate interface than in the bulk. Also, Na+ accumulation around the roots decreased under a low transpiration rate, supporting our hypothesis. The rhizoslides enabled the root growth rate and architecture to be studied under different salinity levels. The root system architecture was retrieved from photos taken during the experiment, enabling us to incorporate real root systems into a simulation. Magnetic resonance imaging (MRI) was used to observe correlations between root system architectures and Na+ distribution. The MRI provided fine resolution of the Na+ accumulation around a single root without disturbing the root system. With time, Na+ accumulated only where roots were found in the soil and later around specific roots. Rhizoslides allow the root systems of larger plants to be investigated, but this method is limited by the medium (paper) and the dimension (2D). The MRI can create a 3D image of Na+ accumulation in soil on a microscopic scale. These data are being used for model calibration, which is expected to enable the prediction of root water uptake in saline soils for different climatic conditions and different soil water availabilities.

  16. ESR Analysis of Polymer Photo-Oxidation

    NASA Technical Reports Server (NTRS)

    Kim, Soon Sam; Liang, Ranty Hing; Tsay, Fun-Dow; Gupta, Amitave

    1987-01-01

    Electron-spin resonance identifies polymer-degradation reactions and their kinetics. New technique enables derivation of kinetic model of specific chemical reactions involved in degradation of particular polymer. Detailed information provided by new method enables prediction of aging characteristics long before manifestation of macroscopic mechanical properties.

  17. Bayesian analysis of rare events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less

  18. Visualization of the Invisible, Explanation of the Unknown, Ruggedization of the Unstable: Sensitivity Analysis, Virtual Tryout and Robust Design through Systematic Stochastic Simulation

    NASA Astrophysics Data System (ADS)

    Zwickl, Titus; Carleer, Bart; Kubli, Waldemar

    2005-08-01

    In the past decade, sheet metal forming simulation became a well established tool to predict the formability of parts. In the automotive industry, this has enabled significant reduction in the cost and time for vehicle design and development, and has helped to improve the quality and performance of vehicle parts. However, production stoppages for troubleshooting and unplanned die maintenance, as well as production quality fluctuations continue to plague manufacturing cost and time. The focus therefore has shifted in recent times beyond mere feasibility to robustness of the product and process being engineered. Ensuring robustness is the next big challenge for the virtual tryout / simulation technology. We introduce new methods, based on systematic stochastic simulations, to visualize the behavior of the part during the whole forming process — in simulation as well as in production. Sensitivity analysis explains the response of the part to changes in influencing parameters. Virtual tryout allows quick exploration of changed designs and conditions. Robust design and manufacturing guarantees quality and process capability for the production process. While conventional simulations helped to reduce development time and cost by ensuring feasible processes, robustness engineering tools have the potential for far greater cost and time savings. Through examples we illustrate how expected and unexpected behavior of deep drawing parts may be tracked down, identified and assigned to the influential parameters. With this knowledge, defects can be eliminated or springback can be compensated e.g.; the response of the part to uncontrollable noise can be predicted and minimized. The newly introduced methods enable more reliable and predictable stamping processes in general.

  19. Street Level Hydrology: An Urban Application of the WRF-Hydro Framework in Denver, Colorado

    NASA Astrophysics Data System (ADS)

    Read, L.; Hogue, T. S.; Salas, F. R.; Gochis, D.

    2015-12-01

    Urban flood modeling at the watershed scale carries unique challenges in routing complexity, data resolution, social and political issues, and land surface - infrastructure interactions. The ability to accurately trace and predict the flow of water through the urban landscape enables better emergency response management, floodplain mapping, and data for future urban infrastructure planning and development. These services are of growing importance as urban population is expected to continue increasing by 1.84% per year for the next 25 years, increasing the vulnerability of urban regions to damages and loss of life from floods. Although a range of watershed-scale models have been applied in specific urban areas to examine these issues, there is a trend towards national scale hydrologic modeling enabled by supercomputing resources to understand larger system-wide hydrologic impacts and feedbacks. As such it is important to address how urban landscapes can be represented in large scale modeling processes. The current project investigates how coupling terrain and infrastructure routing can improve flow prediction and flooding events over the urban landscape. We utilize the WRF-Hydro modeling framework and a high-resolution terrain routing grid with the goal of compiling standard data needs necessary for fine scale urban modeling and dynamic flood forecasting in the urban setting. The city of Denver is selected as a case study, as it has experienced several large flooding events in the last five years and has an urban annual population growth rate of 1.5%, one of the highest in the U.S. Our work highlights the hydro-informatic challenges associated with linking channel networks and drainage infrastructure in an urban area using the WRF-Hydro modeling framework and high resolution urban models for short-term flood prediction.

  20. High Spatiotemporal Resolution ECoG Recording of Somatosensory Evoked Potentials with Flexible Micro-Electrode Arrays.

    PubMed

    Kaiju, Taro; Doi, Keiichi; Yokota, Masashi; Watanabe, Kei; Inoue, Masato; Ando, Hiroshi; Takahashi, Kazutaka; Yoshida, Fumiaki; Hirata, Masayuki; Suzuki, Takafumi

    2017-01-01

    Electrocorticogram (ECoG) has great potential as a source signal, especially for clinical BMI. Until recently, ECoG electrodes were commonly used for identifying epileptogenic foci in clinical situations, and such electrodes were low-density and large. Increasing the number and density of recording channels could enable the collection of richer motor/sensory information, and may enhance the precision of decoding and increase opportunities for controlling external devices. Several reports have aimed to increase the number and density of channels. However, few studies have discussed the actual validity of high-density ECoG arrays. In this study, we developed novel high-density flexible ECoG arrays and conducted decoding analyses with monkey somatosensory evoked potentials (SEPs). Using MEMS technology, we made 96-channel Parylene electrode arrays with an inter-electrode distance of 700 μm and recording site area of 350 μm 2 . The arrays were mainly placed onto the finger representation area in the somatosensory cortex of the macaque, and partially inserted into the central sulcus. With electrical finger stimulation, we successfully recorded and visualized finger SEPs with a high spatiotemporal resolution. We conducted offline analyses in which the stimulated fingers and intensity were predicted from recorded SEPs using a support vector machine. We obtained the following results: (1) Very high accuracy (~98%) was achieved with just a short segment of data (~15 ms from stimulus onset). (2) High accuracy (~96%) was achieved even when only a single channel was used. This result indicated placement optimality for decoding. (3) Higher channel counts generally improved prediction accuracy, but the efficacy was small for predictions with feature vectors that included time-series information. These results suggest that ECoG signals with high spatiotemporal resolution could enable greater decoding precision or external device control.

  1. High Spatiotemporal Resolution ECoG Recording of Somatosensory Evoked Potentials with Flexible Micro-Electrode Arrays

    PubMed Central

    Kaiju, Taro; Doi, Keiichi; Yokota, Masashi; Watanabe, Kei; Inoue, Masato; Ando, Hiroshi; Takahashi, Kazutaka; Yoshida, Fumiaki; Hirata, Masayuki; Suzuki, Takafumi

    2017-01-01

    Electrocorticogram (ECoG) has great potential as a source signal, especially for clinical BMI. Until recently, ECoG electrodes were commonly used for identifying epileptogenic foci in clinical situations, and such electrodes were low-density and large. Increasing the number and density of recording channels could enable the collection of richer motor/sensory information, and may enhance the precision of decoding and increase opportunities for controlling external devices. Several reports have aimed to increase the number and density of channels. However, few studies have discussed the actual validity of high-density ECoG arrays. In this study, we developed novel high-density flexible ECoG arrays and conducted decoding analyses with monkey somatosensory evoked potentials (SEPs). Using MEMS technology, we made 96-channel Parylene electrode arrays with an inter-electrode distance of 700 μm and recording site area of 350 μm2. The arrays were mainly placed onto the finger representation area in the somatosensory cortex of the macaque, and partially inserted into the central sulcus. With electrical finger stimulation, we successfully recorded and visualized finger SEPs with a high spatiotemporal resolution. We conducted offline analyses in which the stimulated fingers and intensity were predicted from recorded SEPs using a support vector machine. We obtained the following results: (1) Very high accuracy (~98%) was achieved with just a short segment of data (~15 ms from stimulus onset). (2) High accuracy (~96%) was achieved even when only a single channel was used. This result indicated placement optimality for decoding. (3) Higher channel counts generally improved prediction accuracy, but the efficacy was small for predictions with feature vectors that included time-series information. These results suggest that ECoG signals with high spatiotemporal resolution could enable greater decoding precision or external device control. PMID:28442997

  2. A new MRI land surface model HAL

    NASA Astrophysics Data System (ADS)

    Hosaka, M.

    2011-12-01

    A land surface model HAL is newly developed for MRI-ESM1. It is used for the CMIP simulations. HAL consists of three submodels: SiByl (vegetation), SNOWA (snow) and SOILA (soil) in the current version. It also contains a land coupler LCUP which connects some submodels and an atmospheric model. The vegetation submodel SiByl has surface vegetation processes similar to JMA/SiB (Sato et al. 1987, Hirai et al. 2007). SiByl has 2 vegetation layers (canopy and grass) and calculates heat, moisture, and momentum fluxes between the land surface and the atmosphere. The snow submodel SNOWA can have any number of snow layers and the maximum value is set to 8 for the CMIP5 experiments. Temperature, SWE, density, grain size and the aerosol deposition contents of each layer are predicted. The snow properties including the grain size are predicted due to snow metamorphism processes (Niwano et al., 2011), and the snow albedo is diagnosed from the aerosol mixing ratio, the snow properties and the temperature (Aoki et al., 2011). The soil submodel SOILA can also have any number of soil layers, and is composed of 14 soil layers in the CMIP5 experiments. The temperature of each layer is predicted by solving heat conduction equations. The soil moisture is predicted by solving the Darcy equation, in which hydraulic conductivity depends on the soil moisture. The land coupler LCUP is designed to enable the complicated constructions of the submidels. HAL can include some competing submodels (precise and detailed ones, and simpler ones), and they can run at the same simulations. LCUP enables a 2-step model validation, in which we compare the results of the detailed submodels with the in-situ observation directly at the 1st step, and follows the comparison between them and those of the simpler ones at the 2nd step. When the performances of the detailed ones are good, we can improve the simpler ones by using the detailed ones as reference models.

  3. Ares I-X Launch Abort System, Crew Module, and Upper Stage Simulator Vibroacoustic Flight Data Evaluation, Comparison to Predictions, and Recommendations for Adjustments to Prediction Methodology and Assumptions

    NASA Technical Reports Server (NTRS)

    Smith, Andrew; Harrison, Phil

    2010-01-01

    The National Aeronautics and Space Administration (NASA) Constellation Program (CxP) has identified a series of tests to provide insight into the design and development of the Crew Launch Vehicle (CLV) and Crew Exploration Vehicle (CEV). Ares I-X was selected as the first suborbital development flight test to help meet CxP objectives. The Ares I-X flight test vehicle (FTV) is an early operational model of CLV, with specific emphasis on CLV and ground operation characteristics necessary to meet Ares I-X flight test objectives. The in-flight part of the test includes a trajectory to simulate maximum dynamic pressure during flight and perform a stage separation of the Upper Stage Simulator (USS) from the First Stage (FS). The in-flight test also includes recovery of the FS. The random vibration response from the ARES 1-X flight will be reconstructed for a few specific locations that were instrumented with accelerometers. This recorded data will be helpful in validating and refining vibration prediction tools and methodology. Measured vibroacoustic environments associated with lift off and ascent phases of the Ares I-X mission will be compared with pre-flight vibration predictions. The measured flight data was given as time histories which will be converted into power spectral density plots for comparison with the maximum predicted environments. The maximum predicted environments are documented in the Vibroacoustics and Shock Environment Data Book, AI1-SYS-ACOv4.10 Vibration predictions made using statistical energy analysis (SEA) VAOne computer program will also be incorporated in the comparisons. Ascent and lift off measured acoustics will also be compared to predictions to assess whether any discrepancies between the predicted vibration levels and measured vibration levels are attributable to inaccurate acoustic predictions. These comparisons will also be helpful in assessing whether adjustments to prediction methodologies are needed to improve agreement between the predicted and measured flight data. Future assessment will incorporate hybrid methods in VAOne analysis (i.e., boundary element methods, BEM and finite element methods, FEM). These hybrid methods will enable the ability to import NASTRAN models providing much more detailed modeling of the underlying beams and support structure of the ARES 1-X test vehicle. Measured acoustic data will be incorporated into these analyses to improve correlation for additional post flight analysis.

  4. Some modifications of Newton's method for the determination of the steady-state response of nonlinear oscillatory circuits

    NASA Astrophysics Data System (ADS)

    Grosz, F. B., Jr.; Trick, T. N.

    1982-07-01

    It is proposed that nondominant states should be eliminated from the Newton algorithm in the steady-state analysis of nonlinear oscillatory systems. This technique not only improves convergence, but also reduces the size of the sensitivity matrix so that less computation is required for each iteration. One or more periods of integration should be performed after each periodic state estimation before the sensitivity computations are made for the next periodic state estimation. These extra periods of integration between Newton iterations are found to allow the fast states due to parasitic effects to settle, which enables the Newton algorithm to make a better prediction. In addition, the reliability of the algorithm is improved in high Q oscillator circuits by both local and global damping in which the amount of damping is proportional to the difference between the initial and final state values.

  5. Telescope Scientist on the Advanced X-ray Astrophysics Observatory

    NASA Technical Reports Server (NTRS)

    VanSpeybroeck, L.; Smith, Carl M. (Technical Monitor)

    2002-01-01

    This period included many scientific observations made with the Chandra Observatory. The results, as is well known, are spectacular. Fortunately, the High Resolution Mirror Assembly (HRMA) performance continues to be essentially identical to that predicted from ground calibration data. The Telescope Scientist Team has improved the mirror model to provide a more accurate description to the Chandra observers and enable them to reduce the systematic errors and uncertainties in their data reduction. We also have made considerable progress in improving the scattering model. There also has been progress in the scientific program. At this time 58 distant clusters of galaxies have been observed. We are performing a systematic analysis of this rather large data set for the purpose of determining absolute distances utilizing the Sunyaev Zel'dovich effect. These observations also have been used to study the evolution of the cluster baryon mass function and the cosmological constraints which result from this evolution.

  6. Ultrathin Injectable Sensors of Temperature, Thermal Conductivity, and Heat Capacity for Cardiac Ablation Monitoring

    PubMed Central

    Koh, Ahyeon; Gutbrod, Sarah R.; Meyers, Jason D.; Lu, Chaofeng; Webb, Richard Chad; Shin, Gunchul; Li, Yuhang; Kang, Seung-Kyun; Huang, Yonggang

    2016-01-01

    Knowledge of the distributions of temperature in cardiac tissue during and after ablation is important in advancing a basic understanding of this process, and for improving its efficacy in treating arrhythmias. Technologies that enable real-time temperature detection and thermal characterization in the transmural direction can help to predict the depths and sizes of lesion that form. Herein, materials and designs for an injectable device platform that supports precision sensors of temperature and thermal transport properties distributed along the length of an ultrathin and flexible needle-type polymer substrate are introduced. The resulting system can insert into the myocardial tissue, in a minimally invasive manner, to monitor both radiofrequency ablation and cryoablation, in a manner that has no measurable effects on the natural mechanical motions of the heart. The measurement results exhibit excellent agreement with thermal simulations, thereby providing improved insights into lesion transmurality. PMID:26648177

  7. Ovulation induction in normogonadotropic anovulation (PCOS).

    PubMed

    van Santbrink, Evert J P; Fauser, Bart C J M

    2006-06-01

    Treatment of normogonadotropic anovulatory infertility (World Health Organization class 2, or WHO2) is by induction of ovulation using clomiphene citrate (CC), followed by follicle-stimulating hormone (FSH) in cases of treatment failure. Not all patients will become ovulatory or will conceive with this treatment. Others, exhibiting multifollicular instead of monofollicular development, may encounter complications such as ovarian hyperstimulation and multiple pregnancy. Recently introduced alternative treatment interventions-such as insulin-sensitizing drugs, aromatase inhibitors, or laparoscopic electrocautery of the ovaries-may offer the possibility of improving the efficacy of the classical ovulation induction algorithm. Based on initial patient characteristics, it may be possible to identify specific patient subgroups with altered chances of success or complications while using one of these interventions. Regarding CC and FSH ovulation induction, this has been performed using multivariate prediction models. This approach may enable us to improve safety, cost-effectiveness, and patient convenience in future ovulation induction.

  8. Low-Cost CdTe/Silicon Tandem Solar Cells

    DOE PAGES

    Tamboli, Adele C.; Bobela, David C.; Kanevce, Ana; ...

    2017-09-06

    Achieving higher photovoltaic efficiency in single-junction devices is becoming increasingly difficult, but tandem modules offer the possibility of significant efficiency improvements. By device modeling we show that four-terminal CdTe/Si tandem solar modules offer the prospect of 25%-30% module efficiency, and technoeconomic analysis predicts that these efficiency gains can be realized at costs per Watt that are competitive with CdTe and Si single junction alternatives. The cost per Watt of the modeled tandems is lower than crystalline silicon, but slightly higher than CdTe alone. But, these higher power modules reduce area-related balance of system costs, providing increased value especially in area-constrainedmore » applications. This avenue for high-efficiency photovoltaics enables improved performance on a near-term timeframe, as well as a path to further reduced levelized cost of electricity as module and cell processes continue to advance.« less

  9. Low-Cost CdTe/Silicon Tandem Solar Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamboli, Adele C.; Bobela, David C.; Kanevce, Ana

    Achieving higher photovoltaic efficiency in single-junction devices is becoming increasingly difficult, but tandem modules offer the possibility of significant efficiency improvements. By device modeling we show that four-terminal CdTe/Si tandem solar modules offer the prospect of 25%-30% module efficiency, and technoeconomic analysis predicts that these efficiency gains can be realized at costs per Watt that are competitive with CdTe and Si single junction alternatives. The cost per Watt of the modeled tandems is lower than crystalline silicon, but slightly higher than CdTe alone. But, these higher power modules reduce area-related balance of system costs, providing increased value especially in area-constrainedmore » applications. This avenue for high-efficiency photovoltaics enables improved performance on a near-term timeframe, as well as a path to further reduced levelized cost of electricity as module and cell processes continue to advance.« less

  10. CEMCAN Software Enhanced for Predicting the Properties of Woven Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mital, Subodh K.; DiCarlo, James A.

    2000-01-01

    Major advancements are needed in current high-temperature materials to meet the requirements of future space and aeropropulsion structural components. Ceramic matrix composites (CMC's) are one class of materials that are being evaluated as candidate materials for many high-temperature applications. Past efforts to improve the performance of CMC's focused primarily on improving the properties of the fiber, interfacial coatings, and matrix constituents as individual phases. Design and analysis tools must take into consideration the complex geometries, microstructures, and fabrication processes involved in these composites and must allow the composite properties to be tailored for optimum performance. Major accomplishments during the past year include the development and inclusion of woven CMC micromechanics methodology into the CEMCAN (Ceramic Matrix Composites Analyzer) computer code. The code enables one to calibrate a consistent set of constituent properties as a function of temperature with the aid of experimentally measured data.

  11. The effect of action video game playing on sensorimotor learning: Evidence from a movement tracking task.

    PubMed

    Gozli, Davood G; Bavelier, Daphne; Pratt, Jay

    2014-10-12

    Research on the impact of action video game playing has revealed performance advantages on a wide range of perceptual and cognitive tasks. It is not known, however, if playing such games confers similar advantages in sensorimotor learning. To address this issue, the present study used a manual motion-tracking task that allowed for a sensitive measure of both accuracy and improvement over time. When the target motion pattern was consistent over trials, gamers improved with a faster rate and eventually outperformed non-gamers. Performance between the two groups, however, did not differ initially. When the target motion was inconsistent, changing on every trial, results revealed no difference between gamers and non-gamers. Together, our findings suggest that video game playing confers no reliable benefit in sensorimotor control, but it does enhance sensorimotor learning, enabling superior performance in tasks with consistent and predictable structure. Copyright © 2014. Published by Elsevier B.V.

  12. Thrombus Formation at High Shear Rates.

    PubMed

    Casa, Lauren D C; Ku, David N

    2017-06-21

    The final common pathway in myocardial infarction and ischemic stroke is occlusion of blood flow from a thrombus forming under high shear rates in arteries. A high-shear thrombus forms rapidly and is distinct from the slow formation of coagulation that occurs in stagnant blood. Thrombosis at high shear rates depends primarily on the long protein von Willebrand factor (vWF) and platelets, with hemodynamics playing an important role in each stage of thrombus formation, including vWF binding, platelet adhesion, platelet activation, and rapid thrombus growth. The prediction of high-shear thrombosis is a major area of biofluid mechanics in which point-of-care testing and computational modeling are promising future directions for clinically relevant research. Further research in this area will enable identification of patients at high risk for arterial thrombosis, improve prevention and treatment based on shear-dependent biological mechanisms, and improve blood-contacting device design to reduce thrombosis risk.

  13. Improving Thin Bed Identification in Sarawak Basin Field using Short Time Fourier Transform Half Cepstrum (STFTHC) method

    NASA Astrophysics Data System (ADS)

    Nizarul, O.; Hermana, M.; Bashir, Y.; Ghosh, D. P.

    2016-02-01

    In delineating complex subsurface geological feature, broad band of frequencies are needed to unveil the often hidden features of hydrocarbon basin such as thin bedding. The ability to resolve thin geological horizon on seismic data is recognized to be a fundamental importance for hydrocarbon exploration, seismic interpretation and reserve prediction. For thin bedding, high frequency content is needed to enable tuning, which can be done by applying the band width extension technique. This paper shows an application of Short Time Fourier Transform Half Cepstrum (STFTHC) method, a frequency bandwidth expansion technique for non-stationary seismic signal in increasing the temporal resolution to uncover thin beds and improve characterization of the basin. A wedge model and synthetic seismic data is used to quantify the algorithm as well as real data from Sarawak basin were used to show the effectiveness of this method in enhancing the resolution.

  14. Changes in Lung Function and Chylous Effusions in Patients With Lymphangioleiomyomatosis Treated With Sirolimus

    PubMed Central

    Taveira-DaSilva, Angelo M.; Hathaway, Olanda; Stylianou, Mario; Moss, Joel

    2011-01-01

    Background Lymphangioleiomyomatosis (LAM) is a disorder that affects women and is characterized by cystic lung destruction, chylous effusions, lymphangioleiomyomas, and angiomyolipomas. It is caused by proliferation of abnormal smooth muscle–like cells. Sirolimus is a mammalian target of rapamycin inhibitor that has been reported to decrease the size of neoplastic growths in animal models of tuberous sclerosis complex and to reduce the size of angiomyolipomas and stabilize lung function in humans. Objective To assess whether sirolimus therapy is associated with improvement in lung function and a decrease in the size of chylous effusions and lymphangioleiomyomas in patients with LAM. Design Observational study. Setting The National Institutes of Health Clinical Center. Patients 19 patients with rapidly progressing LAM or chylous effusions. Intervention Treatment with sirolimus. Measurements Lung function and the size of chylous effusions and lymphangioleiomyomas before and during sirolimus therapy. Results Over a mean of 2.5 years before beginning sirolimus therapy, the mean (±SE) FEV1 decreased by 2.8% ± 0.8% predicted and diffusing capacity of the lung for carbon monoxide (DLCO) decreased by 4.8% ± 0.9% predicted per year. In contrast, over a mean of 2.6 years of sirolimus therapy, the mean (± SE) FEV1 increased by 1.8% ± 0.5% predicted and DLCO increased by 0.8% ± 0.5% predicted per year (P < 0.001). After beginning sirolimus therapy, 12 patients with chylous effusions and 11 patients with lymphangioleiomyomas experienced almost complete resolution of these conditions. In 2 of the 12 patients, sirolimus therapy enabled discontinuation of pleural fluid drainage. Limitations This was an observational study. The resolution of effusions may have affected improvements in lung function. Conclusion Sirolimus therapy is associated with improvement or stabilization of lung function and reduction in the size of chylous effusions and lymphangioleiomyomas in patients with LAM. Primary Funding Source Intramural Research Program, National Heart, Lung, and Blood Institute, National Institutes of Health. PMID:21690594

  15. Enabling Technologies for Ceramic Hot Section Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venkat Vedula; Tania Bhatia

    Silicon-based ceramics are attractive materials for use in gas turbine engine hot sections due to their high temperature mechanical and physical properties as well as lower density than metals. The advantages of utilizing ceramic hot section components include weight reduction, and improved efficiency as well as enhanced power output and lower emissions as a result of reducing or eliminating cooling. Potential gas turbine ceramic components for industrial, commercial and/or military high temperature turbine applications include combustor liners, vanes, rotors, and shrouds. These components require materials that can withstand high temperatures and pressures for long duration under steam-rich environments. For Navymore » applications, ceramic hot section components have the potential to increase the operation range. The amount of weight reduced by utilizing a lighter gas turbine can be used to increase fuel storage capacity while a more efficient gas turbine consumes less fuel. Both improvements enable a longer operation range for Navy ships and aircraft. Ceramic hot section components will also be beneficial to the Navy's Growth Joint Strike Fighter (JSF) and VAATE (Versatile Affordable Advanced Turbine Engines) initiatives in terms of reduced weight, cooling air savings, and capability/cost index (CCI). For DOE applications, ceramic hot section components provide an avenue to achieve low emissions while improving efficiency. Combustors made of ceramic material can withstand higher wall temperatures and require less cooling air. Ability of the ceramics to withstand high temperatures enables novel combustor designs that have reduced NO{sub x}, smoke and CO levels. In the turbine section, ceramic vanes and blades do not require sophisticated cooling schemes currently used for metal components. The saved cooling air could be used to further improve efficiency and power output. The objectives of this contract were to develop technologies critical for ceramic hot section components for gas turbine engines. Significant technical progress has been made towards maturation of the EBC and CMC technologies for incorporation into gas turbine engine hot-section. Promising EBC candidates for longer life and/or higher temperature applications relative to current state of the art BSAS-based EBCs have been identified. These next generation coating systems have been scaled-up from coupons to components and are currently being field tested in Solar Centaur 50S engine. CMC combustor liners were designed, fabricated and tested in a FT8 sector rig to demonstrate the benefits of a high temperature material system. Pretest predictions made through the use of perfectly stirred reactor models showed a 2-3x benefit in CO emissions for CMC versus metallic liners. The sector-rig test validated the pretest predictions with >2x benefit in CO at the same NOx levels at various load conditions. The CMC liners also survived several trip shut downs thereby validating the CMC design methodology. Significant technical progress has been made towards incorporation of ceramic matrix composites (CMC) and environmental barrier coatings (EBC) technologies into gas turbine engine hot-section. The second phase of the program focused on the demonstration of a reverse flow annular CMC combustor. This has included overcoming the challenges of design and fabrication of CMCs into 'complex' shapes; developing processing to apply EBCs to 'engine hardware'; testing of an advanced combustor enabled by CMCs in a PW206 rig; and the validation of performance benefits against a metal baseline. The rig test validated many of the pretest predictions with a 40-50% reduction in pattern factor compared to the baseline and reductions in NOx levels at maximum power conditions. The next steps are to develop an understanding of the life limiting mechanisms in EBC and CMC materials, developing a design system for EBC coated CMCs and durability testing in an engine environment.« less

  16. The Gravity Field of Mars From MGS, Mars Odyssey, and MRO Radio Science

    NASA Technical Reports Server (NTRS)

    Genova, Antonio; Goossens, Sander; Lemoine, Frank G.; Mazarico, Erwan; Smith, David E.; Zuber, Maria T.

    2015-01-01

    The Mars Global Surveyor (MGS), Mars Odyssey (ODY), and Mars Reconnaissance Orbiter (MRO) missions have enabled NASA to conduct reconnaissance and exploration of Mars from orbit for sixteen consecutive years. These radio systems on these spacecraft enabled radio science in orbit around Mars to improve the knowledge of the static structure of the Martian gravitational field. The continuity of the radio tracking data, which cover more than a solar cycle, also provides useful information to characterize the temporal variability of the gravity field, relevant to the planet's internal dynamics and the structure and dynamics of the atmosphere [1]. MGS operated for more than 7 years, between 1999 and 2006, in a frozen sun-synchronous, near-circular, polar orbit with the periapsis at approximately 370 km altitude. ODY and MRO have been orbiting Mars in two separate sun-synchronous orbits at different local times and altitudes. ODY began its mapping phase in 2002 with the periapis at approximately 390 km altitude and 4-5pm Local Solar Time (LST), whereas the MRO science mission started in November 2006 with the periapis at approximately 255 km altitude and 3pm LST. The 16 years of radio tracking data provide useful information on the atmospheric density in the Martian upper atmosphere. We used ODY and MRO radio data to recover the long-term periodicity of the major atmospheric constituents -- CO2, O, and He -- at the orbit altitudes of these two spacecraft [2]. The improved atmospheric model provides a better prediction of the annual and semi-annual variability of the dominant species. Therefore, the inclusion of the recovered model leads to improved orbit determination and an improved gravity field model of Mars with MGS, ODY, and MRO radio tracking data.

  17. Quantifying predictive capability of electronic health records for the most harmful breast cancer

    NASA Astrophysics Data System (ADS)

    Wu, Yirong; Fan, Jun; Peissig, Peggy; Berg, Richard; Tafti, Ahmad Pahlavan; Yin, Jie; Yuan, Ming; Page, David; Cox, Jennifer; Burnside, Elizabeth S.

    2018-03-01

    Improved prediction of the "most harmful" breast cancers that cause the most substantive morbidity and mortality would enable physicians to target more intense screening and preventive measures at those women who have the highest risk; however, such prediction models for the "most harmful" breast cancers have rarely been developed. Electronic health records (EHRs) represent an underused data source that has great research and clinical potential. Our goal was to quantify the value of EHR variables in the "most harmful" breast cancer risk prediction. We identified 794 subjects who had breast cancer with primary non-benign tumors with their earliest diagnosis on or after 1/1/2004 from an existing personalized medicine data repository, including 395 "most harmful" breast cancer cases and 399 "least harmful" breast cancer cases. For these subjects, we collected EHR data comprised of 6 components: demographics, diagnoses, symptoms, procedures, medications, and laboratory results. We developed two regularized prediction models, Ridge Logistic Regression (Ridge-LR) and Lasso Logistic Regression (Lasso-LR), to predict the "most harmful" breast cancer one year in advance. The area under the ROC curve (AUC) was used to assess model performance. We observed that the AUCs of Ridge-LR and Lasso-LR models were 0.818 and 0.839 respectively. For both the Ridge-LR and LassoLR models, the predictive performance of the whole EHR variables was significantly higher than that of each individual component (p<0.001). In conclusion, EHR variables can be used to predict the "most harmful" breast cancer, providing the possibility to personalize care for those women at the highest risk in clinical practice.

  18. Practical approach to subject-specific estimation of knee joint contact force.

    PubMed

    Knarr, Brian A; Higginson, Jill S

    2015-08-20

    Compressive forces experienced at the knee can significantly contribute to cartilage degeneration. Musculoskeletal models enable predictions of the internal forces experienced at the knee, but validation is often not possible, as experimental data detailing loading at the knee joint is limited. Recently available data reporting compressive knee force through direct measurement using instrumented total knee replacements offer a unique opportunity to evaluate the accuracy of models. Previous studies have highlighted the importance of subject-specificity in increasing the accuracy of model predictions; however, these techniques may be unrealistic outside of a research setting. Therefore, the goal of our work was to identify a practical approach for accurate prediction of tibiofemoral knee contact force (KCF). Four methods for prediction of knee contact force were compared: (1) standard static optimization, (2) uniform muscle coordination weighting, (3) subject-specific muscle coordination weighting and (4) subject-specific strength adjustments. Walking trials for three subjects with instrumented knee replacements were used to evaluate the accuracy of model predictions. Predictions utilizing subject-specific muscle coordination weighting yielded the best agreement with experimental data; however this method required in vivo data for weighting factor calibration. Including subject-specific strength adjustments improved models' predictions compared to standard static optimization, with errors in peak KCF less than 0.5 body weight for all subjects. Overall, combining clinical assessments of muscle strength with standard tools available in the OpenSim software package, such as inverse kinematics and static optimization, appears to be a practical method for predicting joint contact force that can be implemented for many applications. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Quantifying predictive capability of electronic health records for the most harmful breast cancer.

    PubMed

    Wu, Yirong; Fan, Jun; Peissig, Peggy; Berg, Richard; Tafti, Ahmad Pahlavan; Yin, Jie; Yuan, Ming; Page, David; Cox, Jennifer; Burnside, Elizabeth S

    2018-02-01

    Improved prediction of the "most harmful" breast cancers that cause the most substantive morbidity and mortality would enable physicians to target more intense screening and preventive measures at those women who have the highest risk; however, such prediction models for the "most harmful" breast cancers have rarely been developed. Electronic health records (EHRs) represent an underused data source that has great research and clinical potential. Our goal was to quantify the value of EHR variables in the "most harmful" breast cancer risk prediction. We identified 794 subjects who had breast cancer with primary non-benign tumors with their earliest diagnosis on or after 1/1/2004 from an existing personalized medicine data repository, including 395 "most harmful" breast cancer cases and 399 "least harmful" breast cancer cases. For these subjects, we collected EHR data comprised of 6 components: demographics, diagnoses, symptoms, procedures, medications, and laboratory results. We developed two regularized prediction models, Ridge Logistic Regression (Ridge-LR) and Lasso Logistic Regression (Lasso-LR), to predict the "most harmful" breast cancer one year in advance. The area under the ROC curve (AUC) was used to assess model performance. We observed that the AUCs of Ridge-LR and Lasso-LR models were 0.818 and 0.839 respectively. For both the Ridge-LR and Lasso-LR models, the predictive performance of the whole EHR variables was significantly higher than that of each individual component (p<0.001). In conclusion, EHR variables can be used to predict the "most harmful" breast cancer, providing the possibility to personalize care for those women at the highest risk in clinical practice.

  20. Practical approach to subject-specific estimation of knee joint contact force

    PubMed Central

    Knarr, Brian A.; Higginson, Jill S.

    2015-01-01

    Compressive forces experienced at the knee can significantly contribute to cartilage degeneration. Musculoskeletal models enable predictions of the internal forces experienced at the knee, but validation is often not possible, as experimental data detailing loading at the knee joint is limited. Recently available data reporting compressive knee force through direct measurement using instrumented total knee replacements offer a unique opportunity to evaluate the accuracy of models. Previous studies have highlighted the importance of subject-specificity in increasing the accuracy of model predictions; however, these techniques may be unrealistic outside of a research setting. Therefore, the goal of our work was to identify a practical approach for accurate prediction of tibiofemoral knee contact force (KCF). Four methods for prediction of knee contact force were compared: (1) standard static optimization, (2) uniform muscle coordination weighting, (3) subject-specific muscle coordination weighting and (4) subject-specific strength adjustments. Walking trials for three subjects with instrumented knee replacements were used to evaluate the accuracy of model predictions. Predictions utilizing subject-specific muscle coordination weighting yielded the best agreement with experimental data, however this method required in vivo data for weighting factor calibration. Including subject-specific strength adjustments improved models’ predictions compared to standard static optimization, with errors in peak KCF less than 0.5 body weight for all subjects. Overall, combining clinical assessments of muscle strength with standard tools available in the OpenSim software package, such as inverse kinematics and static optimization, appears to be a practical method for predicting joint contact force that can be implemented for many applications. PMID:25952546

  1. Predicting the Future: Delivery Room Planning of Congenital Heart Disease Diagnosed by Fetal Echocardiography.

    PubMed

    Donofrio, Mary T

    2018-05-01

    Advances in prenatal imaging have improved the examination of the fetal cardiovascular system. Fetal echocardiography facilitates the prenatal diagnosis of congenital heart disease (CHD) and through sequential examination, allows assessment of fetal cardiac hemodynamics, predicting the evolution of anatomical and functional cardiovascular abnormalities in utero and during the transition to a postnatal circulation at delivery. This approach allows detailed diagnosis with prenatal counseling and enables planning to define perinatal management, selecting the fetuses at a risk of postnatal hemodynamic instability who are likely to require a specialized delivery plan. The prenatal diagnosis and management of critical neonatal CHD has been shown to play an important role in improving the outcome of newborns with these conditions, allowing timely stabilization of the circulation prior to cardiac intervention or surgery, thus reducing the risk of perioperative morbidity and mortality. Diagnostic protocols aimed at risk-stratifying severity and potential postnatal compromise in fetuses with CHD have been developed to identify those who may require special intervention at birth or within the first days of life. In addition, new methodologies are being studied to improve the accuracy of prediction of disease severity. Perinatal management of neonates with a prenatal diagnosis of CHD requires a close collaboration between obstetric, neonatal, and cardiology services. In this article, the management of fetuses with CHD will be discussed, along with summarizing the in utero and fetal echocardiographic findings used for risk stratification of newborns with CHD and reviewing the basic principles used for planning for neonatal resuscitation and initial transitional care of these complex newborns. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  2. Integrating Crop Growth Models with Whole Genome Prediction through Approximate Bayesian Computation.

    PubMed

    Technow, Frank; Messina, Carlos D; Totir, L Radu; Cooper, Mark

    2015-01-01

    Genomic selection, enabled by whole genome prediction (WGP) methods, is revolutionizing plant breeding. Existing WGP methods have been shown to deliver accurate predictions in the most common settings, such as prediction of across environment performance for traits with additive gene effects. However, prediction of traits with non-additive gene effects and prediction of genotype by environment interaction (G×E), continues to be challenging. Previous attempts to increase prediction accuracy for these particularly difficult tasks employed prediction methods that are purely statistical in nature. Augmenting the statistical methods with biological knowledge has been largely overlooked thus far. Crop growth models (CGMs) attempt to represent the impact of functional relationships between plant physiology and the environment in the formation of yield and similar output traits of interest. Thus, they can explain the impact of G×E and certain types of non-additive gene effects on the expressed phenotype. Approximate Bayesian computation (ABC), a novel and powerful computational procedure, allows the incorporation of CGMs directly into the estimation of whole genome marker effects in WGP. Here we provide a proof of concept study for this novel approach and demonstrate its use with synthetic data sets. We show that this novel approach can be considerably more accurate than the benchmark WGP method GBLUP in predicting performance in environments represented in the estimation set as well as in previously unobserved environments for traits determined by non-additive gene effects. We conclude that this proof of concept demonstrates that using ABC for incorporating biological knowledge in the form of CGMs into WGP is a very promising and novel approach to improving prediction accuracy for some of the most challenging scenarios in plant breeding and applied genetics.

  3. Integrating Crop Growth Models with Whole Genome Prediction through Approximate Bayesian Computation

    PubMed Central

    Technow, Frank; Messina, Carlos D.; Totir, L. Radu; Cooper, Mark

    2015-01-01

    Genomic selection, enabled by whole genome prediction (WGP) methods, is revolutionizing plant breeding. Existing WGP methods have been shown to deliver accurate predictions in the most common settings, such as prediction of across environment performance for traits with additive gene effects. However, prediction of traits with non-additive gene effects and prediction of genotype by environment interaction (G×E), continues to be challenging. Previous attempts to increase prediction accuracy for these particularly difficult tasks employed prediction methods that are purely statistical in nature. Augmenting the statistical methods with biological knowledge has been largely overlooked thus far. Crop growth models (CGMs) attempt to represent the impact of functional relationships between plant physiology and the environment in the formation of yield and similar output traits of interest. Thus, they can explain the impact of G×E and certain types of non-additive gene effects on the expressed phenotype. Approximate Bayesian computation (ABC), a novel and powerful computational procedure, allows the incorporation of CGMs directly into the estimation of whole genome marker effects in WGP. Here we provide a proof of concept study for this novel approach and demonstrate its use with synthetic data sets. We show that this novel approach can be considerably more accurate than the benchmark WGP method GBLUP in predicting performance in environments represented in the estimation set as well as in previously unobserved environments for traits determined by non-additive gene effects. We conclude that this proof of concept demonstrates that using ABC for incorporating biological knowledge in the form of CGMs into WGP is a very promising and novel approach to improving prediction accuracy for some of the most challenging scenarios in plant breeding and applied genetics. PMID:26121133

  4. Control and Optimization of Electric Ship Propulsion Systems with Hybrid Energy Storage

    NASA Astrophysics Data System (ADS)

    Hou, Jun

    Electric ships experience large propulsion-load fluctuations on their drive shaft due to encountered waves and the rotational motion of the propeller, affecting the reliability of the shipboard power network and causing wear and tear. This dissertation explores new solutions to address these fluctuations by integrating a hybrid energy storage system (HESS) and developing energy management strategies (EMS). Advanced electric propulsion drive concepts are developed to improve energy efficiency, performance and system reliability by integrating HESS, developing advanced control solutions and system integration strategies, and creating tools (including models and testbed) for design and optimization of hybrid electric drive systems. A ship dynamics model which captures the underlying physical behavior of the electric ship propulsion system is developed to support control development and system optimization. To evaluate the effectiveness of the proposed control approaches, a state-of-the-art testbed has been constructed which includes a system controller, Li-Ion battery and ultra-capacitor (UC) modules, a high-speed flywheel, electric motors with their power electronic drives, DC/DC converters, and rectifiers. The feasibility and effectiveness of HESS are investigated and analyzed. Two different HESS configurations, namely battery/UC (B/UC) and battery/flywheel (B/FW), are studied and analyzed to provide insights into the advantages and limitations of each configuration. Battery usage, loss analysis, and sensitivity to battery aging are also analyzed for each configuration. In order to enable real-time application and achieve desired performance, a model predictive control (MPC) approach is developed, where a state of charge (SOC) reference of flywheel for B/FW or UC for B/UC is used to address the limitations imposed by short predictive horizons, because the benefits of flywheel and UC working around high-efficiency range are ignored by short predictive horizons. Given the multi-frequency characteristics of load fluctuations, a filter-based control strategy is developed to illustrate the importance of the coordination within the HESS. Without proper control strategies, the HESS solution could be worse than a single energy storage system solution. The proposed HESS, when introduced into an existing shipboard electrical propulsion system, will interact with the power generation systems. A model-based analysis is performed to evaluate the interactions of the multiple power sources when a hybrid energy storage system is introduced. The study has revealed undesirable interactions when the controls are not coordinated properly, and leads to the conclusion that a proper EMS is needed. Knowledge of the propulsion-load torque is essential for the proposed system-level EMS, but this load torque is immeasurable in most marine applications. To address this issue, a model-based approach is developed so that load torque estimation and prediction can be incorporated into the MPC. In order to evaluate the effectiveness of the proposed approach, an input observer with linear prediction is developed as an alternative approach to obtain the load estimation and prediction. Comparative studies are performed to illustrate the importance of load torque estimation and prediction, and demonstrate the effectiveness of the proposed approach in terms of improved efficiency, enhanced reliability, and reduced wear and tear. Finally, the real-time MPC algorithm has been implemented on a physical testbed. Three different efforts have been made to enable real-time implementation: a specially tailored problem formulation, an efficient optimization algorithm and a multi-core hardware implementation. Compared to the filter-based strategy, the proposed real-time MPC achieves superior performance, in terms of the enhanced system reliability, improved HESS efficiency, and extended battery life.

  5. SU-E-T-170: Characterization of the Location, Extent, and Proximity to Critical Structures of Target Volumes Provides Detail for Improved Outcome Predictions Among Pancreatic Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Z; Moore, J; Rosati, L

    Purpose: In radiotherapy, size, location and proximity of the target to critical structures influence treatment decisions. It has been shown that proximity of the target predicts dosimetric sparing of critical structures. In addition to dosimetry, precise location of disease has further implications such as tumor invasion, or proximity to major arteries that inhibit surgery. Knowledge of which patients can be converted to surgical candidates by radiation may have high impact on future treat/no-treat decisions. We propose a method to improve our characterization of the location of pancreatic cancer and treatment volume extent with respect to nearby arteries with the goalmore » of developing features to improve clinical predictions and decisions. Methods: Oncospace is a local learning health system that systematically captures clinical outcomes and all aspects of radiotherapy treatment plans, including overlap volume histograms (OVH) – a measure of spatial relationships between two structures. Minimum and maximum distances of PTV and OARs based on OVH, PTV volume, anatomic location by ICD-9 code, and surgical outcome were queried. Normalized distance to center from the left and right kidney was calculated to indicate tumor location and laterality. Distance to critical arteries (celiac, superior mesenteric, common hepatic) is validated by surgical status (borderline resectable, locally advanced converted to resectable). Results: There were 205 pancreas stereotactic body radiotherapy patients treated from 2009–2015 queried. Location/laterality of tumor based on kidney OVH show strong trends between location by OVH and by ICD-9. Compared to the locally advanced group, the borderline resectable group showed larger geometrical distance from critical arteries (p=0.03). Conclusion: Our platform enabled analysis of shape/size-location relationships. These data suggest that PTV volume and attention to distance between PTVs and surrounding OARs and major arteries may be promising for improving characterization of treatment anatomy that can refine our ability for outcome predictions and decision making. Elekta, Toshiba.« less

  6. Smart EV Energy Management System to Support Grid Services

    NASA Astrophysics Data System (ADS)

    Wang, Bin

    Under smart grid scenarios, the advanced sensing and metering technologies have been applied to the legacy power grid to improve the system observability and the real-time situational awareness. Meanwhile, there is increasing amount of distributed energy resources (DERs), such as renewable generations, electric vehicles (EVs) and battery energy storage system (BESS), etc., being integrated into the power system. However, the integration of EVs, which can be modeled as controllable mobile energy devices, brings both challenges and opportunities to the grid planning and energy management, due to the intermittency of renewable generation, uncertainties of EV driver behaviors, etc. This dissertation aims to solve the real-time EV energy management problem in order to improve the overall grid efficiency, reliability and economics, using online and predictive optimization strategies. Most of the previous research on EV energy management strategies and algorithms are based on simplified models with unrealistic assumptions that the EV charging behaviors are perfectly known or following known distributions, such as the arriving time, leaving time and energy consumption values, etc. These approaches fail to obtain the optimal solutions in real-time because of the system uncertainties. Moreover, there is lack of data-driven strategy that performs online and predictive scheduling for EV charging behaviors under microgrid scenarios. Therefore, we develop an online predictive EV scheduling framework, considering uncertainties of renewable generation, building load and EV driver behaviors, etc., based on real-world data. A kernel-based estimator is developed to predict the charging session parameters in real-time with improved estimation accuracy. The efficacy of various optimization strategies that are supported by this framework, including valley-filling, cost reduction, event-based control, etc., has been demonstrated. In addition, the existing simulation-based approaches do not consider a variety of practical concerns of implementing such a smart EV energy management system, including the driver preferences, communication protocols, data models, and customized integration of existing standards to provide grid services. Therefore, this dissertation also solves these issues by designing and implementing a scalable system architecture to capture the user preferences, enable multi-layer communication and control, and finally improve the system reliability and interoperability.

  7. Establishment of a standard operating procedure for predicting the time of calving in cattle

    PubMed Central

    Sauter-Louis, Carola; Braunert, Anna; Lange, Dorothee; Weber, Frank; Zerbe, Holm

    2011-01-01

    Precise calving monitoring is essential for minimizing the effects of dystocia in cows and calves. We conducted two studies in healthy cows that compared seven clinical signs (broad pelvic ligaments relaxation, vaginal secretion, udder hyperplasia, udder edema, teat filling, tail relaxation, and vulva edema) alone and in combination in order to predict the time of parturition. The relaxation of the broad pelvic ligaments combined with teat filling gave the best values for predicting either calving or no calving within 12 h. For the proposed parturition score (PS), a threshold of 4 PS points was identified below which calving within the next 12 h could be ruled out with a probability of 99.3% in cows (95.5% in heifers). Above this threshold, intermitted calving monitoring every 3 h and a progesterone rapid blood test (PRBT) would be recommended. By combining the PS and PRBT (if PS ≥ 4), the prediction of calving within the next 12 h improved from 14.9% to 53.1%, and the probability of ruling out calving was 96.8%. The PRBT was compared to the results of an enzyme immunoassay (sensitivity, 90.2%; specificity, 74.9%). The standard operating procedure developed in this study that combines the PS and PRBT will enable veterinarians to rule out or predict calving within a 12 h period in cows with high accuracy under field conditions. PMID:21586878

  8. Prediction of Multiple-Trait and Multiple-Environment Genomic Data Using Recommender Systems.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Crossa, José; Montesinos-López, José C; Mota-Sanchez, David; Estrada-González, Fermín; Gillberg, Jussi; Singh, Ravi; Mondal, Suchismita; Juliana, Philomin

    2018-01-04

    In genomic-enabled prediction, the task of improving the accuracy of the prediction of lines in environments is difficult because the available information is generally sparse and usually has low correlations between traits. In current genomic selection, although researchers have a large amount of information and appropriate statistical models to process it, there is still limited computing efficiency to do so. Although some statistical models are usually mathematically elegant, many of them are also computationally inefficient, and they are impractical for many traits, lines, environments, and years because they need to sample from huge normal multivariate distributions. For these reasons, this study explores two recommender systems: item-based collaborative filtering (IBCF) and the matrix factorization algorithm (MF) in the context of multiple traits and multiple environments. The IBCF and MF methods were compared with two conventional methods on simulated and real data. Results of the simulated and real data sets show that the IBCF technique was slightly better in terms of prediction accuracy than the two conventional methods and the MF method when the correlation was moderately high. The IBCF technique is very attractive because it produces good predictions when there is high correlation between items (environment-trait combinations) and its implementation is computationally feasible, which can be useful for plant breeders who deal with very large data sets. Copyright © 2018 Montesinos-Lopez et al.

  9. Prediction of Multiple-Trait and Multiple-Environment Genomic Data Using Recommender Systems

    PubMed Central

    Montesinos-López, Osval A.; Montesinos-López, Abelardo; Crossa, José; Montesinos-López, José C.; Mota-Sanchez, David; Estrada-González, Fermín; Gillberg, Jussi; Singh, Ravi; Mondal, Suchismita; Juliana, Philomin

    2018-01-01

    In genomic-enabled prediction, the task of improving the accuracy of the prediction of lines in environments is difficult because the available information is generally sparse and usually has low correlations between traits. In current genomic selection, although researchers have a large amount of information and appropriate statistical models to process it, there is still limited computing efficiency to do so. Although some statistical models are usually mathematically elegant, many of them are also computationally inefficient, and they are impractical for many traits, lines, environments, and years because they need to sample from huge normal multivariate distributions. For these reasons, this study explores two recommender systems: item-based collaborative filtering (IBCF) and the matrix factorization algorithm (MF) in the context of multiple traits and multiple environments. The IBCF and MF methods were compared with two conventional methods on simulated and real data. Results of the simulated and real data sets show that the IBCF technique was slightly better in terms of prediction accuracy than the two conventional methods and the MF method when the correlation was moderately high. The IBCF technique is very attractive because it produces good predictions when there is high correlation between items (environment–trait combinations) and its implementation is computationally feasible, which can be useful for plant breeders who deal with very large data sets. PMID:29097376

  10. Implementation of new pavement performance prediction models in PMIS : report

    DOT National Transportation Integrated Search

    2012-08-01

    Pavement performance prediction models and maintenance and rehabilitation (M&R) optimization processes : enable managers and engineers to plan and prioritize pavement M&R activities in a cost-effective manner. : This report describes TxDOTs effort...

  11. Modelling directional solidification

    NASA Technical Reports Server (NTRS)

    Wilcox, William R.

    1987-01-01

    An improved understanding of the phenomena of importance to directional solidification is attempted to enable explanation and prediction of differences in behavior between solidification on Earth and in space. Emphasis is now on experimentally determining the influence of convection and freezing rate fluctuations on compositional homogeneity and crystalline perfection. A correlation is sought between heater temperature profiles, buoyancy-driven convection, and doping inhomogeneities using naphthalene doped with anthracene. The influence of spin-up/spin-down is determined on compositional homogeneity and microstructure of indium gallium antimonide. The effect is determined of imposed melting - freezing cycles on indium gallium antimonide. The mechanism behind the increase of grain size caused by using spin-up/spin-down in directional solidification of mercury cadimum telluride is sought.

  12. Life prediction and constitutive models for engine hot section anisotropic materials

    NASA Technical Reports Server (NTRS)

    Swanson, G. A.

    1984-01-01

    The development of directionally solidified and single crystal alloys is perhaps the most important recent advancement in hot section materials technology. The objective is to develop knowledge that enables the designer to improve anisotropic gas turbine parts to their full potential. Two single crystal alloys selected were PWA 1480 and Alloy 185. The coatings selected were an overlay coating, PWA 286, and an aluminide diffusion coating, PWA 273. The constitutive specimens were solid and cylindrical; the fatigue specimens were hollow and cylindrical. Two thicknesses of substrate are utilized. Specimens of both thickness (0.4 and 1.5 mm) will be coated and then tested for tensile, creep, and fatigue properties.

  13. Select Methodology for Validating Advanced Satellite Measurement Systems

    NASA Technical Reports Server (NTRS)

    Larar, Allen M.; Zhou, Daniel K.; Liu, Xi; Smith, William L.

    2008-01-01

    Advanced satellite sensors are tasked with improving global measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring capability, and environmental change detection. Measurement system validation is crucial to achieving this goal and maximizing research and operational utility of resultant data. Field campaigns including satellite under-flights with well calibrated FTS sensors aboard high-altitude aircraft are an essential part of the validation task. This presentation focuses on an overview of validation methodology developed for assessment of high spectral resolution infrared systems, and includes results of preliminary studies performed to investigate the performance of the Infrared Atmospheric Sounding Interferometer (IASI) instrument aboard the MetOp-A satellite.

  14. Multilayer Insulation Ascent Venting Model

    NASA Technical Reports Server (NTRS)

    Tramel, R. W.; Sutherlin, S. G.; Johnson, W. L.

    2017-01-01

    The thermal and venting transient experienced by tank-applied multilayer insulation (MLI) in the Earth-to-orbit environment is very dynamic and not well characterized. This new predictive code is a first principles-based engineering model which tracks the time history of the mass and temperature (internal energy) of the gas in each MLI layer. A continuum-based model is used for early portions of the trajectory while a kinetic theory-based model is used for the later portions of the trajectory, and the models are blended based on a reference mean free path. This new capability should improve understanding of the Earth-to-orbit transient and enable better insulation system designs for in-space cryogenic propellant systems.

  15. The basis function approach for modeling autocorrelation in ecological data

    USGS Publications Warehouse

    Hefley, Trevor J.; Broms, Kristin M.; Brost, Brian M.; Buderman, Frances E.; Kay, Shannon L.; Scharf, Henry; Tipton, John; Williams, Perry J.; Hooten, Mevin B.

    2017-01-01

    Analyzing ecological data often requires modeling the autocorrelation created by spatial and temporal processes. Many seemingly disparate statistical methods used to account for autocorrelation can be expressed as regression models that include basis functions. Basis functions also enable ecologists to modify a wide range of existing ecological models in order to account for autocorrelation, which can improve inference and predictive accuracy. Furthermore, understanding the properties of basis functions is essential for evaluating the fit of spatial or time-series models, detecting a hidden form of collinearity, and analyzing large data sets. We present important concepts and properties related to basis functions and illustrate several tools and techniques ecologists can use when modeling autocorrelation in ecological data.

  16. Seizure Forecasting and the Preictal State in Canine Epilepsy.

    PubMed

    Varatharajah, Yogatheesan; Iyer, Ravishankar K; Berry, Brent M; Worrell, Gregory A; Brinkmann, Benjamin H

    2017-02-01

    The ability to predict seizures may enable patients with epilepsy to better manage their medications and activities, potentially reducing side effects and improving quality of life. Forecasting epileptic seizures remains a challenging problem, but machine learning methods using intracranial electroencephalographic (iEEG) measures have shown promise. A machine-learning-based pipeline was developed to process iEEG recordings and generate seizure warnings. Results support the ability to forecast seizures at rates greater than a Poisson random predictor for all feature sets and machine learning algorithms tested. In addition, subject-specific neurophysiological changes in multiple features are reported preceding lead seizures, providing evidence supporting the existence of a distinct and identifiable preictal state.

  17. SEIZURE FORECASTING AND THE PREICTAL STATE IN CANINE EPILEPSY

    PubMed Central

    Varatharajah, Yogatheesan; Iyer, Ravishankar K.; Berry, Brent M.; Worrell, Gregory A.; Brinkmann, Benjamin H.

    2017-01-01

    The ability to predict seizures may enable patients with epilepsy to better manage their medications and activities, potentially reducing side effects and improving quality of life. Forecasting epileptic seizures remains a challenging problem, but machine learning methods using intracranial electroencephalographic (iEEG) measures have shown promise. A machine-learning-based pipeline was developed to process iEEG recordings and generate seizure warnings. Results support the ability to forecast seizures at rates greater than a Poisson random predictor for all feature sets and machine learning algorithms tested. In addition, subject-specific neurophysiological changes in multiple features are reported preceding lead seizures, providing evidence supporting the existence of a distinct and identifiable preictal state. PMID:27464854

  18. Syntactic Prediction in Language Comprehension: Evidence From Either…or

    PubMed Central

    Staub, Adrian; Clifton, Charles

    2006-01-01

    Readers’ eye movements were monitored as they read sentences in which two noun phrases or two independent clauses were connected by the word or (NP-coordination and S-coordination, respectively). The word either could be present or absent earlier in the sentence. When either was present, the material immediately following or was read more quickly, across both sentence types. In addition, there was evidence that readers misanalyzed the S-coordination structure as an NP-coordination structure only when either was absent. The authors interpret the results as indicating that the word either enabled readers to predict the arrival of a coordination structure; this predictive activation facilitated processing of this structure when it ultimately arrived, and in the case of S-coordination sentences, enabled readers to avoid the incorrect NP-coordination analysis. The authors argue that these results support parsing theories according to which the parser can build predictable syntactic structure before encountering the corresponding lexical input. PMID:16569157

  19. A Review of Emerging Technologies for the Management of Diabetes Mellitus.

    PubMed

    Zarkogianni, Konstantia; Litsa, Eleni; Mitsis, Konstantinos; Wu, Po-Yen; Kaddi, Chanchala D; Cheng, Chih-Wen; Wang, May D; Nikita, Konstantina S

    2015-12-01

    High prevalence of diabetes mellitus (DM) along with the poor health outcomes and the escalated costs of treatment and care poses the need to focus on prevention, early detection and improved management of the disease. The aim of this paper is to present and discuss the latest accomplishments in sensors for glucose and lifestyle monitoring along with clinical decision support systems (CDSSs) facilitating self-disease management and supporting healthcare professionals in decision making. A critical literature review analysis is conducted focusing on advances in: 1) sensors for physiological and lifestyle monitoring, 2) models and molecular biomarkers for predicting the onset and assessing the progress of DM, and 3) modeling and control methods for regulating glucose levels. Glucose and lifestyle sensing technologies are continuously evolving with current research focusing on the development of noninvasive sensors for accurate glucose monitoring. A wide range of modeling, classification, clustering, and control approaches have been deployed for the development of the CDSS for diabetes management. Sophisticated multiscale, multilevel modeling frameworks taking into account information from behavioral down to molecular level are necessary to reveal correlations and patterns indicating the onset and evolution of DM. Integration of data originating from sensor-based systems and electronic health records combined with smart data analytics methods and powerful user centered approaches enable the shift toward preventive, predictive, personalized, and participatory diabetes care. The potential of sensing and predictive modeling approaches toward improving diabetes management is highlighted and related challenges are identified.

  20. An application of a relational database system for high-throughput prediction of elemental compositions from accurate mass values.

    PubMed

    Sakurai, Nozomu; Ara, Takeshi; Kanaya, Shigehiko; Nakamura, Yukiko; Iijima, Yoko; Enomoto, Mitsuo; Motegi, Takeshi; Aoki, Koh; Suzuki, Hideyuki; Shibata, Daisuke

    2013-01-15

    High-accuracy mass values detected by high-resolution mass spectrometry analysis enable prediction of elemental compositions, and thus are used for metabolite annotations in metabolomic studies. Here, we report an application of a relational database to significantly improve the rate of elemental composition predictions. By searching a database of pre-calculated elemental compositions with fixed kinds and numbers of atoms, the approach eliminates redundant evaluations of the same formula that occur in repeated calculations with other tools. When our approach is compared with HR2, which is one of the fastest tools available, our database search times were at least 109 times shorter than those of HR2. When a solid-state drive (SSD) was applied, the search time was 488 times shorter at 5 ppm mass tolerance and 1833 times at 0.1 ppm. Even if the search by HR2 was performed with 8 threads in a high-spec Windows 7 PC, the database search times were at least 26 and 115 times shorter without and with the SSD. These improvements were enhanced in a low spec Windows XP PC. We constructed a web service 'MFSearcher' to query the database in a RESTful manner. Available for free at http://webs2.kazusa.or.jp/mfsearcher. The web service is implemented in Java, MySQL, Apache and Tomcat, with all major browsers supported. sakurai@kazusa.or.jp Supplementary data are available at Bioinformatics online.

  1. A Review of Emerging Technologies for the Management of Diabetes Mellitus

    PubMed Central

    Zarkogianni, Konstantia; Litsa, Eleni; Mitsis, Konstantinos; Wu, Po-Yen; Kaddi, Chanchala D.; Cheng, Chih-Wen; Wang, May D.; Nikita, Konstantina S.

    2016-01-01

    Objective High prevalence of diabetes mellitus (DM) along with the poor health outcomes and the escalated costs of treatment and care poses the need to focus on prevention, early detection and improved management of the disease. The aim of this paper is to present and discuss the latest accomplishments in sensors for glucose and lifestyle monitoring along with clinical decision support systems (CDSSs) facilitating self-disease management and supporting healthcare professionals in decision making. Methods A critical literature review analysis is conducted focusing on advances in: 1) sensors for physiological and lifestyle monitoring, 2) models and molecular biomarkers for predicting the onset and assessing the progress of DM, and 3) modeling and control methods for regulating glucose levels. Results Glucose and lifestyle sensing technologies are continuously evolving with current research focusing on the development of noninvasive sensors for accurate glucose monitoring. A wide range of modeling, classification, clustering, and control approaches have been deployed for the development of the CDSS for diabetes management. Sophisticated multiscale, multilevel modeling frameworks taking into account information from behavioral down to molecular level are necessary to reveal correlations and patterns indicating the onset and evolution of DM. Conclusion Integration of data originating from sensor-based systems and electronic health records combined with smart data analytics methods and powerful user centered approaches enable the shift toward preventive, predictive, personalized, and participatory diabetes care. Significance The potential of sensing and predictive modeling approaches toward improving diabetes management is highlighted and related challenges are identified. PMID:26292334

  2. Can Fish Morphological Characteristics be Used to Re-design Hydroelectric Turbines?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cada, G. F.; Richmond, Marshall C.

    2011-07-19

    Safe fish passage affects not only migratory species, but also populations of resident fish by altering biomass, biodiversity, and gene flow. Consequently, it is important to estimate turbine passage survival of a wide range of susceptible fish. Although fish-friendly turbines show promise for reducing turbine passage mortality, experimental data on their beneficial effects are limited to only a few species, mainly salmon and trout. For thousands of untested species and sizes of fish, the particular causes of turbine passage mortality and the benefits of fish-friendly turbine designs remain unknown. It is not feasible to measure the turbine-passage survival of everymore » species of fish in every hydroelectric turbine design. We are attempting to predict fish mortality based on an improved understanding of turbine-passage stresses (pressure, shear stress, turbulence, strike) and information about the morphological, behavioral, and physiological characteristics of different fish taxa that make them susceptible to the stresses. Computational fluid dynamics and blade strike models of the turbine environment are re-examined in light of laboratory and field studies of fish passage effects. Comparisons of model-predicted stresses to measured injuries and mortalities will help identify fish survival thresholds and the aspects of turbines that are most in need of re-design. The coupled model and fish morphology evaluations will enable us to make predictions of turbine-passage survival among untested fish species, for both conventional and advanced turbines, and to guide the design of hydroelectric turbines to improve fish passage survival.« less

  3. Improved predictive modeling of white LEDs with accurate luminescence simulation and practical inputs with TracePro opto-mechanical design software

    NASA Astrophysics Data System (ADS)

    Tsao, Chao-hsi; Freniere, Edward R.; Smith, Linda

    2009-02-01

    The use of white LEDs for solid-state lighting to address applications in the automotive, architectural and general illumination markets is just emerging. LEDs promise greater energy efficiency and lower maintenance costs. However, there is a significant amount of design and cost optimization to be done while companies continue to improve semiconductor manufacturing processes and begin to apply more efficient and better color rendering luminescent materials such as phosphor and quantum dot nanomaterials. In the last decade, accurate and predictive opto-mechanical software modeling has enabled adherence to performance, consistency, cost, and aesthetic criteria without the cost and time associated with iterative hardware prototyping. More sophisticated models that include simulation of optical phenomenon, such as luminescence, promise to yield designs that are more predictive - giving design engineers and materials scientists more control over the design process to quickly reach optimum performance, manufacturability, and cost criteria. A design case study is presented where first, a phosphor formulation and excitation source are optimized for a white light. The phosphor formulation, the excitation source and other LED components are optically and mechanically modeled and ray traced. Finally, its performance is analyzed. A blue LED source is characterized by its relative spectral power distribution and angular intensity distribution. YAG:Ce phosphor is characterized by relative absorption, excitation and emission spectra, quantum efficiency and bulk absorption coefficient. Bulk scatter properties are characterized by wavelength dependent scatter coefficients, anisotropy and bulk absorption coefficient.

  4. Updating Sea Spray Aerosol Emissions in the Community Multiscale Air Quality Model

    NASA Astrophysics Data System (ADS)

    Gantt, B.; Bash, J. O.; Kelly, J.

    2014-12-01

    Sea spray aerosols (SSA) impact the particle mass concentration and gas-particle partitioning in coastal environments, with implications for human and ecosystem health. In this study, the Community Multiscale Air Quality (CMAQ) model is updated to enhance fine mode SSA emissions, include sea surface temperature (SST) dependency, and revise surf zone emissions. Based on evaluation with several regional and national observational datasets in the continental U.S., the updated emissions generally improve surface concentrations predictions of primary aerosols composed of sea-salt and secondary aerosols affected by sea-salt chemistry in coastal and near-coastal sites. Specifically, the updated emissions lead to better predictions of the magnitude and coastal-to-inland gradient of sodium, chloride, and nitrate concentrations at Bay Regional Atmospheric Chemistry Experiment (BRACE) sites near Tampa, FL. Including SST-dependency to the SSA emission parameterization leads to increased sodium concentrations in the southeast U.S. and decreased concentrations along the Pacific coast and northeastern U.S., bringing predictions into closer agreement with observations at most Interagency Monitoring of Protected Visual Environments (IMPROVE) and Chemical Speciation Network (CSN) sites. Model comparison with California Research at the Nexus of Air Quality and Climate Change (CalNex) observations will also be discussed, with particular focus on the South Coast Air Basin where clean marine air mixes with anthropogenic pollution in a complex environment. These SSA emission updates enable more realistic simulation of chemical processes in coastal environments, both in clean marine air masses and mixtures of clean marine and polluted conditions.

  5. Reliable and fast quantitative analysis of active ingredient in pharmaceutical suspension using Raman spectroscopy.

    PubMed

    Park, Seok Chan; Kim, Minjung; Noh, Jaegeun; Chung, Hoeil; Woo, Youngah; Lee, Jonghwa; Kemper, Mark S

    2007-06-12

    The concentration of acetaminophen in a turbid pharmaceutical suspension has been measured successfully using Raman spectroscopy. The spectrometer was equipped with a large spot probe which enabled the coverage of a representative area during sampling. This wide area illumination (WAI) scheme (coverage area 28.3 mm2) for Raman data collection proved to be more reliable for the compositional determination of these pharmaceutical suspensions, especially when the samples were turbid. The reproducibility of measurement using the WAI scheme was compared to that of using a conventional small-spot scheme which employed a much smaller illumination area (about 100 microm spot size). A layer of isobutyric anhydride was placed in front of the sample vials to correct the variation in the Raman intensity due to the fluctuation of laser power. Corrections were accomplished using the isolated carbonyl band of isobutyric anhydride. The acetaminophen concentrations of prediction samples were accurately estimated using a partial least squares (PLS) calibration model. The prediction accuracy was maintained even with changes in laser power. It was noted that the prediction performance was somewhat degraded for turbid suspensions with high acetaminophen contents. When comparing the results of reproducibility obtained with the WAI scheme and those obtained using the conventional scheme, it was concluded that the quantitative determination of the active pharmaceutical ingredient (API) in turbid suspensions is much improved when employing a larger laser coverage area. This is presumably due to the improvement in representative sampling.

  6. Antimicrobial resistance prediction in PATRIC and RAST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, James J.; Boisvert, Sebastien; Brettin, Thomas

    The emergence and spread of antimicrobial resistance (AMR) mechanisms in bacterial pathogens, coupled with the dwindling number of effective antibiotics, has created a global health crisis. Being able to identify the genetic mechanisms of AMR and predict the resistance phenotypes of bacterial pathogens prior to culturing could inform clinical decision-making and improve reaction time. At PATRIC (http://patricbrc.org/), we have been collecting bacterial genomes with AMR metadata for several years. In order to advance phenotype prediction and the identification of genomic regions relating to AMR, we have updated the PATRIC FTP server to enable access to genomes that are binned bymore » their AMR phenotypes, as well as metadata including minimum inhibitory concentrations. Using this infrastructure, we custom built AdaBoost (adaptive boosting) machine learning classifiers for identifying carbapenem resistance in Acinetobacter baumannii, methicillin resistance in Staphylococcus aureus, and beta-lactam and co-trimoxazole resistance in Streptococcus pneumoniae with accuracies ranging from 88–99%. We also did this for isoniazid, kanamycin, ofloxacin, rifampicin, and streptomycin resistance in Mycobacterium tuberculosis, achieving accuracies ranging from 71–88%. Lastly, this set of classifiers has been used to provide an initial framework for species-specific AMR phenotype and genomic feature prediction in the RAST and PATRIC annotation services.« less

  7. Modelling seagrass growth and development to evaluate transplanting strategies for restoration

    PubMed Central

    Renton, Michael; Airey, Michael; Cambridge, Marion L.; Kendrick, Gary A.

    2011-01-01

    Background and Aims Seagrasses are important marine plants that are under threat globally. Restoration by transplanting vegetative fragments or seedlings into areas where seagrasses have been lost is possible, but long-term trial data are limited. The goal of this study is to use available short-term data to predict long-term outcomes of transplanting seagrass. Methods A functional–structural plant model of seagrass growth that integrates data collected from short-term trials and experiments is presented. The model was parameterized for the species Posidonia australis, a limited validation of the model against independent data and a sensitivity analysis were conducted and the model was used to conduct a preliminary evaluation of different transplanting strategies. Key Results The limited validation was successful, and reasonable long-term outcomes could be predicted, based only on short-term data. Conclusions This approach for modelling seagrass growth and development enables long-term predictions of the outcomes to be made from different strategies for transplanting seagrass, even when empirical long-term data are difficult or impossible to collect. More validation is required to improve confidence in the model's predictions, and inclusion of more mechanism will extend the model's usefulness. Marine restoration represents a novel application of functional–structural plant modelling. PMID:21821624

  8. A new predictive multi-zone model for HCCI engine combustion

    DOE PAGES

    Bissoli, Mattia; Frassoldati, Alessio; Cuoci, Alberto; ...

    2016-06-30

    Here, this work introduces a new predictive multi-zone model for the description of combustion in Homogeneous Charge Compression Ignition (HCCI) engines. The model exploits the existing OpenSMOKE++ computational suite to handle detailed kinetic mechanisms, providing reliable predictions of the in-cylinder auto-ignition processes. All the elements with a significant impact on the combustion performances and emissions, like turbulence, heat and mass exchanges, crevices, residual burned gases, thermal and feed stratification are taken into account. Compared to other computational approaches, this model improves the description of mixture stratification phenomena by coupling a wall heat transfer model derived from CFD application with amore » proper turbulence model. Furthermore, the calibration of this multi-zone model requires only three parameters, which can be derived from a non-reactive CFD simulation: these adaptive variables depend only on the engine geometry and remain fixed across a wide range of operating conditions, allowing the prediction of auto-ignition, pressure traces and pollutants. This computational framework enables the use of detail kinetic mechanisms, as well as Rate of Production Analysis (RoPA) and Sensitivity Analysis (SA) to investigate the complex chemistry involved in the auto-ignition and the pollutants formation processes. In the final sections of the paper, these capabilities are demonstrated through the comparison with experimental data.« less

  9. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction

    PubMed Central

    Venkatesan, R.

    2016-01-01

    Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets. PMID:27738649

  10. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction.

    PubMed

    Kumudha, P; Venkatesan, R

    Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets.

  11. Combining PubMed knowledge and EHR data to develop a weighted bayesian network for pancreatic cancer prediction.

    PubMed

    Zhao, Di; Weng, Chunhua

    2011-10-01

    In this paper, we propose a novel method that combines PubMed knowledge and Electronic Health Records to develop a weighted Bayesian Network Inference (BNI) model for pancreatic cancer prediction. We selected 20 common risk factors associated with pancreatic cancer and used PubMed knowledge to weigh the risk factors. A keyword-based algorithm was developed to extract and classify PubMed abstracts into three categories that represented positive, negative, or neutral associations between each risk factor and pancreatic cancer. Then we designed a weighted BNI model by adding the normalized weights into a conventional BNI model. We used this model to extract the EHR values for patients with or without pancreatic cancer, which then enabled us to calculate the prior probabilities for the 20 risk factors in the BNI. The software iDiagnosis was designed to use this weighted BNI model for predicting pancreatic cancer. In an evaluation using a case-control dataset, the weighted BNI model significantly outperformed the conventional BNI and two other classifiers (k-Nearest Neighbor and Support Vector Machine). We conclude that the weighted BNI using PubMed knowledge and EHR data shows remarkable accuracy improvement over existing representative methods for pancreatic cancer prediction. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Combining PubMed Knowledge and EHR Data to Develop a Weighted Bayesian Network for Pancreatic Cancer Prediction

    PubMed Central

    Zhao, Di; Weng, Chunhua

    2011-01-01

    In this paper, we propose a novel method that combines PubMed knowledge and Electronic Health Records to develop a weighted Bayesian Network Inference (BNI) model for pancreatic cancer prediction. We selected 20 common risk factors associated with pancreatic cancer and used PubMed knowledge to weigh the risk factors. A keyword-based algorithm was developed to extract and classify PubMed abstracts into three categories that represented positive, negative, or neutral associations between each risk factor and pancreatic cancer. Then we designed a weighted BNI model by adding the normalized weights into a conventional BNI model. We used this model to extract the EHR values for patients with or without pancreatic cancer, which then enabled us to calculate the prior probabilities for the 20 risk factors in the BNI. The software iDiagnosis was designed to use this weighted BNI model for predicting pancreatic cancer. In an evaluation using a case-control dataset, the weighted BNI model significantly outperformed the conventional BNI and two other classifiers (k-Nearest Neighbor and Support Vector Machine). We conclude that the weighted BNI using PubMed knowledge and EHR data shows remarkable accuracy improvement over existing representative methods for pancreatic cancer prediction. PMID:21642013

  13. Antimicrobial resistance prediction in PATRIC and RAST

    DOE PAGES

    Davis, James J.; Boisvert, Sebastien; Brettin, Thomas; ...

    2016-06-14

    The emergence and spread of antimicrobial resistance (AMR) mechanisms in bacterial pathogens, coupled with the dwindling number of effective antibiotics, has created a global health crisis. Being able to identify the genetic mechanisms of AMR and predict the resistance phenotypes of bacterial pathogens prior to culturing could inform clinical decision-making and improve reaction time. At PATRIC (http://patricbrc.org/), we have been collecting bacterial genomes with AMR metadata for several years. In order to advance phenotype prediction and the identification of genomic regions relating to AMR, we have updated the PATRIC FTP server to enable access to genomes that are binned bymore » their AMR phenotypes, as well as metadata including minimum inhibitory concentrations. Using this infrastructure, we custom built AdaBoost (adaptive boosting) machine learning classifiers for identifying carbapenem resistance in Acinetobacter baumannii, methicillin resistance in Staphylococcus aureus, and beta-lactam and co-trimoxazole resistance in Streptococcus pneumoniae with accuracies ranging from 88–99%. We also did this for isoniazid, kanamycin, ofloxacin, rifampicin, and streptomycin resistance in Mycobacterium tuberculosis, achieving accuracies ranging from 71–88%. Lastly, this set of classifiers has been used to provide an initial framework for species-specific AMR phenotype and genomic feature prediction in the RAST and PATRIC annotation services.« less

  14. Predictive power of Koplik's spots for the diagnosis of measles.

    PubMed

    Zenner, Dominik; Nacul, Luis

    2012-03-12

    Measles is a major cause of mortality globally. In many countries, management of measles is based on clinical suspicion, but the predictive value of clinical diagnosis depends on knowledge and population prevalence of measles. In the pre-vaccine era with high measles incidence, Koplik's spots (KS) were said to be "pathognomonic". This study prospectively evaluated test properties and diagnostic odds ratios (OR) of KS. Data including KS status were prospectively collected for a six-month period on all suspected measles cases reported to the North-West London Health Protection Unit. Saliva test kits were sent to all cases and KS test properties were analysed against measles confirmation by PCR or IgM testing (gold standard). The positive predictive value (PPV) of clinically suspecting measles was 50%. Using KS as diagnostic tool improved the PPV to 80% and the presence of KS was associated with confirmed measles in the multi-variable analysis (OR 7.2, 95% Confidence Interval 2.1-24.9, p=0.001). We found that Koplik's spots were highly predictive of confirmed measles and could be a good clinical tool to enable prompt measles management and control measures, as action often needs to be taken in the absence of laboratory confirmation. We suggest that current clinical case definitions might benefit from the inclusion of KS.

  15. Assessing predictive services' 7-day fire potential outlook

    Treesearch

    Karin Riley; Crystal Stonesifer; Dave Calkin; Haiganoush Preisler

    2015-01-01

    The Predictive Services program was created under the National Wildfire Coordinating Group in 2001 to address the need for long- and short-term decision support information for fire managers and operations personnel. The primary mission of Predictive Services is to integrate fire weather, fire danger, and resource availability to enable strategic fire suppression...

  16. Blind prediction of noncanonical RNA structure at atomic accuracy.

    PubMed

    Watkins, Andrew M; Geniesse, Caleb; Kladwang, Wipapat; Zakrevsky, Paul; Jaeger, Luc; Das, Rhiju

    2018-05-01

    Prediction of RNA structure from nucleotide sequence remains an unsolved grand challenge of biochemistry and requires distinct concepts from protein structure prediction. Despite extensive algorithmic development in recent years, modeling of noncanonical base pairs of new RNA structural motifs has not been achieved in blind challenges. We report a stepwise Monte Carlo (SWM) method with a unique add-and-delete move set that enables predictions of noncanonical base pairs of complex RNA structures. A benchmark of 82 diverse motifs establishes the method's general ability to recover noncanonical pairs ab initio, including multistrand motifs that have been refractory to prior approaches. In a blind challenge, SWM models predicted nucleotide-resolution chemical mapping and compensatory mutagenesis experiments for three in vitro selected tetraloop/receptors with previously unsolved structures (C7.2, C7.10, and R1). As a final test, SWM blindly and correctly predicted all noncanonical pairs of a Zika virus double pseudoknot during a recent community-wide RNA-Puzzle. Stepwise structure formation, as encoded in the SWM method, enables modeling of noncanonical RNA structure in a variety of previously intractable problems.

  17. Jet Noise Modeling for Suppressed and Unsuppressed Aircraft in Simulated Flight

    NASA Technical Reports Server (NTRS)

    Stone, James R.; Krejsa, Eugene A.; Clark, Bruce J; Berton, Jeffrey J.

    2009-01-01

    This document describes the development of further extensions and improvements to the jet noise model developed by Modern Technologies Corporation (MTC) for the National Aeronautics and Space Administration (NASA). The noise component extraction and correlation approach, first used successfully by MTC in developing a noise prediction model for two-dimensional mixer ejector (2DME) nozzles under the High Speed Research (HSR) Program, has been applied to dual-stream nozzles, then extended and improved in earlier tasks under this contract. Under Task 6, the coannular jet noise model was formulated and calibrated with limited scale model data, mainly at high bypass ratio, including a limited-range prediction of the effects of mixing-enhancement nozzle-exit chevrons on jet noise. Under Task 9 this model was extended to a wider range of conditions, particularly those appropriate for a Supersonic Business Jet, with an improvement in simulated flight effects modeling and generalization of the suppressor model. In the present task further comparisons are made over a still wider range of conditions from more test facilities. The model is also further generalized to cover single-stream nozzles of otherwise similar configuration. So the evolution of this prediction/analysis/correlation approach has been in a sense backward, from the complex to the simple; but from this approach a very robust capability is emerging. Also from these studies, some observations emerge relative to theoretical considerations. The purpose of this task is to develop an analytical, semi-empirical jet noise prediction method applicable to takeoff, sideline and approach noise of subsonic and supersonic cruise aircraft over a wide size range. The product of this task is an even more consistent and robust model for the Footprint/Radius (FOOTPR) code than even the Task 9 model. The model is validated for a wider range of cases and statistically quantified for the various reference facilities. The possible role of facility effects will thus be documented. Although the comparisons that can be accomplished within the limited resources of this task are not comprehensive, they provide a broad enough sampling to enable NASA to make an informed decision on how much further effort should be expended on such comparisons. The improved finalized model is incorporated into the FOOTPR code. MTC has also supported the adaptation of this code for incorporation in NASA s Aircraft Noise Prediction Program (ANOPP).

  18. Computational and human observer image quality evaluation of low dose, knowledge-based CT iterative reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eck, Brendan L.; Fahmi, Rachid; Miao, Jun

    2015-10-15

    Purpose: Aims in this study are to (1) develop a computational model observer which reliably tracks the detectability of human observers in low dose computed tomography (CT) images reconstructed with knowledge-based iterative reconstruction (IMR™, Philips Healthcare) and filtered back projection (FBP) across a range of independent variables, (2) use the model to evaluate detectability trends across reconstructions and make predictions of human observer detectability, and (3) perform human observer studies based on model predictions to demonstrate applications of the model in CT imaging. Methods: Detectability (d′) was evaluated in phantom studies across a range of conditions. Images were generated usingmore » a numerical CT simulator. Trained observers performed 4-alternative forced choice (4-AFC) experiments across dose (1.3, 2.7, 4.0 mGy), pin size (4, 6, 8 mm), contrast (0.3%, 0.5%, 1.0%), and reconstruction (FBP, IMR), at fixed display window. A five-channel Laguerre–Gauss channelized Hotelling observer (CHO) was developed with internal noise added to the decision variable and/or to channel outputs, creating six different internal noise models. Semianalytic internal noise computation was tested against Monte Carlo and used to accelerate internal noise parameter optimization. Model parameters were estimated from all experiments at once using maximum likelihood on the probability correct, P{sub C}. Akaike information criterion (AIC) was used to compare models of different orders. The best model was selected according to AIC and used to predict detectability in blended FBP-IMR images, analyze trends in IMR detectability improvements, and predict dose savings with IMR. Predicted dose savings were compared against 4-AFC study results using physical CT phantom images. Results: Detection in IMR was greater than FBP in all tested conditions. The CHO with internal noise proportional to channel output standard deviations, Model-k4, showed the best trade-off between fit and model complexity according to AIC{sub c}. With parameters fixed, the model reasonably predicted detectability of human observers in blended FBP-IMR images. Semianalytic internal noise computation gave results equivalent to Monte Carlo, greatly speeding parameter estimation. Using Model-k4, the authors found an average detectability improvement of 2.7 ± 0.4 times that of FBP. IMR showed greater improvements in detectability with larger signals and relatively consistent improvements across signal contrast and x-ray dose. In the phantom tested, Model-k4 predicted an 82% dose reduction compared to FBP, verified with physical CT scans at 80% reduced dose. Conclusions: IMR improves detectability over FBP and may enable significant dose reductions. A channelized Hotelling observer with internal noise proportional to channel output standard deviation agreed well with human observers across a wide range of variables, even across reconstructions with drastically different image characteristics. Utility of the model observer was demonstrated by predicting the effect of image processing (blending), analyzing detectability improvements with IMR across dose, size, and contrast, and in guiding real CT scan dose reduction experiments. Such a model observer can be applied in optimizing parameters in advanced iterative reconstruction algorithms as well as guiding dose reduction protocols in physical CT experiments.« less

  19. Super H-mode: theoretical prediction and initial observations of a new high performance regime for tokamak operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, Philip B.; Solomon, Wayne M.; Burrell, Keith H.

    2015-07-21

    A new “Super H-mode” regime is predicted, which enables pedestal height and predicted fusion performance substantially higher than for H-mode operation. This new regime is predicted to exist by the EPED pedestal model, which calculates criticality constraints for peeling-ballooning and kinetic ballooning modes, and combines them to predict the pedestal height and width. EPED usually predicts a single (“H-mode”) pedestal solution for each set of input parameters, however, in strongly shaped plasmas above a critical density, multiple pedestal solutions are found, including the standard “Hmode” solution, and a “Super H-Mode” solution at substantially larger pedestal height and width. The Supermore » H-mode regime is predicted to be accessible by controlling the trajectory of the density, and to increase fusion performance for ITER, as well as for DEMO designs with strong shaping. A set of experiments on DIII-D has identified the predicted Super H-mode regime, and finds pedestal height and width, and their variation with density, in good agreement with theoretical predictions from the EPED model. Finally, the very high pedestal enables operation at high global beta and high confinement, including the highest normalized beta achieved on DIII-D with a quiescent edge.« less

  20. Predictive Feedback and Conscious Visual Experience

    PubMed Central

    Panichello, Matthew F.; Cheung, Olivia S.; Bar, Moshe

    2012-01-01

    The human brain continuously generates predictions about the environment based on learned regularities in the world. These predictions actively and efficiently facilitate the interpretation of incoming sensory information. We review evidence that, as a result of this facilitation, predictions directly influence conscious experience. Specifically, we propose that predictions enable rapid generation of conscious percepts and bias the contents of awareness in situations of uncertainty. The possible neural mechanisms underlying this facilitation are discussed. PMID:23346068

  1. Beyond Climate and Weather Science: Expanding the Forecasting Family to Serve Societal Needs

    NASA Astrophysics Data System (ADS)

    Barron, E. J.

    2009-05-01

    The ability to "anticipate" the future is what makes information from the Earth sciences valuable to society - whether it is the prediction of severe weather or the future availability of water resources in response to climate change. An improved ability to anticipate or forecast has the potential to serve society by simultaneously improving our ability to (1) promote economic vitality, (2) enable environmental stewardship, (3) protect life and property, as well as (4) improve our fundamental knowledge of the earth system. The potential is enormous, yet many appear ready to move quickly toward specific mitigation and adaptation strategies assuming that the science is settled. Five important weakness must be addressed first: (1) the formation of a true "climate services" function and capability, (2) the deliberate investment in expanding the family of forecasting elements to incorporate a broader array of environmental factors and impacts, (3) the investment in the sciences that connect climate to society, (4) a deliberate focus on the problems associated with scale, in particular the difference between the scale of predictive models and the scale associated with societal decisions, and (5) the evolution from climate services and model predictions to the equivalent of "environmental intelligence centers." The objective is to bring the discipline of forecasting to a broader array of environmental challenges. Assessments of the potential impacts of global climate change on societal sectors such as water, human health, and agriculture provide good examples of this challenge. We have the potential to move from a largely reactive mode in addressing adverse health outcomes, for example, to one in which the ties between climate, land cover, infectious disease vectors, and human health are used to forecast and predict adverse human health conditions. The potential exists for a revolution in forecasting, that entrains a much broader set of societal needs and solutions. The argument is made that (for example) the current capabilities in the prediction of environmental health is similar to the capabilities (and potential) of weather forecasting in the 1960's.

  2. A new risk prediction model for critical care: the Intensive Care National Audit & Research Centre (ICNARC) model.

    PubMed

    Harrison, David A; Parry, Gareth J; Carpenter, James R; Short, Alasdair; Rowan, Kathy

    2007-04-01

    To develop a new model to improve risk prediction for admissions to adult critical care units in the UK. Prospective cohort study. The setting was 163 adult, general critical care units in England, Wales, and Northern Ireland, December 1995 to August 2003. Patients were 216,626 critical care admissions. None. The performance of different approaches to modeling physiologic measurements was evaluated, and the best methods were selected to produce a new physiology score. This physiology score was combined with other information relating to the critical care admission-age, diagnostic category, source of admission, and cardiopulmonary resuscitation before admission-to develop a risk prediction model. Modeling interactions between diagnostic category and physiology score enabled the inclusion of groups of admissions that are frequently excluded from risk prediction models. The new model showed good discrimination (mean c index 0.870) and fit (mean Shapiro's R 0.665, mean Brier's score 0.132) in 200 repeated validation samples and performed well when compared with recalibrated versions of existing published risk prediction models in the cohort of patients eligible for all models. The hypothesis of perfect fit was rejected for all models, including the Intensive Care National Audit & Research Centre (ICNARC) model, as is to be expected in such a large cohort. The ICNARC model demonstrated better discrimination and overall fit than existing risk prediction models, even following recalibration of these models. We recommend it be used to replace previously published models for risk adjustment in the UK.

  3. Massive metrology using fast e-beam technology improves OPC model accuracy by >2x at faster turnaround time

    NASA Astrophysics Data System (ADS)

    Zhao, Qian; Wang, Lei; Wang, Jazer; Wang, ChangAn; Shi, Hong-Fei; Guerrero, James; Feng, Mu; Zhang, Qiang; Liang, Jiao; Guo, Yunbo; Zhang, Chen; Wallow, Tom; Rio, David; Wang, Lester; Wang, Alvin; Wang, Jen-Shiang; Gronlund, Keith; Lang, Jun; Koh, Kar Kit; Zhang, Dong Qing; Zhang, Hongxin; Krishnamurthy, Subramanian; Fei, Ray; Lin, Chiawen; Fang, Wei; Wang, Fei

    2018-03-01

    Classical SEM metrology, CD-SEM, uses low data rate and extensive frame-averaging technique to achieve high-quality SEM imaging for high-precision metrology. The drawbacks include prolonged data collection time and larger photoresist shrinkage due to excess electron dosage. This paper will introduce a novel e-beam metrology system based on a high data rate, large probe current, and ultra-low noise electron optics design. At the same level of metrology precision, this high speed e-beam metrology system could significantly shorten data collection time and reduce electron dosage. In this work, the data collection speed is higher than 7,000 images per hr. Moreover, a novel large field of view (LFOV) capability at high resolution was enabled by an advanced electron deflection system design. The area coverage by LFOV is >100x larger than classical SEM. Superior metrology precision throughout the whole image has been achieved, and high quality metrology data could be extracted from full field. This new capability on metrology will further improve metrology data collection speed to support the need for large volume of metrology data from OPC model calibration of next generation technology. The shrinking EPE (Edge Placement Error) budget places more stringent requirement on OPC model accuracy, which is increasingly limited by metrology errors. In the current practice of metrology data collection and data processing to model calibration flow, CD-SEM throughput becomes a bottleneck that limits the amount of metrology measurements available for OPC model calibration, impacting pattern coverage and model accuracy especially for 2D pattern prediction. To address the trade-off in metrology sampling and model accuracy constrained by the cycle time requirement, this paper employs the high speed e-beam metrology system and a new computational software solution to take full advantage of the large volume data and significantly reduce both systematic and random metrology errors. The new computational software enables users to generate large quantity of highly accurate EP (Edge Placement) gauges and significantly improve design pattern coverage with up to 5X gain in model prediction accuracy on complex 2D patterns. Overall, this work showed >2x improvement in OPC model accuracy at a faster model turn-around time.

  4. Replacement of the initial steps of ethanol metabolism in Saccharomyces cerevisiae by ATP-independent acetylating acetaldehyde dehydrogenase

    PubMed Central

    Kozak, Barbara U.; van Rossum, Harmen M.; Niemeijer, Matthijs S.; van Dijk, Marlous; Benjamin, Kirsten; Wu, Liang; Daran, Jean-Marc G.; Pronk, Jack T.

    2016-01-01

    In Saccharomyces cerevisiae ethanol dissimilation is initiated by its oxidation and activation to cytosolic acetyl-CoA. The associated consumption of ATP strongly limits yields of biomass and acetyl-CoA-derived products. Here, we explore the implementation of an ATP-independent pathway for acetyl-CoA synthesis from ethanol that, in theory, enables biomass yield on ethanol that is up to 40% higher. To this end, all native yeast acetaldehyde dehydrogenases (ALDs) were replaced by heterologous acetylating acetaldehyde dehydrogenase (A-ALD). Engineered Ald− strains expressing different A-ALDs did not immediately grow on ethanol, but serial transfer in ethanol-grown batch cultures yielded growth rates of up to 70% of the wild-type value. Mutations in ACS1 were identified in all independently evolved strains and deletion of ACS1 enabled slow growth of non-evolved Ald− A-ALD strains on ethanol. Acquired mutations in A-ALD genes improved affinity—Vmax/Km for acetaldehyde. One of five evolved strains showed a significant 5% increase of its biomass yield in ethanol-limited chemostat cultures. Increased production of acetaldehyde and other by-products was identified as possible cause for lower than theoretically predicted biomass yields. This study proves that the native yeast pathway for conversion of ethanol to acetyl-CoA can be replaced by an engineered pathway with the potential to improve biomass and product yields. PMID:26818854

  5. Dementia in Latin America

    PubMed Central

    Parra, Mario A.; Baez, Sandra; Allegri, Ricardo; Nitrini, Ricardo; Lopera, Francisco; Slachevsky, Andrea; Custodio, Nilton; Lira, David; Piguet, Olivier; Kumfor, Fiona; Huepe, David; Cogram, Patricia; Bak, Thomas; Manes, Facundo

    2018-01-01

    The demographic structure of Latin American countries (LAC) is fast approaching that of developing countries, and the predicted prevalence of dementia in the former already exceeds the latter. Dementia has been declared a global challenge, yet regions around the world show differences in both the nature and magnitude of such a challenge. This article provides evidence and insights on barriers which, if overcome, would enable the harmonization of strategies to tackle the dementia challenge in LAC. First, we analyze the lack of available epidemiologic data, the need for standardizing clinical practice and improving physician training, and the existing barriers regarding resources, culture, and stigmas. We discuss how these are preventing timely care and research. Regarding specific health actions, most LAC have minimal mental health facilities and do not have specific mental health policies or budgets specific to dementia. In addition, local regulations may need to consider the regional context when developing treatment and prevention strategies. The support needed nationally and internationally to enable a smooth and timely transition of LAC to a position that integrates global strategies is highlighted. We focus on shared issues of poverty, cultural barriers, and socioeconomic vulnerability. We identify avenues for collaboration aimed to study unique populations, improve valid assessment methods, and generate opportunities for translational research, thus establishing a regional network. The issues identified here point to future specific actions aimed at tackling the dementia challenge in LAC. PMID:29305437

  6. Global biosurveillance: enabling science and technology. Workshop background and motivation: international scientific engagement for global security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Helen H

    2011-01-18

    Through discussion the conference aims to: (1) Identify core components of a comprehensive global biosurveillance capability; (2) Determine the scientific and technical bases to support such a program; (3) Explore the improvement in biosurveillance to enhance regional and global disease outbreak prediction; (4) Recommend an engagement approach to establishing an effective international community and regional or global network; (5) Propose implementation strategies and the measures of effectiveness; and (6) Identify the challenges that must be overcome in the next 3-5 years in order to establish an initial global biosurveillance capability that will have significant positive impact on BioNP as wellmore » as public health and/or agriculture. There is also a look back at the First Biothreat Nonproliferation Conference from December 2007. Whereas the first conference was an opportunity for problem solving to enhance and identify new paradigms for biothreat nonproliferation, this conference is moving towards integrated comprehensive global biosurveillance. Main reasons for global biosurveillance are: (1) Rapid assessment of unusual disease outbreak; (2) Early warning of emerging, re-emerging and engineered biothreat enabling reduced morbidity and mortality; (3) Enhanced crop and livestock management; (4) Increase understanding of host-pathogen interactions and epidemiology; (5) Enhanced international transparency for infectious disease research supporting BWC goals; and (6) Greater sharing of technology and knowledge to improve global health.« less

  7. DECHLORINATION-CONTROLLED POLYCHLORINATED DIBENZOFURAN FROM MUNICIPAL WASTE INCINERATORS

    EPA Science Inventory

    The ability to predict polychlorinated dibenzofuran (PCDF) isomer patterns from municipal waste incinerators (MWIs) enables an understanding of PCDF formation that may provide preventive measures. This work develops a model for the pattern prediction, assuming that the peak rati...

  8. Predicting damage in concrete due to expansive aggregates : modeling to enable sustainable material design.

    DOT National Transportation Integrated Search

    2012-04-01

    A poroelastic model is developed that can predict stress and strain distributions and, thus, ostensibly : damage likelihood in concrete under freezing conditions caused by aggregates with undesirable : combinations of geometry and constitutive proper...

  9. Using Toxicological Evidence from QSAR Models in Practice

    EPA Science Inventory

    The new generation of QSAR models provides supporting documentation in addition to the predicted toxicological value. Such information enables the toxicologist to explore the properties of chemical substances and to review and increase the reliability of toxicity predictions. Thi...

  10. Changing the approach to treatment choice in epilepsy using big data.

    PubMed

    Devinsky, Orrin; Dilley, Cynthia; Ozery-Flato, Michal; Aharonov, Ranit; Goldschmidt, Ya'ara; Rosen-Zvi, Michal; Clark, Chris; Fritz, Patty

    2016-03-01

    A UCB-IBM collaboration explored the application of machine learning to large claims databases to construct an algorithm for antiepileptic drug (AED) choice for individual patients. Claims data were collected between January 2006 and September 2011 for patients with epilepsy > 16 years of age. A subset of patient claims with a valid index date of AED treatment change (new, add, or switch) were used to train the AED prediction model by retrospectively evaluating an index date treatment for subsequent treatment change. Based on the trained model, a model-predicted AED regimen with the lowest likelihood of treatment change was assigned to each patient in the group of test claims, and outcomes were evaluated to test model validity. The model had 72% area under receiver operator characteristic curve, indicating good predictive power. Patients who were given the model-predicted AED regimen had significantly longer survival rates (time until a treatment change event) and lower expected health resource utilization on average than those who received another treatment. The actual prescribed AED regimen at the index date matched the model-predicted AED regimen in only 13% of cases; there were large discrepancies in the frequency of use of certain AEDs/combinations between model-predicted AED regimens and those actually prescribed. Chances of treatment success were improved if patients received the model-predicted treatment. Using the model's prediction system may enable personalized, evidence-based epilepsy care, accelerating the match between patients and their ideal therapy, thereby delivering significantly better health outcomes for patients and providing health-care savings by applying resources more efficiently. Our goal will be to strengthen the predictive power of the model by integrating diverse data sets and potentially moving to prospective data collection. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  11. Computer Aided Evaluation of Higher Education Tutors' Performance

    ERIC Educational Resources Information Center

    Xenos, Michalis; Papadopoulos, Thanos

    2007-01-01

    This article presents a method for computer-aided tutor evaluation: Bayesian Networks are used for organizing the collected data about tutors and for enabling accurate estimations and predictions about future tutor behavior. The model provides indications about each tutor's strengths and weaknesses, which enables the evaluator to exploit strengths…

  12. Predicting Bacteria Removal by Enhanced Stormwater Control Measures (SCMs) at the Watershed Scale

    NASA Astrophysics Data System (ADS)

    Wolfand, J.; Bell, C. D.; Boehm, A. B.; Hogue, T. S.; Luthy, R. G.

    2017-12-01

    Urban stormwater is a major cause of water quality impairment, resulting in surface waters that fail to meet water quality standards and support their designated uses. Fecal indicator bacteria are present in high concentrations in stormwater and are strictly regulated in receiving waters; yet, their fate and transport in urban stormwater is poorly understood. Stormwater control measures (SCMs) are often used to treat, infiltrate, and release urban runoff, but field measurements show that the removal of bacteria by these structural solutions is limited (median log removal = 0.24, n = 370). Researchers have therefore looked to improve bacterial removal by enhancing SCMs through alterations in flow regimes or adding geomedia such as biochar. The present research seeks to develop a model to predict removal of fecal indicator bacteria by enhanced SCMs at the watershed scale in a semi-arid climate. Using the highly developed Ballona Creek watershed (290 km2) located in Los Angeles County as a case study, a hydrologic model is coupled with a stochastic water quality model to predict E. coli concentration near the outfall of the Ballona Creek, Santa Monica Bay. A hydrologic model was developed using EPA SWMM, calibrated for flow from water year 1998-2006 (NSE = 0.94; R2 = 0.94), and validated from water year 2007-2015 (NSE = 0.90; R2 = 0.93). This bacterial loading model was then linked to EPA SUSTAIN and a SCM bacterial removal script to simulate log removal of bacteria by various SCMs and predict bacterial concentrations in Ballona Creek. Preliminary results suggest small enhancements to SCMs that improve bacterial removal (<0.5 log removal) may offer large benefits to surface water quality and enable communities such as Los Angeles to meet their regulatory requirements.

  13. Comprehensive lipid analysis: a powerful metanomic tool for predictive and diagnostic medicine.

    PubMed

    Watkins, S M

    2000-09-01

    The power and accuracy of predictive diagnostics stand to improve dramatically as a result of lipid metanomics. The high definition of data obtained with this approach allows multiple rather than single metabolites to be used in markers for a group. Since as many as 40 fatty acids are quantified from each lipid class, and up to 15 lipid classes can be quantified easily, more than 600 individual lipid metabolites can be measured routinely for each sample. Because these analyses are comprehensive, only the most appropriate and unique metabolites are selected for their predictive value. Thus, comprehensive lipid analysis promises to greatly improve predictive diagnostics for phenotypes that directly or peripherally involve lipids. A broader and possibly more exciting aspect of this technology is the generation of metabolic profiles that are not simply markers for disease, but metabolic maps that can be used to identify specific genes or activities that cause or influence the disease state. Metanomics is, in essence, functional genomics from metabolite analysis. By defining the metabolic basis for phenotype, researchers and clinicians will have an extraordinary opportunity to understand and treat disease. Much in the same way that gene chips allow researchers to observe the complex expression response to a stimulus, metanomics will enable researchers to observe the complex metabolic interplay responsible for defining phenotype. By extending this approach beyond the observation of individual dysregulations, medicine will begin to profile not single diseases, but health. As health is the proper balance of all vital metabolic pathways, comprehensive or metanomic analysis lends itself very well to identifying the metabolite distributions necessary for optimum health. Comprehensive and quantitative analysis of lipids would provide this degree of diagnostic power to researchers and clinicians interested in mining metabolic profiles for biological meaning.

  14. Optimization of classification and regression analysis of four monoclonal antibodies from Raman spectra using collaborative machine learning approach.

    PubMed

    Le, Laetitia Minh Maï; Kégl, Balázs; Gramfort, Alexandre; Marini, Camille; Nguyen, David; Cherti, Mehdi; Tfaili, Sana; Tfayli, Ali; Baillet-Guffroy, Arlette; Prognon, Patrice; Chaminade, Pierre; Caudron, Eric

    2018-07-01

    The use of monoclonal antibodies (mAbs) constitutes one of the most important strategies to treat patients suffering from cancers such as hematological malignancies and solid tumors. These antibodies are prescribed by the physician and prepared by hospital pharmacists. An analytical control enables the quality of the preparations to be ensured. The aim of this study was to explore the development of a rapid analytical method for quality control. The method used four mAbs (Infliximab, Bevacizumab, Rituximab and Ramucirumab) at various concentrations and was based on recording Raman data and coupling them to a traditional chemometric and machine learning approach for data analysis. Compared to conventional linear approach, prediction errors are reduced with a data-driven approach using statistical machine learning methods. In the latter, preprocessing and predictive models are jointly optimized. An additional original aspect of the work involved on submitting the problem to a collaborative data challenge platform called Rapid Analytics and Model Prototyping (RAMP). This allowed using solutions from about 300 data scientists in collaborative work. Using machine learning, the prediction of the four mAbs samples was considerably improved. The best predictive model showed a combined error of 2.4% versus 14.6% using linear approach. The concentration and classification errors were 5.8% and 0.7%, only three spectra were misclassified over the 429 spectra of the test set. This large improvement obtained with machine learning techniques was uniform for all molecules but maximal for Bevacizumab with an 88.3% reduction on combined errors (2.1% versus 17.9%). Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Barriers affecting successful technology enablement of supply chain: An Indian perspective

    NASA Astrophysics Data System (ADS)

    Arora, R.; Haleem, A.; Farooquie, J. A.

    2018-03-01

    In order to compete, organizations need to focus on improving supply chain and technology acts as a major enabler. Technology enablement of supply chain has not always been successful and has been examined by many researchers. The purpose of this paper is to do a systematic literature review of technology enabled supply chain from a strategic viewpoint. The literature is examined from two perspectives. Firstly, it studies the growing interest in technology-enabled supply chain in India. Secondly, it studies barriers affecting technology enablement of supply chain. The literature review identifies that technology enabled supply chain helps in improving performance via effective decision making, monitoring entire supply chain, faster reaction to customer service problems, etc. The research has emphasized the importance of 12 barriers affecting technology enablement. This research will help as a guide for practitioners in order to successfully implement technology and fills the gap in existing literature by highlighting and consolidating the significant research work done in past.

  16. Perceiving expatriate coworkers as foreigners encourages aid: social categorization and procedural justice together improve intergroup cooperation and dual identity.

    PubMed

    Leonardelli, Geoffrey J; Toh, Soo Min

    2011-01-01

    We propose that social categorization can encourage particular forms of intergroup cooperation because it differentiates a group in need from a group that can give aid. Moreover, social categorization is most likely to occur when individuals perceive procedural justice (i.e., fair treatment) from authorities in a superordinate group that includes the individuals' subgroup. Two field studies investigating relations between local and foreign coworkers tested not only this prediction, but also whether high social categorization and procedural justice would yield a dual identity, in which group members identify simultaneously with their social category and the superordinate group. Both studies supported our predictions: Local employees engaged a dual identity and offered knowledge to aid a foreign coworker's adjustment more often when local-foreign categorization and procedural justice from organizational authorities were high than when these variables were low. These discoveries point to controllable mechanisms that enable intergroup cooperation, and our findings have important implications for intergroup aid, expatriate adjustment, immigration, and multiculturalism.

  17. Assessment of computational prediction of tail buffeting

    NASA Technical Reports Server (NTRS)

    Edwards, John W.

    1990-01-01

    Assessments of the viability of computational methods and the computer resource requirements for the prediction of tail buffeting are made. Issues involved in the use of Euler and Navier-Stokes equations in modeling vortex-dominated and buffet flows are discussed and the requirement for sufficient grid density to allow accurate, converged calculations is stressed. Areas in need of basic fluid dynamics research are highlighted: vorticity convection, vortex breakdown, dynamic turbulence modeling for free shear layers, unsteady flow separation for moderately swept, rounded leading-edge wings, vortex flows about wings at high subsonic speeds. An estimate of the computer run time for a buffeting response calculation for a full span F-15 aircraft indicates that an improvement in computer and/or algorithm efficiency of three orders of magnitude is needed to enable routine use of such methods. Attention is also drawn to significant uncertainties in the estimates, in particular with regard to nonlinearities contained within the modeling and the question of the repeatability or randomness of buffeting response.

  18. Neural Network Assisted Inverse Dynamic Guidance for Terminally Constrained Entry Flight

    PubMed Central

    Chen, Wanchun

    2014-01-01

    This paper presents a neural network assisted entry guidance law that is designed by applying Bézier approximation. It is shown that a fully constrained approximation of a reference trajectory can be made by using the Bézier curve. Applying this approximation, an inverse dynamic system for an entry flight is solved to generate guidance command. The guidance solution thus gotten ensures terminal constraints for position, flight path, and azimuth angle. In order to ensure terminal velocity constraint, a prediction of the terminal velocity is required, based on which, the approximated Bézier curve is adjusted. An artificial neural network is used for this prediction of the terminal velocity. The method enables faster implementation in achieving fully constrained entry flight. Results from simulations indicate improved performance of the neural network assisted method. The scheme is expected to have prospect for further research on automated onboard control of terminal velocity for both reentry and terminal guidance laws. PMID:24723821

  19. Neuroimaging in epilepsy.

    PubMed

    Sidhu, Meneka Kaur; Duncan, John S; Sander, Josemir W

    2018-05-17

    Epilepsy neuroimaging is important for detecting the seizure onset zone, predicting and preventing deficits from surgery and illuminating mechanisms of epileptogenesis. An aspiration is to integrate imaging and genetic biomarkers to enable personalized epilepsy treatments. The ability to detect lesions, particularly focal cortical dysplasia and hippocampal sclerosis, is increased using ultra high-field imaging and postprocessing techniques such as automated volumetry, T2 relaxometry, voxel-based morphometry and surface-based techniques. Statistical analysis of PET and single photon emission computer tomography (STATISCOM) are superior to qualitative analysis alone in identifying focal abnormalities in MRI-negative patients. These methods have also been used to study mechanisms of epileptogenesis and pharmacoresistance.Recent language fMRI studies aim to localize, and also lateralize language functions. Memory fMRI has been recommended to lateralize mnemonic function and predict outcome after surgery in temporal lobe epilepsy. Combinations of structural, functional and post-processing methods have been used in multimodal and machine learning models to improve the identification of the seizure onset zone and increase understanding of mechanisms underlying structural and functional aberrations in epilepsy.

  20. High Precision Rovibrational Spectroscopy of OH+

    NASA Astrophysics Data System (ADS)

    Markus, Charles R.; Hodges, James N.; Perry, Adam J.; Kocheril, G. Stephen; Müller, Holger S. P.; McCall, Benjamin J.

    2016-02-01

    The molecular ion OH+ has long been known to be an important component of the interstellar medium. Its relative abundance can be used to indirectly measure cosmic ray ionization rates of hydrogen, and it is the first intermediate in the interstellar formation of water. To date, only a limited number of pure rotational transitions have been observed in the laboratory making it necessary to indirectly calculate rotational levels from high-precision rovibrational spectroscopy. We have remeasured 30 transitions in the fundamental band with MHz-level precision, in order to enable the prediction of a THz spectrum of OH+. The ions were produced in a water cooled discharge of O2, H2, and He, and the rovibrational transitions were measured with the technique Noise Immune Cavity Enhanced Optical Heterodyne Velocity Modulation Spectroscopy. These values have been included in a global fit of field free data to a 3Σ- linear molecule effective Hamiltonian to determine improved spectroscopic parameters which were used to predict the pure rotational transition frequencies.

  1. Graph Learning in Knowledge Bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Sean; Wang, Daisy Zhe

    The amount of text data has been growing exponentially in recent years, giving rise to automatic information extraction methods that store text annotations in a database. The current state-of-theart structured prediction methods, however, are likely to contain errors and it’s important to be able to manage the overall uncertainty of the database. On the other hand, the advent of crowdsourcing has enabled humans to aid machine algorithms at scale. As part of this project we introduced pi-CASTLE , a system that optimizes and integrates human and machine computing as applied to a complex structured prediction problem involving conditional random fieldsmore » (CRFs). We proposed strategies grounded in information theory to select a token subset, formulate questions for the crowd to label, and integrate these labelings back into the database using a method of constrained inference. On both a text segmentation task over academic citations and a named entity recognition task over tweets we showed an order of magnitude improvement in accuracy gain over baseline methods.« less

  2. Theory of chaotic orbital variations confirmed by Cretaceous geological evidence

    NASA Astrophysics Data System (ADS)

    Ma, Chao; Meyers, Stephen R.; Sageman, Bradley B.

    2017-02-01

    Variations in the Earth’s orbit and spin vector are a primary control on insolation and climate; their recognition in the geological record has revolutionized our understanding of palaeoclimate dynamics, and has catalysed improvements in the accuracy and precision of the geological timescale. Yet the secular evolution of the planetary orbits beyond 50 million years ago remains highly uncertain, and the chaotic dynamical nature of the Solar System predicted by theoretical models has yet to be rigorously confirmed by well constrained (radioisotopically calibrated and anchored) geological data. Here we present geological evidence for a chaotic resonance transition associated with interactions between the orbits of Mars and the Earth, using an integrated radioisotopic and astronomical timescale from the Cretaceous Western Interior Basin of what is now North America. This analysis confirms the predicted chaotic dynamical behaviour of the Solar System, and provides a constraint for refining numerical solutions for insolation, which will enable a more precise and accurate geological timescale to be produced.

  3. Interatomic Potentials for Structure Simulation of Alkaline-Earth Cuprates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eremin, N.N.; Leonyuk, L.I.; Urusov, V.S.

    2001-05-01

    A specific potential model of interionic interactions was derived in which the crystal structures of alkaline-earth cuprates were satisfactorily described and some of their physical properties were predicted. It was found that a harmonic three-particle O-Cu-O potential and some Morse-type contributions to the simple Buckingham-type Cu-O repulsive potential enable one to improve essentially the results of crystal structure modeling for cuprates. The obtained potential set seems to be well transferable for different cuprates, despite the variety in linkages of the CuO{sub 4} groups. In the present work this potential set model was applied in the crystal structure modeling for Ca{submore » 2}CuO{sub 3}, CaCuO{sub 2}, SrCuO{sub 3}, (Sr{sub 1.19}Ca{sub 0.73})Cu{sub 2}O{sub 4}, and BaCuO{sub 2}. Some elastic and energetic properties of the compounds under question were predicted.« less

  4. Can we Predict Disease Course with Clinical Factors?

    PubMed

    Vegh, Zsuzsanna; Kurti, Zsuzsanna; Golovics, Petra A; Lakatos, Peter L

    2018-01-01

    The disease phenotype at diagnosis and the disease course of Crohn's disease (CD) and ulcerative colitis (UC) show remarkable heterogeneity across patients. This review aims to summarize the currently available evidence on clinical and some environmental predictive factors, which clinicians should evaluate in the everyday practice together with other laboratory and imaging data to prevent disease progression, enable a more personalized therapy, and avoid negative disease outcomes. In recent population-based epidemiological and referral cohort studies, the evolution of disease phenotype of CD and UC varied significantly. Most CD and severe UC patients still require hospitalization or surgery/colectomy during follow-up. A change in the natural history of inflammatory bowel diseases (IBD) with improved outcomes in parallel with tailored positioning of aggressive immunomodulator and biological therapy has been suspected. According to the currently available literature, it is of major importance to refer IBD cases at risk for adverse disease outcomes as early during the disease course as possible. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Inlet Flow Control and Prediction Technologies for Embedded Propulsion Systems

    NASA Technical Reports Server (NTRS)

    McMillan, Michelle L.; Gissen, Abe; Vukasinovic, Bojan; Lakebrink, Matthew T.; Glezer, Ari; Mani, Mori; Mace, James

    2010-01-01

    Fail-safe inlet flow control may enable high-speed cruise efficiency, low noise signature, and reduced fuel-burn goals for hybrid wing-body aircraft. The objectives of this program are to develop flow control and prediction methodologies for boundary-layer ingesting (BLI) inlets used in these aircraft. This report covers the second of a three year program. The approach integrates experiments and numerical simulations. Both passive and active flow-control devices were tested in a small-scale wind tunnel. Hybrid actuation approaches, combining a passive microvane and active synthetic jet, were tested in various geometric arrangements. Detailed flow measurements were taken to provide insight into the flow physics. Results of the numerical simulations were correlated against experimental data. The sensitivity of results to grid resolution and turbulence models was examined. Aerodynamic benefits from microvanes and microramps were assessed when installed in an offset BLI inlet. Benefits were quantified in terms of recovery and distortion changes. Microvanes were more effective than microramps at improving recovery and distortion.

  6. Theory of chaotic orbital variations confirmed by Cretaceous geological evidence.

    PubMed

    Ma, Chao; Meyers, Stephen R; Sageman, Bradley B

    2017-02-22

    Variations in the Earth's orbit and spin vector are a primary control on insolation and climate; their recognition in the geological record has revolutionized our understanding of palaeoclimate dynamics, and has catalysed improvements in the accuracy and precision of the geological timescale. Yet the secular evolution of the planetary orbits beyond 50 million years ago remains highly uncertain, and the chaotic dynamical nature of the Solar System predicted by theoretical models has yet to be rigorously confirmed by well constrained (radioisotopically calibrated and anchored) geological data. Here we present geological evidence for a chaotic resonance transition associated with interactions between the orbits of Mars and the Earth, using an integrated radioisotopic and astronomical timescale from the Cretaceous Western Interior Basin of what is now North America. This analysis confirms the predicted chaotic dynamical behaviour of the Solar System, and provides a constraint for refining numerical solutions for insolation, which will enable a more precise and accurate geological timescale to be produced.

  7. Bootstrap study of genome-enabled prediction reliabilities using haplotype blocks across Nordic Red cattle breeds.

    PubMed

    Cuyabano, B C D; Su, G; Rosa, G J M; Lund, M S; Gianola, D

    2015-10-01

    This study compared the accuracy of genome-enabled prediction models using individual single nucleotide polymorphisms (SNP) or haplotype blocks as covariates when using either a single breed or a combined population of Nordic Red cattle. The main objective was to compare predictions of breeding values of complex traits using a combined training population with haplotype blocks, with predictions using a single breed as training population and individual SNP as predictors. To compare the prediction reliabilities, bootstrap samples were taken from the test data set. With the bootstrapped samples of prediction reliabilities, we built and graphed confidence ellipses to allow comparisons. Finally, measures of statistical distances were used to calculate the gain in predictive ability. Our analyses are innovative in the context of assessment of predictive models, allowing a better understanding of prediction reliabilities and providing a statistical basis to effectively calibrate whether one prediction scenario is indeed more accurate than another. An ANOVA indicated that use of haplotype blocks produced significant gains mainly when Bayesian mixture models were used but not when Bayesian BLUP was fitted to the data. Furthermore, when haplotype blocks were used to train prediction models in a combined Nordic Red cattle population, we obtained up to a statistically significant 5.5% average gain in prediction accuracy, over predictions using individual SNP and training the model with a single breed. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  8. Maximization of the annual energy production of wind power plants by optimization of layout and yaw-based wake control: Maximization of wind plant AEP by optimization of layout and wake control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gebraad, Pieter; Thomas, Jared J.; Ning, Andrew

    This paper presents a wind plant modeling and optimization tool that enables the maximization of wind plant annual energy production (AEP) using yaw-based wake steering control and layout changes. The tool is an extension of a wake engineering model describing the steady-state effects of yaw on wake velocity profiles and power productions of wind turbines in a wind plant. To make predictions of a wind plant's AEP, necessary extensions of the original wake model include coupling it with a detailed rotor model and a control policy for turbine blade pitch and rotor speed. This enables the prediction of power productionmore » with wake effects throughout a range of wind speeds. We use the tool to perform an example optimization study on a wind plant based on the Princess Amalia Wind Park. In this case study, combined optimization of layout and wake steering control increases AEP by 5%. The power gains from wake steering control are highest for region 1.5 inflow wind speeds, and they continue to be present to some extent for the above-rated inflow wind speeds. The results show that layout optimization and wake steering are complementary because significant AEP improvements can be achieved with wake steering in a wind plant layout that is already optimized to reduce wake losses.« less

  9. Development and external validation of nomograms to predict the risk of skeletal metastasis at the time of diagnosis and skeletal metastasis-free survival in nasopharyngeal carcinoma.

    PubMed

    Yang, Lin; Xia, Liangping; Wang, Yan; He, Shasha; Chen, Haiyang; Liang, Shaobo; Peng, Peijian; Hong, Shaodong; Chen, Yong

    2017-09-06

    The skeletal system is the most common site of distant metastasis in nasopharyngeal carcinoma (NPC); various prognostic factors have been reported for skeletal metastasis, though most studies have focused on a single factor. We aimed to establish nomograms to effectively predict skeletal metastasis at initial diagnosis (SMAD) and skeletal metastasis-free survival (SMFS) in NPC. A total of 2685 patients with NPC who received bone scintigraphy (BS) and/or 18F-deoxyglucose positron emission tomography/computed tomography (18F-FDG PET/CT) and 2496 patients without skeletal metastasis were retrospectively assessed to develop individual nomograms for SMAD and SMFS. The models were validated externally using separate cohorts of 1329 and 1231 patients treated at two other institutions. Five independent prognostic factors were included in each nomogram. The SMAD nomogram had a significantly higher c-index than the TNM staging system (training cohort, P = 0.005; validation cohort, P < 0.001). The SMFS nomogram had significantly higher c-index values in the training and validation sets than the TNM staging system (P < 0.001 and P = 0.005, respectively). Three proposed risk stratification groups were created using the nomograms, and enabled significant discrimination of SMFS for each risk group. The prognostic nomograms established in this study enable accurate stratification of distinct risk groups for skeletal metastasis, which may improve counseling and facilitate individualized management of patients with NPC.

  10. Simulation of image formation in x-ray coded aperture microscopy with polycapillary optics.

    PubMed

    Korecki, P; Roszczynialski, T P; Sowa, K M

    2015-04-06

    In x-ray coded aperture microscopy with polycapillary optics (XCAMPO), the microstructure of focusing polycapillary optics is used as a coded aperture and enables depth-resolved x-ray imaging at a resolution better than the focal spot dimensions. Improvements in the resolution and development of 3D encoding procedures require a simulation model that can predict the outcome of XCAMPO experiments. In this work we introduce a model of image formation in XCAMPO which enables calculation of XCAMPO datasets for arbitrary positions of the object relative to the focal plane as well as to incorporate optics imperfections. In the model, the exit surface of the optics is treated as a micro-structured x-ray source that illuminates a periodic object. This makes it possible to express the intensity of XCAMPO images as a convolution series and to perform simulations by means of fast Fourier transforms. For non-periodic objects, the model can be applied by enforcing artificial periodicity and setting the spatial period larger then the field-of-view. Simulations are verified by comparison with experimental data.

  11. Understanding pressurized metered dose inhaler performance.

    PubMed

    Ivey, James W; Vehring, Reinhard; Finlay, Warren H

    2015-06-01

    Deepening the current understanding of the factors governing the performance of the pressurized metered dose inhaler (pMDI) has the potential to benefit patients by providing improved drugs for current indications as well as by enabling new areas of therapy. Although a great deal of work has been conducted to this end, our knowledge of the physical mechanisms that drive pMDI performance remains incomplete. This review focuses on research into the influence of device and formulation variables on pMDI performance metrics. Literature in the areas of dose metering, atomization and aerosol evolution and deposition is covered, with an emphasis on studies of a more fundamental nature. Simple models which may be of use to those developing pMDI products are summarized. Although researchers have had good success utilizing an empirically developed knowledge base to predict pMDI performance, such knowledge may not be applicable when pursuing innovations in device or formulation technology. Developing a better understanding of the underlying mechanisms is a worthwhile investment for those working to enable the next generation of pMDI products.

  12. Systems Vaccinology: Enabling rational vaccine design with systems biological approaches

    PubMed Central

    Hagan, Thomas; Nakaya, Helder I.; Subramaniam, Shankar; Pulendran, Bali

    2015-01-01

    Vaccines have drastically reduced the mortality and morbidity of many diseases. However, vaccines have historically been developed empirically, and recent development of vaccines against current pandemics such as HIV and malaria has been met with difficulty. The advent of high-throughput technologies, coupled with systems biological methods of data analysis, has enabled researchers to interrogate the entire complement of a variety of molecular components within cells, and characterize the myriad interactions among them in order to model and understand the behavior of the system as a whole. In the context of vaccinology, these tools permit exploration of the molecular mechanisms by which vaccines induce protective immune responses. Here we review the recent advances, challenges, and potential of systems biological approaches in vaccinology. If the challenges facing this developing field can be overcome, systems vaccinology promises to empower the identification of early predictive signatures of vaccine response, as well as novel and robust correlates of protection from infection. Such discoveries, along with the improved understanding of immune responses to vaccination they impart, will play an instrumental role in development of the next generation of rationally designed vaccines. PMID:25858860

  13. Climate Model Diagnostic Analyzer

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  14. A wearable device for monitoring and prevention of repetitive ankle sprain.

    PubMed

    Attia, Mohammed; Taher, Mona F

    2015-01-01

    This study presents the design and implementation of a wearable wireless device, connected to a smart phone, which monitors and prevents repetitive ankle sprain due to chronic ankle instability (CAI). The device prevents this common foot injury by electrical stimulation of the peroneal muscles using surface electrodes which causes dorsiflexion of the foot. This is done after measuring ankle kinematics using inertial motion sensors and predicting ankle sprain. The prototype implemented here has a fast response time of 7 msec which enables prevention of ankle sprain before ligament damage occurs. Wireless communication between the components of the device, in addition to their small size, low cost and low power consumption, makes it unobtrusive, easy to wear and not hinder normal activities. The device connects via Bluetooth to an android smart phone application for continuous data logging and reporting to keep track of the incidences of possible ankle sprain and correction. This is a significant feature of this device since it enables monitoring of patients with CAI and quantifying progression of the condition or improvement in the case of treatment.

  15. Crop improvement using life cycle datasets acquired under field conditions.

    PubMed

    Mochida, Keiichi; Saisho, Daisuke; Hirayama, Takashi

    2015-01-01

    Crops are exposed to various environmental stresses in the field throughout their life cycle. Modern plant science has provided remarkable insights into the molecular networks of plant stress responses in laboratory conditions, but the responses of different crops to environmental stresses in the field need to be elucidated. Recent advances in omics analytical techniques and information technology have enabled us to integrate data from a spectrum of physiological metrics of field crops. The interdisciplinary efforts of plant science and data science enable us to explore factors that affect crop productivity and identify stress tolerance-related genes and alleles. Here, we describe recent advances in technologies that are key components for data driven crop design, such as population genomics, chronological omics analyses, and computer-aided molecular network prediction. Integration of the outcomes from these technologies will accelerate our understanding of crop phenology under practical field situations and identify key characteristics to represent crop stress status. These elements would help us to genetically engineer "designed crops" to prevent yield shortfalls because of environmental fluctuations due to future climate change.

  16. High-resolution vertical profiles of groundwater electrical conductivity (EC) and chloride from direct-push EC logs

    NASA Astrophysics Data System (ADS)

    Bourke, Sarah A.; Hermann, Kristian J.; Hendry, M. Jim

    2017-11-01

    Elevated groundwater salinity associated with produced water, leaching from landfills or secondary salinity can degrade arable soils and potable water resources. Direct-push electrical conductivity (EC) profiling enables rapid, relatively inexpensive, high-resolution in-situ measurements of subsurface salinity, without requiring core collection or installation of groundwater wells. However, because the direct-push tool measures the bulk EC of both solid and liquid phases (ECa), incorporation of ECa data into regional or historical groundwater data sets requires the prediction of pore water EC (ECw) or chloride (Cl-) concentrations from measured ECa. Statistical linear regression and physically based models for predicting ECw and Cl- from ECa profiles were tested on a brine plume in central Saskatchewan, Canada. A linear relationship between ECa/ECw and porosity was more accurate for predicting ECw and Cl- concentrations than a power-law relationship (Archie's Law). Despite clay contents of up to 96%, the addition of terms to account for electrical conductance in the solid phase did not improve model predictions. In the absence of porosity data, statistical linear regression models adequately predicted ECw and Cl- concentrations from direct-push ECa profiles (ECw = 5.48 ECa + 0.78, R 2 = 0.87; Cl- = 1,978 ECa - 1,398, R 2 = 0.73). These statistical models can be used to predict ECw in the absence of lithologic data and will be particularly useful for initial site assessments. The more accurate linear physically based model can be used to predict ECw and Cl- as porosity data become available and the site-specific ECw-Cl- relationship is determined.

  17. Genome wide selection in Citrus breeding.

    PubMed

    Gois, I B; Borém, A; Cristofani-Yaly, M; de Resende, M D V; Azevedo, C F; Bastianel, M; Novelli, V M; Machado, M A

    2016-10-17

    Genome wide selection (GWS) is essential for the genetic improvement of perennial species such as Citrus because of its ability to increase gain per unit time and to enable the efficient selection of characteristics with low heritability. This study assessed GWS efficiency in a population of Citrus and compared it with selection based on phenotypic data. A total of 180 individual trees from a cross between Pera sweet orange (Citrus sinensis Osbeck) and Murcott tangor (Citrus sinensis Osbeck x Citrus reticulata Blanco) were evaluated for 10 characteristics related to fruit quality. The hybrids were genotyped using 5287 DArT_seq TM (diversity arrays technology) molecular markers and their effects on phenotypes were predicted using the random regression - best linear unbiased predictor (rr-BLUP) method. The predictive ability, prediction bias, and accuracy of GWS were estimated to verify its effectiveness for phenotype prediction. The proportion of genetic variance explained by the markers was also computed. The heritability of the traits, as determined by markers, was 16-28%. The predictive ability of these markers ranged from 0.53 to 0.64, and the regression coefficients between predicted and observed phenotypes were close to unity. Over 35% of the genetic variance was accounted for by the markers. Accuracy estimates with GWS were lower than those obtained by phenotypic analysis; however, GWS was superior in terms of genetic gain per unit time. Thus, GWS may be useful for Citrus breeding as it can predict phenotypes early and accurately, and reduce the length of the selection cycle. This study demonstrates the feasibility of genomic selection in Citrus.

  18. Towards a National Space Weather Predictive Capability

    NASA Astrophysics Data System (ADS)

    Fox, N. J.; Lindstrom, K. L.; Ryschkewitsch, M. G.; Anderson, B. J.; Gjerloev, J. W.; Merkin, V. G.; Kelly, M. A.; Miller, E. S.; Sitnov, M. I.; Ukhorskiy, A. Y.; Erlandson, R. E.; Barnes, R. J.; Paxton, L. J.; Sotirelis, T.; Stephens, G.; Comberiate, J.

    2014-12-01

    National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review datasets, tools and models that have resulted from research by scientists at JHU/APL, and examine how they could be applied to support space weather applications in coordination with other community assets and capabilities.

  19. Systems Toxicology: From Basic Research to Risk Assessment

    PubMed Central

    2014-01-01

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777

  20. Systems toxicology: from basic research to risk assessment.

    PubMed

    Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C

    2014-03-17

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.

Top