Sample records for data-based predictive control

  1. Data-Based Predictive Control with Multirate Prediction Step

    NASA Technical Reports Server (NTRS)

    Barlow, Jonathan S.

    2010-01-01

    Data-based predictive control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. One challenge of MPC is computational requirements increasing with prediction horizon length. This paper develops a closed-loop dynamic output feedback controller that minimizes a multi-step-ahead receding-horizon cost function with multirate prediction step. One result is a reduced influence of prediction horizon and the number of system outputs on the computational requirements of the controller. Another result is an emphasis on portions of the prediction window that are sampled more frequently. A third result is the ability to include more outputs in the feedback path than in the cost function.

  2. Adaptive Data-based Predictive Control for Short Take-off and Landing (STOL) Aircraft

    NASA Technical Reports Server (NTRS)

    Barlow, Jonathan Spencer; Acosta, Diana Michelle; Phan, Minh Q.

    2010-01-01

    Data-based Predictive Control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. The characteristics of adaptive data-based predictive control are particularly appropriate for the control of nonlinear and time-varying systems, such as Short Take-off and Landing (STOL) aircraft. STOL is a capability of interest to NASA because conceptual Cruise Efficient Short Take-off and Landing (CESTOL) transport aircraft offer the ability to reduce congestion in the terminal area by utilizing existing shorter runways at airports, as well as to lower community noise by flying steep approach and climb-out patterns that reduce the noise footprint of the aircraft. In this study, adaptive data-based predictive control is implemented as an integrated flight-propulsion controller for the outer-loop control of a CESTOL-type aircraft. Results show that the controller successfully tracks velocity while attempting to maintain a constant flight path angle, using longitudinal command, thrust and flap setting as the control inputs.

  3. Predicting Loss-of-Control Boundaries Toward a Piloting Aid

    NASA Technical Reports Server (NTRS)

    Barlow, Jonathan; Stepanyan, Vahram; Krishnakumar, Kalmanje

    2012-01-01

    This work presents an approach to predicting loss-of-control with the goal of providing the pilot a decision aid focused on maintaining the pilot's control action within predicted loss-of-control boundaries. The predictive architecture combines quantitative loss-of-control boundaries, a data-based predictive control boundary estimation algorithm and an adaptive prediction method to estimate Markov model parameters in real-time. The data-based loss-of-control boundary estimation algorithm estimates the boundary of a safe set of control inputs that will keep the aircraft within the loss-of-control boundaries for a specified time horizon. The adaptive prediction model generates estimates of the system Markov Parameters, which are used by the data-based loss-of-control boundary estimation algorithm. The combined algorithm is applied to a nonlinear generic transport aircraft to illustrate the features of the architecture.

  4. Prediction-based sampled-data H∞ controller design for attitude stabilisation of a rigid spacecraft with disturbances

    NASA Astrophysics Data System (ADS)

    Zhu, Baolong; Zhang, Zhiping; Zhou, Ding; Ma, Jie; Li, Shunli

    2017-08-01

    This paper investigates the H∞ control problem of the attitude stabilisation of a rigid spacecraft with external disturbances using prediction-based sampled-data control strategy. Aiming to achieve a 'virtual' closed-loop system, a type of parameterised sampled-data controller is designed by introducing a prediction mechanism. The resultant closed-loop system is equivalent to a hybrid system featured by a continuous-time and an impulsive differential system. By using a time-varying Lyapunov functional, a generalised bounded real lemma (GBRL) is first established for a kind of impulsive differential system. Based on this GBRL and Lyapunov functional approach, a sufficient condition is derived to guarantee the closed-loop system to be asymptotically stable and to achieve a prescribed H∞ performance. In addition, the controller parameter tuning is cast into a convex optimisation problem. Simulation and comparative results are provided to illustrate the effectiveness of the developed control scheme.

  5. Choosing the appropriate forecasting model for predictive parameter control.

    PubMed

    Aleti, Aldeida; Moser, Irene; Meedeniya, Indika; Grunske, Lars

    2014-01-01

    All commonly used stochastic optimisation algorithms have to be parameterised to perform effectively. Adaptive parameter control (APC) is an effective method used for this purpose. APC repeatedly adjusts parameter values during the optimisation process for optimal algorithm performance. The assignment of parameter values for a given iteration is based on previously measured performance. In recent research, time series prediction has been proposed as a method of projecting the probabilities to use for parameter value selection. In this work, we examine the suitability of a variety of prediction methods for the projection of future parameter performance based on previous data. All considered prediction methods have assumptions the time series data has to conform to for the prediction method to provide accurate projections. Looking specifically at parameters of evolutionary algorithms (EAs), we find that all standard EA parameters with the exception of population size conform largely to the assumptions made by the considered prediction methods. Evaluating the performance of these prediction methods, we find that linear regression provides the best results by a very small and statistically insignificant margin. Regardless of the prediction method, predictive parameter control outperforms state of the art parameter control methods when the performance data adheres to the assumptions made by the prediction method. When a parameter's performance data does not adhere to the assumptions made by the forecasting method, the use of prediction does not have a notable adverse impact on the algorithm's performance.

  6. Prediction of Safety Stock Using Fuzzy Time Series (FTS) and Technology of Radio Frequency Identification (RFID) for Stock Control at Vendor Managed Inventory (VMI)

    NASA Astrophysics Data System (ADS)

    Mashuri, Chamdan; Suryono; Suseno, Jatmiko Endro

    2018-02-01

    This research was conducted by prediction of safety stock using Fuzzy Time Series (FTS) and technology of Radio Frequency Identification (RFID) for stock control at Vendor Managed Inventory (VMI). Well-controlled stock influenced company revenue and minimized cost. It discussed about information system of safety stock prediction developed through programming language of PHP. Input data consisted of demand got from automatic, online and real time acquisition using technology of RFID, then, sent to server and stored at online database. Furthermore, data of acquisition result was predicted by using algorithm of FTS applying universe of discourse defining and fuzzy sets determination. Fuzzy set result was continued to division process of universe of discourse in order to be to final step. Prediction result was displayed at information system dashboard developed. By using 60 data from demand data, prediction score was 450.331 and safety stock was 135.535. Prediction result was done by error deviation validation using Mean Square Percent Error of 15%. It proved that FTS was good enough in predicting demand and safety stock for stock control. For deeper analysis, researchers used data of demand and universe of discourse U varying at FTS to get various result based on test data used.

  7. Dynamic Simulation of Human Gait Model With Predictive Capability.

    PubMed

    Sun, Jinming; Wu, Shaoli; Voglewede, Philip A

    2018-03-01

    In this paper, it is proposed that the central nervous system (CNS) controls human gait using a predictive control approach in conjunction with classical feedback control instead of exclusive classical feedback control theory that controls based on past error. To validate this proposition, a dynamic model of human gait is developed using a novel predictive approach to investigate the principles of the CNS. The model developed includes two parts: a plant model that represents the dynamics of human gait and a controller that represents the CNS. The plant model is a seven-segment, six-joint model that has nine degrees-of-freedom (DOF). The plant model is validated using data collected from able-bodied human subjects. The proposed controller utilizes model predictive control (MPC). MPC uses an internal model to predict the output in advance, compare the predicted output to the reference, and optimize the control input so that the predicted error is minimal. To decrease the complexity of the model, two joints are controlled using a proportional-derivative (PD) controller. The developed predictive human gait model is validated by simulating able-bodied human gait. The simulation results show that the developed model is able to simulate the kinematic output close to experimental data.

  8. Prediction of Regulation Reserve Requirements in California ISO Control Area based on BAAL Standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel V.; Makarov, Yuri V.; Samaan, Nader A.

    This paper presents new methodologies developed at Pacific Northwest National Laboratory (PNNL) to estimate regulation capacity requirements in the California ISO control area. Two approaches have been developed: (1) an approach based on statistical analysis of actual historical area control error (ACE) and regulation data, and (2) an approach based on balancing authority ACE limit control performance standard. The approaches predict regulation reserve requirements on a day-ahead basis including upward and downward requirements, for each operating hour of a day. California ISO data has been used to test the performance of the proposed algorithms. Results show that software tool allowsmore » saving up to 30% on the regulation procurements cost .« less

  9. Adaptive MPC based on MIMO ARX-Laguerre model.

    PubMed

    Ben Abdelwahed, Imen; Mbarek, Abdelkader; Bouzrara, Kais

    2017-03-01

    This paper proposes a method for synthesizing an adaptive predictive controller using a reduced complexity model. This latter is given by the projection of the ARX model on Laguerre bases. The resulting model is entitled MIMO ARX-Laguerre and it is characterized by an easy recursive representation. The adaptive predictive control law is computed based on multi-step-ahead finite-element predictors, identified directly from experimental input/output data. The model is tuned in each iteration by an online identification algorithms of both model parameters and Laguerre poles. The proposed approach avoids time consuming numerical optimization algorithms associated with most common linear predictive control strategies, which makes it suitable for real-time implementation. The method is used to synthesize and test in numerical simulations adaptive predictive controllers for the CSTR process benchmark. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Prediction based active ramp metering control strategy with mobility and safety assessment

    NASA Astrophysics Data System (ADS)

    Fang, Jie; Tu, Lili

    2018-04-01

    Ramp metering is one of the most direct and efficient motorway traffic flow management measures so as to improve traffic conditions. However, owing to short of traffic conditions prediction, in earlier studies, the impact on traffic flow dynamics of the applied RM control was not quantitatively evaluated. In this study, a RM control algorithm adopting Model Predictive Control (MPC) framework to predict and assess future traffic conditions, which taking both the current traffic conditions and the RM-controlled future traffic states into consideration, was presented. The designed RM control algorithm targets at optimizing the network mobility and safety performance. The designed algorithm is evaluated in a field-data-based simulation. Through comparing the presented algorithm controlled scenario with the uncontrolled scenario, it was proved that the proposed RM control algorithm can effectively relieve the congestion of traffic network with no significant compromises in safety aspect.

  11. Model predictive control of the solid oxide fuel cell stack temperature with models based on experimental data

    NASA Astrophysics Data System (ADS)

    Pohjoranta, Antti; Halinen, Matias; Pennanen, Jari; Kiviaho, Jari

    2015-03-01

    Generalized predictive control (GPC) is applied to control the maximum temperature in a solid oxide fuel cell (SOFC) stack and the temperature difference over the stack. GPC is a model predictive control method and the models utilized in this work are ARX-type (autoregressive with extra input), multiple input-multiple output, polynomial models that were identified from experimental data obtained from experiments with a complete SOFC system. The proposed control is evaluated by simulation with various input-output combinations, with and without constraints. A comparison with conventional proportional-integral-derivative (PID) control is also made. It is shown that if only the stack maximum temperature is controlled, a standard PID controller can be used to obtain output performance comparable to that obtained with the significantly more complex model predictive controller. However, in order to control the temperature difference over the stack, both the stack minimum and the maximum temperature need to be controlled and this cannot be done with a single PID controller. In such a case the model predictive controller provides a feasible and effective solution.

  12. Preserving privacy whilst maintaining robust epidemiological predictions.

    PubMed

    Werkman, Marleen; Tildesley, Michael J; Brooks-Pollock, Ellen; Keeling, Matt J

    2016-12-01

    Mathematical models are invaluable tools for quantifying potential epidemics and devising optimal control strategies in case of an outbreak. State-of-the-art models increasingly require detailed individual farm-based and sensitive data, which may not be available due to either lack of capacity for data collection or privacy concerns. However, in many situations, aggregated data are available for use. In this study, we systematically investigate the accuracy of predictions made by mathematical models initialised with varying data aggregations, using the UK 2001 Foot-and-Mouth Disease Epidemic as a case study. We consider the scenario when the only data available are aggregated into spatial grid cells, and develop a metapopulation model where individual farms in a single subpopulation are assumed to behave uniformly and transmit randomly. We also adapt this standard metapopulation model to capture heterogeneity in farm size and composition, using farm census data. Our results show that homogeneous models based on aggregated data overestimate final epidemic size but can perform well for predicting spatial spread. Recognising heterogeneity in farm sizes improves predictions of the final epidemic size, identifying risk areas, determining the likelihood of epidemic take-off and identifying the optimal control strategy. In conclusion, in cases where individual farm-based data are not available, models can still generate meaningful predictions, although care must be taken in their interpretation and use. Copyright © 2016. Published by Elsevier B.V.

  13. Intelligent Control of Micro Grid: A Big Data-Based Control Center

    NASA Astrophysics Data System (ADS)

    Liu, Lu; Wang, Yanping; Liu, Li; Wang, Zhiseng

    2018-01-01

    In this paper, a structure of micro grid system with big data-based control center is introduced. Energy data from distributed generation, storage and load are analized through the control center, and from the results new trends will be predicted and applied as a feedback to optimize the control. Therefore, each step proceeded in micro grid can be adjusted and orgnized in a form of comprehensive management. A framework of real-time data collection, data processing and data analysis will be proposed by employing big data technology. Consequently, a integrated distributed generation and a optimized energy storage and transmission process can be implemented in the micro grid system.

  14. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions.

    PubMed

    Kaufman, Leyla V; Wright, Mark G

    2017-07-07

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA) procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.

  15. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions

    PubMed Central

    Kaufman, Leyla V.; Wright, Mark G.

    2017-01-01

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA) procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments. PMID:28686180

  16. A Wavelet Neural Network Optimal Control Model for Traffic-Flow Prediction in Intelligent Transport Systems

    NASA Astrophysics Data System (ADS)

    Huang, Darong; Bai, Xing-Rong

    Based on wavelet transform and neural network theory, a traffic-flow prediction model, which was used in optimal control of Intelligent Traffic system, is constructed. First of all, we have extracted the scale coefficient and wavelet coefficient from the online measured raw data of traffic flow via wavelet transform; Secondly, an Artificial Neural Network model of Traffic-flow Prediction was constructed and trained using the coefficient sequences as inputs and raw data as outputs; Simultaneous, we have designed the running principium of the optimal control system of traffic-flow Forecasting model, the network topological structure and the data transmitted model; Finally, a simulated example has shown that the technique is effectively and exactly. The theoretical results indicated that the wavelet neural network prediction model and algorithms have a broad prospect for practical application.

  17. Structured Kernel Subspace Learning for Autonomous Robot Navigation.

    PubMed

    Kim, Eunwoo; Choi, Sungjoon; Oh, Songhwai

    2018-02-14

    This paper considers two important problems for autonomous robot navigation in a dynamic environment, where the goal is to predict pedestrian motion and control a robot with the prediction for safe navigation. While there are several methods for predicting the motion of a pedestrian and controlling a robot to avoid incoming pedestrians, it is still difficult to safely navigate in a dynamic environment due to challenges, such as the varying quality and complexity of training data with unwanted noises. This paper addresses these challenges simultaneously by proposing a robust kernel subspace learning algorithm based on the recent advances in nuclear-norm and l 1 -norm minimization. We model the motion of a pedestrian and the robot controller using Gaussian processes. The proposed method efficiently approximates a kernel matrix used in Gaussian process regression by learning low-rank structured matrix (with symmetric positive semi-definiteness) to find an orthogonal basis, which eliminates the effects of erroneous and inconsistent data. Based on structured kernel subspace learning, we propose a robust motion model and motion controller for safe navigation in dynamic environments. We evaluate the proposed robust kernel learning in various tasks, including regression, motion prediction, and motion control problems, and demonstrate that the proposed learning-based systems are robust against outliers and outperform existing regression and navigation methods.

  18. Novel Formulation of Adaptive MPC as EKF Using ANN Model: Multiproduct Semibatch Polymerization Reactor Case Study.

    PubMed

    Kamesh, Reddi; Rani, Kalipatnapu Yamuna

    2017-12-01

    In this paper, a novel formulation for nonlinear model predictive control (MPC) has been proposed incorporating the extended Kalman filter (EKF) control concept using a purely data-driven artificial neural network (ANN) model based on measurements for supervisory control. The proposed scheme consists of two modules focusing on online parameter estimation based on past measurements and control estimation over control horizon based on minimizing the deviation of model output predictions from set points along the prediction horizon. An industrial case study for temperature control of a multiproduct semibatch polymerization reactor posed as a challenge problem has been considered as a test bed to apply the proposed ANN-EKFMPC strategy at supervisory level as a cascade control configuration along with proportional integral controller [ANN-EKFMPC with PI (ANN-EKFMPC-PI)]. The proposed approach is formulated incorporating all aspects of MPC including move suppression factor for control effort minimization and constraint-handling capability including terminal constraints. The nominal stability analysis and offset-free tracking capabilities of the proposed controller are proved. Its performance is evaluated by comparison with a standard MPC-based cascade control approach using the same adaptive ANN model. The ANN-EKFMPC-PI control configuration has shown better controller performance in terms of temperature tracking, smoother input profiles, as well as constraint-handling ability compared with the ANN-MPC with PI approach for two products in summer and winter. The proposed scheme is found to be versatile although it is based on a purely data-driven model with online parameter estimation.

  19. Does Marital Status Predict the Odds of Suicidal Death in Taiwan? A Seven-Year Population-Based Study

    ERIC Educational Resources Information Center

    Yeh, Jui-Yuan; Xirasagar, Sudha; Liu, Tsai-Ching; Li, Chong-Yi; Lin, Herng-Ching

    2008-01-01

    Using nationwide, 7-year population-based data for 1997-2003, we examined marital status to see if it predicted suicide among the ethnic Chinese population of Taiwan. Using cause of death data, with a case-control design, two groups--total adult suicide deaths, n = 17,850, the study group, and adult deaths other than suicide, n = 71,400 (randomly…

  20. A predictive estimation method for carbon dioxide transport by data-driven modeling with a physically-based data model

    NASA Astrophysics Data System (ADS)

    Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun

    2017-11-01

    In this study, a data-driven method for predicting CO2 leaks and associated concentrations from geological CO2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems.

  1. Predictive onboard flow control for packet switching satellites

    NASA Technical Reports Server (NTRS)

    Bobinsky, Eric A.

    1992-01-01

    We outline two alternate approaches to predicting the onset of congestion in a packet switching satellite, and argue that predictive, rather than reactive, flow control is necessary for the efficient operation of such a system. The first method discussed is based on standard, statistical techniques which are used to periodically calculate a probability of near-term congestion based on arrival rate statistics. If this probability exceeds a present threshold, the satellite would transmit a rate-reduction signal to all active ground stations. The second method discussed would utilize a neural network to periodically predict the occurrence of buffer overflow based on input data which would include, in addition to arrival rates, the distributions of packet lengths, source addresses, and destination addresses.

  2. Assessment of quantitative structure-activity relationship of toxicity prediction models for Korean chemical substance control legislation

    PubMed Central

    Kim, Kwang-Yon; Shin, Seong Eun; No, Kyoung Tai

    2015-01-01

    Objectives For successful adoption of legislation controlling registration and assessment of chemical substances, it is important to obtain sufficient toxicological experimental evidence and other related information. It is also essential to obtain a sufficient number of predicted risk and toxicity results. Particularly, methods used in predicting toxicities of chemical substances during acquisition of required data, ultimately become an economic method for future dealings with new substances. Although the need for such methods is gradually increasing, the-required information about reliability and applicability range has not been systematically provided. Methods There are various representative environmental and human toxicity models based on quantitative structure-activity relationships (QSAR). Here, we secured the 10 representative QSAR-based prediction models and its information that can make predictions about substances that are expected to be regulated. We used models that predict and confirm usability of the information expected to be collected and submitted according to the legislation. After collecting and evaluating each predictive model and relevant data, we prepared methods quantifying the scientific validity and reliability, which are essential conditions for using predictive models. Results We calculated predicted values for the models. Furthermore, we deduced and compared adequacies of the models using the Alternative non-testing method assessed for Registration, Evaluation, Authorization, and Restriction of Chemicals Substances scoring system, and deduced the applicability domains for each model. Additionally, we calculated and compared inclusion rates of substances expected to be regulated, to confirm the applicability. Conclusions We evaluated and compared the data, adequacy, and applicability of our selected QSAR-based toxicity prediction models, and included them in a database. Based on this data, we aimed to construct a system that can be used with predicted toxicity results. Furthermore, by presenting the suitability of individual predicted results, we aimed to provide a foundation that could be used in actual assessments and regulations. PMID:26206368

  3. Inferential modeling and predictive feedback control in real-time motion compensation using the treatment couch during radiotherapy

    NASA Astrophysics Data System (ADS)

    Qiu, Peng; D'Souza, Warren D.; McAvoy, Thomas J.; Liu, K. J. Ray

    2007-09-01

    Tumor motion induced by respiration presents a challenge to the reliable delivery of conformal radiation treatments. Real-time motion compensation represents the technologically most challenging clinical solution but has the potential to overcome the limitations of existing methods. The performance of a real-time couch-based motion compensation system is mainly dependent on two aspects: the ability to infer the internal anatomical position and the performance of the feedback control system. In this paper, we propose two novel methods for the two aspects respectively, and then combine the proposed methods into one system. To accurately estimate the internal tumor position, we present partial-least squares (PLS) regression to predict the position of the diaphragm using skin-based motion surrogates. Four radio-opaque markers were placed on the abdomen of patients who underwent fluoroscopic imaging of the diaphragm. The coordinates of the markers served as input variables and the position of the diaphragm served as the output variable. PLS resulted in lower prediction errors compared with standard multiple linear regression (MLR). The performance of the feedback control system depends on the system dynamics and dead time (delay between the initiation and execution of the control action). While the dynamics of the system can be inverted in a feedback control system, the dead time cannot be inverted. To overcome the dead time of the system, we propose a predictive feedback control system by incorporating forward prediction using least-mean-square (LMS) and recursive least square (RLS) filtering into the couch-based control system. Motion data were obtained using a skin-based marker. The proposed predictive feedback control system was benchmarked against pure feedback control (no forward prediction) and resulted in a significant performance gain. Finally, we combined the PLS inference model and the predictive feedback control to evaluate the overall performance of the feedback control system. Our results show that, with the tumor motion unknown but inferred by skin-based markers through the PLS model, the predictive feedback control system was able to effectively compensate intra-fraction motion.

  4. Testing a hydraulic trait based model of stomatal control: results from a controlled drought experiment on aspen (Populus tremuloides, Michx.) and ponderosa pine (Pinus ponderosa, Douglas)

    NASA Astrophysics Data System (ADS)

    Love, D. M.; Venturas, M.; Sperry, J.; Wang, Y.; Anderegg, W.

    2017-12-01

    Modeling approaches for tree stomatal control often rely on empirical fitting to provide accurate estimates of whole tree transpiration (E) and assimilation (A), which are limited in their predictive power by the data envelope used to calibrate model parameters. Optimization based models hold promise as a means to predict stomatal behavior under novel climate conditions. We designed an experiment to test a hydraulic trait based optimization model, which predicts stomatal conductance from a gain/risk approach. Optimal stomatal conductance is expected to maximize the potential carbon gain by photosynthesis, and minimize the risk to hydraulic transport imposed by cavitation. The modeled risk to the hydraulic network is assessed from cavitation vulnerability curves, a commonly measured physiological trait in woody plant species. Over a growing season garden grown plots of aspen (Populus tremuloides, Michx.) and ponderosa pine (Pinus ponderosa, Douglas) were subjected to three distinct drought treatments (moderate, severe, severe with rehydration) relative to a control plot to test model predictions. Model outputs of predicted E, A, and xylem pressure can be directly compared to both continuous data (whole tree sapflux, soil moisture) and point measurements (leaf level E, A, xylem pressure). The model also predicts levels of whole tree hydraulic impairment expected to increase mortality risk. This threshold is used to estimate survivorship in the drought treatment plots. The model can be run at two scales, either entirely from climate (meteorological inputs, irrigation) or using the physiological measurements as a starting point. These data will be used to study model performance and utility, and aid in developing the model for larger scale applications.

  5. Using Historical Data to Automatically Identify Air-Traffic Control Behavior

    NASA Technical Reports Server (NTRS)

    Lauderdale, Todd A.; Wu, Yuefeng; Tretto, Celeste

    2014-01-01

    This project seeks to develop statistical-based machine learning models to characterize the types of errors present when using current systems to predict future aircraft states. These models will be data-driven - based on large quantities of historical data. Once these models are developed, they will be used to infer situations in the historical data where an air-traffic controller intervened on an aircraft's route, even when there is no direct recording of this action.

  6. Data-driven modeling and predictive control for boiler-turbine unit using fuzzy clustering and subspace methods.

    PubMed

    Wu, Xiao; Shen, Jiong; Li, Yiguo; Lee, Kwang Y

    2014-05-01

    This paper develops a novel data-driven fuzzy modeling strategy and predictive controller for boiler-turbine unit using fuzzy clustering and subspace identification (SID) methods. To deal with the nonlinear behavior of boiler-turbine unit, fuzzy clustering is used to provide an appropriate division of the operation region and develop the structure of the fuzzy model. Then by combining the input data with the corresponding fuzzy membership functions, the SID method is extended to extract the local state-space model parameters. Owing to the advantages of the both methods, the resulting fuzzy model can represent the boiler-turbine unit very closely, and a fuzzy model predictive controller is designed based on this model. As an alternative approach, a direct data-driven fuzzy predictive control is also developed following the same clustering and subspace methods, where intermediate subspace matrices developed during the identification procedure are utilized directly as the predictor. Simulation results show the advantages and effectiveness of the proposed approach. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Anticipatory Monitoring and Control of Complex Systems using a Fuzzy based Fusion of Support Vector Regressors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miltiadis Alamaniotis; Vivek Agarwal

    This paper places itself in the realm of anticipatory systems and envisions monitoring and control methods being capable of making predictions over system critical parameters. Anticipatory systems allow intelligent control of complex systems by predicting their future state. In the current work, an intelligent model aimed at implementing anticipatory monitoring and control in energy industry is presented and tested. More particularly, a set of support vector regressors (SVRs) are trained using both historical and observed data. The trained SVRs are used to predict the future value of the system based on current operational system parameter. The predicted values are thenmore » inputted to a fuzzy logic based module where the values are fused to obtain a single value, i.e., final system output prediction. The methodology is tested on real turbine degradation datasets. The outcome of the approach presented in this paper highlights the superiority over single support vector regressors. In addition, it is shown that appropriate selection of fuzzy sets and fuzzy rules plays an important role in improving system performance.« less

  8. Prediction of clinical depression scores and detection of changes in whole-brain using resting-state functional MRI data with partial least squares regression

    PubMed Central

    Shimizu, Yu; Yoshimoto, Junichiro; Takamura, Masahiro; Okada, Go; Okamoto, Yasumasa; Yamawaki, Shigeto; Doya, Kenji

    2017-01-01

    In diagnostic applications of statistical machine learning methods to brain imaging data, common problems include data high-dimensionality and co-linearity, which often cause over-fitting and instability. To overcome these problems, we applied partial least squares (PLS) regression to resting-state functional magnetic resonance imaging (rs-fMRI) data, creating a low-dimensional representation that relates symptoms to brain activity and that predicts clinical measures. Our experimental results, based upon data from clinically depressed patients and healthy controls, demonstrated that PLS and its kernel variants provided significantly better prediction of clinical measures than ordinary linear regression. Subsequent classification using predicted clinical scores distinguished depressed patients from healthy controls with 80% accuracy. Moreover, loading vectors for latent variables enabled us to identify brain regions relevant to depression, including the default mode network, the right superior frontal gyrus, and the superior motor area. PMID:28700672

  9. Invasive Species Distribution Modeling (iSDM): Are absence data and dispersal constraints needed to predict actual distributions?

    Treesearch

    Tomáš Václavík; Ross K. Meentemeyer

    2009-01-01

    Species distribution models (SDMs) based on statistical relationships between occurrence data and underlying environmental conditions are increasingly used to predict spatial patterns of biological invasions and prioritize locations for early detection and control of invasion outbreaks. However, invasive species distribution models (iSDMs) face special challenges...

  10. Testing the predictive power of the transtheoretical model of behavior change applied to dietary fat intake

    PubMed Central

    Wright, Julie A.; Velicer, Wayne F.; Prochaska, James O.

    2009-01-01

    This study evaluated how well predictions from the transtheoretical model (TTM) generalized from smoking to diet. Longitudinal data were used from a randomized control trial on reducing dietary fat consumption in adults (n =1207) recruited from primary care practices. Predictive power was evaluated by making a priori predictions of the magnitude of change expected in the TTM constructs of temptation, pros and cons, and 10 processes of change when an individual transitions between the stages of change. Generalizability was evaluated by testing predictions based on smoking data. Three sets of predictions were made for each stage: Precontemplation (PC), Contemplation (C) and Preparation (PR) based on stage transition categories of no progress, progress and regression determined by stage at baseline versus stage at the 12-month follow-up. Univariate analysis of variance between stage transition groups was used to calculate the effect size [omega squared (ω2)]. For diet predictions based on diet data, there was a high degree of confirmation: 92%, 95% and 92% for PC, C and PR, respectively. For diet predictions based on smoking data, 77%, 79% and 85% were confirmed, respectively, suggesting a moderate degree of generalizability. This study revised effect size estimates for future theory testing on the TTM applied to dietary fat. PMID:18400785

  11. A predictive estimation method for carbon dioxide transport by data-driven modeling with a physically-based data model.

    PubMed

    Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun

    2017-11-01

    In this study, a data-driven method for predicting CO 2 leaks and associated concentrations from geological CO 2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO 2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO 2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO 2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Effects of external loads on balance control during upright stance: experimental results and model-based predictions.

    PubMed

    Qu, Xingda; Nussbaum, Maury A

    2009-01-01

    The purpose of this study was to identify the effects of external loads on balance control during upright stance, and to examine the ability of a new balance control model to predict these effects. External loads were applied to 12 young, healthy participants, and effects on balance control were characterized by center-of-pressure (COP) based measures. Several loading conditions were studied, involving combinations of load mass (10% and 20% of individual body mass) and height (at or 15% of stature above the whole-body COM). A balance control model based on an optimal control strategy was used to predict COP time series. It was assumed that a given individual would adopt the same neural optimal control mechanisms, identified in a no-load condition, under diverse external loading conditions. With the application of external loads, COP mean velocity in the anterior-posterior direction and RMS distance in the medial-lateral direction increased 8.1% and 10.4%, respectively. Predicted COP mean velocity and RMS distance in the anterior-posterior direction also increased with external loading, by 11.1% and 2.9%, respectively. Both experimental COP data and model-based predictions provided the same general conclusion, that application of larger external loads and loads more superior to the whole body center of mass lead to less effective postural control and perhaps a greater risk of loss of balance or falls. Thus, it can be concluded that the assumption about consistency in control mechanisms was partially supported, and it is the mechanical changes induced by external loads that primarily affect balance control.

  13. A model for prediction of STOVL ejector dynamics

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.

    1989-01-01

    A semi-empirical control-volume approach to ejector modeling for transient performance prediction is presented. This new approach is motivated by the need for a predictive real-time ejector sub-system simulation for Short Take-Off Verticle Landing (STOVL) integrated flight and propulsion controls design applications. Emphasis is placed on discussion of the approximate characterization of the mixing process central to thrust augmenting ejector operation. The proposed ejector model suggests transient flow predictions are possible with a model based on steady-flow data. A practical test case is presented to illustrate model calibration.

  14. Experimental evaluation of a recursive model identification technique for type 1 diabetes.

    PubMed

    Finan, Daniel A; Doyle, Francis J; Palerm, Cesar C; Bevier, Wendy C; Zisser, Howard C; Jovanovic, Lois; Seborg, Dale E

    2009-09-01

    A model-based controller for an artificial beta cell requires an accurate model of the glucose-insulin dynamics in type 1 diabetes subjects. To ensure the robustness of the controller for changing conditions (e.g., changes in insulin sensitivity due to illnesses, changes in exercise habits, or changes in stress levels), the model should be able to adapt to the new conditions by means of a recursive parameter estimation technique. Such an adaptive strategy will ensure that the most accurate model is used for the current conditions, and thus the most accurate model predictions are used in model-based control calculations. In a retrospective analysis, empirical dynamic autoregressive exogenous input (ARX) models were identified from glucose-insulin data for nine type 1 diabetes subjects in ambulatory conditions. Data sets consisted of continuous (5-minute) glucose concentration measurements obtained from a continuous glucose monitor, basal insulin infusion rates and times and amounts of insulin boluses obtained from the subjects' insulin pumps, and subject-reported estimates of the times and carbohydrate content of meals. Two identification techniques were investigated: nonrecursive, or batch methods, and recursive methods. Batch models were identified from a set of training data, whereas recursively identified models were updated at each sampling instant. Both types of models were used to make predictions of new test data. For the purpose of comparison, model predictions were compared to zero-order hold (ZOH) predictions, which were made by simply holding the current glucose value constant for p steps into the future, where p is the prediction horizon. Thus, the ZOH predictions are model free and provide a base case for the prediction metrics used to quantify the accuracy of the model predictions. In theory, recursive identification techniques are needed only when there are changing conditions in the subject that require model adaptation. Thus, the identification and validation techniques were performed with both "normal" data and data collected during conditions of reduced insulin sensitivity. The latter were achieved by having the subjects self-administer a medication, prednisone, for 3 consecutive days. The recursive models were allowed to adapt to this condition of reduced insulin sensitivity, while the batch models were only identified from normal data. Data from nine type 1 diabetes subjects in ambulatory conditions were analyzed; six of these subjects also participated in the prednisone portion of the study. For normal test data, the batch ARX models produced 30-, 45-, and 60-minute-ahead predictions that had average root mean square error (RMSE) values of 26, 34, and 40 mg/dl, respectively. For test data characterized by reduced insulin sensitivity, the batch ARX models produced 30-, 60-, and 90-minute-ahead predictions with average RMSE values of 27, 46, and 59 mg/dl, respectively; the recursive ARX models demonstrated similar performance with corresponding values of 27, 45, and 61 mg/dl, respectively. The identified ARX models (batch and recursive) produced more accurate predictions than the model-free ZOH predictions, but only marginally. For test data characterized by reduced insulin sensitivity, RMSE values for the predictions of the batch ARX models were 9, 5, and 5% more accurate than the ZOH predictions for prediction horizons of 30, 60, and 90 minutes, respectively. In terms of RMSE values, the 30-, 60-, and 90-minute predictions of the recursive models were more accurate than the ZOH predictions, by 10, 5, and 2%, respectively. In this experimental study, the recursively identified ARX models resulted in predictions of test data that were similar, but not superior, to the batch models. Even for the test data characteristic of reduced insulin sensitivity, the batch and recursive models demonstrated similar prediction accuracy. The predictions of the identified ARX models were only marginally more accurate than the model-free ZOH predictions. Given the simplicity of the ARX models and the computational ease with which they are identified, however, even modest improvements may justify the use of these models in a model-based controller for an artificial beta cell. 2009 Diabetes Technology Society.

  15. Initial Evaluations of LoC Prediction Algorithms Using the NASA Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Krishnakumar, Kalmanje; Stepanyan, Vahram; Barlow, Jonathan; Hardy, Gordon; Dorais, Greg; Poolla, Chaitanya; Reardon, Scott; Soloway, Donald

    2014-01-01

    Flying near the edge of the safe operating envelope is an inherently unsafe proposition. Edge of the envelope here implies that small changes or disturbances in system state or system dynamics can take the system out of the safe envelope in a short time and could result in loss-of-control events. This study evaluated approaches to predicting loss-of-control safety margins as the aircraft gets closer to the edge of the safe operating envelope. The goal of the approach is to provide the pilot aural, visual, and tactile cues focused on maintaining the pilot's control action within predicted loss-of-control boundaries. Our predictive architecture combines quantitative loss-of-control boundaries, an adaptive prediction method to estimate in real-time Markov model parameters and associated stability margins, and a real-time data-based predictive control margins estimation algorithm. The combined architecture is applied to a nonlinear transport class aircraft. Evaluations of various feedback cues using both test and commercial pilots in the NASA Ames Vertical Motion-base Simulator (VMS) were conducted in the summer of 2013. The paper presents results of this evaluation focused on effectiveness of these approaches and the cues in preventing the pilots from entering a loss-of-control event.

  16. Data-Driven Nonlinear Subspace Modeling for Prediction and Control of Molten Iron Quality Indices in Blast Furnace Ironmaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ping; Song, Heda; Wang, Hong

    Blast furnace (BF) in ironmaking is a nonlinear dynamic process with complicated physical-chemical reactions, where multi-phase and multi-field coupling and large time delay occur during its operation. In BF operation, the molten iron temperature (MIT) as well as Si, P and S contents of molten iron are the most essential molten iron quality (MIQ) indices, whose measurement, modeling and control have always been important issues in metallurgic engineering and automation field. This paper develops a novel data-driven nonlinear state space modeling for the prediction and control of multivariate MIQ indices by integrating hybrid modeling and control techniques. First, to improvemore » modeling efficiency, a data-driven hybrid method combining canonical correlation analysis and correlation analysis is proposed to identify the most influential controllable variables as the modeling inputs from multitudinous factors would affect the MIQ indices. Then, a Hammerstein model for the prediction of MIQ indices is established using the LS-SVM based nonlinear subspace identification method. Such a model is further simplified by using piecewise cubic Hermite interpolating polynomial method to fit the complex nonlinear kernel function. Compared to the original Hammerstein model, this simplified model can not only significantly reduce the computational complexity, but also has almost the same reliability and accuracy for a stable prediction of MIQ indices. Last, in order to verify the practicability of the developed model, it is applied in designing a genetic algorithm based nonlinear predictive controller for multivariate MIQ indices by directly taking the established model as a predictor. Industrial experiments show the advantages and effectiveness of the proposed approach.« less

  17. User-Preference-Driven Model Predictive Control of Residential Building Loads and Battery Storage for Demand Response: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Xin; Baker, Kyri A.; Christensen, Dane T.

    This paper presents a user-preference-driven home energy management system (HEMS) for demand response (DR) with residential building loads and battery storage. The HEMS is based on a multi-objective model predictive control algorithm, where the objectives include energy cost, thermal comfort, and carbon emission. A multi-criterion decision making method originating from social science is used to quickly determine user preferences based on a brief survey and derive the weights of different objectives used in the optimization process. Besides the residential appliances used in the traditional DR programs, a home battery system is integrated into the HEMS to improve the flexibility andmore » reliability of the DR resources. Simulation studies have been performed on field data from a residential building stock data set. Appliance models and usage patterns were learned from the data to predict the DR resource availability. Results indicate the HEMS was able to provide a significant amount of load reduction with less than 20% prediction error in both heating and cooling cases.« less

  18. User-Preference-Driven Model Predictive Control of Residential Building Loads and Battery Storage for Demand Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Xin; Baker, Kyri A; Isley, Steven C

    This paper presents a user-preference-driven home energy management system (HEMS) for demand response (DR) with residential building loads and battery storage. The HEMS is based on a multi-objective model predictive control algorithm, where the objectives include energy cost, thermal comfort, and carbon emission. A multi-criterion decision making method originating from social science is used to quickly determine user preferences based on a brief survey and derive the weights of different objectives used in the optimization process. Besides the residential appliances used in the traditional DR programs, a home battery system is integrated into the HEMS to improve the flexibility andmore » reliability of the DR resources. Simulation studies have been performed on field data from a residential building stock data set. Appliance models and usage patterns were learned from the data to predict the DR resource availability. Results indicate the HEMS was able to provide a significant amount of load reduction with less than 20% prediction error in both heating and cooling cases.« less

  19. Literature-based condition-specific miRNA-mRNA target prediction.

    PubMed

    Oh, Minsik; Rhee, Sungmin; Moon, Ji Hwan; Chae, Heejoon; Lee, Sunwon; Kang, Jaewoo; Kim, Sun

    2017-01-01

    miRNAs are small non-coding RNAs that regulate gene expression by binding to the 3'-UTR of genes. Many recent studies have reported that miRNAs play important biological roles by regulating specific mRNAs or genes. Many sequence-based target prediction algorithms have been developed to predict miRNA targets. However, these methods are not designed for condition-specific target predictions and produce many false positives; thus, expression-based target prediction algorithms have been developed for condition-specific target predictions. A typical strategy to utilize expression data is to leverage the negative control roles of miRNAs on genes. To control false positives, a stringent cutoff value is typically set, but in this case, these methods tend to reject many true target relationships, i.e., false negatives. To overcome these limitations, additional information should be utilized. The literature is probably the best resource that we can utilize. Recent literature mining systems compile millions of articles with experiments designed for specific biological questions, and the systems provide a function to search for specific information. To utilize the literature information, we used a literature mining system, BEST, that automatically extracts information from the literature in PubMed and that allows the user to perform searches of the literature with any English words. By integrating omics data analysis methods and BEST, we developed Context-MMIA, a miRNA-mRNA target prediction method that combines expression data analysis results and the literature information extracted based on the user-specified context. In the pathway enrichment analysis using genes included in the top 200 miRNA-targets, Context-MMIA outperformed the four existing target prediction methods that we tested. In another test on whether prediction methods can re-produce experimentally validated target relationships, Context-MMIA outperformed the four existing target prediction methods. In summary, Context-MMIA allows the user to specify a context of the experimental data to predict miRNA targets, and we believe that Context-MMIA is very useful for predicting condition-specific miRNA targets.

  20. Intelligent Prediction of Fan Rotation Stall in Power Plants Based on Pressure Sensor Data Measured In-Situ

    PubMed Central

    Xu, Xiaogang; Wang, Songling; Liu, Jinlian; Liu, Xinyu

    2014-01-01

    Blower and exhaust fans consume over 30% of electricity in a thermal power plant, and faults of these fans due to rotation stalls are one of the most frequent reasons for power plant outage failures. To accurately predict the occurrence of fan rotation stalls, we propose a support vector regression machine (SVRM) model that predicts the fan internal pressures during operation, leaving ample time for rotation stall detection. We train the SVRM model using experimental data samples, and perform pressure data prediction using the trained SVRM model. To prove the feasibility of using the SVRM model for rotation stall prediction, we further process the predicted pressure data via wavelet-transform-based stall detection. By comparison of the detection results from the predicted and measured pressure data, we demonstrate that the SVRM model can accurately predict the fan pressure and guarantee reliable stall detection with a time advance of up to 0.0625 s. This superior pressure data prediction capability leaves significant time for effective control and prevention of fan rotation stall faults. This model has great potential for use in intelligent fan systems with stall prevention capability, which will ensure safe operation and improve the energy efficiency of power plants. PMID:24854057

  1. ERTS-1 data collection systems used to predict wheat disease severities. [Riley County, Kansas

    NASA Technical Reports Server (NTRS)

    Kanemasu, E. T.; Schimmelpfenning, H.; Choy, E. C.; Eversmeyer, M. G.; Lenhert, D.

    1974-01-01

    The author has identified the following significant results. The feasibility of using the data collection system on ERTS-1 to predict wheat leaf rust severity and resulting yield loss was tested. Ground-based data collection platforms (DCP'S), placed in two commercial wheat fields in Riley County, Kansas, transmitted to the satellite such meteorological information as maximum and minimum temperature, relative humidity, and hours of free moisture. Meteorological data received from the two DCP'S from April 23 to 29 were used to estimate the disease progress curve. Values from the curve were used to predict the percentage decrease in wheat yields resulting from leaf rust. Actual decrease in yield was obtained by applying a zinc and maneb spray (5.6 kg/ha) to control leaf rust, then comparing yields of the controlled (healthy) and the noncontrolled (rusted) areas. In each field a 9% decrease in yield was predicted by the DCP-derived data; actual decreases were 12% and 9%.

  2. MetaDP: a comprehensive web server for disease prediction of 16S rRNA metagenomic datasets.

    PubMed

    Xu, Xilin; Wu, Aiping; Zhang, Xinlei; Su, Mingming; Jiang, Taijiao; Yuan, Zhe-Ming

    2016-01-01

    High-throughput sequencing-based metagenomics has garnered considerable interest in recent years. Numerous methods and tools have been developed for the analysis of metagenomic data. However, it is still a daunting task to install a large number of tools and complete a complicated analysis, especially for researchers with minimal bioinformatics backgrounds. To address this problem, we constructed an automated software named MetaDP for 16S rRNA sequencing data analysis, including data quality control, operational taxonomic unit clustering, diversity analysis, and disease risk prediction modeling. Furthermore, a support vector machine-based prediction model for intestinal bowel syndrome (IBS) was built by applying MetaDP to microbial 16S sequencing data from 108 children. The success of the IBS prediction model suggests that the platform may also be applied to other diseases related to gut microbes, such as obesity, metabolic syndrome, or intestinal cancer, among others (http://metadp.cn:7001/).

  3. Short-term PV/T module temperature prediction based on PCA-RBF neural network

    NASA Astrophysics Data System (ADS)

    Li, Jiyong; Zhao, Zhendong; Li, Yisheng; Xiao, Jing; Tang, Yunfeng

    2018-02-01

    Aiming at the non-linearity and large inertia of temperature control in PV/T system, short-term temperature prediction of PV/T module is proposed, to make the PV/T system controller run forward according to the short-term forecasting situation to optimize control effect. Based on the analysis of the correlation between PV/T module temperature and meteorological factors, and the temperature of adjacent time series, the principal component analysis (PCA) method is used to pre-process the original input sample data. Combined with the RBF neural network theory, the simulation results show that the PCA method makes the prediction accuracy of the network model higher and the generalization performance stronger than that of the RBF neural network without the main component extraction.

  4. A support vector machine based control application to the experimental three-tank system.

    PubMed

    Iplikci, Serdar

    2010-07-01

    This paper presents a support vector machine (SVM) approach to generalized predictive control (GPC) of multiple-input multiple-output (MIMO) nonlinear systems. The possession of higher generalization potential and at the same time avoidance of getting stuck into the local minima have motivated us to employ SVM algorithms for modeling MIMO systems. Based on the SVM model, detailed and compact formulations for calculating predictions and gradient information, which are used in the computation of the optimal control action, are given in the paper. The proposed MIMO SVM-based GPC method has been verified on an experimental three-tank liquid level control system. Experimental results have shown that the proposed method can handle the control task successfully for different reference trajectories. Moreover, a detailed discussion on data gathering, model selection and effects of the control parameters have been given in this paper. 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Preliminary results of the FVS gypsy moth event monitor using remeasurement plot data from Northern West Virginia

    Treesearch

    Matthew P. Perkowski; John R. Brooks; Kurt W. Gottschalk

    2008-01-01

    Predictions based on the Gypsy Moth Event Monitor were compared to remeasurement plot data from stands receiving gypsy moth defoliation. These stands were part of a silvicultural treatment study located in northern West Virginia that included a sanitation thinning, a presalvage thinning and paired no-treatment controls. In all cases the event monitor under predicted...

  6. Departments of Defense and Agriculture Team Up to Develop New Insecticides for Mosquito Control

    DTIC Science & Technology

    2010-01-01

    archives of insecticide data by quantita- tive structure-activity relationship ( QSAR ) modeling to predict and synthesize new insecticides. This...blood- sucking arthropods. The key thrust of IIBBL’s approach involves QSAR -based modeling of fast-acting pyrethroid insecticides to predict and

  7. Discovering Pediatric Asthma Phenotypes on the Basis of Response to Controller Medication Using Machine Learning.

    PubMed

    Ross, Mindy K; Yoon, Jinsung; van der Schaar, Auke; van der Schaar, Mihaela

    2018-01-01

    Pediatric asthma has variable underlying inflammation and symptom control. Approaches to addressing this heterogeneity, such as clustering methods to find phenotypes and predict outcomes, have been investigated. However, clustering based on the relationship between treatment and clinical outcome has not been performed, and machine learning approaches for long-term outcome prediction in pediatric asthma have not been studied in depth. Our objectives were to use our novel machine learning algorithm, predictor pursuit (PP), to discover pediatric asthma phenotypes on the basis of asthma control in response to controller medications, to predict longitudinal asthma control among children with asthma, and to identify features associated with asthma control within each discovered pediatric phenotype. We applied PP to the Childhood Asthma Management Program study data (n = 1,019) to discover phenotypes on the basis of asthma control between assigned controller therapy groups (budesonide vs. nedocromil). We confirmed PP's ability to discover phenotypes using the Asthma Clinical Research Network/Childhood Asthma Research and Education network data. We next predicted children's asthma control over time and compared PP's performance with that of traditional prediction methods. Last, we identified clinical features most correlated with asthma control in the discovered phenotypes. Four phenotypes were discovered in both datasets: allergic not obese (A + /O - ), obese not allergic (A - /O + ), allergic and obese (A + /O + ), and not allergic not obese (A - /O - ). Of the children with well-controlled asthma in the Childhood Asthma Management Program dataset, we found more nonobese children treated with budesonide than with nedocromil (P = 0.015) and more obese children treated with nedocromil than with budesonide (P = 0.008). Within the obese group, more A + /O + children's asthma was well controlled with nedocromil than with budesonide (P = 0.022) or with placebo (P = 0.011). The PP algorithm performed significantly better (P < 0.001) than traditional machine learning algorithms for both short- and long-term asthma control prediction. Asthma control and bronchodilator response were the features most predictive of short-term asthma control, regardless of type of controller medication or phenotype. Bronchodilator response and serum eosinophils were the most predictive features of asthma control, regardless of type of controller medication or phenotype. Advanced statistical machine learning approaches can be powerful tools for discovery of phenotypes based on treatment response and can aid in asthma control prediction in complex medical conditions such as asthma.

  8. Real-time and simultaneous control of artificial limbs based on pattern recognition algorithms.

    PubMed

    Ortiz-Catalan, Max; Håkansson, Bo; Brånemark, Rickard

    2014-07-01

    The prediction of simultaneous limb motions is a highly desirable feature for the control of artificial limbs. In this work, we investigate different classification strategies for individual and simultaneous movements based on pattern recognition of myoelectric signals. Our results suggest that any classifier can be potentially employed in the prediction of simultaneous movements if arranged in a distributed topology. On the other hand, classifiers inherently capable of simultaneous predictions, such as the multi-layer perceptron (MLP), were found to be more cost effective, as they can be successfully employed in their simplest form. In the prediction of individual movements, the one-vs-one (OVO) topology was found to improve classification accuracy across different classifiers and it was therefore used to benchmark the benefits of simultaneous control. As opposed to previous work reporting only offline accuracy, the classification performance and the resulting controllability are evaluated in real time using the motion test and target achievement control (TAC) test, respectively. We propose a simultaneous classification strategy based on MLP that outperformed a top classifier for individual movements (LDA-OVO), thus improving the state-of-the-art classification approach. Furthermore, all the presented classification strategies and data collected in this study are freely available in BioPatRec, an open source platform for the development of advanced prosthetic control strategies.

  9. Reference governors for controlled belt restraint systems

    NASA Astrophysics Data System (ADS)

    van der Laan, E. P.; Heemels, W. P. M. H.; Luijten, H.; Veldpaus, F. E.; Steinbuch, M.

    2010-07-01

    Today's restraint systems typically include a number of airbags, and a three-point seat belt with load limiter and pretensioner. For the class of real-time controlled restraint systems, the restraint actuator settings are continuously manipulated during the crash. This paper presents a novel control strategy for these systems. The control strategy developed here is based on a combination of model predictive control and reference management, in which a non-linear device - a reference governor (RG) - is added to a primal closed-loop controlled system. This RG determines an optimal setpoint in terms of injury reduction and constraint satisfaction by solving a constrained optimisation problem. Prediction of the vehicle motion, required to predict future constraint violation, is included in the design and is based on past crash data, using linear regression techniques. Simulation results with MADYMO models show that, with ideal sensors and actuators, a significant reduction (45%) of the peak chest acceleration can be achieved, without prior knowledge of the crash. Furthermore, it is shown that the algorithms are sufficiently fast to be implemented online.

  10. Predicting the outbreak of hand, foot, and mouth disease in Nanjing, China: a time-series model based on weather variability

    NASA Astrophysics Data System (ADS)

    Liu, Sijun; Chen, Jiaping; Wang, Jianming; Wu, Zhuchao; Wu, Weihua; Xu, Zhiwei; Hu, Wenbiao; Xu, Fei; Tong, Shilu; Shen, Hongbing

    2017-10-01

    Hand, foot, and mouth disease (HFMD) is a significant public health issue in China and an accurate prediction of epidemic can improve the effectiveness of HFMD control. This study aims to develop a weather-based forecasting model for HFMD using the information on climatic variables and HFMD surveillance in Nanjing, China. Daily data on HFMD cases and meteorological variables between 2010 and 2015 were acquired from the Nanjing Center for Disease Control and Prevention, and China Meteorological Data Sharing Service System, respectively. A multivariate seasonal autoregressive integrated moving average (SARIMA) model was developed and validated by dividing HFMD infection data into two datasets: the data from 2010 to 2013 were used to construct a model and those from 2014 to 2015 were used to validate it. Moreover, we used weekly prediction for the data between 1 January 2014 and 31 December 2015 and leave-1-week-out prediction was used to validate the performance of model prediction. SARIMA (2,0,0)52 associated with the average temperature at lag of 1 week appeared to be the best model (R 2 = 0.936, BIC = 8.465), which also showed non-significant autocorrelations in the residuals of the model. In the validation of the constructed model, the predicted values matched the observed values reasonably well between 2014 and 2015. There was a high agreement rate between the predicted values and the observed values (sensitivity 80%, specificity 96.63%). This study suggests that the SARIMA model with average temperature could be used as an important tool for early detection and prediction of HFMD outbreaks in Nanjing, China.

  11. Overview of the NASA Wallops Flight Facility Mobile Range Control System

    NASA Technical Reports Server (NTRS)

    Davis, Rodney A.; Semancik, Susan K.; Smith, Donna C.; Stancil, Robert K.

    1999-01-01

    The NASA GSFC's Wallops Flight Facility (WFF) Mobile Range Control System (MRCS) is based on the functionality of the WFF Range Control Center at Wallops Island, Virginia. The MRCS provides real time instantaneous impact predictions, real time flight performance data, and other critical information needed by mission and range safety personnel in support of range operations at remote launch sites. The MRCS integrates a PC telemetry processing system (TELPro), a PC radar processing system (PCDQS), multiple Silicon Graphics display workstations (IRIS), and communication links within a mobile van for worldwide support of orbital, suborbital, and aircraft missions. This paper describes the MRCS configuration; the TELPro's capability to provide single/dual telemetry tracking and vehicle state data processing; the PCDQS' capability to provide real time positional data and instantaneous impact prediction for up to 8 data sources; and the IRIS' user interface for setup/display options. With portability, PC-based data processing, high resolution graphics, and flexible multiple source support, the MRCS system is proving to be responsive to the ever-changing needs of a variety of increasingly complex missions.

  12. Atmospheric Ozone 1985. Assessment of our understanding of the processes controlling its present distribution and change, volume 3

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Topics addressed include: assessment models; model predictions of ozone changes; ozone and temperature trends; trace gas effects on climate; kinetics and photchemical data base; spectroscopic data base (infrared to microwave); instrument intercomparisons and assessments; and monthly mean distribution of ozone and temperature.

  13. Modeling and control for closed environment plant production systems

    NASA Technical Reports Server (NTRS)

    Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.

  14. Risk Prediction for Epithelial Ovarian Cancer in 11 United States–Based Case-Control Studies: Incorporation of Epidemiologic Risk Factors and 17 Confirmed Genetic Loci

    PubMed Central

    Clyde, Merlise A.; Palmieri Weber, Rachel; Iversen, Edwin S.; Poole, Elizabeth M.; Doherty, Jennifer A.; Goodman, Marc T.; Ness, Roberta B.; Risch, Harvey A.; Rossing, Mary Anne; Terry, Kathryn L.; Wentzensen, Nicolas; Whittemore, Alice S.; Anton-Culver, Hoda; Bandera, Elisa V.; Berchuck, Andrew; Carney, Michael E.; Cramer, Daniel W.; Cunningham, Julie M.; Cushing-Haugen, Kara L.; Edwards, Robert P.; Fridley, Brooke L.; Goode, Ellen L.; Lurie, Galina; McGuire, Valerie; Modugno, Francesmary; Moysich, Kirsten B.; Olson, Sara H.; Pearce, Celeste Leigh; Pike, Malcolm C.; Rothstein, Joseph H.; Sellers, Thomas A.; Sieh, Weiva; Stram, Daniel; Thompson, Pamela J.; Vierkant, Robert A.; Wicklund, Kristine G.; Wu, Anna H.; Ziogas, Argyrios; Tworoger, Shelley S.; Schildkraut, Joellen M.

    2016-01-01

    Previously developed models for predicting absolute risk of invasive epithelial ovarian cancer have included a limited number of risk factors and have had low discriminatory power (area under the receiver operating characteristic curve (AUC) < 0.60). Because of this, we developed and internally validated a relative risk prediction model that incorporates 17 established epidemiologic risk factors and 17 genome-wide significant single nucleotide polymorphisms (SNPs) using data from 11 case-control studies in the United States (5,793 cases; 9,512 controls) from the Ovarian Cancer Association Consortium (data accrued from 1992 to 2010). We developed a hierarchical logistic regression model for predicting case-control status that included imputation of missing data. We randomly divided the data into an 80% training sample and used the remaining 20% for model evaluation. The AUC for the full model was 0.664. A reduced model without SNPs performed similarly (AUC = 0.649). Both models performed better than a baseline model that included age and study site only (AUC = 0.563). The best predictive power was obtained in the full model among women younger than 50 years of age (AUC = 0.714); however, the addition of SNPs increased the AUC the most for women older than 50 years of age (AUC = 0.638 vs. 0.616). Adapting this improved model to estimate absolute risk and evaluating it in prospective data sets is warranted. PMID:27698005

  15. Control surface hinge moment prediction using computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Simpson, Christopher David

    The following research determines the feasibility of predicting control surface hinge moments using various computational methods. A detailed analysis is conducted using a 2D GA(W)-1 airfoil with a 20% plain flap. Simple hinge moment prediction methods are tested, including empirical Datcom relations and XFOIL. Steady-state and time-accurate turbulent, viscous, Navier-Stokes solutions are computed using Fun3D. Hinge moment coefficients are computed. Mesh construction techniques are discussed. An adjoint-based mesh adaptation case is also evaluated. An NACA 0012 45-degree swept horizontal stabilizer with a 25% elevator is also evaluated using Fun3D. Results are compared with experimental wind-tunnel data obtained from references. Finally, the costs of various solution methods are estimated. Results indicate that while a steady-state Navier-Stokes solution can accurately predict control surface hinge moments for small angles of attack and deflection angles, a time-accurate solution is necessary to accurately predict hinge moments in the presence of flow separation. The ability to capture the unsteady vortex shedding behavior present in moderate to large control surface deflections is found to be critical to hinge moment prediction accuracy. Adjoint-based mesh adaptation is shown to give hinge moment predictions similar to a globally-refined mesh for a steady-state 2D simulation.

  16. Serious injury prediction algorithm based on large-scale data and under-triage control.

    PubMed

    Nishimoto, Tetsuya; Mukaigawa, Kosuke; Tominaga, Shigeru; Lubbe, Nils; Kiuchi, Toru; Motomura, Tomokazu; Matsumoto, Hisashi

    2017-01-01

    The present study was undertaken to construct an algorithm for an advanced automatic collision notification system based on national traffic accident data compiled by Japanese police. While US research into the development of a serious-injury prediction algorithm is based on a logistic regression algorithm using the National Automotive Sampling System/Crashworthiness Data System, the present injury prediction algorithm was based on comprehensive police data covering all accidents that occurred across Japan. The particular focus of this research is to improve the rescue of injured vehicle occupants in traffic accidents, and the present algorithm assumes the use of an onboard event data recorder data from which risk factors such as pseudo delta-V, vehicle impact location, seatbelt wearing or non-wearing, involvement in a single impact or multiple impact crash and the occupant's age can be derived. As a result, a simple and handy algorithm suited for onboard vehicle installation was constructed from a sample of half of the available police data. The other half of the police data was applied to the validation testing of this new algorithm using receiver operating characteristic analysis. An additional validation was conducted using in-depth investigation of accident injuries in collaboration with prospective host emergency care institutes. The validated algorithm, named the TOYOTA-Nihon University algorithm, proved to be as useful as the US URGENCY and other existing algorithms. Furthermore, an under-triage control analysis found that the present algorithm could achieve an under-triage rate of less than 10% by setting a threshold of 8.3%. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Machine Learning–Based Differential Network Analysis: A Study of Stress-Responsive Transcriptomes in Arabidopsis[W

    PubMed Central

    Ma, Chuang; Xin, Mingming; Feldmann, Kenneth A.; Wang, Xiangfeng

    2014-01-01

    Machine learning (ML) is an intelligent data mining technique that builds a prediction model based on the learning of prior knowledge to recognize patterns in large-scale data sets. We present an ML-based methodology for transcriptome analysis via comparison of gene coexpression networks, implemented as an R package called machine learning–based differential network analysis (mlDNA) and apply this method to reanalyze a set of abiotic stress expression data in Arabidopsis thaliana. The mlDNA first used a ML-based filtering process to remove nonexpressed, constitutively expressed, or non-stress-responsive “noninformative” genes prior to network construction, through learning the patterns of 32 expression characteristics of known stress-related genes. The retained “informative” genes were subsequently analyzed by ML-based network comparison to predict candidate stress-related genes showing expression and network differences between control and stress networks, based on 33 network topological characteristics. Comparative evaluation of the network-centric and gene-centric analytic methods showed that mlDNA substantially outperformed traditional statistical testing–based differential expression analysis at identifying stress-related genes, with markedly improved prediction accuracy. To experimentally validate the mlDNA predictions, we selected 89 candidates out of the 1784 predicted salt stress–related genes with available SALK T-DNA mutagenesis lines for phenotypic screening and identified two previously unreported genes, mutants of which showed salt-sensitive phenotypes. PMID:24520154

  18. Base drag prediction on missile configurations

    NASA Technical Reports Server (NTRS)

    Moore, F. G.; Hymer, T.; Wilcox, F.

    1993-01-01

    New wind tunnel data have been taken, and a new empirical model has been developed for predicting base drag on missile configurations. The new wind tunnel data were taken at NASA-Langley in the Unitary Wind Tunnel at Mach numbers from 2.0 to 4.5, angles of attack to 16 deg, fin control deflections up to 20 deg, fin thickness/chord of 0.05 to 0.15, and fin locations from 'flush with the base' to two chord-lengths upstream of the base. The empirical model uses these data along with previous wind tunnel data, estimating base drag as a function of all these variables as well as boat-tail and power-on/power-off effects. The new model yields improved accuracy, compared to wind tunnel data. The new model also is more robust due to inclusion of additional variables. On the other hand, additional wind tunnel data are needed to validate or modify the current empirical model in areas where data are not available.

  19. A Theoretical and Experimental Analysis of the Outside World Perception Process

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1978-01-01

    The outside scene is often an important source of information for manual control tasks. Important examples of these are car driving and aircraft control. This paper deals with modelling this visual scene perception process on the basis of linear perspective geometry and the relative motion cues. Model predictions utilizing psychophysical threshold data from base-line experiments and literature of a variety of visual approach tasks are compared with experimental data. Both the performance and workload results illustrate that the model provides a meaningful description of the outside world perception process, with a useful predictive capability.

  20. The Prediction of the Gas Utilization Ratio Based on TS Fuzzy Neural Network and Particle Swarm Optimization

    PubMed Central

    Jiang, Haihe; Yin, Yixin; Xiao, Wendong; Zhao, Baoyong

    2018-01-01

    Gas utilization ratio (GUR) is an important indicator that is used to evaluate the energy consumption of blast furnaces (BFs). Currently, the existing methods cannot predict the GUR accurately. In this paper, we present a novel data-driven model for predicting the GUR. The proposed approach utilized both the TS fuzzy neural network (TS-FNN) and the particle swarm algorithm (PSO) to predict the GUR. The particle swarm algorithm (PSO) is applied to optimize the parameters of the TS-FNN in order to decrease the error caused by the inaccurate initial parameter. This paper also applied the box graph (Box-plot) method to eliminate the abnormal value of the raw data during the data preprocessing. This method can deal with the data which does not obey the normal distribution which is caused by the complex industrial environments. The prediction results demonstrate that the optimization model based on PSO and the TS-FNN approach achieves higher prediction accuracy compared with the TS-FNN model and SVM model and the proposed approach can accurately predict the GUR of the blast furnace, providing an effective way for the on-line blast furnace distribution control. PMID:29461469

  1. The Prediction of the Gas Utilization Ratio based on TS Fuzzy Neural Network and Particle Swarm Optimization.

    PubMed

    Zhang, Sen; Jiang, Haihe; Yin, Yixin; Xiao, Wendong; Zhao, Baoyong

    2018-02-20

    Gas utilization ratio (GUR) is an important indicator that is used to evaluate the energy consumption of blast furnaces (BFs). Currently, the existing methods cannot predict the GUR accurately. In this paper, we present a novel data-driven model for predicting the GUR. The proposed approach utilized both the TS fuzzy neural network (TS-FNN) and the particle swarm algorithm (PSO) to predict the GUR. The particle swarm algorithm (PSO) is applied to optimize the parameters of the TS-FNN in order to decrease the error caused by the inaccurate initial parameter. This paper also applied the box graph (Box-plot) method to eliminate the abnormal value of the raw data during the data preprocessing. This method can deal with the data which does not obey the normal distribution which is caused by the complex industrial environments. The prediction results demonstrate that the optimization model based on PSO and the TS-FNN approach achieves higher prediction accuracy compared with the TS-FNN model and SVM model and the proposed approach can accurately predict the GUR of the blast furnace, providing an effective way for the on-line blast furnace distribution control.

  2. Testing by artificial intelligence: computational alternatives to the determination of mutagenicity.

    PubMed

    Klopman, G; Rosenkranz, H S

    1992-08-01

    In order to develop methods for evaluating the predictive performance of computer-driven structure-activity methods (SAR) as well as to determine the limits of predictivity, we investigated the behavior of two Salmonella mutagenicity data bases: (a) a subset from the Genetox Program and (b) one from the U.S. National Toxicology Program (NTP). For molecules common to the two data bases, the experimental concordance was 76% when "marginals" were included and 81% when they were excluded. Three SAR methods were evaluated: CASE, MULTICASE and CASE/Graph Indices (CASE/GI). The programs "learned" the Genetox data base and used it to predict NTP molecules that were not present in the Genetox compilation. The concordances were 72, 80 and 47% respectively. Obviously, the MULTICASE version is superior and approaches the 85% interlaboratory variability observed for the Salmonella mutagenicity assays when the latter was carried out under carefully controlled conditions.

  3. Simulating the Historical Process To Create Laboratory Exercises That Teach Research Methods.

    ERIC Educational Resources Information Center

    Alcock, James

    1994-01-01

    Explains how controlling student access to data can be used as a strategy enabling students to take the role of a research geologist. Students develop models based on limited data and conduct field tests by comparing their predictions with the additional data. (DDR)

  4. Machine learning derived risk prediction of anorexia nervosa.

    PubMed

    Guo, Yiran; Wei, Zhi; Keating, Brendan J; Hakonarson, Hakon

    2016-01-20

    Anorexia nervosa (AN) is a complex psychiatric disease with a moderate to strong genetic contribution. In addition to conventional genome wide association (GWA) studies, researchers have been using machine learning methods in conjunction with genomic data to predict risk of diseases in which genetics play an important role. In this study, we collected whole genome genotyping data on 3940 AN cases and 9266 controls from the Genetic Consortium for Anorexia Nervosa (GCAN), the Wellcome Trust Case Control Consortium 3 (WTCCC3), Price Foundation Collaborative Group and the Children's Hospital of Philadelphia (CHOP), and applied machine learning methods for predicting AN disease risk. The prediction performance is measured by area under the receiver operating characteristic curve (AUC), indicating how well the model distinguishes cases from unaffected control subjects. Logistic regression model with the lasso penalty technique generated an AUC of 0.693, while Support Vector Machines and Gradient Boosted Trees reached AUC's of 0.691 and 0.623, respectively. Using different sample sizes, our results suggest that larger datasets are required to optimize the machine learning models and achieve higher AUC values. To our knowledge, this is the first attempt to assess AN risk based on genome wide genotype level data. Future integration of genomic, environmental and family-based information is likely to improve the AN risk evaluation process, eventually benefitting AN patients and families in the clinical setting.

  5. Towards a Semantically-Enabled Control Strategy for Building Simulations: Integration of Semantic Technologies and Model Predictive Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delgoshaei, Parastoo; Austin, Mark A.; Pertzborn, Amanda J.

    State-of-the-art building simulation control methods incorporate physical constraints into their mathematical models, but omit implicit constraints associated with policies of operation and dependency relationships among rules representing those constraints. To overcome these shortcomings, there is a recent trend in enabling the control strategies with inference-based rule checking capabilities. One solution is to exploit semantic web technologies in building simulation control. Such approaches provide the tools for semantic modeling of domains, and the ability to deduce new information based on the models through use of Description Logic (DL). In a step toward enabling this capability, this paper presents a cross-disciplinary data-drivenmore » control strategy for building energy management simulation that integrates semantic modeling and formal rule checking mechanisms into a Model Predictive Control (MPC) formulation. The results show that MPC provides superior levels of performance when initial conditions and inputs are derived from inference-based rules.« less

  6. 76 FR 61566 - Significant New Use Rules on Certain Chemical Substances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-05

    ... foam control agents. Based on EcoSAR analysis of test data on analogous epoxides, EPA predicts toxicity... control; and an unscheduled DNA synthesis in mammalian cells in culture (OPPTS Test Guideline 870.5550) in...) under section 5(a)(2) of the Toxic Substances Control Act (TSCA) for 36 chemical substances which were...

  7. Modelling and model predictive control for a bicycle-rider system

    NASA Astrophysics Data System (ADS)

    Chu, T. D.; Chen, C. K.

    2018-01-01

    This study proposes a bicycle-rider control model based on model predictive control (MPC). First, a bicycle-rider model with leaning motion of the rider's upper body is developed. The initial simulation data of the bicycle rider are then used to identify the linear model of the system in state-space form for MPC design. Control characteristics of the proposed controller are assessed by simulating the roll-angle tracking control. In this riding task, the MPC uses steering and leaning torques as the control inputs to control the bicycle along a reference roll angle. The simulation results in different cases have demonstrated the applicability and performance of the MPC for bicycle-rider modelling.

  8. Assessment of Rifle Marksmanship Skill Using Sensor-Based Measures. CRESST Report 755

    ERIC Educational Resources Information Center

    Nagashima, Sam O.; Chung, Gregory K. W. K.; Espinosa, Paul D.; Berka, Chris; Baker, Eva L.

    2009-01-01

    The goal of this report was to test the use of sensor-based skill measures in evaluating performance differences in rifle marksmanship. Ten shots were collected from 30 novices and 9 experts. Three measures for breath control and one for trigger control were used to predict skill classification. The data were fitted with a logistic regression…

  9. Job Design and Ethnic Differences in Working Women’s Physical Activity

    PubMed Central

    Grzywacz, Joseph G.; Crain, A. Lauren; Martinson, Brian C.; Quandt, Sara A.

    2014-01-01

    Objective To document the role job control and schedule control play in shaping women’s physical activity, and how it delineates educational and racial variability in associations of job and social control with physical activity. Methods Prospective data were obtained from a community-based sample of working women (N = 302). Validated instruments measured job control and schedule control. Steps per day were assessed using New Lifestyles 800 activity monitors. Results Greater job control predicted more steps per day, whereas greater schedule control predicted fewer steps. Small indirect associations between ethnicity and physical activity were observed among women with a trade school degree or less but not for women with a college degree. Conclusions Low job control created barriers to physical activity among working women with a trade school degree or less. Greater schedule control predicted less physical activity, suggesting women do not use time “created” by schedule flexibility for personal health enhancement. PMID:24034681

  10. Job design and ethnic differences in working women's physical activity.

    PubMed

    Grzywacz, Joseph G; Crain, A Lauren; Martinson, Brian C; Quandt, Sara A

    2014-01-01

    To document the role job control and schedule control play in shaping women's physical activity, and how it delineates educational and racial variability in associations of job and social control with physical activity. Prospective data were obtained from a community-based sample of working women (N = 302). Validated instruments measured job control and schedule control. Steps per day were assessed using New Lifestyles 800 activity monitors. Greater job control predicted more steps per day, whereas greater schedule control predicted fewer steps. Small indirect associations between ethnicity and physical activity were observed among women with a trade school degree or less but not for women with a college degree. Low job control created barriers to physical activity among working women with a trade school degree or less. Greater schedule control predicted less physical activity, suggesting women do not use time "created" by schedule flexibility for personal health enhancement.

  11. [Application of ARIMA model to predict number of malaria cases in China].

    PubMed

    Hui-Yu, H; Hua-Qin, S; Shun-Xian, Z; Lin, A I; Yan, L U; Yu-Chun, C; Shi-Zhu, L I; Xue-Jiao, T; Chun-Li, Y; Wei, H U; Jia-Xu, C

    2017-08-15

    Objective To study the application of autoregressive integrated moving average (ARIMA) model to predict the monthly reported malaria cases in China, so as to provide a reference for prevention and control of malaria. Methods SPSS 24.0 software was used to construct the ARIMA models based on the monthly reported malaria cases of the time series of 20062015 and 2011-2015, respectively. The data of malaria cases from January to December, 2016 were used as validation data to compare the accuracy of the two ARIMA models. Results The models of the monthly reported cases of malaria in China were ARIMA (2, 1, 1) (1, 1, 0) 12 and ARIMA (1, 0, 0) (1, 1, 0) 12 respectively. The comparison between the predictions of the two models and actual situation of malaria cases showed that the ARIMA model based on the data of 2011-2015 had a higher accuracy of forecasting than the model based on the data of 2006-2015 had. Conclusion The establishment and prediction of ARIMA model is a dynamic process, which needs to be adjusted unceasingly according to the accumulated data, and in addition, the major changes of epidemic characteristics of infectious diseases must be considered.

  12. Adaptive State Predictor Based Human Operator Modeling on Longitudinal and Lateral Control

    NASA Technical Reports Server (NTRS)

    Trujillo, Anna C.; Gregory, Irene M.; Hempley, Lucas E.

    2015-01-01

    Control-theoretic modeling of the human operator dynamic behavior in manual control tasks has a long and rich history. In the last two decades, there has been a renewed interest in modeling the human operator. There has also been significant work on techniques used to identify the pilot model of a given structure. The purpose of this research is to attempt to go beyond pilot identification based on collected experimental data and to develop a predictor of pilot behavior. An experiment was conducted to categorize these interactions of the pilot with an adaptive controller compensating during control surface failures. A general linear in-parameter model structure is used to represent a pilot. Three different estimation methods are explored. A gradient descent estimator (GDE), a least squares estimator with exponential forgetting (LSEEF), and a least squares estimator with bounded gain forgetting (LSEBGF) used the experiment data to predict pilot stick input. Previous results have found that the GDE and LSEEF methods are fairly accurate in predicting longitudinal stick input from commanded pitch. This paper discusses the accuracy of each of the three methods - GDE, LSEEF, and LSEBGF - to predict both pilot longitudinal and lateral stick input from the flight director's commanded pitch and bank attitudes.

  13. Real-time sensing of fatigue crack damage for information-based decision and control

    NASA Astrophysics Data System (ADS)

    Keller, Eric Evans

    Information-based decision and control for structures that are subject to failure by fatigue cracking is based on the following notion: Maintenance, usage scheduling, and control parameter tuning can be optimized through real time knowledge of the current state of fatigue crack damage. Additionally, if the material properties of a mechanical structure can be identified within a smaller range, then the remaining life prediction of that structure will be substantially more accurate. Information-based decision systems can rely one physical models, estimation of material properties, exact knowledge of usage history, and sensor data to synthesize an accurate snapshot of the current state of damage and the likely remaining life of a structure under given assumed loading. The work outlined in this thesis is structured to enhance the development of information-based decision and control systems. This is achieved by constructing a test facility for laboratory experiments on real-time damage sensing. This test facility makes use of a methodology that has been formulated for fatigue crack model parameter estimation and significantly improves the quality of predictions of remaining life. Specifically, the thesis focuses on development of an on-line fatigue crack damage sensing and life prediction system that is built upon the disciplines of Systems Sciences and Mechanics of Materials. A major part of the research effort has been expended to design and fabricate a test apparatus which allows: (i) measurement and recording of statistical data for fatigue crack growth in metallic materials via different sensing techniques; and (ii) identification of stochastic model parameters for prediction of fatigue crack damage. To this end, this thesis describes the test apparatus and the associated instrumentation based on four different sensing techniques, namely, traveling optical microscopy, ultrasonic flaw detection, Alternating Current Potential Drop (ACPD), and fiber-optic extensometry-based compliance, for crack length measurements.

  14. Geomorphically based predictive mapping of soil thickness in upland watersheds

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.; Rasmussen, Craig

    2009-09-01

    The hydrologic response of upland watersheds is strongly controlled by soil (regolith) thickness. Despite the need to quantify soil thickness for input into hydrologic models, there is currently no widely used, geomorphically based method for doing so. In this paper we describe and illustrate a new method for predictive mapping of soil thicknesses using high-resolution topographic data, numerical modeling, and field-based calibration. The model framework works directly with input digital elevation model data to predict soil thicknesses assuming a long-term balance between soil production and erosion. Erosion rates in the model are quantified using one of three geomorphically based sediment transport models: nonlinear slope-dependent transport, nonlinear area- and slope-dependent transport, and nonlinear depth- and slope-dependent transport. The model balances soil production and erosion locally to predict a family of solutions corresponding to a range of values of two unconstrained model parameters. A small number of field-based soil thickness measurements can then be used to calibrate the local value of those unconstrained parameters, thereby constraining which solution is applicable at a particular study site. As an illustration, the model is used to predictively map soil thicknesses in two small, ˜0.1 km2, drainage basins in the Marshall Gulch watershed, a semiarid drainage basin in the Santa Catalina Mountains of Pima County, Arizona. Field observations and calibration data indicate that the nonlinear depth- and slope-dependent sediment transport model is the most appropriate transport model for this site. The resulting framework provides a generally applicable, geomorphically based tool for predictive mapping of soil thickness using high-resolution topographic data sets.

  15. Predictable and reliable ECG monitoring over IEEE 802.11 WLANs within a hospital.

    PubMed

    Park, Juyoung; Kang, Kyungtae

    2014-09-01

    Telecardiology provides mobility for patients who require constant electrocardiogram (ECG) monitoring. However, its safety is dependent on the predictability and robustness of data delivery, which must overcome errors in the wireless channel through which the ECG data are transmitted. We report here a framework that can be used to gauge the applicability of IEEE 802.11 wireless local area network (WLAN) technology to ECG monitoring systems in terms of delay constraints and transmission reliability. For this purpose, a medical-grade WLAN architecture achieved predictable delay through the combination of a medium access control mechanism based on the point coordination function provided by IEEE 802.11 and an error control scheme based on Reed-Solomon coding and block interleaving. The size of the jitter buffer needed was determined by this architecture to avoid service dropout caused by buffer underrun, through analysis of variations in transmission delay. Finally, we assessed this architecture in terms of service latency and reliability by modeling the transmission of uncompressed two-lead electrocardiogram data from the MIT-BIH Arrhythmia Database and highlight the applicability of this wireless technology to telecardiology.

  16. Does marital status predict the odds of suicidal death in taiwan? A seven-year population-based study.

    PubMed

    Yeh, Jui-Yuan; Xirasagar, Sudha; Liu, Tsai-Ching; Li, Chong-Yi; Lin, Herng-Ching

    2008-06-01

    Using nationwide, 7-year population-based data for 1997-2003, we examined marital status to see if it predicted suicide among the ethnic Chinese population of Taiwan. Using cause of death data, with a case-control design, two groups-total adult suicide deaths, n = 17,850, the study group, and adult deaths other than suicide, n = 71,400 (randomly selected from age, sex, and geographic region matched controls, four per suicide)-were studied. Using multiple logistic regression analysis including age-marital status interaction, adjusted estimates show divorced status to be the most detrimental for suicide propensity, with males showing stronger effect size. Females never married, aged below 35 and 65-plus, and widowed 65-plus had lower suicide odds.

  17. Mated vertical ground vibration test

    NASA Technical Reports Server (NTRS)

    Ivey, E. W.

    1980-01-01

    The Mated Vertical Ground Vibration Test (MVGVT) was considered to provide an experimental base in the form of structural dynamic characteristics for the shuttle vehicle. This data base was used in developing high confidence analytical models for the prediction and design of loads, pogo controls, and flutter criteria under various payloads and operational missions. The MVGVT boost and launch program evolution, test configurations, and their suspensions are described. Test results are compared with predicted analytical results.

  18. A Cloud Based Framework For Monitoring And Predicting Subsurface System Behaviour

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Rodzianko, A.; Johnson, D. V.; Soltanian, M. R.; Dwivedi, D.; Dafflon, B.; Tran, A. P.; Versteeg, O. J.

    2015-12-01

    Subsurface system behavior is driven and controlled by the interplay of physical, chemical, and biological processes which occur at multiple temporal and spatial scales. Capabilities to monitor, understand and predict this behavior in an effective and timely manner are needed for both scientific purposes and for effective subsurface system management. Such capabilities require three elements: Models, Data and an enabling cyberinfrastructure, which allow users to use these models and data in an effective manner. Under a DOE Office of Science funded STTR award Subsurface Insights and LBNL have designed and implemented a cloud based predictive assimilation framework (PAF) which automatically ingests, controls quality and stores heterogeneous physical and chemical subsurface data and processes these data using different inversion and modeling codes to provide information on the current state and evolution of subsurface systems. PAF is implemented as a modular cloud based software application with five components: (1) data acquisition, (2) data management, (3) data assimilation and processing, (4) visualization and result delivery and (5) orchestration. Serverside PAF uses ZF2 (a PHP web application framework) and Python and both open source (ODM2) and in house developed data models. Clientside PAF uses CSS and JS to allow for interactive data visualization and analysis. Client side modularity (which allows for a responsive interface) of the system is achieved by implementing each core capability of PAF (such as data visualization, user configuration and control, electrical geophysical monitoring and email/SMS alerts on data streams) as a SPA (Single Page Application). One of the recent enhancements is the full integration of a number of flow and mass transport and parameter estimation codes (e.g., MODFLOW, MT3DMS, PHT3D, TOUGH, PFLOTRAN) in this framework. This integration allows for autonomous and user controlled modeling of hydrological and geochemical processes. In our presentation we will discuss our software architecture and present the results of using these codes and the overall developed performance of our framework using hydrological, geochemical and geophysical data from the LBNL SFA2 Rifle field site.

  19. Research on time series data prediction based on clustering algorithm - A case study of Yuebao

    NASA Astrophysics Data System (ADS)

    Lu, Xu; Zhao, Tianzhong

    2017-08-01

    Forecasting is the prerequisite for making scientific decisions, it is based on the past information of the research on the phenomenon, and combined with some of the factors affecting this phenomenon, then using scientific methods to forecast the development trend of the future, it is an important way for people to know the world. This is particularly important in the prediction of financial data, because proper financial data forecasts can provide a great deal of help to financial institutions in their strategic implementation, strategic alignment and risk control. However, the current forecasts of financial data generally use the method of forecast of overall data, which lack of consideration of customer behavior and other factors in the financial data forecasting process, and they are important factors influencing the change of financial data. Based on this situation, this paper analyzed the data of Yuebao, and according to the user's attributes and the operating characteristics, this paper classified 567 users of Yuebao, and made further predicted the data of Yuebao for every class of users, the results showed that the forecasting model in this paper can meet the demand of forecasting.

  20. Modelling Influence and Opinion Evolution in Online Collective Behaviour

    PubMed Central

    Gend, Pascal; Rentfrow, Peter J.; Hendrickx, Julien M.; Blondel, Vincent D.

    2016-01-01

    Opinion evolution and judgment revision are mediated through social influence. Based on a large crowdsourced in vitro experiment (n = 861), it is shown how a consensus model can be used to predict opinion evolution in online collective behaviour. It is the first time the predictive power of a quantitative model of opinion dynamics is tested against a real dataset. Unlike previous research on the topic, the model was validated on data which did not serve to calibrate it. This avoids to favor more complex models over more simple ones and prevents overfitting. The model is parametrized by the influenceability of each individual, a factor representing to what extent individuals incorporate external judgments. The prediction accuracy depends on prior knowledge on the participants’ past behaviour. Several situations reflecting data availability are compared. When the data is scarce, the data from previous participants is used to predict how a new participant will behave. Judgment revision includes unpredictable variations which limit the potential for prediction. A first measure of unpredictability is proposed. The measure is based on a specific control experiment. More than two thirds of the prediction errors are found to occur due to unpredictability of the human judgment revision process rather than to model imperfection. PMID:27336834

  1. Effects of inductive bias on computational evaluations of ligand-based modeling and on drug discovery

    NASA Astrophysics Data System (ADS)

    Cleves, Ann E.; Jain, Ajay N.

    2008-03-01

    Inductive bias is the set of assumptions that a person or procedure makes in making a prediction based on data. Different methods for ligand-based predictive modeling have different inductive biases, with a particularly sharp contrast between 2D and 3D similarity methods. A unique aspect of ligand design is that the data that exist to test methodology have been largely man-made, and that this process of design involves prediction. By analyzing the molecular similarities of known drugs, we show that the inductive bias of the historic drug discovery process has a very strong 2D bias. In studying the performance of ligand-based modeling methods, it is critical to account for this issue in dataset preparation, use of computational controls, and in the interpretation of results. We propose specific strategies to explicitly address the problems posed by inductive bias considerations.

  2. Integrating the Base of Aircraft Data (BADA) in CTAS Trajectory Synthesizer

    NASA Technical Reports Server (NTRS)

    Abramson, Michael; Ali, Kareem

    2012-01-01

    The Center-Terminal Radar Approach Control (TRACON) Automation System (CTAS), developed at NASA Ames Research Center for assisting controllers in the management and control of air traffic in the extended terminal area, supports the modeling of more than four hundred aircraft types. However, 90% of them are supported indirectly by mapping them to one of a relatively few aircraft types for which CTAS has detailed drag and engine thrust models. On the other hand, the Base of Aircraft Data (BADA), developed and maintained by Eurocontrol, supports more than 300 aircraft types, about one third of which are directly supported, i.e. they have validated performance data. All these data were made available for CTAS by integrating BADA version 3.8 into CTAS Trajectory Synthesizer (TS). Several validation tools were developed and used to validate the integrated code and to evaluate the accuracy of trajectory predictions generated using CTAS "native" and BADA Aircraft Performance Models (APM) comparing them with radar track data. Results of these comparisons indicate that the two models have different strengths and weaknesses. The BADA APM can improve the accuracy of CTAS predictions at least for some aircraft types, especially small aircraft, and for some flight phases, especially climb.

  3. Sustained sensorimotor control as intermittent decisions about prediction errors: computational framework and application to ground vehicle steering.

    PubMed

    Markkula, Gustav; Boer, Erwin; Romano, Richard; Merat, Natasha

    2018-06-01

    A conceptual and computational framework is proposed for modelling of human sensorimotor control and is exemplified for the sensorimotor task of steering a car. The framework emphasises control intermittency and extends on existing models by suggesting that the nervous system implements intermittent control using a combination of (1) motor primitives, (2) prediction of sensory outcomes of motor actions, and (3) evidence accumulation of prediction errors. It is shown that approximate but useful sensory predictions in the intermittent control context can be constructed without detailed forward models, as a superposition of simple prediction primitives, resembling neurobiologically observed corollary discharges. The proposed mathematical framework allows straightforward extension to intermittent behaviour from existing one-dimensional continuous models in the linear control and ecological psychology traditions. Empirical data from a driving simulator are used in model-fitting analyses to test some of the framework's main theoretical predictions: it is shown that human steering control, in routine lane-keeping and in a demanding near-limit task, is better described as a sequence of discrete stepwise control adjustments, than as continuous control. Results on the possible roles of sensory prediction in control adjustment amplitudes, and of evidence accumulation mechanisms in control onset timing, show trends that match the theoretical predictions; these warrant further investigation. The results for the accumulation-based model align with other recent literature, in a possibly converging case against the type of threshold mechanisms that are often assumed in existing models of intermittent control.

  4. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  5. Predictive Scheduling for Electric Vehicles Considering Uncertainty of Load and User Behaviors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Bin; Huang, Rui; Wang, Yubo

    2016-05-02

    Un-coordinated Electric Vehicle (EV) charging can create unexpected load in local distribution grid, which may degrade the power quality and system reliability. The uncertainty of EV load, user behaviors and other baseload in distribution grid, is one of challenges that impedes optimal control for EV charging problem. Previous researches did not fully solve this problem due to lack of real-world EV charging data and proper stochastic model to describe these behaviors. In this paper, we propose a new predictive EV scheduling algorithm (PESA) inspired by Model Predictive Control (MPC), which includes a dynamic load estimation module and a predictive optimizationmore » module. The user-related EV load and base load are dynamically estimated based on the historical data. At each time interval, the predictive optimization program will be computed for optimal schedules given the estimated parameters. Only the first element from the algorithm outputs will be implemented according to MPC paradigm. Current-multiplexing function in each Electric Vehicle Supply Equipment (EVSE) is considered and accordingly a virtual load is modeled to handle the uncertainties of future EV energy demands. This system is validated by the real-world EV charging data collected on UCLA campus and the experimental results indicate that our proposed model not only reduces load variation up to 40% but also maintains a high level of robustness. Finally, IEC 61850 standard is utilized to standardize the data models involved, which brings significance to more reliable and large-scale implementation.« less

  6. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  7. Multiaxial Fatigue Life Prediction Based on Nonlinear Continuum Damage Mechanics and Critical Plane Method

    NASA Astrophysics Data System (ADS)

    Wu, Z. R.; Li, X.; Fang, L.; Song, Y. D.

    2018-04-01

    A new multiaxial fatigue life prediction model has been proposed in this paper. The concepts of nonlinear continuum damage mechanics and critical plane criteria were incorporated in the proposed model. The shear strain-based damage control parameter was chosen to account for multiaxial fatigue damage under constant amplitude loading. Fatigue tests were conducted on nickel-based superalloy GH4169 tubular specimens at the temperature of 400 °C under proportional and nonproportional loading. The proposed method was checked against the multiaxial fatigue test data of GH4169. Most of prediction results are within a factor of two scatter band of the test results.

  8. AAA gunnermodel based on observer theory. [predicting a gunner's tracking response

    NASA Technical Reports Server (NTRS)

    Kou, R. S.; Glass, B. C.; Day, C. N.; Vikmanis, M. M.

    1978-01-01

    The Luenberger observer theory is used to develop a predictive model of a gunner's tracking response in antiaircraft artillery systems. This model is composed of an observer, a feedback controller and a remnant element. An important feature of the model is that the structure is simple, hence a computer simulation requires only a short execution time. A parameter identification program based on the least squares curve fitting method and the Gauss Newton gradient algorithm is developed to determine the parameter values of the gunner model. Thus, a systematic procedure exists for identifying model parameters for a given antiaircraft tracking task. Model predictions of tracking errors are compared with human tracking data obtained from manned simulation experiments. Model predictions are in excellent agreement with the empirical data for several flyby and maneuvering target trajectories.

  9. Prediction of Low Community Sanitation Coverage Using Environmental and Sociodemographic Factors in Amhara Region, Ethiopia

    PubMed Central

    Oswald, William E.; Stewart, Aisha E. P.; Flanders, W. Dana; Kramer, Michael R.; Endeshaw, Tekola; Zerihun, Mulat; Melaku, Birhanu; Sata, Eshetu; Gessesse, Demelash; Teferi, Tesfaye; Tadesse, Zerihun; Guadie, Birhan; King, Jonathan D.; Emerson, Paul M.; Callahan, Elizabeth K.; Moe, Christine L.; Clasen, Thomas F.

    2016-01-01

    This study developed and validated a model for predicting the probability that communities in Amhara Region, Ethiopia, have low sanitation coverage, based on environmental and sociodemographic conditions. Community sanitation coverage was measured between 2011 and 2014 through trachoma control program evaluation surveys. Information on environmental and sociodemographic conditions was obtained from available data sources and linked with community data using a geographic information system. Logistic regression was used to identify predictors of low community sanitation coverage (< 20% versus ≥ 20%). The selected model was geographically and temporally validated. Model-predicted probabilities of low community sanitation coverage were mapped. Among 1,502 communities, 344 (22.90%) had coverage below 20%. The selected model included measures for high topsoil gravel content, an indicator for low-lying land, population density, altitude, and rainfall and had reasonable predictive discrimination (area under the curve = 0.75, 95% confidence interval = 0.72, 0.78). Measures of soil stability were strongly associated with low community sanitation coverage, controlling for community wealth, and other factors. A model using available environmental and sociodemographic data predicted low community sanitation coverage for areas across Amhara Region with fair discrimination. This approach could assist sanitation programs and trachoma control programs, scaling up or in hyperendemic areas, to target vulnerable areas with additional activities or alternate technologies. PMID:27430547

  10. Novel hyperspectral prediction method and apparatus

    NASA Astrophysics Data System (ADS)

    Kemeny, Gabor J.; Crothers, Natalie A.; Groth, Gard A.; Speck, Kathy A.; Marbach, Ralf

    2009-05-01

    Both the power and the challenge of hyperspectral technologies is the very large amount of data produced by spectral cameras. While off-line methodologies allow the collection of gigabytes of data, extended data analysis sessions are required to convert the data into useful information. In contrast, real-time monitoring, such as on-line process control, requires that compression of spectral data and analysis occur at a sustained full camera data rate. Efficient, high-speed practical methods for calibration and prediction are therefore sought to optimize the value of hyperspectral imaging. A novel method of matched filtering known as science based multivariate calibration (SBC) was developed for hyperspectral calibration. Classical (MLR) and inverse (PLS, PCR) methods are combined by spectroscopically measuring the spectral "signal" and by statistically estimating the spectral "noise." The accuracy of the inverse model is thus combined with the easy interpretability of the classical model. The SBC method is optimized for hyperspectral data in the Hyper-CalTM software used for the present work. The prediction algorithms can then be downloaded into a dedicated FPGA based High-Speed Prediction EngineTM module. Spectral pretreatments and calibration coefficients are stored on interchangeable SD memory cards, and predicted compositions are produced on a USB interface at real-time camera output rates. Applications include minerals, pharmaceuticals, food processing and remote sensing.

  11. Predicting pedestrian flow: a methodology and a proof of concept based on real-life data.

    PubMed

    Davidich, Maria; Köster, Gerta

    2013-01-01

    Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For instance, our analysis of pedestrian motion under natural conditions at a major German railway station reveals that the values for free-flow velocities and the flow-density relationship differ significantly from widely used literature values. It is thus necessary to calibrate and validate the model against relevant real-life data to make it capable of reproducing and predicting real-life scenarios. In this work we aim at constructing such realistic pedestrian stream simulation. Based on the analysis of real-life data, we present a methodology that identifies key parameters and interdependencies that enable us to properly calibrate the model. The success of the approach is demonstrated for a benchmark model, a cellular automaton. We show that the proposed approach significantly improves the reliability of the simulation and hence the potential prediction accuracy. The simulation is validated by comparing the local density evolution of the measured data to that of the simulated data. We find that for our model the most sensitive parameters are: the source-target distribution of the pedestrian trajectories, the schedule of pedestrian appearances in the scenario and the mean free-flow velocity. Our results emphasize the need for real-life data extraction and analysis to enable predictive simulations.

  12. Inference for multivariate regression model based on multiply imputed synthetic data generated via posterior predictive sampling

    NASA Astrophysics Data System (ADS)

    Moura, Ricardo; Sinha, Bimal; Coelho, Carlos A.

    2017-06-01

    The recent popularity of the use of synthetic data as a Statistical Disclosure Control technique has enabled the development of several methods of generating and analyzing such data, but almost always relying in asymptotic distributions and in consequence being not adequate for small sample datasets. Thus, a likelihood-based exact inference procedure is derived for the matrix of regression coefficients of the multivariate regression model, for multiply imputed synthetic data generated via Posterior Predictive Sampling. Since it is based in exact distributions this procedure may even be used in small sample datasets. Simulation studies compare the results obtained from the proposed exact inferential procedure with the results obtained from an adaptation of Reiters combination rule to multiply imputed synthetic datasets and an application to the 2000 Current Population Survey is discussed.

  13. Glycemic Control Indices and Their Aggregation in the Prediction of Nocturnal Hypoglycemia From Intermittent Blood Glucose Measurements.

    PubMed

    Sampath, Sivananthan; Tkachenko, Pavlo; Renard, Eric; Pereverzev, Sergei V

    2016-11-01

    Despite the risk associated with nocturnal hypoglycemia (NH) there are only a few methods aiming at the prediction of such events based on intermittent blood glucose monitoring data. One of the first methods that potentially can be used for NH prediction is based on the low blood glucose index (LBGI) and suggested, for example, in Accu-Chek® Connect as a hypoglycemia risk indicator. On the other hand, nowadays there are other glucose control indices (GCI), which could be used for NH prediction in the same spirit as LBGI. In the present study we propose a general approach of combining NH predictors constructed from different GCI. The approach is based on a recently developed strategy for aggregating ranking algorithms in machine learning. NH predictors have been calibrated and tested on data extracted from clinical trials, performed in EU FP7-funded project DIAdvisor. Then, to show a portability of the method we have tested it on another dataset that was received from EU Horizon 2020-funded project AMMODIT. We exemplify the proposed approach by aggregating NH predictors that have been constructed based on 4 GCI associated with hypoglycemia. Even though these predictors have been preliminary optimized to exhibit better performance on the considered dataset, our aggregation approach allows a further performance improvement. On the dataset, where a portability of the proposed approach has been demonstrated, the aggregating predictor has exhibited the following performance: sensitivity 77%, specificity 83.4%, positive predictive value 80.2%, negative predictive value 80.6%, which is higher than conventionally considered as acceptable. The proposed approach shows potential to be used in telemedicine systems for NH prediction. © 2016 Diabetes Technology Society.

  14. Utility of genetic and non-genetic risk factors in predicting coronary heart disease in Singaporean Chinese.

    PubMed

    Chang, Xuling; Salim, Agus; Dorajoo, Rajkumar; Han, Yi; Khor, Chiea-Chuen; van Dam, Rob M; Yuan, Jian-Min; Koh, Woon-Puay; Liu, Jianjun; Goh, Daniel Yt; Wang, Xu; Teo, Yik-Ying; Friedlander, Yechiel; Heng, Chew-Kiat

    2017-01-01

    Background Although numerous phenotype based equations for predicting risk of 'hard' coronary heart disease are available, data on the utility of genetic information for such risk prediction is lacking in Chinese populations. Design Case-control study nested within the Singapore Chinese Health Study. Methods A total of 1306 subjects comprising 836 men (267 incident cases and 569 controls) and 470 women (128 incident cases and 342 controls) were included. A Genetic Risk Score comprising 156 single nucleotide polymorphisms that have been robustly associated with coronary heart disease or its risk factors ( p < 5 × 10 -8 ) in at least two independent cohorts of genome-wide association studies was built. For each gender, three base models were used: recalibrated Adult Treatment Panel III (ATPIII) Model (M 1 ); ATP III model fitted using Singapore Chinese Health Study data (M 2 ) and M 3 : M 2 + C-reactive protein + creatinine. Results The Genetic Risk Score was significantly associated with incident 'hard' coronary heart disease ( p for men: 1.70 × 10 -10 -1.73 × 10 -9 ; p for women: 0.001). The inclusion of the Genetic Risk Score in the prediction models improved discrimination in both genders (c-statistics: 0.706-0.722 vs. 0.663-0.695 from base models for men; 0.788-0.790 vs. 0.765-0.773 for women). In addition, the inclusion of the Genetic Risk Score also improved risk classification with a net gain of cases being reclassified to higher risk categories (men: 12.4%-16.5%; women: 10.2% (M 3 )), while not significantly reducing the classification accuracy in controls. Conclusions The Genetic Risk Score is an independent predictor for incident 'hard' coronary heart disease in our ethnic Chinese population. Inclusion of genetic factors into coronary heart disease prediction models could significantly improve risk prediction performance.

  15. USAF Flight Test Investigation of Focused Sonic Booms: Project Have Bears

    NASA Technical Reports Server (NTRS)

    Downing, Micah; Zamot, Noel; Moss, Chris; Morin, Daniel; Wolski, Ed; Chung, Sukhwan; Plotkin, Kenneth; Maglieri, Domenic

    1996-01-01

    Supersonic operations from military aircraft generate sonic booms that can affect people, animals and structures. A substantial experimental data base exists on sonic booms for aircraft in steady flight and confidence in the predictive techniques has been established. All the focus sonic boom data that are in existence today were collected during the 60's and 70's as part of the information base to the US Supersonic Transport program and the French Jericho studies for the Concorde. These experiments formed the data base to develop sonic boom propagation and prediction theories for focusing. There is a renewed interest in high-speed transports for civilian application. Moreover, today's fighter aircraft have better performance capabilities, and supersonic flights ars more common during air combat maneuvers. Most of the existing data on focus booms are related to high-speed civil operations such as transitional linear accelerations and mild turns. However, military aircraft operating in training areas perform more drastic maneuvers such as dives and high-g turns. An update and confirmation of USAF prediction capabilities is required to demonstrate the ability to predict and control sonic boom impacts, especially those produced by air combat maneuvers.

  16. High capacity reversible watermarking for audio by histogram shifting and predicted error expansion.

    PubMed

    Wang, Fei; Xie, Zhaoxin; Chen, Zuo

    2014-01-01

    Being reversible, the watermarking information embedded in audio signals can be extracted while the original audio data can achieve lossless recovery. Currently, the few reversible audio watermarking algorithms are confronted with following problems: relatively low SNR (signal-to-noise) of embedded audio; a large amount of auxiliary embedded location information; and the absence of accurate capacity control capability. In this paper, we present a novel reversible audio watermarking scheme based on improved prediction error expansion and histogram shifting. First, we use differential evolution algorithm to optimize prediction coefficients and then apply prediction error expansion to output stego data. Second, in order to reduce location map bits length, we introduced histogram shifting scheme. Meanwhile, the prediction error modification threshold according to a given embedding capacity can be computed by our proposed scheme. Experiments show that this algorithm improves the SNR of embedded audio signals and embedding capacity, drastically reduces location map bits length, and enhances capacity control capability.

  17. Robust PBPK/PD-Based Model Predictive Control of Blood Glucose.

    PubMed

    Schaller, Stephan; Lippert, Jorg; Schaupp, Lukas; Pieber, Thomas R; Schuppert, Andreas; Eissing, Thomas

    2016-07-01

    Automated glucose control (AGC) has not yet reached the point where it can be applied clinically [3]. Challenges are accuracy of subcutaneous (SC) glucose sensors, physiological lag times, and both inter- and intraindividual variability. To address above issues, we developed a novel scheme for MPC that can be applied to AGC. An individualizable generic whole-body physiology-based pharmacokinetic and dynamics (PBPK/PD) model of the glucose, insulin, and glucagon metabolism has been used as the predictive kernel. The high level of mechanistic detail represented by the model takes full advantage of the potential of MPC and may make long-term prediction possible as it captures at least some relevant sources of variability [4]. Robustness against uncertainties was increased by a control cascade relying on proportional-integrative derivative-based offset control. The performance of this AGC scheme was evaluated in silico and retrospectively using data from clinical trials. This analysis revealed that our approach handles sensor noise with a MARD of 10%-14%, and model uncertainties and disturbances. The results suggest that PBPK/PD models are well suited for MPC in a glucose control setting, and that their predictive power in combination with the integrated database-driven (a priori individualizable) model framework will help overcome current challenges in the development of AGC systems. This study provides a new, generic, and robust mechanistic approach to AGC using a PBPK platform with extensive a priori (database) knowledge for individualization.

  18. Super short term forecasting of photovoltaic power generation output in micro grid

    NASA Astrophysics Data System (ADS)

    Gong, Cheng; Ma, Longfei; Chi, Zhongjun; Zhang, Baoqun; Jiao, Ran; Yang, Bing; Chen, Jianshu; Zeng, Shuang

    2017-01-01

    The prediction model combining data mining and support vector machine (SVM) was built. Which provide information of photovoltaic (PV) power generation output for economic operation and optimal control of micro gird, and which reduce influence of power system from PV fluctuation. Because of the characteristic which output of PV rely on radiation intensity, ambient temperature, cloudiness, etc., so data mining was brought in. This technology can deal with large amounts of historical data and eliminate superfluous data, by using fuzzy classifier of daily type and grey related degree. The model of SVM was built, which can dock with information from data mining. Based on measured data from a small PV station, the prediction model was tested. The numerical example shows that the prediction model is fast and accurate.

  19. Roughness Based Crossflow Transition Control: A Computational Assessment

    NASA Technical Reports Server (NTRS)

    Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan; Streett, Craig L.; Carpenter, Mark H.

    2009-01-01

    A combination of parabolized stability equations and secondary instability theory has been applied to a low-speed swept airfoil model with a chord Reynolds number of 7.15 million, with the goals of (i) evaluating this methodology in the context of transition prediction for a known configuration for which roughness based crossflow transition control has been demonstrated under flight conditions and (ii) of analyzing the mechanism of transition delay via the introduction of discrete roughness elements (DRE). Roughness based transition control involves controlled seeding of suitable, subdominant crossflow modes, so as to weaken the growth of naturally occurring, linearly more unstable crossflow modes. Therefore, a synthesis of receptivity, linear and nonlinear growth of stationary crossflow disturbances, and the ensuing development of high frequency secondary instabilities is desirable to understand the experimentally observed transition behavior. With further validation, such higher fidelity prediction methodology could be utilized to assess the potential for crossflow transition control at even higher Reynolds numbers, where experimental data is currently unavailable.

  20. Model-Based Control of Observer Bias for the Analysis of Presence-Only Data in Ecology

    PubMed Central

    Warton, David I.; Renner, Ian W.; Ramp, Daniel

    2013-01-01

    Presence-only data, where information is available concerning species presence but not species absence, are subject to bias due to observers being more likely to visit and record sightings at some locations than others (hereafter “observer bias”). In this paper, we describe and evaluate a model-based approach to accounting for observer bias directly – by modelling presence locations as a function of known observer bias variables (such as accessibility variables) in addition to environmental variables, then conditioning on a common level of bias to make predictions of species occurrence free of such observer bias. We implement this idea using point process models with a LASSO penalty, a new presence-only method related to maximum entropy modelling, that implicitly addresses the “pseudo-absence problem” of where to locate pseudo-absences (and how many). The proposed method of bias-correction is evaluated using systematically collected presence/absence data for 62 plant species endemic to the Blue Mountains near Sydney, Australia. It is shown that modelling and controlling for observer bias significantly improves the accuracy of predictions made using presence-only data, and usually improves predictions as compared to pseudo-absence or “inventory” methods of bias correction based on absences from non-target species. Future research will consider the potential for improving the proposed bias-correction approach by estimating the observer bias simultaneously across multiple species. PMID:24260167

  1. Algorithm for cellular reprogramming.

    PubMed

    Ronquist, Scott; Patterson, Geoff; Muir, Lindsey A; Lindsly, Stephen; Chen, Haiming; Brown, Markus; Wicha, Max S; Bloch, Anthony; Brockett, Roger; Rajapakse, Indika

    2017-11-07

    The day we understand the time evolution of subcellular events at a level of detail comparable to physical systems governed by Newton's laws of motion seems far away. Even so, quantitative approaches to cellular dynamics add to our understanding of cell biology. With data-guided frameworks we can develop better predictions about, and methods for, control over specific biological processes and system-wide cell behavior. Here we describe an approach for optimizing the use of transcription factors (TFs) in cellular reprogramming, based on a device commonly used in optimal control. We construct an approximate model for the natural evolution of a cell-cycle-synchronized population of human fibroblasts, based on data obtained by sampling the expression of 22,083 genes at several time points during the cell cycle. To arrive at a model of moderate complexity, we cluster gene expression based on division of the genome into topologically associating domains (TADs) and then model the dynamics of TAD expression levels. Based on this dynamical model and additional data, such as known TF binding sites and activity, we develop a methodology for identifying the top TF candidates for a specific cellular reprogramming task. Our data-guided methodology identifies a number of TFs previously validated for reprogramming and/or natural differentiation and predicts some potentially useful combinations of TFs. Our findings highlight the immense potential of dynamical models, mathematics, and data-guided methodologies for improving strategies for control over biological processes. Copyright © 2017 the Author(s). Published by PNAS.

  2. Exhaled Breath Markers for Nonimaging and Noninvasive Measures for Detection of Multiple Sclerosis.

    PubMed

    Broza, Yoav Y; Har-Shai, Lior; Jeries, Raneen; Cancilla, John C; Glass-Marmor, Lea; Lejbkowicz, Izabella; Torrecilla, José S; Yao, Xuelin; Feng, Xinliang; Narita, Akimitsu; Müllen, Klaus; Miller, Ariel; Haick, Hossam

    2017-11-15

    Multiple sclerosis (MS) is the most common chronic neurological disease affecting young adults. MS diagnosis is based on clinical characteristics and confirmed by examination of the cerebrospinal fluids (CSF) or by magnetic resonance imaging (MRI) of the brain or spinal cord or both. However, neither of the current diagnostic procedures are adequate as a routine tool to determine disease state. Thus, diagnostic biomarkers are needed. In the current study, a novel approach that could meet these expectations is presented. The approach is based on noninvasive analysis of volatile organic compounds (VOCs) in breath. Exhaled breath was collected from 204 participants, 146 MS and 58 healthy control individuals. Analysis was performed by gas-chromatography mass-spectrometry (GC-MS) and nanomaterial-based sensor array. Predictive models were derived from the sensors, using artificial neural networks (ANNs). GC-MS analysis revealed significant differences in VOC abundance between MS patients and controls. Sensor data analysis on training sets was able to discriminate in binary comparisons between MS patients and controls with accuracies up to 90%. Blinded sets showed 95% positive predictive value (PPV) between MS-remission and control, 100% sensitivity with 100% negative predictive value (NPV) between MS not-treated (NT) and control, and 86% NPV between relapse and control. Possible links between VOC biomarkers and the MS pathogenesis were established. Preliminary results suggest the applicability of a new nanotechnology-based method for MS diagnostics.

  3. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  4. L70 life prediction for solid state lighting using Kalman Filter and Extended Kalman Filter based models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lall, Pradeep; Wei, Junchao; Davis, Lynn

    2013-08-08

    Solid-state lighting (SSL) luminaires containing light emitting diodes (LEDs) have the potential of seeing excessive temperatures when being transported across country or being stored in non-climate controlled warehouses. They are also being used in outdoor applications in desert environments that see little or no humidity but will experience extremely high temperatures during the day. This makes it important to increase our understanding of what effects high temperature exposure for a prolonged period of time will have on the usability and survivability of these devices. Traditional light sources “burn out” at end-of-life. For an incandescent bulb, the lamp life is definedmore » by B50 life. However, the LEDs have no filament to “burn”. The LEDs continually degrade and the light output decreases eventually below useful levels causing failure. Presently, the TM-21 test standard is used to predict the L70 life of LEDs from LM-80 test data. Several failure mechanisms may be active in a LED at a single time causing lumen depreciation. The underlying TM-21 Model may not capture the failure physics in presence of multiple failure mechanisms. Correlation of lumen maintenance with underlying physics of degradation at system-level is needed. In this paper, Kalman Filter (KF) and Extended Kalman Filters (EKF) have been used to develop a 70-percent Lumen Maintenance Life Prediction Model for LEDs used in SSL luminaires. Ten-thousand hour LM-80 test data for various LEDs have been used for model development. System state at each future time has been computed based on the state space at preceding time step, system dynamics matrix, control vector, control matrix, measurement matrix, measured vector, process noise and measurement noise. The future state of the lumen depreciation has been estimated based on a second order Kalman Filter model and a Bayesian Framework. The measured state variable has been related to the underlying damage using physics-based models. Life prediction of L70 life for the LEDs used in SSL luminaires from KF and EKF based models have been compared with the TM-21 model predictions and experimental data.« less

  5. An insula-frontostriatal network mediates flexible cognitive control by adaptively predicting changing control demands

    PubMed Central

    Jiang, Jiefeng; Beck, Jeffrey; Heller, Katherine; Egner, Tobias

    2015-01-01

    The anterior cingulate and lateral prefrontal cortices have been implicated in implementing context-appropriate attentional control, but the learning mechanisms underlying our ability to flexibly adapt the control settings to changing environments remain poorly understood. Here we show that human adjustments to varying control demands are captured by a reinforcement learner with a flexible, volatility-driven learning rate. Using model-based functional magnetic resonance imaging, we demonstrate that volatility of control demand is estimated by the anterior insula, which in turn optimizes the prediction of forthcoming demand in the caudate nucleus. The caudate's prediction of control demand subsequently guides the implementation of proactive and reactive attentional control in dorsal anterior cingulate and dorsolateral prefrontal cortices. These data enhance our understanding of the neuro-computational mechanisms of adaptive behaviour by connecting the classic cingulate-prefrontal cognitive control network to a subcortical control-learning mechanism that infers future demands by flexibly integrating remote and recent past experiences. PMID:26391305

  6. Optimization of Control Strategies for Non-Domiciliated Triatoma dimidiata, Chagas Disease Vector in the Yucatán Peninsula, Mexico

    PubMed Central

    Barbu, Corentin; Dumonteil, Eric; Gourbière, Sébastien

    2009-01-01

    Background Chagas disease is the most important vector-borne disease in Latin America. Regional initiatives based on residual insecticide spraying have successfully controlled domiciliated vectors in many regions. Non-domiciliated vectors remain responsible for a significant transmission risk, and their control is now a key challenge for disease control. Methodology/Principal Findings A mathematical model was developed to predict the temporal variations in abundance of non-domiciliated vectors inside houses. Demographic parameters were estimated by fitting the model to two years of field data from the Yucatan peninsula, Mexico. The predictive value of the model was tested on an independent data set before simulations examined the efficacy of control strategies based on residual insecticide spraying, insect screens, and bednets. The model accurately fitted and predicted field data in the absence and presence of insecticide spraying. Pyrethroid spraying was found effective when 50 mg/m2 were applied yearly within a two-month period matching the immigration season. The >80% reduction in bug abundance was not improved by larger doses or more frequent interventions, and it decreased drastically for different timing and lower frequencies of intervention. Alternatively, the use of insect screens consistently reduced bug abundance proportionally to the reduction of the vector immigration rate. Conclusion/Significance Control of non-domiciliated vectors can hardly be achieved by insecticide spraying, because it would require yearly application and an accurate understanding of the temporal pattern of immigration. Insect screens appear to offer an effective and sustainable alternative, which may be part of multi-disease interventions for the integrated control of neglected vector-borne diseases. PMID:19365542

  7. Improvement of mindfulness skills during Mindfulness-Based Cognitive Therapy predicts long-term reductions of neuroticism in persons with recurrent depression in remission.

    PubMed

    Spinhoven, Philip; Huijbers, Marloes J; Ormel, Johan; Speckens, Anne E M

    2017-04-15

    This study examined whether changes in mindfulness skills following Mindfulness-based Cognitive Therapy (MBCT) are predictive of long-term changes in personality traits. Using data from the MOMENT study, we included 278 participants with recurrent depression in remission allocated to Mindfulness-Based Cognitive Therapy (MBCT). Mindfulness skills were measured with the FFMQ at baseline, after treatment and at 15-month follow-up and personality traits with the NEO-PI-R at baseline and follow-up. For 138 participants, complete repeated assessments of mindfulness and personality traits were available. Following MBCT participants manifested significant improvement of mindfulness skills. Moreover, at 15-month follow-up participants showed significantly lower levels of neuroticism and higher levels of conscientiousness. Large improvements in mindfulness skills after treatment predicted the long-term changes in neuroticism but not in conscientiousness, while controlling for use of maintenance antidepressant medication, baseline depression severity and change in depression severity during follow-up (IDS-C). In particular improvements in the facets of acting with awareness predicted lower levels of neuroticism. Sensitivity analyses with multiple data imputation yielded similar results. Uncontrolled clinical study with substantial attrition based on data of two randomized controlled trials. The design of the present study precludes to establish whether there is any causal association between changes in mindfulness and subsequent changes in neuroticism. MBCT could be a viable intervention to directly target one of the most important risk factors for onset and maintenance of recurrent depression and other mental disorders, i.e. neuroticism. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Output MSE and PSNR prediction in DCT-based lossy compression of remote sensing images

    NASA Astrophysics Data System (ADS)

    Kozhemiakin, Ruslan A.; Abramov, Sergey K.; Lukin, Vladimir V.; Vozel, Benoit; Chehdi, Kacem

    2017-10-01

    Amount and size of remote sensing (RS) images acquired by modern systems are so large that data have to be compressed in order to transfer, save and disseminate them. Lossy compression becomes more popular for aforementioned situations. But lossy compression has to be applied carefully with providing acceptable level of introduced distortions not to lose valuable information contained in data. Then introduced losses have to be controlled and predicted and this is problematic for many coders. In this paper, we analyze possibilities of predicting mean square error or, equivalently, PSNR for coders based on discrete cosine transform (DCT) applied either for compressing singlechannel RS images or multichannel data in component-wise manner. The proposed approach is based on direct dependence between distortions introduced due to DCT coefficient quantization and losses in compressed data. One more innovation deals with possibility to employ a limited number (percentage) of blocks for which DCT-coefficients have to be calculated. This accelerates prediction and makes it considerably faster than compression itself. There are two other advantages of the proposed approach. First, it is applicable for both uniform and non-uniform quantization of DCT coefficients. Second, the approach is quite general since it works for several analyzed DCT-based coders. The simulation results are obtained for standard test images and then verified for real-life RS data.

  9. The backend design of an environmental monitoring system upon real-time prediction of groundwater level fluctuation under the hillslope.

    PubMed

    Lin, Hsueh-Chun; Hong, Yao-Ming; Kan, Yao-Chiang

    2012-01-01

    The groundwater level represents a critical factor to evaluate hillside landslides. A monitoring system upon the real-time prediction platform with online analytical functions is important to forecast the groundwater level due to instantaneously monitored data when the heavy precipitation raises the groundwater level under the hillslope and causes instability. This study is to design the backend of an environmental monitoring system with efficient algorithms for machine learning and knowledge bank for the groundwater level fluctuation prediction. A Web-based platform upon the model-view controller-based architecture is established with technology of Web services and engineering data warehouse to support online analytical process and feedback risk assessment parameters for real-time prediction. The proposed system incorporates models of hydrological computation, machine learning, Web services, and online prediction to satisfy varieties of risk assessment requirements and approaches of hazard prevention. The rainfall data monitored from the potential landslide area at Lu-Shan, Nantou and Li-Shan, Taichung, in Taiwan, are applied to examine the system design.

  10. Handling qualities effects of display latency

    NASA Technical Reports Server (NTRS)

    King, David W.

    1993-01-01

    Display latency is the time delay between aircraft response and the corresponding response of the cockpit displays. Currently, there is no explicit specification for allowable display lags to ensure acceptable aircraft handling qualities in instrument flight conditions. This paper examines the handling qualities effects of display latency between 70 and 400 milliseconds for precision instrument flight tasks of the V-22 Tiltrotor aircraft. Display delay effects on the pilot control loop are analytically predicted through a second order pilot crossover model of the V-22 lateral axis, and handling qualities trends are evaluated through a series of fixed-base piloted simulation tests. The results show that the effects of display latency for flight path tracking tasks are driven by the stability characteristics of the attitude control loop. The data indicate that the loss of control damping due to latency can be simply predicted from knowledge of the aircraft's stability margins, control system lags, and required control bandwidths. Based on the relationship between attitude control damping and handling qualities ratings, latency design guidelines are presented. In addition, this paper presents a design philosophy, supported by simulation data, for using flight director display augmentation to suppress the effects of display latency for delays up to 300 milliseconds.

  11. A wavelet-based technique to predict treatment outcome for Major Depressive Disorder.

    PubMed

    Mumtaz, Wajid; Xia, Likun; Mohd Yasin, Mohd Azhar; Azhar Ali, Syed Saad; Malik, Aamir Saeed

    2017-01-01

    Treatment management for Major Depressive Disorder (MDD) has been challenging. However, electroencephalogram (EEG)-based predictions of antidepressant's treatment outcome may help during antidepressant's selection and ultimately improve the quality of life for MDD patients. In this study, a machine learning (ML) method involving pretreatment EEG data was proposed to perform such predictions for Selective Serotonin Reuptake Inhibitor (SSRIs). For this purpose, the acquisition of experimental data involved 34 MDD patients and 30 healthy controls. Consequently, a feature matrix was constructed involving time-frequency decomposition of EEG data based on wavelet transform (WT) analysis, termed as EEG data matrix. However, the resultant EEG data matrix had high dimensionality. Therefore, dimension reduction was performed based on a rank-based feature selection method according to a criterion, i.e., receiver operating characteristic (ROC). As a result, the most significant features were identified and further be utilized during the training and testing of a classification model, i.e., the logistic regression (LR) classifier. Finally, the LR model was validated with 100 iterations of 10-fold cross-validation (10-CV). The classification results were compared with short-time Fourier transform (STFT) analysis, and empirical mode decompositions (EMD). The wavelet features extracted from frontal and temporal EEG data were found statistically significant. In comparison with other time-frequency approaches such as the STFT and EMD, the WT analysis has shown highest classification accuracy, i.e., accuracy = 87.5%, sensitivity = 95%, and specificity = 80%. In conclusion, significant wavelet coefficients extracted from frontal and temporal pre-treatment EEG data involving delta and theta frequency bands may predict antidepressant's treatment outcome for the MDD patients.

  12. Predicting the Reasons of Customer Complaints: A First Step Toward Anticipating Quality Issues of In Vitro Diagnostics Assays with Machine Learning.

    PubMed

    Aris-Brosou, Stephane; Kim, James; Li, Li; Liu, Hui

    2018-05-15

    Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. ©Stephane Aris-Brosou, James Kim, Li Li, Hui Liu. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 15.05.2018.

  13. Predicting the Reasons of Customer Complaints: A First Step Toward Anticipating Quality Issues of In Vitro Diagnostics Assays with Machine Learning

    PubMed Central

    Kim, James; Li, Li; Liu, Hui

    2018-01-01

    Background Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. Objective The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. Methods QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. Results The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. Conclusions This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. PMID:29764796

  14. Optimal strategy analysis based on robust predictive control for inventory system with random demand

    NASA Astrophysics Data System (ADS)

    Saputra, Aditya; Widowati, Sutrisno

    2017-12-01

    In this paper, the optimal strategy for a single product single supplier inventory system with random demand is analyzed by using robust predictive control with additive random parameter. We formulate the dynamical system of this system as a linear state space with additive random parameter. To determine and analyze the optimal strategy for the given inventory system, we use robust predictive control approach which gives the optimal strategy i.e. the optimal product volume that should be purchased from the supplier for each time period so that the expected cost is minimal. A numerical simulation is performed with some generated random inventory data. We simulate in MATLAB software where the inventory level must be controlled as close as possible to a set point decided by us. From the results, robust predictive control model provides the optimal strategy i.e. the optimal product volume that should be purchased and the inventory level was followed the given set point.

  15. Development of Rock Engineering Systems-Based Models for Flyrock Risk Analysis and Prediction of Flyrock Distance in Surface Blasting

    NASA Astrophysics Data System (ADS)

    Faramarzi, Farhad; Mansouri, Hamid; Farsangi, Mohammad Ali Ebrahimi

    2014-07-01

    The environmental effects of blasting must be controlled in order to comply with regulatory limits. Because of safety concerns and risk of damage to infrastructures, equipment, and property, and also having a good fragmentation, flyrock control is crucial in blasting operations. If measures to decrease flyrock are taken, then the flyrock distance would be limited, and, in return, the risk of damage can be reduced or eliminated. This paper deals with modeling the level of risk associated with flyrock and, also, flyrock distance prediction based on the rock engineering systems (RES) methodology. In the proposed models, 13 effective parameters on flyrock due to blasting are considered as inputs, and the flyrock distance and associated level of risks as outputs. In selecting input data, the simplicity of measuring input data was taken into account as well. The data for 47 blasts, carried out at the Sungun copper mine, western Iran, were used to predict the level of risk and flyrock distance corresponding to each blast. The obtained results showed that, for the 47 blasts carried out at the Sungun copper mine, the level of estimated risks are mostly in accordance with the measured flyrock distances. Furthermore, a comparison was made between the results of the flyrock distance predictive RES-based model, the multivariate regression analysis model (MVRM), and, also, the dimensional analysis model. For the RES-based model, R 2 and root mean square error (RMSE) are equal to 0.86 and 10.01, respectively, whereas for the MVRM and dimensional analysis, R 2 and RMSE are equal to (0.84 and 12.20) and (0.76 and 13.75), respectively. These achievements confirm the better performance of the RES-based model over the other proposed models.

  16. Prediction Study on Anti-Slide Control of Railway Vehicle Based on RBF Neural Networks

    NASA Astrophysics Data System (ADS)

    Yang, Lijun; Zhang, Jimin

    While railway vehicle braking, Anti-slide control system will detect operating status of each wheel-sets e.g. speed difference and deceleration etc. Once the detected value on some wheel-set is over pre-defined threshold, brake effort on such wheel-set will be adjusted automatically to avoid blocking. Such method takes effect on guarantee safety operation of vehicle and avoid wheel-set flatness, however it cannot adapt itself to the rail adhesion variation. While wheel-sets slide, the operating status is chaotic time series with certain law, and can be predicted with the law and experiment data in certain time. The predicted values can be used as the input reference signals of vehicle anti-slide control system, to judge and control the slide status of wheel-sets. In this article, the RBF neural networks is taken to predict wheel-set slide status in multi-step with weight vector adjusted based on online self-adaptive algorithm, and the center & normalizing parameters of active function of the hidden unit of RBF neural networks' hidden layer computed with K-means clustering algorithm. With multi-step prediction simulation, the predicted signal with appropriate precision can be used by anti-slide system to trace actively and adjust wheel-set slide tendency, so as to adapt to wheel-rail adhesion variation and reduce the risk of wheel-set blocking.

  17. Study on SOC wavelet analysis for LiFePO4 battery

    NASA Astrophysics Data System (ADS)

    Liu, Xuepeng; Zhao, Dongmei

    2017-08-01

    Improving the prediction accuracy of SOC can reduce the complexity of the conservative and control strategy of the strategy such as the scheduling, optimization and planning of LiFePO4 battery system. Based on the analysis of the relationship between the SOC historical data and the external stress factors, the SOC Estimation-Correction Prediction Model based on wavelet analysis is established. Using wavelet neural network prediction model is of high precision to achieve forecast link, external stress measured data is used to update parameters estimation in the model, implement correction link, makes the forecast model can adapt to the LiFePO4 battery under rated condition of charge and discharge the operating point of the variable operation area. The test results show that the method can obtain higher precision prediction model when the input and output of LiFePO4 battery are changed frequently.

  18. Analysis of clinically important factors on the performance of advanced hydraulic, microprocessor-controlled exo-prosthetic knee joints based on 899 trial fittings

    PubMed Central

    Hahn, Andreas; Lang, Michael; Stuckart, Claudia

    2016-01-01

    Abstract The objective of this work is to evaluate whether clinically important factors may predict an individual's capability to utilize the functional benefits provided by an advanced hydraulic, microprocessor-controlled exo-prosthetic knee component. This retrospective cross-sectional cohort analysis investigated the data of above knee amputees captured during routine trial fittings. Prosthetists rated the performance indicators showing the functional benefits of the advanced maneuvering capabilities of the device. Subjects were asked to rate their perception. Simple and multiple linear and logistic regression was applied. Data from 899 subjects with demographics typical for the population were evaluated. Ability to vary gait speed, perform toileting, and ascend stairs were identified as the most sensitive performance predictors. Prior C-Leg users showed benefits during advanced maneuvering. Variables showed plausible and meaningful effects, however, could not claim predictive power. Mobility grade showed the largest effect but also failed to be predictive. Clinical parameters such as etiology, age, mobility grade, and others analyzed here do not suffice to predict individual potential. Daily walking distance may pose a threshold value and be part of a predictive instrument. Decisions based solely on single parameters such as mobility grade rating or walking distance seem to be questionable. PMID:27828871

  19. Analysis of clinically important factors on the performance of advanced hydraulic, microprocessor-controlled exo-prosthetic knee joints based on 899 trial fittings.

    PubMed

    Hahn, Andreas; Lang, Michael; Stuckart, Claudia

    2016-11-01

    The objective of this work is to evaluate whether clinically important factors may predict an individual's capability to utilize the functional benefits provided by an advanced hydraulic, microprocessor-controlled exo-prosthetic knee component.This retrospective cross-sectional cohort analysis investigated the data of above knee amputees captured during routine trial fittings. Prosthetists rated the performance indicators showing the functional benefits of the advanced maneuvering capabilities of the device. Subjects were asked to rate their perception. Simple and multiple linear and logistic regression was applied.Data from 899 subjects with demographics typical for the population were evaluated. Ability to vary gait speed, perform toileting, and ascend stairs were identified as the most sensitive performance predictors. Prior C-Leg users showed benefits during advanced maneuvering. Variables showed plausible and meaningful effects, however, could not claim predictive power. Mobility grade showed the largest effect but also failed to be predictive.Clinical parameters such as etiology, age, mobility grade, and others analyzed here do not suffice to predict individual potential. Daily walking distance may pose a threshold value and be part of a predictive instrument. Decisions based solely on single parameters such as mobility grade rating or walking distance seem to be questionable.

  20. Network of listed companies based on common shareholders and the prediction of market volatility

    NASA Astrophysics Data System (ADS)

    Li, Jie; Ren, Da; Feng, Xu; Zhang, Yongjie

    2016-11-01

    In this paper, we build a network of listed companies in the Chinese stock market based on common shareholding data from 2003 to 2013. We analyze the evolution of topological characteristics of the network (e.g., average degree, diameter, average path length and clustering coefficient) with respect to the time sequence. Additionally, we consider the economic implications of topological characteristic changes on market volatility and use them to make future predictions. Our study finds that the network diameter significantly predicts volatility. After adding control variables used in traditional financial studies (volume, turnover and previous volatility), network topology still significantly influences volatility and improves the predictive ability of the model.

  1. Improving spatial prediction of Schistosoma haematobium prevalence in southern Ghana through new remote sensors and local water access profiles.

    PubMed

    Kulinkina, Alexandra V; Walz, Yvonne; Koch, Magaly; Biritwum, Nana-Kwadwo; Utzinger, Jürg; Naumova, Elena N

    2018-06-04

    Schistosomiasis is a water-related neglected tropical disease. In many endemic low- and middle-income countries, insufficient surveillance and reporting lead to poor characterization of the demographic and geographic distribution of schistosomiasis cases. Hence, modeling is relied upon to predict areas of high transmission and to inform control strategies. We hypothesized that utilizing remotely sensed (RS) environmental data in combination with water, sanitation, and hygiene (WASH) variables could improve on the current predictive modeling approaches. Schistosoma haematobium prevalence data, collected from 73 rural Ghanaian schools, were used in a random forest model to investigate the predictive capacity of 15 environmental variables derived from RS data (Landsat 8, Sentinel-2, and Global Digital Elevation Model) with fine spatial resolution (10-30 m). Five methods of variable extraction were tested to determine the spatial linkage between school-based prevalence and the environmental conditions of potential transmission sites, including applying the models to known human water contact locations. Lastly, measures of local water access and groundwater quality were incorporated into RS-based models to assess the relative importance of environmental and WASH variables. Predictive models based on environmental characterization of specific locations where people contact surface water bodies offered some improvement as compared to the traditional approach based on environmental characterization of locations where prevalence is measured. A water index (MNDWI) and topographic variables (elevation and slope) were important environmental risk factors, while overall, groundwater iron concentration predominated in the combined model that included WASH variables. The study helps to understand localized drivers of schistosomiasis transmission. Specifically, unsatisfactory water quality in boreholes perpetuates reliance of surface water bodies, indirectly increasing schistosomiasis risk and resulting in rapid reinfection (up to 40% prevalence six months following preventive chemotherapy). Considering WASH-related risk factors in schistosomiasis prediction can help shift the focus of control strategies from treating symptoms to reducing exposure.

  2. Modeling a full-scale primary sedimentation tank using artificial neural networks.

    PubMed

    Gamal El-Din, A; Smith, D W

    2002-05-01

    Modeling the performance of full-scale primary sedimentation tanks has been commonly done using regression-based models, which are empirical relationships derived strictly from observed daily average influent and effluent data. Another approach to model a sedimentation tank is using a hydraulic efficiency model that utilizes tracer studies to characterize the performance of model sedimentation tanks based on eddy diffusion. However, the use of hydraulic efficiency models to predict the dynamic behavior of a full-scale sedimentation tank is very difficult as the development of such models has been done using controlled studies of model tanks. In this paper, another type of model, namely artificial neural network modeling approach, is used to predict the dynamic response of a full-scale primary sedimentation tank. The neuralmodel consists of two separate networks, one uses flow and influent total suspended solids data in order to predict the effluent total suspended solids from the tank, and the other makes predictions of the effluent chemical oxygen demand using data of the flow and influent chemical oxygen demand as inputs. An extensive sampling program was conducted in order to collect a data set to be used in training and validating the networks. A systematic approach was used in the building process of the model which allowed the identification of a parsimonious neural model that is able to learn (and not memorize) from past data and generalize very well to unseen data that were used to validate the model. Theresults seem very promising. The potential of using the model as part of a real-time process control system isalso discussed.

  3. Data-driven modeling, control and tools for cyber-physical energy systems

    NASA Astrophysics Data System (ADS)

    Behl, Madhur

    Energy systems are experiencing a gradual but substantial change in moving away from being non-interactive and manually-controlled systems to utilizing tight integration of both cyber (computation, communications, and control) and physical representations guided by first principles based models, at all scales and levels. Furthermore, peak power reduction programs like demand response (DR) are becoming increasingly important as the volatility on the grid continues to increase due to regulation, integration of renewables and extreme weather conditions. In order to shield themselves from the risk of price volatility, end-user electricity consumers must monitor electricity prices and be flexible in the ways they choose to use electricity. This requires the use of control-oriented predictive models of an energy system's dynamics and energy consumption. Such models are needed for understanding and improving the overall energy efficiency and operating costs. However, learning dynamical models using grey/white box approaches is very cost and time prohibitive since it often requires significant financial investments in retrofitting the system with several sensors and hiring domain experts for building the model. We present the use of data-driven methods for making model capture easy and efficient for cyber-physical energy systems. We develop Model-IQ, a methodology for analysis of uncertainty propagation for building inverse modeling and controls. Given a grey-box model structure and real input data from a temporary set of sensors, Model-IQ evaluates the effect of the uncertainty propagation from sensor data to model accuracy and to closed-loop control performance. We also developed a statistical method to quantify the bias in the sensor measurement and to determine near optimal sensor placement and density for accurate data collection for model training and control. Using a real building test-bed, we show how performing an uncertainty analysis can reveal trends about inverse model accuracy and control performance, which can be used to make informed decisions about sensor requirements and data accuracy. We also present DR-Advisor, a data-driven demand response recommender system for the building's facilities manager which provides suitable control actions to meet the desired load curtailment while maintaining operations and maximizing the economic reward. We develop a model based control with regression trees algorithm (mbCRT), which allows us to perform closed-loop control for DR strategy synthesis for large commercial buildings. Our data-driven control synthesis algorithm outperforms rule-based demand response methods for a large DoE commercial reference building and leads to a significant amount of load curtailment (of 380kW) and over $45,000 in savings which is 37.9% of the summer energy bill for the building. The performance of DR-Advisor is also evaluated for 8 buildings on Penn's campus; where it achieves 92.8% to 98.9% prediction accuracy. We also compare DR-Advisor with other data driven methods and rank 2nd on ASHRAE's benchmarking data-set for energy prediction.

  4. Bullet trajectory predicts the need for damage control: an artificial neural network model.

    PubMed

    Hirshberg, Asher; Wall, Matthew J; Mattox, Kenneth L

    2002-05-01

    Effective use of damage control in trauma hinges on an early decision to use it. Bullet trajectory has never been studied as a marker for damage control. We hypothesize that this decision can be predicted by an artificial neural network (ANN) model based on the bullet trajectory and the patient's blood pressure. A multilayer perceptron ANN predictive model was developed from a data set of 312 patients with single abdominal gunshot injuries. Input variables were the bullet path, trajectory patterns, and admission systolic pressure. The output variable was either a damage control laparotomy or intraoperative death. The best performing ANN was implemented on prospectively collected data from 34 patients. The model achieved a correct classification rate of 0.96 and area under the receiver operating characteristic curve of 0.94. External validation showed the model to have a sensitivity of 88% and specificity of 96%. Model implementation on the prospectively collected data had a correct classification rate of 0.91. Sensitivity analysis showed that systolic pressure, bullet path across the midline, and trajectory involving the right upper quadrant were the three most important input variables. Bullet trajectory is an important, hitherto unrecognized, factor that should be incorporated into the decision to use damage control.

  5. Predictive assimilation framework to support contaminated site understanding and remediation

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Bianchi, M.; Hubbard, S. S.

    2014-12-01

    Subsurface system behavior at contaminated sites is driven and controlled by the interplay of physical, chemical, and biological processes occurring at multiple temporal and spatial scales. Effective remediation and monitoring planning requires an understanding of this complexity that is current, predictive (with some level of confidence) and actionable. We present and demonstrate a predictive assimilation framework (PAF). This framework automatically ingests, quality controls and stores near real-time environmental data and processes these data using different inversion and modeling codes to provide information on the current state and evolution of the subsurface system. PAF is implemented as a cloud based software application which has five components: (1) data acquisition, (2) data management, (3) data assimilation and processing, (4) visualization and result deliver and (5) orchestration. Access to and interaction with PAF is done through a standard browser. PAF is designed to be modular so that it can ingest and process different data streams dependent on the site. We will present an implementation of PAF which uses data from a highly instrumented site (the DOE Rifle Subsurface Biogeochemistry Field Observatory in Rifle, Colorado) for which PAF automatically ingests hydrological data and forward models groundwater flow in the saturated zone.

  6. Aerodynamic Parameter Estimation for the X-43A (Hyper-X) from Flight Data

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; Derry, Stephen D.; Smith, Mark S.

    2005-01-01

    Aerodynamic parameters were estimated based on flight data from the third flight of the X-43A hypersonic research vehicle, also called Hyper-X. Maneuvers were flown using multiple orthogonal phase-optimized sweep inputs applied as simultaneous control surface perturbations at Mach 8, 7, 6, 5, 4, and 3 during the vehicle descent. Aerodynamic parameters, consisting of non-dimensional longitudinal and lateral stability and control derivatives, were estimated from flight data at each Mach number. Multi-step inputs at nearly the same flight conditions were also flown to assess the prediction capability of the identified models. Prediction errors were found to be comparable in magnitude to the modeling errors, which indicates accurate modeling. Aerodynamic parameter estimates were plotted as a function of Mach number, and compared with estimates from the pre-flight aerodynamic database, which was based on wind-tunnel tests and computational fluid dynamics. Agreement between flight estimates and values computed from the aerodynamic database was excellent overall.

  7. Predicting clinical symptoms of attention deficit hyperactivity disorder based on temporal patterns between and within intrinsic connectivity networks.

    PubMed

    Wang, Xun-Heng; Jiao, Yun; Li, Lihua

    2017-10-24

    Attention deficit hyperactivity disorder (ADHD) is a common brain disorder with high prevalence in school-age children. Previously developed machine learning-based methods have discriminated patients with ADHD from normal controls by providing label information of the disease for individuals. Inattention and impulsivity are the two most significant clinical symptoms of ADHD. However, predicting clinical symptoms (i.e., inattention and impulsivity) is a challenging task based on neuroimaging data. The goal of this study is twofold: to build predictive models for clinical symptoms of ADHD based on resting-state fMRI and to mine brain networks for predictive patterns of inattention and impulsivity. To achieve this goal, a cohort of 74 boys with ADHD and a cohort of 69 age-matched normal controls were recruited from the ADHD-200 Consortium. Both structural and resting-state fMRI images were obtained for each participant. Temporal patterns between and within intrinsic connectivity networks (ICNs) were applied as raw features in the predictive models. Specifically, sample entropy was taken asan intra-ICN feature, and phase synchronization (PS) was used asan inter-ICN feature. The predictive models were based on the least absolute shrinkage and selectionator operator (LASSO) algorithm. The performance of the predictive model for inattention is r=0.79 (p<10 -8 ), and the performance of the predictive model for impulsivity is r=0.48 (p<10 -8 ). The ICN-related predictive patterns may provide valuable information for investigating the brain network mechanisms of ADHD. In summary, the predictive models for clinical symptoms could be beneficial for personalizing ADHD medications. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.

  8. Generalized Predictive and Neural Generalized Predictive Control of Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Kelkar, Atul G.

    2000-01-01

    The research work presented in this thesis addresses the problem of robust control of uncertain linear and nonlinear systems using Neural network-based Generalized Predictive Control (NGPC) methodology. A brief overview of predictive control and its comparison with Linear Quadratic (LQ) control is given to emphasize advantages and drawbacks of predictive control methods. It is shown that the Generalized Predictive Control (GPC) methodology overcomes the drawbacks associated with traditional LQ control as well as conventional predictive control methods. It is shown that in spite of the model-based nature of GPC it has good robustness properties being special case of receding horizon control. The conditions for choosing tuning parameters for GPC to ensure closed-loop stability are derived. A neural network-based GPC architecture is proposed for the control of linear and nonlinear uncertain systems. A methodology to account for parametric uncertainty in the system is proposed using on-line training capability of multi-layer neural network. Several simulation examples and results from real-time experiments are given to demonstrate the effectiveness of the proposed methodology.

  9. Polygenic risk score in postmortem diagnosed sporadic early-onset Alzheimer's disease.

    PubMed

    Chaudhury, Sultan; Patel, Tulsi; Barber, Imelda S; Guetta-Baranes, Tamar; Brookes, Keeley J; Chappell, Sally; Turton, James; Guerreiro, Rita; Bras, Jose; Hernandez, Dena; Singleton, Andrew; Hardy, John; Mann, David; Morgan, Kevin

    2018-02-01

    Sporadic early-onset Alzheimer's disease (sEOAD) exhibits the symptoms of late-onset Alzheimer's disease but lacks the familial aspect of the early-onset familial form. The genetics of Alzheimer's disease (AD) identifies APOEε4 to be the greatest risk factor; however, it is a complex disease involving both environmental risk factors and multiple genetic loci. Polygenic risk scores (PRSs) accumulate the total risk of a phenotype in an individual based on variants present in their genome. We determined whether sEOAD cases had a higher PRS compared to controls. A cohort of sEOAD cases was genotyped on the NeuroX array, and PRSs were generated using PRSice. The target data set consisted of 408 sEOAD cases and 436 controls. The base data set was collated by the International Genomics of Alzheimer's Project consortium, with association data from 17,008 late-onset Alzheimer's disease cases and 37,154 controls, which can be used for identifying sEOAD cases due to having shared phenotype. PRSs were generated using all common single nucleotide polymorphisms between the base and target data set, PRS were also generated using only single nucleotide polymorphisms within a 500 kb region surrounding the APOE gene. Sex and number of APOE ε2 or ε4 alleles were used as variables for logistic regression and combined with PRS. The results show that PRS is higher on average in sEOAD cases than controls, although there is still overlap among the whole cohort. Predictive ability of identifying cases and controls using PRSice was calculated with 72.9% accuracy, greater than the APOE locus alone (65.2%). Predictive ability was further improved with logistic regression, identifying cases and controls with 75.5% accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. BCI Competition IV – Data Set I: Learning Discriminative Patterns for Self-Paced EEG-Based Motor Imagery Detection

    PubMed Central

    Zhang, Haihong; Guan, Cuntai; Ang, Kai Keng; Wang, Chuanchu

    2012-01-01

    Detecting motor imagery activities versus non-control in brain signals is the basis of self-paced brain-computer interfaces (BCIs), but also poses a considerable challenge to signal processing due to the complex and non-stationary characteristics of motor imagery as well as non-control. This paper presents a self-paced BCI based on a robust learning mechanism that extracts and selects spatio-spectral features for differentiating multiple EEG classes. It also employs a non-linear regression and post-processing technique for predicting the time-series of class labels from the spatio-spectral features. The method was validated in the BCI Competition IV on Dataset I where it produced the lowest prediction error of class labels continuously. This report also presents and discusses analysis of the method using the competition data set. PMID:22347153

  11. Identifying malaria vector breeding habitats with remote sensing data and terrain-based landscape indices in Zambia.

    PubMed

    Clennon, Julie A; Kamanga, Aniset; Musapa, Mulenga; Shiff, Clive; Glass, Gregory E

    2010-11-05

    Malaria, caused by the parasite Plasmodium falciparum, is a significant source of morbidity and mortality in southern Zambia. In the Mapanza Chiefdom, where transmission is seasonal, Anopheles arabiensis is the dominant malaria vector. The ability to predict larval habitats can help focus control measures. A survey was conducted in March-April 2007, at the end of the rainy season, to identify and map locations of water pooling and the occurrence anopheline larval habitats; this was repeated in October 2007 at the end of the dry season and in March-April 2008 during the next rainy season. Logistic regression and generalized linear mixed modeling were applied to assess the predictive value of terrain-based landscape indices along with LandSat imagery to identify aquatic habitats and, especially, those with anopheline mosquito larvae. Approximately two hundred aquatic habitat sites were identified with 69 percent positive for anopheline mosquitoes. Nine species of anopheline mosquitoes were identified, of which, 19% were An. arabiensis. Terrain-based landscape indices combined with LandSat predicted sites with water, sites with anopheline mosquitoes and sites specifically with An. arabiensis. These models were especially successful at ruling out potential locations, but had limited ability in predicting which anopheline species inhabited aquatic sites. Terrain indices derived from 90 meter Shuttle Radar Topography Mission (SRTM) digital elevation data (DEM) were better at predicting water drainage patterns and characterizing the landscape than those derived from 30 m Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) DEM. The low number of aquatic habitats available and the ability to locate the limited number of aquatic habitat locations for surveillance, especially those containing anopheline larvae, suggest that larval control maybe a cost-effective control measure in the fight against malaria in Zambia and other regions with seasonal transmission. This work shows that, in areas of seasonal malaria transmission, incorporating terrain-based landscape models to the planning stages of vector control allows for the exclusion of significant portions of landscape that would be unsuitable for water to accumulate and for mosquito larvae occupation. With increasing free availability of satellite imagery such as SRTM and LandSat, the development of satellite imagery-based prediction models is becoming more accessible to vector management coordinators.

  12. Polygenic risk score analysis of pathologically confirmed Alzheimer disease.

    PubMed

    Escott-Price, Valentina; Myers, Amanda J; Huentelman, Matt; Hardy, John

    2017-08-01

    Previous estimates of the utility of polygenic risk score analysis for the prediction of Alzheimer disease have given area under the curve (AUC) estimates of <80%. However, these have been based on the genetic analysis of clinical case-control series. Here, we apply the same analytic approaches to a pathological case-control series and show a predictive AUC of 84%. We suggest that this analysis has clinical utility and that there is limited room for further improvement using genetic data. Ann Neurol 2017;82:311-314. © 2017 American Neurological Association.

  13. Intelligent processing for thick composites

    NASA Astrophysics Data System (ADS)

    Shin, Daniel Dong-Ok

    2000-10-01

    Manufacturing thick composite parts are associated with adverse curing conditions such as large in-plane temperature gradient and exotherms. The condition is further aggravated because the manufacturer's cycle and the existing cure control systems do not adequately counter such affects. In response, the forecast-based thermal control system is developed to have better cure control for thick composites. Accurate cure kinetic model is crucial for correctly identifying the amount of heat generated for composite process simulation. A new technique for identifying cure parameters for Hercules AS4/3502 prepreg is presented by normalizing the DSC data. The cure kinetics is based on an autocatalytic model for the proposed method, which uses dynamic and isothermal DSC data to determine its parameters. Existing models are also used to determine kinetic parameters but rendered inadequate because of the material's temperature dependent final degree of cure. The model predictions determined from the new technique showed good agreement to both isothermal and dynamic DSC data. The final degree of cure was also in good agreement with experimental data. A realistic cure simulation model including bleeder ply analysis and compaction is validated with Hercules AS4/3501-6 based laminates. The nonsymmetrical temperature distribution resulting from the presence of bleeder plies agreed well to the model prediction. Some of the discrepancies in the predicted compaction behavior were attributed to inaccurate viscosity and permeability models. The temperature prediction was quite good for the 3cm laminate. The validated process simulation model along with cure kinetics model for AS4/3502 prepreg were integrated into the thermal control system. The 3cm Hercules AS4/3501-6 and AS4/3502 laminate were fabricated. The resulting cure cycles satisfied all imposed requirements by minimizing exotherms and temperature gradient. Although the duration of the cure cycles increased, such phenomena was inevitable since longer time was required to maintain acceptable temperature gradient. The derived cure cycles were slightly different than what was anticipated by the offline simulation. Nevertheless, the system adapted to unanticipated events to satisfy the cure requirements.

  14. Prediction of Traffic Complexity and Controller Workload in Mixed Equipage NextGen Environments

    NASA Technical Reports Server (NTRS)

    Lee, Paul U.; Prevot, Thomas

    2012-01-01

    Controller workload is a key factor in limiting en route air traffic capacity. Past efforts to quantify and predict workload have resulted in identifying objective metrics that correlate well with subjective workload ratings during current air traffic control operations. Although these metrics provide a reasonable statistical fit to existing data, they do not provide a good mechanism for estimating controller workload for future air traffic concepts and environments that make different assumptions about automation, enabling technologies, and controller tasks. One such future environment is characterized by en route airspace with a mixture of aircraft equipped with and without Data Communications (Data Comm). In this environment, aircraft with Data Comm will impact controller workload less than aircraft requiring voice communication, altering the close correlation between aircraft count and controller workload that exists in current air traffic operations. This paper outlines a new trajectory-based complexity (TBX) calculation that was presented to controllers during a human-in-the-loop simulation. The results showed that TBX accurately estimated the workload in a mixed Data Comm equipage environment and the resulting complexity values were understood and readily interpreted by the controllers. The complexity was represented as a "modified aircraft account" that weighted different complexity factors and summed them in such a way that the controllers could effectively treat them as aircraft count. The factors were also relatively easy to tune without an extensive data set. The results showed that the TBX approach is well suited for presenting traffic complexity in future air traffic environments.

  15. Prediction of L70 lumen maintenance and chromaticity for LEDs using extended Kalman filter models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lall, Pradeep; Wei, Junchao; Davis, Lynn

    2013-09-30

    Solid-state lighting (SSL) luminaires containing light emitting diodes (LEDs) have the potential of seeing excessive temperatures when being transported across country or being stored in non-climate controlled warehouses. They are also being used in outdoor applications in desert environments that see little or no humidity but will experience extremely high temperatures during the day. This makes it important to increase our understanding of what effects high temperature exposure for a prolonged period of time will have on the usability and survivability of these devices. Traditional light sources “burn out” at end-of-life. For an incandescent bulb, the lamp life is definedmore » by B50 life. However, the LEDs have no filament to “burn”. The LEDs continually degrade and the light output decreases eventually below useful levels causing failure. Presently, the TM-21 test standard is used to predict the L70 life of LEDs from LM-80 test data. Several failure mechanisms may be active in a LED at a single time causing lumen depreciation. The underlying TM-21 Model may not capture the failure physics in presence of multiple failure mechanisms. Correlation of lumen maintenance with underlying physics of degradation at system-level is needed. In this paper, Kalman Filter (KF) and Extended Kalman Filters (EKF) have been used to develop a 70-percent Lumen Maintenance Life Prediction Model for LEDs used in SSL luminaires. Ten-thousand hour LM-80 test data for various LEDs have been used for model development. System state at each future time has been computed based on the state space at preceding time step, system dynamics matrix, control vector, control matrix, measurement matrix, measured vector, process noise and measurement noise. The future state of the lumen depreciation has been estimated based on a second order Kalman Filter model and a Bayesian Framework. The measured state variable has been related to the underlying damage using physics-based models. Life prediction of L70 life for the LEDs used in SSL luminaires from KF and EKF based models have been compared with the TM-21 model predictions and experimental data.« less

  16. Prediction of Lumen Output and Chromaticity Shift in LEDs Using Kalman Filter and Extended Kalman Filter Based Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lall, Pradeep; Wei, Junchao; Davis, J Lynn

    2014-06-24

    Abstract— Solid-state lighting (SSL) luminaires containing light emitting diodes (LEDs) have the potential of seeing excessive temperatures when being transported across country or being stored in non-climate controlled warehouses. They are also being used in outdoor applications in desert environments that see little or no humidity but will experience extremely high temperatures during the day. This makes it important to increase our understanding of what effects high temperature exposure for a prolonged period of time will have on the usability and survivability of these devices. Traditional light sources “burn out” at end-of-life. For an incandescent bulb, the lamp life ismore » defined by B50 life. However, the LEDs have no filament to “burn”. The LEDs continually degrade and the light output decreases eventually below useful levels causing failure. Presently, the TM-21 test standard is used to predict the L70 life of LEDs from LM-80 test data. Several failure mechanisms may be active in a LED at a single time causing lumen depreciation. The underlying TM-21 Model may not capture the failure physics in presence of multiple failure mechanisms. Correlation of lumen maintenance with underlying physics of degradation at system-level is needed. In this paper, Kalman Filter (KF) and Extended Kalman Filters (EKF) have been used to develop a 70-percent Lumen Maintenance Life Prediction Model for LEDs used in SSL luminaires. Ten-thousand hour LM-80 test data for various LEDs have been used for model development. System state at each future time has been computed based on the state space at preceding time step, system dynamics matrix, control vector, control matrix, measurement matrix, measured vector, process noise and measurement noise. The future state of the lumen depreciation has been estimated based on a second order Kalman Filter model and a Bayesian Framework. Life prediction of L70 life for the LEDs used in SSL luminaires from KF and EKF based models have been compared with the TM-21 model predictions and experimental data.« less

  17. Development and Validation of a Predictive Model to Identify Individuals Likely to Have Undiagnosed Chronic Obstructive Pulmonary Disease Using an Administrative Claims Database.

    PubMed

    Moretz, Chad; Zhou, Yunping; Dhamane, Amol D; Burslem, Kate; Saverno, Kim; Jain, Gagan; Devercelli, Giovanna; Kaila, Shuchita; Ellis, Jeffrey J; Hernandez, Gemzel; Renda, Andrew

    2015-12-01

    Despite the importance of early detection, delayed diagnosis of chronic obstructive pulmonary disease (COPD) is relatively common. Approximately 12 million people in the United States have undiagnosed COPD. Diagnosis of COPD is essential for the timely implementation of interventions, such as smoking cessation programs, drug therapies, and pulmonary rehabilitation, which are aimed at improving outcomes and slowing disease progression. To develop and validate a predictive model to identify patients likely to have undiagnosed COPD using administrative claims data. A predictive model was developed and validated utilizing a retro-spective cohort of patients with and without a COPD diagnosis (cases and controls), aged 40-89, with a minimum of 24 months of continuous health plan enrollment (Medicare Advantage Prescription Drug [MAPD] and commercial plans), and identified between January 1, 2009, and December 31, 2012, using Humana's claims database. Stratified random sampling based on plan type (commercial or MAPD) and index year was performed to ensure that cases and controls had a similar distribution of these variables. Cases and controls were compared to identify demographic, clinical, and health care resource utilization (HCRU) characteristics associated with a COPD diagnosis. Stepwise logistic regression (SLR), neural networking, and decision trees were used to develop a series of models. The models were trained, validated, and tested on randomly partitioned subsets of the sample (Training, Validation, and Test data subsets). Measures used to evaluate and compare the models included area under the curve (AUC); index of the receiver operating characteristics (ROC) curve; sensitivity, specificity, positive predictive value (PPV); and negative predictive value (NPV). The optimal model was selected based on AUC index on the Test data subset. A total of 50,880 cases and 50,880 controls were included, with MAPD patients comprising 92% of the study population. Compared with controls, cases had a statistically significantly higher comorbidity burden and HCRU (including hospitalizations, emergency room visits, and medical procedures). The optimal predictive model was generated using SLR, which included 34 variables that were statistically significantly associated with a COPD diagnosis. After adjusting for covariates, anticholinergic bronchodilators (OR = 3.336) and tobacco cessation counseling (OR = 2.871) were found to have a large influence on the model. The final predictive model had an AUC of 0.754, sensitivity of 60%, specificity of 78%, PPV of 73%, and an NPV of 66%. This claims-based predictive model provides an acceptable level of accuracy in identifying patients likely to have undiagnosed COPD in a large national health plan. Identification of patients with undiagnosed COPD may enable timely management and lead to improved health outcomes and reduced COPD-related health care expenditures.

  18. Predictions of heading date in bread wheat (Triticum aestivum L.) using QTL-based parameters of an ecophysiological model

    PubMed Central

    Bogard, Matthieu; Ravel, Catherine; Paux, Etienne; Bordes, Jacques; Balfourier, François; Chapman, Scott C.; Le Gouis, Jacques; Allard, Vincent

    2014-01-01

    Prediction of wheat phenology facilitates the selection of cultivars with specific adaptations to a particular environment. However, while QTL analysis for heading date can identify major genes controlling phenology, the results are limited to the environments and genotypes tested. Moreover, while ecophysiological models allow accurate predictions in new environments, they may require substantial phenotypic data to parameterize each genotype. Also, the model parameters are rarely related to all underlying genes, and all the possible allelic combinations that could be obtained by breeding cannot be tested with models. In this study, a QTL-based model is proposed to predict heading date in bread wheat (Triticum aestivum L.). Two parameters of an ecophysiological model (V sat and P base, representing genotype vernalization requirements and photoperiod sensitivity, respectively) were optimized for 210 genotypes grown in 10 contrasting location × sowing date combinations. Multiple linear regression models predicting V sat and P base with 11 and 12 associated genetic markers accounted for 71 and 68% of the variance of these parameters, respectively. QTL-based V sat and P base estimates were able to predict heading date of an independent validation data set (88 genotypes in six location × sowing date combinations) with a root mean square error of prediction of 5 to 8.6 days, explaining 48 to 63% of the variation for heading date. The QTL-based model proposed in this study may be used for agronomic purposes and to assist breeders in suggesting locally adapted ideotypes for wheat phenology. PMID:25148833

  19. A semi-automated workflow for biodiversity data retrieval, cleaning, and quality control

    PubMed Central

    Mathew, Cherian; Obst, Matthias; Vicario, Saverio; Haines, Robert; Williams, Alan R.; de Jong, Yde; Goble, Carole

    2014-01-01

    Abstract The compilation and cleaning of data needed for analyses and prediction of species distributions is a time consuming process requiring a solid understanding of data formats and service APIs provided by biodiversity informatics infrastructures. We designed and implemented a Taverna-based Data Refinement Workflow which integrates taxonomic data retrieval, data cleaning, and data selection into a consistent, standards-based, and effective system hiding the complexity of underlying service infrastructures. The workflow can be freely used both locally and through a web-portal which does not require additional software installations by users. PMID:25535486

  20. Demand response-enabled model predictive HVAC load control in buildings using real-time electricity pricing

    NASA Astrophysics Data System (ADS)

    Avci, Mesut

    A practical cost and energy efficient model predictive control (MPC) strategy is proposed for HVAC load control under dynamic real-time electricity pricing. The MPC strategy is built based on a proposed model that jointly minimizes the total energy consumption and hence, cost of electricity for the user, and the deviation of the inside temperature from the consumer's preference. An algorithm that assigns temperature set-points (reference temperatures) to price ranges based on the consumer's discomfort tolerance index is developed. A practical parameter prediction model is also designed for mapping between the HVAC load and the inside temperature. The prediction model and the produced temperature set-points are integrated as inputs into the MPC controller, which is then used to generate signal actions for the AC unit. To investigate and demonstrate the effectiveness of the proposed approach, a simulation based experimental analysis is presented using real-life pricing data. An actual prototype for the proposed HVAC load control strategy is then built and a series of prototype experiments are conducted similar to the simulation studies. The experiments reveal that the MPC strategy can lead to significant reductions in overall energy consumption and cost savings for the consumer. Results suggest that by providing an efficient response strategy for the consumers, the proposed MPC strategy can enable the utility providers to adopt efficient demand management policies using real-time pricing. Finally, a cost-benefit analysis is performed to display the economic feasibility of implementing such a controller as part of a building energy management system, and the payback period is identified considering cost of prototype build and cost savings to help the adoption of this controller in the building HVAC control industry.

  1. Machine learning classification with confidence: application of transductive conformal predictors to MRI-based diagnostic and prognostic markers in depression.

    PubMed

    Nouretdinov, Ilia; Costafreda, Sergi G; Gammerman, Alexander; Chervonenkis, Alexey; Vovk, Vladimir; Vapnik, Vladimir; Fu, Cynthia H Y

    2011-05-15

    There is rapidly accumulating evidence that the application of machine learning classification to neuroimaging measurements may be valuable for the development of diagnostic and prognostic prediction tools in psychiatry. However, current methods do not produce a measure of the reliability of the predictions. Knowing the risk of the error associated with a given prediction is essential for the development of neuroimaging-based clinical tools. We propose a general probabilistic classification method to produce measures of confidence for magnetic resonance imaging (MRI) data. We describe the application of transductive conformal predictor (TCP) to MRI images. TCP generates the most likely prediction and a valid measure of confidence, as well as the set of all possible predictions for a given confidence level. We present the theoretical motivation for TCP, and we have applied TCP to structural and functional MRI data in patients and healthy controls to investigate diagnostic and prognostic prediction in depression. We verify that TCP predictions are as accurate as those obtained with more standard machine learning methods, such as support vector machine, while providing the additional benefit of a valid measure of confidence for each prediction. Copyright © 2010 Elsevier Inc. All rights reserved.

  2. On the comparison of stochastic model predictive control strategies applied to a hydrogen-based microgrid

    NASA Astrophysics Data System (ADS)

    Velarde, P.; Valverde, L.; Maestre, J. M.; Ocampo-Martinez, C.; Bordons, C.

    2017-03-01

    In this paper, a performance comparison among three well-known stochastic model predictive control approaches, namely, multi-scenario, tree-based, and chance-constrained model predictive control is presented. To this end, three predictive controllers have been designed and implemented in a real renewable-hydrogen-based microgrid. The experimental set-up includes a PEM electrolyzer, lead-acid batteries, and a PEM fuel cell as main equipment. The real experimental results show significant differences from the plant components, mainly in terms of use of energy, for each implemented technique. Effectiveness, performance, advantages, and disadvantages of these techniques are extensively discussed and analyzed to give some valid criteria when selecting an appropriate stochastic predictive controller.

  3. Analyses of the most influential factors for vibration monitoring of planetary power transmissions in pellet mills by adaptive neuro-fuzzy technique

    NASA Astrophysics Data System (ADS)

    Milovančević, Miloš; Nikolić, Vlastimir; Anđelković, Boban

    2017-01-01

    Vibration-based structural health monitoring is widely recognized as an attractive strategy for early damage detection in civil structures. Vibration monitoring and prediction is important for any system since it can save many unpredictable behaviors of the system. If the vibration monitoring is properly managed, that can ensure economic and safe operations. Potentials for further improvement of vibration monitoring lie in the improvement of current control strategies. One of the options is the introduction of model predictive control. Multistep ahead predictive models of vibration are a starting point for creating a successful model predictive strategy. For the purpose of this article, predictive models of are created for vibration monitoring of planetary power transmissions in pellet mills. The models were developed using the novel method based on ANFIS (adaptive neuro fuzzy inference system). The aim of this study is to investigate the potential of ANFIS for selecting the most relevant variables for predictive models of vibration monitoring of pellet mills power transmission. The vibration data are collected by PIC (Programmable Interface Controller) microcontrollers. The goal of the predictive vibration monitoring of planetary power transmissions in pellet mills is to indicate deterioration in the vibration of the power transmissions before the actual failure occurs. The ANFIS process for variable selection was implemented in order to detect the predominant variables affecting the prediction of vibration monitoring. It was also used to select the minimal input subset of variables from the initial set of input variables - current and lagged variables (up to 11 steps) of vibration. The obtained results could be used for simplification of predictive methods so as to avoid multiple input variables. It was preferable to used models with less inputs because of overfitting between training and testing data. While the obtained results are promising, further work is required in order to get results that could be directly applied in practice.

  4. Rate-Based Model Predictive Control of Turbofan Engine Clearance

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan A.

    2006-01-01

    An innovative model predictive control strategy is developed for control of nonlinear aircraft propulsion systems and sub-systems. At the heart of the controller is a rate-based linear parameter-varying model that propagates the state derivatives across the prediction horizon, extending prediction fidelity to transient regimes where conventional models begin to lose validity. The new control law is applied to a demanding active clearance control application, where the objectives are to tightly regulate blade tip clearances and also anticipate and avoid detrimental blade-shroud rub occurrences by optimally maintaining a predefined minimum clearance. Simulation results verify that the rate-based controller is capable of satisfying the objectives during realistic flight scenarios where both a conventional Jacobian-based model predictive control law and an unconstrained linear-quadratic optimal controller are incapable of doing so. The controller is evaluated using a variety of different actuators, illustrating the efficacy and versatility of the control approach. It is concluded that the new strategy has promise for this and other nonlinear aerospace applications that place high importance on the attainment of control objectives during transient regimes.

  5. Further validation of artificial neural network-based emissions simulation models for conventional and hybrid electric vehicles.

    PubMed

    Tóth-Nagy, Csaba; Conley, John J; Jarrett, Ronald P; Clark, Nigel N

    2006-07-01

    With the advent of hybrid electric vehicles, computer-based vehicle simulation becomes more useful to the engineer and designer trying to optimize the complex combination of control strategy, power plant, drive train, vehicle, and driving conditions. With the desire to incorporate emissions as a design criterion, researchers at West Virginia University have developed artificial neural network (ANN) models for predicting emissions from heavy-duty vehicles. The ANN models were trained on engine and exhaust emissions data collected from transient dynamometer tests of heavy-duty diesel engines then used to predict emissions based on engine speed and torque data from simulated operation of a tractor truck and hybrid electric bus. Simulated vehicle operation was performed with the ADVISOR software package. Predicted emissions (carbon dioxide [CO2] and oxides of nitrogen [NO(x)]) were then compared with actual emissions data collected from chassis dynamometer tests of similar vehicles. This paper expands on previous research to include different driving cycles for the hybrid electric bus and varying weights of the conventional truck. Results showed that different hybrid control strategies had a significant effect on engine behavior (and, thus, emissions) and may affect emissions during different driving cycles. The ANN models underpredicted emissions of CO2 and NO(x) in the case of a class-8 truck but were more accurate as the truck weight increased.

  6. A wavelet-based technique to predict treatment outcome for Major Depressive Disorder

    PubMed Central

    Xia, Likun; Mohd Yasin, Mohd Azhar; Azhar Ali, Syed Saad

    2017-01-01

    Treatment management for Major Depressive Disorder (MDD) has been challenging. However, electroencephalogram (EEG)-based predictions of antidepressant’s treatment outcome may help during antidepressant’s selection and ultimately improve the quality of life for MDD patients. In this study, a machine learning (ML) method involving pretreatment EEG data was proposed to perform such predictions for Selective Serotonin Reuptake Inhibitor (SSRIs). For this purpose, the acquisition of experimental data involved 34 MDD patients and 30 healthy controls. Consequently, a feature matrix was constructed involving time-frequency decomposition of EEG data based on wavelet transform (WT) analysis, termed as EEG data matrix. However, the resultant EEG data matrix had high dimensionality. Therefore, dimension reduction was performed based on a rank-based feature selection method according to a criterion, i.e., receiver operating characteristic (ROC). As a result, the most significant features were identified and further be utilized during the training and testing of a classification model, i.e., the logistic regression (LR) classifier. Finally, the LR model was validated with 100 iterations of 10-fold cross-validation (10-CV). The classification results were compared with short-time Fourier transform (STFT) analysis, and empirical mode decompositions (EMD). The wavelet features extracted from frontal and temporal EEG data were found statistically significant. In comparison with other time-frequency approaches such as the STFT and EMD, the WT analysis has shown highest classification accuracy, i.e., accuracy = 87.5%, sensitivity = 95%, and specificity = 80%. In conclusion, significant wavelet coefficients extracted from frontal and temporal pre-treatment EEG data involving delta and theta frequency bands may predict antidepressant’s treatment outcome for the MDD patients. PMID:28152063

  7. Neural network-based nonlinear model predictive control vs. linear quadratic gaussian control

    USGS Publications Warehouse

    Cho, C.; Vance, R.; Mardi, N.; Qian, Z.; Prisbrey, K.

    1997-01-01

    One problem with the application of neural networks to the multivariable control of mineral and extractive processes is determining whether and how to use them. The objective of this investigation was to compare neural network control to more conventional strategies and to determine if there are any advantages in using neural network control in terms of set-point tracking, rise time, settling time, disturbance rejection and other criteria. The procedure involved developing neural network controllers using both historical plant data and simulation models. Various control patterns were tried, including both inverse and direct neural network plant models. These were compared to state space controllers that are, by nature, linear. For grinding and leaching circuits, a nonlinear neural network-based model predictive control strategy was superior to a state space-based linear quadratic gaussian controller. The investigation pointed out the importance of incorporating state space into neural networks by making them recurrent, i.e., feeding certain output state variables into input nodes in the neural network. It was concluded that neural network controllers can have better disturbance rejection, set-point tracking, rise time, settling time and lower set-point overshoot, and it was also concluded that neural network controllers can be more reliable and easy to implement in complex, multivariable plants.

  8. Development of an improved system for contract time determination : phase III.

    DOT National Transportation Integrated Search

    2010-09-30

    This study developed Daily Work Report (DWR) based prediction models to determine reasonable : production rates of controlling activities of highway projects. The study used available resources such as : DWR, soil data, AADT and other existing projec...

  9. Validation of a mapping and prediction model for human fasciolosis transmission in Andean very high altitude endemic areas using remote sensing data.

    PubMed

    Fuentes, M V; Malone, J B; Mas-Coma, S

    2001-04-27

    The present paper aims to validate the usefulness of the Normalized Difference Vegetation Index (NDVI) obtained by satellite remote sensing for the development of local maps of risk and for prediction of human fasciolosis in the Northern Bolivian Altiplano. The endemic area, which is located at very high altitudes (3800-4100 m) between Lake Titicaca and the valley of the city of La Paz, presents the highest prevalences and intensities of fasciolosis known in humans. NDVI images of 1.1 km resolution from the Advanced Very High Resolution Radiometer (AVHRR) sensor on board the National Oceanic and Atmospheric Administration (NOAA) series of environmental satellites appear to provide adequate information for a study area such as that of the Northern Bolivian Altiplano. The predictive value of the remotely sensed map based on NDVI data appears to be better than that from forecast indices based only on climatic data. A close correspondence was observed between real ranges of human fasciolosis prevalence at 13 localities of known prevalence rates and the predicted ranges of fasciolosis prevalence using NDVI maps. However, results based on NDVI map data predicted zones as risk areas where, in fact, field studies have demonstrated the absence of lymnaeid populations during snail surveys, corroborated by the absence of the parasite in humans and livestock. NDVI data maps represent a useful data component in long-term efforts to develop a comprehensive geographical information system control program model that accurately fits real epidemiological and transmission situations of human fasciolosis in high altitude endemic areas in Andean countries.

  10. Life Extending Control. [mechanical fatigue in reusable rocket engines

    NASA Technical Reports Server (NTRS)

    Lorenzo, Carl F.; Merrill, Walter C.

    1991-01-01

    The concept of Life Extending Control is defined. Life is defined in terms of mechanical fatigue life. A brief description is given of the current approach to life prediction using a local, cyclic, stress-strain approach for a critical system component. An alternative approach to life prediction based on a continuous functional relationship to component performance is proposed. Based on cyclic life prediction, an approach to life extending control, called the Life Management Approach, is proposed. A second approach, also based on cyclic life prediction, called the implicit approach, is presented. Assuming the existence of the alternative functional life prediction approach, two additional concepts for Life Extending Control are presented.

  11. Life extending control: A concept paper

    NASA Technical Reports Server (NTRS)

    Lorenzo, Carl F.; Merrill, Walter C.

    1991-01-01

    The concept of Life Extending Control is defined. Life is defined in terms of mechanical fatigue life. A brief description is given of the current approach to life prediction using a local, cyclic, stress-strain approach for a critical system component. An alternative approach to life prediction based on a continuous functional relationship to component performance is proposed.Base on cyclic life prediction an approach to Life Extending Control, called the Life Management Approach is proposed. A second approach, also based on cyclic life prediction, called the Implicit Approach, is presented. Assuming the existence of the alternative functional life prediction approach, two additional concepts for Life Extending Control are presented.

  12. Offline modeling for product quality prediction of mineral processing using modeling error PDF shaping and entropy minimization.

    PubMed

    Ding, Jinliang; Chai, Tianyou; Wang, Hong

    2011-03-01

    This paper presents a novel offline modeling for product quality prediction of mineral processing which consists of a number of unit processes in series. The prediction of the product quality of the whole mineral process (i.e., the mixed concentrate grade) plays an important role and the establishment of its predictive model is a key issue for the plantwide optimization. For this purpose, a hybrid modeling approach of the mixed concentrate grade prediction is proposed, which consists of a linear model and a nonlinear model. The least-squares support vector machine is adopted to establish the nonlinear model. The inputs of the predictive model are the performance indices of each unit process, while the output is the mixed concentrate grade. In this paper, the model parameter selection is transformed into the shape control of the probability density function (PDF) of the modeling error. In this context, both the PDF-control-based and minimum-entropy-based model parameter selection approaches are proposed. Indeed, this is the first time that the PDF shape control idea is used to deal with system modeling, where the key idea is to turn model parameters so that either the modeling error PDF is controlled to follow a target PDF or the modeling error entropy is minimized. The experimental results using the real plant data and the comparison of the two approaches are discussed. The results show the effectiveness of the proposed approaches.

  13. Model-on-Demand Predictive Control for Nonlinear Hybrid Systems With Application to Adaptive Behavioral Interventions

    PubMed Central

    Nandola, Naresh N.; Rivera, Daniel E.

    2011-01-01

    This paper presents a data-centric modeling and predictive control approach for nonlinear hybrid systems. System identification of hybrid systems represents a challenging problem because model parameters depend on the mode or operating point of the system. The proposed algorithm applies Model-on-Demand (MoD) estimation to generate a local linear approximation of the nonlinear hybrid system at each time step, using a small subset of data selected by an adaptive bandwidth selector. The appeal of the MoD approach lies in the fact that model parameters are estimated based on a current operating point; hence estimation of locations or modes governed by autonomous discrete events is achieved automatically. The local MoD model is then converted into a mixed logical dynamical (MLD) system representation which can be used directly in a model predictive control (MPC) law for hybrid systems using multiple-degree-of-freedom tuning. The effectiveness of the proposed MoD predictive control algorithm for nonlinear hybrid systems is demonstrated on a hypothetical adaptive behavioral intervention problem inspired by Fast Track, a real-life preventive intervention for improving parental function and reducing conduct disorder in at-risk children. Simulation results demonstrate that the proposed algorithm can be useful for adaptive intervention problems exhibiting both nonlinear and hybrid character. PMID:21874087

  14. Predictive control strategies for wind turbine system based on permanent magnet synchronous generator.

    PubMed

    Maaoui-Ben Hassine, Ikram; Naouar, Mohamed Wissem; Mrabet-Bellaaj, Najiba

    2016-05-01

    In this paper, Model Predictive Control and Dead-beat predictive control strategies are proposed for the control of a PMSG based wind energy system. The proposed MPC considers the model of the converter-based system to forecast the possible future behavior of the controlled variables. It allows selecting the voltage vector to be applied that leads to a minimum error by minimizing a predefined cost function. The main features of the MPC are low current THD and robustness against parameters variations. The Dead-beat predictive control is based on the system model to compute the optimum voltage vector that ensures zero-steady state error. The optimum voltage vector is then applied through Space Vector Modulation (SVM) technique. The main advantages of the Dead-beat predictive control are low current THD and constant switching frequency. The proposed control techniques are presented and detailed for the control of back-to-back converter in a wind turbine system based on PMSG. Simulation results (under Matlab-Simulink software environment tool) and experimental results (under developed prototyping platform) are presented in order to show the performances of the considered control strategies. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Modeling and control of plasma rotation and βn for NSTX-U using Neoclassical Toroidal Viscosity and Neutral Beam Injection

    NASA Astrophysics Data System (ADS)

    Goumiri, Imene; Rowley, Clarence; Sabbagh, Steven; Gates, David; Gerhardt, Stefan; Boyer, Mark

    2015-11-01

    A model-based system is presented allowing control of the plasma rotation profile in a magnetically confined toroidal fusion device to maintain plasma stability for long pulse operation. The analysis, using NSTX data and NSTX-U TRANSP simulations, is aimed at controlling plasma rotation using momentum from six injected neutral beams and neoclassical toroidal viscosity generated by three-dimensional applied magnetic fields as actuators. Based on the momentum diffusion and torque balance model obtained, a feedback controller is designed and predictive simulations using TRANSP will be presented. Robustness of the model and the rotation controller will be discussed.

  16. SU-F-R-51: Radiomics in CT Perfusion Maps of Head and Neck Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nesteruk, M; Riesterer, O; Veit-Haibach, P

    2016-06-15

    Purpose: The aim of this study was to test the predictive value of radiomics features of CT perfusion (CTP) for tumor control, based on a preselection of radiomics features in a robustness study. Methods: 11 patients with head and neck cancer (HNC) and 11 patients with lung cancer were included in the robustness study to preselect stable radiomics parameters. Data from 36 HNC patients treated with definitive radiochemotherapy (median follow-up 30 months) was used to build a predictive model based on these parameters. All patients underwent pre-treatment CTP. 315 texture parameters were computed for three perfusion maps: blood volume, bloodmore » flow and mean transit time. The variability of texture parameters was tested with respect to non-standardizable perfusion computation factors (noise level and artery contouring) using intraclass correlation coefficients (ICC). The parameter with the highest ICC in the correlated group of parameters (inter-parameter Spearman correlations) was tested for its predictive value. The final model to predict tumor control was built using multivariate Cox regression analysis with backward selection of the variables. For comparison, a predictive model based on tumor volume was created. Results: Ten parameters were found to be stable in both HNC and lung cancer regarding potentially non-standardizable factors after the correction for inter-parameter correlations. In the multivariate backward selection of the variables, blood flow entropy showed a highly significant impact on tumor control (p=0.03) with concordance index (CI) of 0.76. Blood flow entropy was significantly lower in the patient group with controlled tumors at 18 months (p<0.1). The new model showed a higher concordance index compared to the tumor volume model (CI=0.68). Conclusion: The preselection of variables in the robustness study allowed building a predictive radiomics-based model of tumor control in HNC despite a small patient cohort. This model was found to be superior to the volume-based model. The project was supported by the KFSP Tumor Oxygenation of the University of Zurich, by a grant of the Center for Clinical Research, University and University Hospital Zurich and by a research grant from Merck (Schweiz) AG.« less

  17. Using Mobile Phone Data to Predict the Spatial Spread of Cholera

    PubMed Central

    Bengtsson, Linus; Gaudart, Jean; Lu, Xin; Moore, Sandra; Wetter, Erik; Sallah, Kankoe; Rebaudet, Stanislas; Piarroux, Renaud

    2015-01-01

    Effective response to infectious disease epidemics requires focused control measures in areas predicted to be at high risk of new outbreaks. We aimed to test whether mobile operator data could predict the early spatial evolution of the 2010 Haiti cholera epidemic. Daily case data were analysed for 78 study areas from October 16 to December 16, 2010. Movements of 2.9 million anonymous mobile phone SIM cards were used to create a national mobility network. Two gravity models of population mobility were implemented for comparison. Both were optimized based on the complete retrospective epidemic data, available only after the end of the epidemic spread. Risk of an area experiencing an outbreak within seven days showed strong dose-response relationship with the mobile phone-based infectious pressure estimates. The mobile phone-based model performed better (AUC 0.79) than the retrospectively optimized gravity models (AUC 0.66 and 0.74, respectively). Infectious pressure at outbreak onset was significantly correlated with reported cholera cases during the first ten days of the epidemic (p < 0.05). Mobile operator data is a highly promising data source for improving preparedness and response efforts during cholera outbreaks. Findings may be particularly important for containment efforts of emerging infectious diseases, including high-mortality influenza strains. PMID:25747871

  18. Using mobile phone data to predict the spatial spread of cholera.

    PubMed

    Bengtsson, Linus; Gaudart, Jean; Lu, Xin; Moore, Sandra; Wetter, Erik; Sallah, Kankoe; Rebaudet, Stanislas; Piarroux, Renaud

    2015-03-09

    Effective response to infectious disease epidemics requires focused control measures in areas predicted to be at high risk of new outbreaks. We aimed to test whether mobile operator data could predict the early spatial evolution of the 2010 Haiti cholera epidemic. Daily case data were analysed for 78 study areas from October 16 to December 16, 2010. Movements of 2.9 million anonymous mobile phone SIM cards were used to create a national mobility network. Two gravity models of population mobility were implemented for comparison. Both were optimized based on the complete retrospective epidemic data, available only after the end of the epidemic spread. Risk of an area experiencing an outbreak within seven days showed strong dose-response relationship with the mobile phone-based infectious pressure estimates. The mobile phone-based model performed better (AUC 0.79) than the retrospectively optimized gravity models (AUC 0.66 and 0.74, respectively). Infectious pressure at outbreak onset was significantly correlated with reported cholera cases during the first ten days of the epidemic (p < 0.05). Mobile operator data is a highly promising data source for improving preparedness and response efforts during cholera outbreaks. Findings may be particularly important for containment efforts of emerging infectious diseases, including high-mortality influenza strains.

  19. Determinants of responsibility for health, spiritual health and interpersonal relationship based on theory of planned behavior in high school girl students.

    PubMed

    Rezazadeh, Afsaneh; Solhi, Mahnaz; Azam, Kamal

    2015-01-01

    Adolescence is a sensitive period of acquiring normal and abnormal habits for all oflife. The study investigates determinants of responsibility for health, spiritual health and interpersonal relations and predictive factors based on the theory of planned behavior in high school girl students in Tabriz. In this Cross-sectional study, 340 students were selected thorough multi-stage sampling. An author-made questionnaire based on standard questionnaires of Health Promotion and Lifestyle II (HPLPII), spiritual health standards (Palutzian & Ellison) and components of the theory of planned behavior (attitudes, subjective norms, perceived behavioral control, and behavioral intention) was used for data collection. The questionnaire was validated in a pilot study. Data were analyzed using SPSS v.15 and descriptive and analytical tests (Chi-square test, Pearson correlation co-efficient and liner regression test in backward method). Students' responsibility for health, spiritual health, interpersonal relationships, and concepts of theory of planned behavior was moderate. We found a significant positive correlation (p<0/001) among all concepts of theory of planned behavior. Attitude and perceived behavioral control predicted 35% of intention of behavioral change (p<0.001). Attitude, subjective norms, and perceived behavioral control predicted 74% of behavioral change in accountability for health (p<0.0001), 56% for behavioral change in spiritual health (p<0.0001) and 63% for behavioral change in interpersonal relationship (p<0.0001). Status of responsibility for health, spiritual health and interpersonal relationships of students was moderate. Hence, behavioral intention and its determinants such as perceived behavioral control should be noted in promoting intervention programs.

  20. Deep Learning-Based Data Forgery Detection in Automatic Generation Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Fengli; Li, Qinghua

    Automatic Generation Control (AGC) is a key control system in the power grid. It is used to calculate the Area Control Error (ACE) based on frequency and tie-line power flow between balancing areas, and then adjust power generation to maintain the power system frequency in an acceptable range. However, attackers might inject malicious frequency or tie-line power flow measurements to mislead AGC to do false generation correction which will harm the power grid operation. Such attacks are hard to be detected since they do not violate physical power system models. In this work, we propose algorithms based on Neural Networkmore » and Fourier Transform to detect data forgery attacks in AGC. Different from the few previous work that rely on accurate load prediction to detect data forgery, our solution only uses the ACE data already available in existing AGC systems. In particular, our solution learns the normal patterns of ACE time series and detects abnormal patterns caused by artificial attacks. Evaluations on the real ACE dataset show that our methods have high detection accuracy.« less

  1. Prediction and Informative Risk Factor Selection of Bone Diseases.

    PubMed

    Li, Hui; Li, Xiaoyi; Ramanathan, Murali; Zhang, Aidong

    2015-01-01

    With the booming of healthcare industry and the overwhelming amount of electronic health records (EHRs) shared by healthcare institutions and practitioners, we take advantage of EHR data to develop an effective disease risk management model that not only models the progression of the disease, but also predicts the risk of the disease for early disease control or prevention. Existing models for answering these questions usually fall into two categories: the expert knowledge based model or the handcrafted feature set based model. To fully utilize the whole EHR data, we will build a framework to construct an integrated representation of features from all available risk factors in the EHR data and use these integrated features to effectively predict osteoporosis and bone fractures. We will also develop a framework for informative risk factor selection of bone diseases. A pair of models for two contrast cohorts (e.g., diseased patients versus non-diseased patients) will be established to discriminate their characteristics and find the most informative risk factors. Several empirical results on a real bone disease data set show that the proposed framework can successfully predict bone diseases and select informative risk factors that are beneficial and useful to guide clinical decisions.

  2. Arterial spin labeling-based Z-maps have high specificity and positive predictive value for neurodegenerative dementia compared to FDG-PET.

    PubMed

    Fällmar, David; Haller, Sven; Lilja, Johan; Danfors, Torsten; Kilander, Lena; Tolboom, Nelleke; Egger, Karl; Kellner, Elias; Croon, Philip M; Verfaillie, Sander C J; van Berckel, Bart N M; Ossenkoppele, Rik; Barkhof, Frederik; Larsson, Elna-Marie

    2017-10-01

    Cerebral perfusion analysis based on arterial spin labeling (ASL) MRI has been proposed as an alternative to FDG-PET in patients with neurodegenerative disease. Z-maps show normal distribution values relating an image to a database of controls. They are routinely used for FDG-PET to demonstrate disease-specific patterns of hypometabolism at the individual level. This study aimed to compare the performance of Z-maps based on ASL to FDG-PET. Data were combined from two separate sites, each cohort consisting of patients with Alzheimer's disease (n = 18 + 7), frontotemporal dementia (n = 12 + 8) and controls (n = 9 + 29). Subjects underwent pseudocontinuous ASL and FDG-PET. Z-maps were created for each subject and modality. Four experienced physicians visually assessed the 166 Z-maps in random order, blinded to modality and diagnosis. Discrimination of patients versus controls using ASL-based Z-maps yielded high specificity (84%) and positive predictive value (80%), but significantly lower sensitivity compared to FDG-PET-based Z-maps (53% vs. 96%, p < 0.001). Among true-positive cases, correct diagnoses were made in 76% (ASL) and 84% (FDG-PET) (p = 0.168). ASL-based Z-maps can be used for visual assessment of neurodegenerative dementia with high specificity and positive predictive value, but with inferior sensitivity compared to FDG-PET. • ASL-based Z-maps yielded high specificity and positive predictive value in neurodegenerative dementia. • ASL-based Z-maps had significantly lower sensitivity compared to FDG-PET-based Z-maps. • FDG-PET might be reserved for ASL-negative cases where clinical suspicion persists. • Findings were similar at two study sites.

  3. Satellite freeze forecast system: Executive summary

    NASA Technical Reports Server (NTRS)

    Martsolf, J. D. (Principal Investigator)

    1983-01-01

    A satellite-based temperature monitoring and prediction system consisting of a computer controlled acquisition, processing, and display system and the ten automated weather stations called by that computer was developed and transferred to the national weather service. This satellite freeze forecasting system (SFFS) acquires satellite data from either one of two sources, surface data from 10 sites, displays the observed data in the form of color-coded thermal maps and in tables of automated weather station temperatures, computes predicted thermal maps when requested and displays such maps either automatically or manually, archives the data acquired, and makes comparisons with historical data. Except for the last function, SFFS handles these tasks in a highly automated fashion if the user so directs. The predicted thermal maps are the result of two models, one a physical energy budget of the soil and atmosphere interface and the other a statistical relationship between the sites at which the physical model predicts temperatures and each of the pixels of the satellite thermal map.

  4. Smart EV Energy Management System to Support Grid Services

    NASA Astrophysics Data System (ADS)

    Wang, Bin

    Under smart grid scenarios, the advanced sensing and metering technologies have been applied to the legacy power grid to improve the system observability and the real-time situational awareness. Meanwhile, there is increasing amount of distributed energy resources (DERs), such as renewable generations, electric vehicles (EVs) and battery energy storage system (BESS), etc., being integrated into the power system. However, the integration of EVs, which can be modeled as controllable mobile energy devices, brings both challenges and opportunities to the grid planning and energy management, due to the intermittency of renewable generation, uncertainties of EV driver behaviors, etc. This dissertation aims to solve the real-time EV energy management problem in order to improve the overall grid efficiency, reliability and economics, using online and predictive optimization strategies. Most of the previous research on EV energy management strategies and algorithms are based on simplified models with unrealistic assumptions that the EV charging behaviors are perfectly known or following known distributions, such as the arriving time, leaving time and energy consumption values, etc. These approaches fail to obtain the optimal solutions in real-time because of the system uncertainties. Moreover, there is lack of data-driven strategy that performs online and predictive scheduling for EV charging behaviors under microgrid scenarios. Therefore, we develop an online predictive EV scheduling framework, considering uncertainties of renewable generation, building load and EV driver behaviors, etc., based on real-world data. A kernel-based estimator is developed to predict the charging session parameters in real-time with improved estimation accuracy. The efficacy of various optimization strategies that are supported by this framework, including valley-filling, cost reduction, event-based control, etc., has been demonstrated. In addition, the existing simulation-based approaches do not consider a variety of practical concerns of implementing such a smart EV energy management system, including the driver preferences, communication protocols, data models, and customized integration of existing standards to provide grid services. Therefore, this dissertation also solves these issues by designing and implementing a scalable system architecture to capture the user preferences, enable multi-layer communication and control, and finally improve the system reliability and interoperability.

  5. Condensed-Matter Physics.

    ERIC Educational Resources Information Center

    Hirsch, Jorge E.; Scalapino, Douglas J.

    1983-01-01

    Discusses ways computers are being used in condensed-matter physics by experimenters and theorists. Experimenters use them to control experiments and to gather and analyze data. Theorists use them for detailed predictions based on realistic models and for studies on systems not realizable in practice. (JN)

  6. Computational Study of Laminar Flow Control on a Subsonic Swept Wing Using Discrete Roughness Elements

    NASA Technical Reports Server (NTRS)

    Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan; Streett, Craig L.; Carpenter, Mark H.

    2011-01-01

    A combination of parabolized stability equations and secondary instability theory has been applied to a low-speed swept airfoil model with a chord Reynolds number of 7.15 million, with the goals of (i) evaluating this methodology in the context of transition prediction for a known configuration for which roughness based crossflow transition control has been demonstrated under flight conditions and (ii) of analyzing the mechanism of transition delay via the introduction of discrete roughness elements (DRE). Roughness based transition control involves controlled seeding of suitable, subdominant crossflow modes, so as to weaken the growth of naturally occurring, linearly more unstable crossflow modes. Therefore, a synthesis of receptivity, linear and nonlinear growth of stationary crossflow disturbances, and the ensuing development of high frequency secondary instabilities is desirable to understand the experimentally observed transition behavior. With further validation, such higher fidelity prediction methodology could be utilized to assess the potential for crossflow transition control at even higher Reynolds numbers, where experimental data is currently unavailable.

  7. Evaluating simplistic methods to understand current distributions and forecast distribution changes under climate change scenarios: An example with coypu (Myocastor coypus)

    USGS Publications Warehouse

    Jarnevich, Catherine S.; Young, Nicholas E; Sheffels, Trevor R.; Carter, Jacoby; Systma, Mark D.; Talbert, Colin

    2017-01-01

    Invasive species provide a unique opportunity to evaluate factors controlling biogeographic distributions; we can consider introduction success as an experiment testing suitability of environmental conditions. Predicting potential distributions of spreading species is not easy, and forecasting potential distributions with changing climate is even more difficult. Using the globally invasive coypu (Myocastor coypus [Molina, 1782]), we evaluate and compare the utility of a simplistic ecophysiological based model and a correlative model to predict current and future distribution. The ecophysiological model was based on winter temperature relationships with nutria survival. We developed correlative statistical models using the Software for Assisted Habitat Modeling and biologically relevant climate data with a global extent. We applied the ecophysiological based model to several global circulation model (GCM) predictions for mid-century. We used global coypu introduction data to evaluate these models and to explore a hypothesized physiological limitation, finding general agreement with known coypu distribution locally and globally and support for an upper thermal tolerance threshold. Global circulation model based model results showed variability in coypu predicted distribution among GCMs, but had general agreement of increasing suitable area in the USA. Our methods highlighted the dynamic nature of the edges of the coypu distribution due to climate non-equilibrium, and uncertainty associated with forecasting future distributions. Areas deemed suitable habitat, especially those on the edge of the current known range, could be used for early detection of the spread of coypu populations for management purposes. Combining approaches can be beneficial to predicting potential distributions of invasive species now and in the future and in exploring hypotheses of factors controlling distributions.

  8. Robust predictive cruise control for commercial vehicles

    NASA Astrophysics Data System (ADS)

    Junell, Jaime; Tumer, Kagan

    2013-10-01

    In this paper we explore learning-based predictive cruise control and the impact of this technology on increasing fuel efficiency for commercial trucks. Traditional cruise control is wasteful when maintaining a constant velocity over rolling hills. Predictive cruise control (PCC) is able to look ahead at future road conditions and solve for a cost-effective course of action. Model- based controllers have been implemented in this field but cannot accommodate many complexities of a dynamic environment which includes changing road and vehicle conditions. In this work, we focus on incorporating a learner into an already successful model- based predictive cruise controller in order to improve its performance. We explore back propagating neural networks to predict future errors then take actions to prevent said errors from occurring. The results show that this approach improves the model based PCC by up to 60% under certain conditions. In addition, we explore the benefits of classifier ensembles to further improve the gains due to intelligent cruise control.

  9. Model predictive control design for polytopic uncertain systems by synthesising multi-step prediction scenarios

    NASA Astrophysics Data System (ADS)

    Lu, Jianbo; Xi, Yugeng; Li, Dewei; Xu, Yuli; Gan, Zhongxue

    2018-01-01

    A common objective of model predictive control (MPC) design is the large initial feasible region, low online computational burden as well as satisfactory control performance of the resulting algorithm. It is well known that interpolation-based MPC can achieve a favourable trade-off among these different aspects. However, the existing results are usually based on fixed prediction scenarios, which inevitably limits the performance of the obtained algorithms. So by replacing the fixed prediction scenarios with the time-varying multi-step prediction scenarios, this paper provides a new insight into improvement of the existing MPC designs. The adopted control law is a combination of predetermined multi-step feedback control laws, based on which two MPC algorithms with guaranteed recursive feasibility and asymptotic stability are presented. The efficacy of the proposed algorithms is illustrated by a numerical example.

  10. Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base

    NASA Technical Reports Server (NTRS)

    Bryant, Richard B., Jr.; Carrelli, David J.

    2006-01-01

    The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.

  11. Spatial pattern formation facilitates eradication of infectious diseases

    PubMed Central

    Eisinger, Dirk; Thulke, Hans-Hermann

    2008-01-01

    Control of animal-born diseases is a major challenge faced by applied ecologists and public health managers. To improve cost-effectiveness, the effort required to control such pathogens needs to be predicted as accurately as possible. In this context, we reviewed the anti-rabies vaccination schemes applied around the world during the past 25 years. We contrasted predictions from classic approaches based on theoretical population ecology (which governs rabies control to date) with a newly developed individual-based model. Our spatially explicit approach allowed for the reproduction of pattern formation emerging from a pathogen's spread through its host population. We suggest that a much lower management effort could eliminate the disease than that currently in operation. This is supported by empirical evidence from historic field data. Adapting control measures to the new prediction would save one-third of resources in future control programmes. The reason for the lower prediction is the spatial structure formed by spreading infections in spatially arranged host populations. It is not the result of technical differences between models. Synthesis and applications. For diseases predominantly transmitted by neighbourhood interaction, our findings suggest that the emergence of spatial structures facilitates eradication. This may have substantial implications for the cost-effectiveness of existing disease management schemes, and suggests that when planning management strategies consideration must be given to methods that reflect the spatial nature of the pathogen–host system. PMID:18784795

  12. Implementation of a polling protocol for predicting celiac disease in videocapsule analysis.

    PubMed

    Ciaccio, Edward J; Tennyson, Christina A; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2013-07-16

    To investigate the presence of small intestinal villous atrophy in celiac disease patients from quantitative analysis of videocapsule image sequences. Nine celiac patient data with biopsy-proven villous atrophy and seven control patient data lacking villous atrophy were used for analysis. Celiacs had biopsy-proven disease with scores of Marsh II-IIIC except in the case of one hemophiliac patient. At four small intestinal levels (duodenal bulb, distal duodenum, jejunum, and ileum), video clips of length 200 frames (100 s) were analyzed. Twenty-four measurements were used for image characterization. These measurements were determined by quantitatively processing the videocapsule images via techniques for texture analysis, motility estimation, volumetric reconstruction using shape-from-shading principles, and image transformation. Each automated measurement method, or automaton, was polled as to whether or not villous atrophy was present in the small intestine, indicating celiac disease. Each automaton's vote was determined based upon an optimized parameter threshold level, with the threshold levels being determined from prior data. A prediction of villous atrophy was made if it received the majority of votes (≥ 13), while no prediction was made for tie votes (12-12). Thus each set of images was classified as being from either a celiac disease patient or from a control patient. Separated by intestinal level, the overall sensitivity of automata polling for predicting villous atrophy and hence celiac disease was 83.9%, while the specificity was 92.9%, and the overall accuracy of automata-based polling was 88.1%. The method of image transformation yielded the highest sensitivity at 93.8%, while the method of texture analysis using subbands had the highest specificity at 76.0%. Similar results of prediction were observed at all four small intestinal locations, but there were more tie votes at location 4 (ileum). Incorrect prediction which reduced sensitivity occurred for two celiac patients with Marsh type II pattern, which is characterized by crypt hyperplasia, but normal villous architecture. Pooled from all levels, there was a mean of 14.31 ± 3.28 automaton votes for celiac vs 9.67 ± 3.31 automaton votes for control when celiac patient data was analyzed (P < 0.001). Pooled from all levels, there was a mean of 9.71 ± 2.8128 automaton votes for celiac vs 14.32 ± 2.7931 automaton votes for control when control patient data was analyzed (P < 0.001). Automata-based polling may be useful to indicate presence of mucosal atrophy, indicative of celiac disease, across the entire small bowel, though this must be confirmed in a larger patient set. Since the method is quantitative and automated, it can potentially eliminate observer bias and enable the detection of subtle abnormality in patients lacking a clear diagnosis. Our paradigm was found to be more efficacious at proximal small intestinal locations, which may suggest a greater presence and severity of villous atrophy at proximal as compared with distal locations.

  13. Implementation of a polling protocol for predicting celiac disease in videocapsule analysis

    PubMed Central

    Ciaccio, Edward J; Tennyson, Christina A; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2013-01-01

    AIM: To investigate the presence of small intestinal villous atrophy in celiac disease patients from quantitative analysis of videocapsule image sequences. METHODS: Nine celiac patient data with biopsy-proven villous atrophy and seven control patient data lacking villous atrophy were used for analysis. Celiacs had biopsy-proven disease with scores of Marsh II-IIIC except in the case of one hemophiliac patient. At four small intestinal levels (duodenal bulb, distal duodenum, jejunum, and ileum), video clips of length 200 frames (100 s) were analyzed. Twenty-four measurements were used for image characterization. These measurements were determined by quantitatively processing the videocapsule images via techniques for texture analysis, motility estimation, volumetric reconstruction using shape-from-shading principles, and image transformation. Each automated measurement method, or automaton, was polled as to whether or not villous atrophy was present in the small intestine, indicating celiac disease. Each automaton’s vote was determined based upon an optimized parameter threshold level, with the threshold levels being determined from prior data. A prediction of villous atrophy was made if it received the majority of votes (≥ 13), while no prediction was made for tie votes (12-12). Thus each set of images was classified as being from either a celiac disease patient or from a control patient. RESULTS: Separated by intestinal level, the overall sensitivity of automata polling for predicting villous atrophy and hence celiac disease was 83.9%, while the specificity was 92.9%, and the overall accuracy of automata-based polling was 88.1%. The method of image transformation yielded the highest sensitivity at 93.8%, while the method of texture analysis using subbands had the highest specificity at 76.0%. Similar results of prediction were observed at all four small intestinal locations, but there were more tie votes at location 4 (ileum). Incorrect prediction which reduced sensitivity occurred for two celiac patients with Marsh type II pattern, which is characterized by crypt hyperplasia, but normal villous architecture. Pooled from all levels, there was a mean of 14.31 ± 3.28 automaton votes for celiac vs 9.67 ± 3.31 automaton votes for control when celiac patient data was analyzed (P < 0.001). Pooled from all levels, there was a mean of 9.71 ± 2.8128 automaton votes for celiac vs 14.32 ± 2.7931 automaton votes for control when control patient data was analyzed (P < 0.001). CONCLUSION: Automata-based polling may be useful to indicate presence of mucosal atrophy, indicative of celiac disease, across the entire small bowel, though this must be confirmed in a larger patient set. Since the method is quantitative and automated, it can potentially eliminate observer bias and enable the detection of subtle abnormality in patients lacking a clear diagnosis. Our paradigm was found to be more efficacious at proximal small intestinal locations, which may suggest a greater presence and severity of villous atrophy at proximal as compared with distal locations. PMID:23858375

  14. Predicting performance and safety based on driver fatigue.

    PubMed

    Mollicone, Daniel; Kan, Kevin; Mott, Chris; Bartels, Rachel; Bruneau, Steve; van Wollen, Matthew; Sparrow, Amy R; Van Dongen, Hans P A

    2018-04-02

    Fatigue causes decrements in vigilant attention and reaction time and is a major safety hazard in the trucking industry. There is a need to quantify the relationship between driver fatigue and safety in terms of operationally relevant measures. Hard-braking events are a suitable measure for this purpose as they are relatively easily observed and are correlated with collisions and near-crashes. We developed an analytic approach that predicts driver fatigue based on a biomathematical model and then estimates hard-braking events as a function of predicted fatigue, controlling for time of day to account for systematic variations in exposure (traffic density). The analysis used de-identified data from a previously published, naturalistic field study of 106 U.S. commercial motor vehicle (CMV) drivers. Data analyzed included drivers' official duty logs, sleep patterns measured around the clock using wrist actigraphy, and continuous recording of vehicle data to capture hard-braking events. The curve relating predicted fatigue to hard-braking events showed that the frequency of hard-braking events increased as predicted fatigue levels worsened. For each increment on the fatigue scale, the frequency of hard-braking events increased by 7.8%. The results provide proof of concept for a novel approach that predicts fatigue based on drivers' sleep patterns and estimates driving performance in terms of an operational metric related to safety. The approach can be translated to practice by CMV operators to achieve a fatigue risk profile specific to their own settings, in order to support data-driven decisions about fatigue countermeasures that cost-effectively deliver quantifiable operational benefits. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Data for Room Fire Model Comparisons

    PubMed Central

    Peacock, Richard D.; Davis, Sanford; Babrauskas, Vytenis

    1991-01-01

    With the development of models to predict fire growth and spread in buildings, there has been a concomitant evolution in the measurement and analysis of experimental data in real-scale fires. This report presents the types of analyses that can be used to examine large-scale room fire test data to prepare the data for comparison with zone-based fire models. Five sets of experimental data which can be used to test the limits of a typical two-zone fire model are detailed. A standard set of nomenclature describing the geometry of the building and the quantities measured in each experiment is presented. Availability of ancillary data (such as smaller-scale test results) is included. These descriptions, along with the data (available in computer-readable form) should allow comparisons between the experiment and model predictions. The base of experimental data ranges in complexity from one room tests with individual furniture items to a series of tests conducted in a multiple story hotel equipped with a zoned smoke control system. PMID:28184121

  16. Data for Room Fire Model Comparisons.

    PubMed

    Peacock, Richard D; Davis, Sanford; Babrauskas, Vytenis

    1991-01-01

    With the development of models to predict fire growth and spread in buildings, there has been a concomitant evolution in the measurement and analysis of experimental data in real-scale fires. This report presents the types of analyses that can be used to examine large-scale room fire test data to prepare the data for comparison with zone-based fire models. Five sets of experimental data which can be used to test the limits of a typical two-zone fire model are detailed. A standard set of nomenclature describing the geometry of the building and the quantities measured in each experiment is presented. Availability of ancillary data (such as smaller-scale test results) is included. These descriptions, along with the data (available in computer-readable form) should allow comparisons between the experiment and model predictions. The base of experimental data ranges in complexity from one room tests with individual furniture items to a series of tests conducted in a multiple story hotel equipped with a zoned smoke control system.

  17. Development and Evaluation of a Mobile Personalized Blood Glucose Prediction System for Patients With Gestational Diabetes Mellitus.

    PubMed

    Pustozerov, Evgenii; Popova, Polina; Tkachuk, Aleksandra; Bolotko, Yana; Yuldashev, Zafar; Grineva, Elena

    2018-01-09

    Personalized blood glucose (BG) prediction for diabetes patients is an important goal that is pursued by many researchers worldwide. Despite many proposals, only a few projects are dedicated to the development of complete recommender system infrastructures that incorporate BG prediction algorithms for diabetes patients. The development and implementation of such a system aided by mobile technology is of particular interest to patients with gestational diabetes mellitus (GDM), especially considering the significant importance of quickly achieving adequate BG control for these patients in a short period (ie, during pregnancy) and a typically higher acceptance rate for mobile health (mHealth) solutions for short- to midterm usage. This study was conducted with the objective of developing infrastructure comprising data processing algorithms, BG prediction models, and an appropriate mobile app for patients' electronic record management to guide BG prediction-based personalized recommendations for patients with GDM. A mobile app for electronic diary management was developed along with data exchange and continuous BG signal processing software. Both components were coupled to obtain the necessary data for use in the personalized BG prediction system. Necessary data on meals, BG measurements, and other events were collected via the implemented mobile app and continuous glucose monitoring (CGM) system processing software. These data were used to tune and evaluate the BG prediction model, which included an algorithm for dynamic coefficients tuning. In the clinical study, 62 participants (GDM: n=49; control: n=13) took part in a 1-week monitoring trial during which they used the mobile app to track their meals and self-measurements of BG and CGM system for continuous BG monitoring. The data on 909 food intakes and corresponding postprandial BG curves as well as the set of patients' characteristics (eg, glycated hemoglobin, body mass index [BMI], age, and lifestyle parameters) were selected as inputs for the BG prediction models. The prediction results by the models for BG levels 1 hour after food intake were root mean square error=0.87 mmol/L, mean absolute error=0.69 mmol/L, and mean absolute percentage error=12.8%, which correspond to an adequate prediction accuracy for BG control decisions. The mobile app for the collection and processing of relevant data, appropriate software for CGM system signals processing, and BG prediction models were developed for a recommender system. The developed system may help improve BG control in patients with GDM; this will be the subject of evaluation in a subsequent study. ©Evgenii Pustozerov, Polina Popova, Aleksandra Tkachuk, Yana Bolotko, Zafar Yuldashev, Elena Grineva. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 09.01.2018.

  18. Trust-Based Design of Human-Guided Algorithms

    DTIC Science & Technology

    2007-06-01

    Management Interdepartmental Program in Operations Research 17 May, 2007 Approved by: Laura Major Forest The Charles Stark Draper Laboratory...2. Information Analysis: predicting based on data, integrating and managing information, augmenting human operator perception and cognition. 3...allocation of automation by designers and managers . How an operator decides between manual and automatic control of a system is a necessary

  19. Strategies for control of sudden oak death in Humboldt County-informed guidance based on a parameterized epidemiological model

    Treesearch

    João A. N. Filipe; Richard C. Cobb; David M. Rizzo; Ross K. Meentemeyer; Christopher A. Gilligan

    2010-01-01

    Landscape- to regional-scale models of plant epidemics are direly needed to predict largescale impacts of disease and assess practicable options for control. While landscape heterogeneity is recognized as a major driver of disease dynamics, epidemiological models are rarely applied to realistic landscape conditions due to computational and data limitations. Here we...

  20. Parametric Optimization Of Gas Metal Arc Welding Process By Using Grey Based Taguchi Method On Aisi 409 Ferritic Stainless Steel

    NASA Astrophysics Data System (ADS)

    Ghosh, Nabendu; Kumar, Pradip; Nandi, Goutam

    2016-10-01

    Welding input process parameters play a very significant role in determining the quality of the welded joint. Only by properly controlling every element of the process can product quality be controlled. For better quality of MIG welding of Ferritic stainless steel AISI 409, precise control of process parameters, parametric optimization of the process parameters, prediction and control of the desired responses (quality indices) etc., continued and elaborate experiments, analysis and modeling are needed. A data of knowledge - base may thus be generated which may be utilized by the practicing engineers and technicians to produce good quality weld more precisely, reliably and predictively. In the present work, X-ray radiographic test has been conducted in order to detect surface and sub-surface defects of weld specimens made of Ferritic stainless steel. The quality of the weld has been evaluated in terms of yield strength, ultimate tensile strength and percentage of elongation of the welded specimens. The observed data have been interpreted, discussed and analyzed by considering ultimate tensile strength ,yield strength and percentage elongation combined with use of Grey-Taguchi methodology.

  1. Object-oriented model-driven control

    NASA Technical Reports Server (NTRS)

    Drysdale, A.; Mcroberts, M.; Sager, J.; Wheeler, R.

    1994-01-01

    A monitoring and control subsystem architecture has been developed that capitalizes on the use of modeldriven monitoring and predictive control, knowledge-based data representation, and artificial reasoning in an operator support mode. We have developed an object-oriented model of a Controlled Ecological Life Support System (CELSS). The model based on the NASA Kennedy Space Center CELSS breadboard data, tracks carbon, hydrogen, and oxygen, carbodioxide, and water. It estimates and tracks resorce-related parameters such as mass, energy, and manpower measurements such as growing area required for balance. We are developing an interface with the breadboard systems that is compatible with artificial reasoning. Initial work is being done on use of expert systems and user interface development. This paper presents an approach to defining universally applicable CELSS monitor and control issues, and implementing appropriate monitor and control capability for a particular instance: the KSC CELSS Breadboard Facility.

  2. Adaptive control of the packet transmission period with solar energy harvesting prediction in wireless sensor networks.

    PubMed

    Kwon, Kideok; Yang, Jihoon; Yoo, Younghwan

    2015-04-24

    A number of research works has studied packet scheduling policies in energy scavenging wireless sensor networks, based on the predicted amount of harvested energy. Most of them aim to achieve energy neutrality, which means that an embedded system can operate perpetually while meeting application requirements. Unlike other renewable energy sources, solar energy has the feature of distinct periodicity in the amount of harvested energy over a day. Using this feature, this paper proposes a packet transmission control policy that can enhance the network performance while keeping sensor nodes alive. Furthermore, this paper suggests a novel solar energy prediction method that exploits the relation between cloudiness and solar radiation. The experimental results and analyses show that the proposed packet transmission policy outperforms others in terms of the deadline miss rate and data throughput. Furthermore, the proposed solar energy prediction method can predict more accurately than others by 6.92%.

  3. Predicting the stochastic guiding of kinesin-driven microtubules in microfabricated tracks: a statistical-mechanics-based modeling approach.

    PubMed

    Lin, Chih-Tin; Meyhofer, Edgar; Kurabayashi, Katsuo

    2010-01-01

    Directional control of microtubule shuttles via microfabricated tracks is key to the development of controlled nanoscale mass transport by kinesin motor molecules. Here we develop and test a model to quantitatively predict the stochastic behavior of microtubule guiding when they mechanically collide with the sidewalls of lithographically patterned tracks. By taking into account appropriate probability distributions of microscopic states of the microtubule system, the model allows us to theoretically analyze the roles of collision conditions and kinesin surface densities in determining how the motion of microtubule shuttles is controlled. In addition, we experimentally observe the statistics of microtubule collision events and compare our theoretical prediction with experimental data to validate our model. The model will direct the design of future hybrid nanotechnology devices that integrate nanoscale transport systems powered by kinesin-driven molecular shuttles.

  4. Robustness and cognition in stabilization problem of dynamical systems based on asymptotic methods

    NASA Astrophysics Data System (ADS)

    Dubovik, S. A.; Kabanov, A. A.

    2017-01-01

    The problem of synthesis of stabilizing systems based on principles of cognitive (logical-dynamic) control for mobile objects used under uncertain conditions is considered. This direction in control theory is based on the principles of guaranteeing robust synthesis focused on worst-case scenarios of the controlled process. The guaranteeing approach is able to provide functioning of the system with the required quality and reliability only at sufficiently low disturbances and in the absence of large deviations from some regular features of the controlled process. The main tool for the analysis of large deviations and prediction of critical states here is the action functional. After the forecast is built, the choice of anti-crisis control is the supervisory control problem that optimizes the control system in a normal mode and prevents escape of the controlled process in critical states. An essential aspect of the approach presented here is the presence of a two-level (logical-dynamic) control: the input data are used not only for generating of synthesized feedback (local robust synthesis) in advance (off-line), but also to make decisions about the current (on-line) quality of stabilization in the global sense. An example of using the presented approach for the problem of development of the ship tilting prediction system is considered.

  5. Development and evaluation of a data-adaptive alerting algorithm for univariate temporal biosurveillance data.

    PubMed

    Elbert, Yevgeniy; Burkom, Howard S

    2009-11-20

    This paper discusses further advances in making robust predictions with the Holt-Winters forecasts for a variety of syndromic time series behaviors and introduces a control-chart detection approach based on these forecasts. Using three collections of time series data, we compare biosurveillance alerting methods with quantified measures of forecast agreement, signal sensitivity, and time-to-detect. The study presents practical rules for initialization and parameterization of biosurveillance time series. Several outbreak scenarios are used for detection comparison. We derive an alerting algorithm from forecasts using Holt-Winters-generalized smoothing for prospective application to daily syndromic time series. The derived algorithm is compared with simple control-chart adaptations and to more computationally intensive regression modeling methods. The comparisons are conducted on background data from both authentic and simulated data streams. Both types of background data include time series that vary widely by both mean value and cyclic or seasonal behavior. Plausible, simulated signals are added to the background data for detection performance testing at signal strengths calculated to be neither too easy nor too hard to separate the compared methods. Results show that both the sensitivity and the timeliness of the Holt-Winters-based algorithm proved to be comparable or superior to that of the more traditional prediction methods used for syndromic surveillance.

  6. FN-DFE: Fuzzy-Neural Data Fusion Engine for Enhanced State-Awareness of Resilient Hybrid Energy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ondrej Linda; Dumidu Wijayasekara; Milos Manic

    Resiliency and improved state-awareness of modern critical infrastructures, such as energy production and industrial systems, is becoming increasingly important. As control systems become increasingly complex, the number of inputs and outputs increase. Therefore, in order to maintain sufficient levels of state-awareness, a robust system state monitoring must be implemented that correctly identifies system behavior even when one or more sensors are faulty. Furthermore, as intelligent cyber adversaries become more capable, incorrect values may be fed to the operators. To address these needs, this paper proposes a Fuzzy-Neural Data Fusion Engine (FN-DFE) for resilient state-awareness of control systems. The designed FN-DFEmore » is composed of a three-layered system consisting of: 1) traditional threshold based alarms, 2) anomalous behavior detector using self-organizing fuzzy logic system, and 3) artificial neural network based system modeling and prediction. The improved control system state-awareness is achieved via fusing input data from multiple sources and combining them into robust anomaly indicators. In addition, the neural network based signal predictions are used to augment the resiliency of the system and provide coherent state-awareness despite temporary unavailability of sensory data. The proposed system was integrated and tested with a model of the Idaho National Laboratory’s (INL) hybrid energy system facility know as HYTEST. Experimental results demonstrate that the proposed FN-DFE provides timely plant performance monitoring and anomaly detection capabilities. It was shown that the system is capable of identifying intrusive behavior significantly earlier than conventional threshold based alarm systems.« less

  7. Finite element based model predictive control for active vibration suppression of a one-link flexible manipulator.

    PubMed

    Dubay, Rickey; Hassan, Marwan; Li, Chunying; Charest, Meaghan

    2014-09-01

    This paper presents a unique approach for active vibration control of a one-link flexible manipulator. The method combines a finite element model of the manipulator and an advanced model predictive controller to suppress vibration at its tip. This hybrid methodology improves significantly over the standard application of a predictive controller for vibration control. The finite element model used in place of standard modelling in the control algorithm provides a more accurate prediction of dynamic behavior, resulting in enhanced control. Closed loop control experiments were performed using the flexible manipulator, instrumented with strain gauges and piezoelectric actuators. In all instances, experimental and simulation results demonstrate that the finite element based predictive controller provides improved active vibration suppression in comparison with using a standard predictive control strategy. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Set-membership fault detection under noisy environment with application to the detection of abnormal aircraft control surface positions

    NASA Astrophysics Data System (ADS)

    El Houda Thabet, Rihab; Combastel, Christophe; Raïssi, Tarek; Zolghadri, Ali

    2015-09-01

    The paper develops a set membership detection methodology which is applied to the detection of abnormal positions of aircraft control surfaces. Robust and early detection of such abnormal positions is an important issue for early system reconfiguration and overall optimisation of aircraft design. In order to improve fault sensitivity while ensuring a high level of robustness, the method combines a data-driven characterisation of noise and a model-driven approach based on interval prediction. The efficiency of the proposed methodology is illustrated through simulation results obtained based on data recorded in several flight scenarios of a highly representative aircraft benchmark.

  9. An improved predictive functional control method with application to PMSM systems

    NASA Astrophysics Data System (ADS)

    Li, Shihua; Liu, Huixian; Fu, Wenshu

    2017-01-01

    In common design of prediction model-based control method, usually disturbances are not considered in the prediction model as well as the control design. For the control systems with large amplitude or strong disturbances, it is difficult to precisely predict the future outputs according to the conventional prediction model, and thus the desired optimal closed-loop performance will be degraded to some extent. To this end, an improved predictive functional control (PFC) method is developed in this paper by embedding disturbance information into the system model. Here, a composite prediction model is thus obtained by embedding the estimated value of disturbances, where disturbance observer (DOB) is employed to estimate the lumped disturbances. So the influence of disturbances on system is taken into account in optimisation procedure. Finally, considering the speed control problem for permanent magnet synchronous motor (PMSM) servo system, a control scheme based on the improved PFC method is designed to ensure an optimal closed-loop performance even in the presence of disturbances. Simulation and experimental results based on a hardware platform are provided to confirm the effectiveness of the proposed algorithm.

  10. Measured vs. Predicted Pedestal Pressure During RMP ELM Control in DIII-D

    NASA Astrophysics Data System (ADS)

    Zywicki, Bailey; Fenstermacher, Max; Groebner, Richard; Meneghini, Orso

    2017-10-01

    From database analysis of DIII-D plasmas with Resonant Magnetic Perturbations (RMPs) for ELM control, we will compare the experimental pedestal pressure (p_ped) to EPED code predictions and present the dependence of any p_ped differences from EPED on RMP parameters not included in the EPED model e.g. RMP field strength, toroidal and poloidal spectrum etc. The EPED code, based on Peeling-Ballooning and Kinetic Ballooning instability constraints, will also be used by ITER to predict the H-mode p_ped without RMPs. ITER plans to use RMPs as an effective ELM control method. The need to control ELMs in ITER is of the utmost priority, as it directly correlates to the lifetime of the plasma facing components. An accurate means of determining the impact of RMP ELM control on the p_ped is needed, because the device fusion power is strongly dependent on p_ped. With this new collection of data, we aim to provide guidance to predictions of the ITER pedestal during RMP ELM control that can be incorporated in a future predictive code. Work supported in part by US DoE under the Science Undergraduate Laboratory Internship (SULI) program and under DE-FC02-04ER54698, and DE-AC52-07NA27344.

  11. Population-centered Risk- and Evidence-based Dental Interprofessional Care Team (PREDICT): study protocol for a randomized controlled trial.

    PubMed

    Cunha-Cruz, Joana; Milgrom, Peter; Shirtcliff, R Michael; Bailit, Howard L; Huebner, Colleen E; Conrad, Douglas; Ludwig, Sharity; Mitchell, Melissa; Dysert, Jeanne; Allen, Gary; Scott, JoAnna; Mancl, Lloyd

    2015-06-20

    To improve the oral health of low-income children, innovations in dental delivery systems are needed, including community-based care, the use of expanded duty auxiliary dental personnel, capitation payments, and global budgets. This paper describes the protocol for PREDICT (Population-centered Risk- and Evidence-based Dental Interprofessional Care Team), an evaluation project to test the effectiveness of new delivery and payment systems for improving dental care and oral health. This is a parallel-group cluster randomized controlled trial. Fourteen rural Oregon counties with a publicly insured (Medicaid) population of 82,000 children (0 to 21 years old) and pregnant women served by a managed dental care organization are randomized into test and control counties. In the test intervention (PREDICT), allied dental personnel provide screening and preventive services in community settings and case managers serve as patient navigators to arrange referrals of children who need dentist services. The delivery system intervention is paired with a compensation system for high performance (pay-for-performance) with efficient performance monitoring. PREDICT focuses on the following: 1) identifying eligible children and gaining caregiver consent for services in community settings (for example, schools); 2) providing risk-based preventive and caries stabilization services efficiently at these settings; 3) providing curative care in dental clinics; and 4) incentivizing local delivery teams to meet performance benchmarks. In the control intervention, care is delivered in dental offices without performance incentives. The primary outcome is the prevalence of untreated dental caries. Other outcomes are related to process, structure and cost. Data are collected through patient and staff surveys, clinical examinations, and the review of health and administrative records. If effective, PREDICT is expected to substantially reduce disparities in dental care and oral health. PREDICT can be disseminated to other care organizations as publicly insured clients are increasingly served by large practice organizations. ClinicalTrials.gov NCT02312921 6 December 2014. The Robert Wood Johnson Foundation and Advantage Dental Services, LLC, are supporting the evaluation.

  12. The insertion of human dynamics models in the flight control loops of V/STOL research aircraft. Appendix 2: The optimal control model of a pilot in V/STOL aircraft control loops

    NASA Technical Reports Server (NTRS)

    Zipf, Mark E.

    1989-01-01

    An overview is presented of research work focussed on the design and insertion of classical models of human pilot dynamics within the flight control loops of V/STOL aircraft. The pilots were designed and configured for use in integrated control system research and design. The models of human behavior that were considered are: McRuer-Krendel (a single variable transfer function model); and Optimal Control Model (a multi-variable approach based on optimal control and stochastic estimation theory). These models attempt to predict human control response characteristics when confronted with compensatory tracking and state regulation tasks. An overview, mathematical description, and discussion of predictive limitations of the pilot models is presented. Design strategies and closed loop insertion configurations are introduced and considered for various flight control scenarios. Models of aircraft dynamics (both transfer function and state space based) are developed and discussed for their use in pilot design and application. Pilot design and insertion are illustrated for various flight control objectives. Results of pilot insertion within the control loops of two V/STOL research aricraft (Sikorski Black Hawk UH-60A, McDonnell Douglas Harrier II AV-8B) are presented and compared against actual pilot flight data. Conclusions are reached on the ability of the pilot models to adequately predict human behavior when confronted with similar control objectives.

  13. Time series analysis of malaria in Afghanistan: using ARIMA models to predict future trends in incidence.

    PubMed

    Anwar, Mohammad Y; Lewnard, Joseph A; Parikh, Sunil; Pitzer, Virginia E

    2016-11-22

    Malaria remains endemic in Afghanistan. National control and prevention strategies would be greatly enhanced through a better ability to forecast future trends in disease incidence. It is, therefore, of interest to develop a predictive tool for malaria patterns based on the current passive and affordable surveillance system in this resource-limited region. This study employs data from Ministry of Public Health monthly reports from January 2005 to September 2015. Malaria incidence in Afghanistan was forecasted using autoregressive integrated moving average (ARIMA) models in order to build a predictive tool for malaria surveillance. Environmental and climate data were incorporated to assess whether they improve predictive power of models. Two models were identified, each appropriate for different time horizons. For near-term forecasts, malaria incidence can be predicted based on the number of cases in the four previous months and 12 months prior (Model 1); for longer-term prediction, malaria incidence can be predicted using the rates 1 and 12 months prior (Model 2). Next, climate and environmental variables were incorporated to assess whether the predictive power of proposed models could be improved. Enhanced vegetation index was found to have increased the predictive accuracy of longer-term forecasts. Results indicate ARIMA models can be applied to forecast malaria patterns in Afghanistan, complementing current surveillance systems. The models provide a means to better understand malaria dynamics in a resource-limited context with minimal data input, yielding forecasts that can be used for public health planning at the national level.

  14. An Efficient Silent Data Corruption Detection Method with Error-Feedback Control and Even Sampling for HPC Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di, Sheng; Berrocal, Eduardo; Cappello, Franck

    The silent data corruption (SDC) problem is attracting more and more attentions because it is expected to have a great impact on exascale HPC applications. SDC faults are hazardous in that they pass unnoticed by hardware and can lead to wrong computation results. In this work, we formulate SDC detection as a runtime one-step-ahead prediction method, leveraging multiple linear prediction methods in order to improve the detection results. The contributions are twofold: (1) we propose an error feedback control model that can reduce the prediction errors for different linear prediction methods, and (2) we propose a spatial-data-based even-sampling method tomore » minimize the detection overheads (including memory and computation cost). We implement our algorithms in the fault tolerance interface, a fault tolerance library with multiple checkpoint levels, such that users can conveniently protect their HPC applications against both SDC errors and fail-stop errors. We evaluate our approach by using large-scale traces from well-known, large-scale HPC applications, as well as by running those HPC applications on a real cluster environment. Experiments show that our error feedback control model can improve detection sensitivity by 34-189% for bit-flip memory errors injected with the bit positions in the range [20,30], without any degradation on detection accuracy. Furthermore, memory size can be reduced by 33% with our spatial-data even-sampling method, with only a slight and graceful degradation in the detection sensitivity.« less

  15. Smartphone dependence classification using tensor factorization.

    PubMed

    Choi, Jingyun; Rho, Mi Jung; Kim, Yejin; Yook, In Hye; Yu, Hwanjo; Kim, Dai-Jin; Choi, In Young

    2017-01-01

    Excessive smartphone use causes personal and social problems. To address this issue, we sought to derive usage patterns that were directly correlated with smartphone dependence based on usage data. This study attempted to classify smartphone dependence using a data-driven prediction algorithm. We developed a mobile application to collect smartphone usage data. A total of 41,683 logs of 48 smartphone users were collected from March 8, 2015, to January 8, 2016. The participants were classified into the control group (SUC) or the addiction group (SUD) using the Korean Smartphone Addiction Proneness Scale for Adults (S-Scale) and a face-to-face offline interview by a psychiatrist and a clinical psychologist (SUC = 23 and SUD = 25). We derived usage patterns using tensor factorization and found the following six optimal usage patterns: 1) social networking services (SNS) during daytime, 2) web surfing, 3) SNS at night, 4) mobile shopping, 5) entertainment, and 6) gaming at night. The membership vectors of the six patterns obtained a significantly better prediction performance than the raw data. For all patterns, the usage times of the SUD were much longer than those of the SUC. From our findings, we concluded that usage patterns and membership vectors were effective tools to assess and predict smartphone dependence and could provide an intervention guideline to predict and treat smartphone dependence based on usage data.

  16. The Evidential Basis of Decision Making in Plant Disease Management.

    PubMed

    Hughes, Gareth

    2017-08-04

    The evidential basis for disease management decision making is provided by data relating to risk factors. The decision process involves an assessment of the evidence leading to taking (or refraining from) action on the basis of a prediction. The primary objective of the decision process is to identify-at the time the decision is made-the control action that provides the best predicted end-of-season outcome, calculated in terms of revenue or another appropriate metric. Data relating to disease risk factors may take a variety of forms (e.g., continuous, discrete, categorical) on measurement scales in a variety of units. Log 10 -likelihood ratios provide a principled basis for the accumulation of evidence based on such data and allow predictions to be made via Bayesian updating of prior probabilities.

  17. A Pathway Based Classification Method for Analyzing Gene Expression for Alzheimer's Disease Diagnosis.

    PubMed

    Voyle, Nicola; Keohane, Aoife; Newhouse, Stephen; Lunnon, Katie; Johnston, Caroline; Soininen, Hilkka; Kloszewska, Iwona; Mecocci, Patrizia; Tsolaki, Magda; Vellas, Bruno; Lovestone, Simon; Hodges, Angela; Kiddle, Steven; Dobson, Richard Jb

    2016-01-01

    Recent studies indicate that gene expression levels in blood may be able to differentiate subjects with Alzheimer's disease (AD) from normal elderly controls and mild cognitively impaired (MCI) subjects. However, there is limited replicability at the single marker level. A pathway-based interpretation of gene expression may prove more robust. This study aimed to investigate whether a case/control classification model built on pathway level data was more robust than a gene level model and may consequently perform better in test data. The study used two batches of gene expression data from the AddNeuroMed (ANM) and Dementia Case Registry (DCR) cohorts. Our study used Illumina Human HT-12 Expression BeadChips to collect gene expression from blood samples. Random forest modeling with recursive feature elimination was used to predict case/control status. Age and APOE ɛ4 status were used as covariates for all analysis. Gene and pathway level models performed similarly to each other and to a model based on demographic information only. Any potential increase in concordance from the novel pathway level approach used here has not lead to a greater predictive ability in these datasets. However, we have only tested one method for creating pathway level scores. Further, we have been able to benchmark pathways against genes in datasets that had been extensively harmonized. Further work should focus on the use of alternative methods for creating pathway level scores, in particular those that incorporate pathway topology, and the use of an endophenotype based approach.

  18. Finite element thermal analysis of multispectral coatings for the ABL

    NASA Astrophysics Data System (ADS)

    Shah, Rashmi S.; Bettis, Jerry R.; Stewart, Alan F.; Bonsall, Lynn; Copland, James; Hughes, William; Echeverry, Juan C.

    1999-04-01

    The thermal response of a coated optical surface is an important consideration in the design of any high average power system. Finite element temperature distribution were calculated for both coating witness samples and calorimetry wafers and were compared to actual measured data under tightly controlled conditions. Coatings for ABL were deposited on various substrates including fused silica, ULE, Zerodur, and silicon. The witness samples were irradiate data high power levels at 1.315micrometers to evaluate laser damage thresholds and study absorption levels. Excellent agreement was obtained between temperature predictions and measured thermal response curves. When measured absorption values were not available, the code was used to predict coating absorption based on the measured temperature rise on the back surface. Using the finite element model, the damaging temperature rise can be predicted for a coating with known absorption based on run time, flux, and substrate material.

  19. Normalized coffin-manson plot in terms of a new life function based on stress relaxation under creep-fatigue conditions

    NASA Astrophysics Data System (ADS)

    Jeong, Chang Yeol; Nam, Soo Woo; Lim, Jong Dae

    2003-04-01

    A new life prediction function based on a model formulated in terms of stress relaxation during hold time under creep-fatigue conditions is proposed. From the idea that reduction in fatigue life with hold is due to the creep effect of stress relaxation that results in additional energy dissipation in the hysteresis loop, it is suggested that the relaxed stress range may be a creep-fatigue damage function. Creep-fatigue data from the present and other investigators are used to check the validity of the proposed life prediction equation. It is shown that the data satisfy the applicability of the life relation model. Accordingly, using this life prediction model, one may realize that all the Coffin-Manson plots at various levels of hold time in strain-controlled creep-fatigue tests can be normalized to make one straight line.

  20. A Nasal Brush-based Classifier of Asthma Identified by Machine Learning Analysis of Nasal RNA Sequence Data.

    PubMed

    Pandey, Gaurav; Pandey, Om P; Rogers, Angela J; Ahsen, Mehmet E; Hoffman, Gabriel E; Raby, Benjamin A; Weiss, Scott T; Schadt, Eric E; Bunyavanich, Supinda

    2018-06-11

    Asthma is a common, under-diagnosed disease affecting all ages. We sought to identify a nasal brush-based classifier of mild/moderate asthma. 190 subjects with mild/moderate asthma and controls underwent nasal brushing and RNA sequencing of nasal samples. A machine learning-based pipeline identified an asthma classifier consisting of 90 genes interpreted via an L2-regularized logistic regression classification model. This classifier performed with strong predictive value and sensitivity across eight test sets, including (1) a test set of independent asthmatic and control subjects profiled by RNA sequencing (positive and negative predictive values of 1.00 and 0.96, respectively; AUC of 0.994), (2) two independent case-control cohorts of asthma profiled by microarray, and (3) five cohorts with other respiratory conditions (allergic rhinitis, upper respiratory infection, cystic fibrosis, smoking), where the classifier had a low to zero misclassification rate. Following validation in large, prospective cohorts, this classifier could be developed into a nasal biomarker of asthma.

  1. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Error-Prone Model Derived from 1978-1979 Quality Control Study. Data Report. [Task 3.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; Kuchak, JoAnn

    An error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications was developed, based on interviews conducted with a quality control sample of 1,791 students during 1978-1979. The model was designed to identify corrective methods appropriate for different types of…

  2. Analytical design and evaluation of an active control system for helicopter vibration reduction and gust response alleviation

    NASA Technical Reports Server (NTRS)

    Taylor, R. B.; Zwicke, P. E.; Gold, P.; Miao, W.

    1980-01-01

    An analytical study was conducted to define the basic configuration of an active control system for helicopter vibration and gust response alleviation. The study culminated in a control system design which has two separate systems: narrow band loop for vibration reduction and wider band loop for gust response alleviation. The narrow band vibration loop utilizes the standard swashplate control configuration to input controller for the vibration loop is based on adaptive optimal control theory and is designed to adapt to any flight condition including maneuvers and transients. The prime characteristics of the vibration control system is its real time capability. The gust alleviation control system studied consists of optimal sampled data feedback gains together with an optimal one-step-ahead prediction. The prediction permits the estimation of the gust disturbance which can then be used to minimize the gust effects on the helicopter.

  3. Use of Linear Prediction Uncertainty Analysis to Guide Conditioning of Models Simulating Surface-Water/Groundwater Interactions

    NASA Astrophysics Data System (ADS)

    Hughes, J. D.; White, J.; Doherty, J.

    2011-12-01

    Linear prediction uncertainty analysis in a Bayesian framework was applied to guide the conditioning of an integrated surface water/groundwater model that will be used to predict the effects of groundwater withdrawals on surface-water and groundwater flows. Linear prediction uncertainty analysis is an effective approach for identifying (1) raw and processed data most effective for model conditioning prior to inversion, (2) specific observations and periods of time critically sensitive to specific predictions, and (3) additional observation data that would reduce model uncertainty relative to specific predictions. We present results for a two-dimensional groundwater model of a 2,186 km2 area of the Biscayne aquifer in south Florida implicitly coupled to a surface-water routing model of the actively managed canal system. The model domain includes 5 municipal well fields withdrawing more than 1 Mm3/day and 17 operable surface-water control structures that control freshwater releases from the Everglades and freshwater discharges to Biscayne Bay. More than 10 years of daily observation data from 35 groundwater wells and 24 surface water gages are available to condition model parameters. A dense parameterization was used to fully characterize the contribution of the inversion null space to predictive uncertainty and included bias-correction parameters. This approach allows better resolution of the boundary between the inversion null space and solution space. Bias-correction parameters (e.g., rainfall, potential evapotranspiration, and structure flow multipliers) absorb information that is present in structural noise that may otherwise contaminate the estimation of more physically-based model parameters. This allows greater precision in predictions that are entirely solution-space dependent, and reduces the propensity for bias in predictions that are not. Results show that application of this analysis is an effective means of identifying those surface-water and groundwater data, both raw and processed, that minimize predictive uncertainty, while simultaneously identifying the maximum solution-space dimensionality of the inverse problem supported by the data.

  4. Aircraft noise prediction program user's manual

    NASA Technical Reports Server (NTRS)

    Gillian, R. E.

    1982-01-01

    The Aircraft Noise Prediction Program (ANOPP) predicts aircraft noise with the best methods available. This manual is designed to give the user an understanding of the capabilities of ANOPP and to show how to formulate problems and obtain solutions by using these capabilities. Sections within the manual document basic ANOPP concepts, ANOPP usage, ANOPP functional modules, ANOPP control statement procedure library, and ANOPP permanent data base. appendixes to the manual include information on preparing job decks for the operating systems in use, error diagnostics and recovery techniques, and a glossary of ANOPP terms.

  5. ANOPP programmer's reference manual for the executive System. [aircraft noise prediction program

    NASA Technical Reports Server (NTRS)

    Gillian, R. E.; Brown, C. G.; Bartlett, R. W.; Baucom, P. H.

    1977-01-01

    Documentation for the Aircraft Noise Prediction Program as of release level 01/00/00 is presented in a manual designed for programmers having a need for understanding the internal design and logical concepts of the executive system software. Emphasis is placed on providing sufficient information to modify the system for enhancements or error correction. The ANOPP executive system includes software related to operating system interface, executive control, and data base management for the Aircraft Noise Prediction Program. It is written in Fortran IV for use on CDC Cyber series of computers.

  6. Analysis of Air Traffic Track Data with the AutoBayes Synthesis System

    NASA Technical Reports Server (NTRS)

    Schumann, Johann Martin Philip; Cate, Karen; Lee, Alan G.

    2010-01-01

    The Next Generation Air Traffic System (NGATS) is aiming to provide substantial computer support for the air traffic controllers. Algorithms for the accurate prediction of aircraft movements are of central importance for such software systems but trajectory prediction has to work reliably in the presence of unknown parameters and uncertainties. We are using the AutoBayes program synthesis system to generate customized data analysis algorithms that process large sets of aircraft radar track data in order to estimate parameters and uncertainties. In this paper, we present, how the tasks of finding structure in track data, estimation of important parameters in climb trajectories, and the detection of continuous descent approaches can be accomplished with compact task-specific AutoBayes specifications. We present an overview of the AutoBayes architecture and describe, how its schema-based approach generates customized analysis algorithms, documented C/C++ code, and detailed mathematical derivations. Results of experiments with actual air traffic control data are discussed.

  7. Trajectory-Based Complexity (TBX): A Modified Aircraft Count to Predict Sector Complexity During Trajectory-Based Operations

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas; Lee, Paul U.

    2011-01-01

    In this paper we introduce a new complexity metric to predict -in real-time- sector complexity for trajectory-based operations (TBO). TBO will be implemented in the Next Generation Air Transportation System (NextGen). Trajectory-Based Complexity (TBX) is a modified aircraft count that can easily be computed and communicated in a TBO environment based upon predictions of aircraft and weather trajectories. TBX is scaled to aircraft count and represents an alternate and additional means to manage air traffic demand and capacity with more consideration of dynamic factors such as weather, aircraft equipage or predicted separation violations, as well as static factors such as sector size. We have developed and evaluated TBX in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center during human-in-the-loop studies of trajectory-based concepts since 2009. In this paper we will describe the TBX computation in detail and present the underlying algorithm. Next, we will describe the specific TBX used in an experiment at NASA's AOL. We will evaluate the performance of this metric using data collected during a controller-inthe- loop study on trajectory-based operations at different equipage levels. In this study controllers were prompted at regular intervals to rate their current workload on a numeric scale. When comparing this real-time workload rating to the TBX values predicted for these time periods we demonstrate that TBX is a better predictor of workload than aircraft count. Furthermore we demonstrate that TBX is well suited to be used for complexity management in TBO and can easily be adjusted to future operational concepts.

  8. Predictive control and estimation algorithms for the NASA/JPL 70-meter antennas

    NASA Technical Reports Server (NTRS)

    Gawronski, W.

    1991-01-01

    A modified output prediction procedure and a new controller design is presented based on the predictive control law. Also, a new predictive estimator is developed to complement the controller and to enhance system performance. The predictive controller is designed and applied to the tracking control of the Deep Space Network 70 m antennas. Simulation results show significant improvement in tracking performance over the linear quadratic controller and estimator presently in use.

  9. Sex differences in theory-based predictors of leisure time physical activity in a population-based sample of adults with spinal cord injury.

    PubMed

    Stapleton, Jessie N; Martin Ginis, Kathleen A

    2014-09-01

    To examine sex differences in theory-based predictors of leisure time physical activity (LTPA) among men and women with spinal cord injury, and secondarily, to identify factors that might explain any sex differences in social cognitions. A secondary analysis of Study of Health and Activity in People with Spinal Cord Injury survey data. Community. Community-dwelling men (n=536) and women (n=164) recruited from 4 rehabilitation and research centers. Not applicable. Subjective norms, attitudes, barrier self-efficacy, perceived controllability (PC), and intentions. Men had stronger PC and barrier self-efficacy than women. Hierarchical regression analyses revealed that social support significantly predicted PC for both sexes, and health, pain, and physical independence also significantly predicted PC for men. Social support, health, and pain significantly predicted barrier self-efficacy for men. Social support was the only significant predictor of barrier self-efficacy for women. Women felt significantly less control over their physical activity behavior and had lower confidence to overcome barriers to physical activity than did men. Although social support predicted PC and barrier self-efficacy in both men and women, men seemed to take additional factors into consideration when formulating their control beliefs for LTPA. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  10. Simplification of a light-based model for estimating final internode length in greenhouse cucumber canopies.

    PubMed

    Kahlen, Katrin; Stützel, Hartmut

    2011-10-01

    Light quantity and quality affect internode lengths in cucumber (Cucumis sativus), whereby leaf area and the optical properties of the leaves mainly control light quality within a cucumber plant community. This modelling study aimed at providing a simple, non-destructive method to predict final internode lengths (FILs) using light quantity and leaf area data. Several simplifications of a light quantity and quality sensitive model for estimating FILs in cucumber have been tested. The direct simplifications substitute the term for the red : far-red (R : FR) ratios, by a term for (a) the leaf area index (LAI, m(2) m(-2)) or (b) partial LAI, the cumulative leaf area per m(2) ground, where leaf area per m(2) ground is accumulated from the top of each plant until a number, n, of leaves per plant is reached. The indirect simplifications estimate the input R : FR ratio based on partial leaf area and plant density. In all models, simulated FILs were in line with the measured FILs over various canopy architectures and light conditions, but the prediction quality varied. The indirect simplification based on leaf area of ten leaves revealed the best fit with measured data. Its prediction quality was even higher than of the original model. This study showed that for vertically trained cucumber plants, leaf area data can substitute local light quality data for estimating FIL data. In unstressed canopies, leaf area over the upper ten ranks seems to represent the feedback of the growing architecture on internode elongation with respect to light quality. This highlights the role of this domain of leaves as the primary source for the specific R : FR signal controlling the final length of an internode and could therefore guide future research on up-scaling local processes to the crop level.

  11. The Interaction between Vector Life History and Short Vector Life in Vector-Borne Disease Transmission and Control.

    PubMed

    Brand, Samuel P C; Rock, Kat S; Keeling, Matt J

    2016-04-01

    Epidemiological modelling has a vital role to play in policy planning and prediction for the control of vectors, and hence the subsequent control of vector-borne diseases. To decide between competing policies requires models that can generate accurate predictions, which in turn requires accurate knowledge of vector natural histories. Here we highlight the importance of the distribution of times between life-history events, using short-lived midge species as an example. In particular we focus on the distribution of the extrinsic incubation period (EIP) which determines the time between infection and becoming infectious, and the distribution of the length of the gonotrophic cycle which determines the time between successful bites. We show how different assumptions for these periods can radically change the basic reproductive ratio (R0) of an infection and additionally the impact of vector control on the infection. These findings highlight the need for detailed entomological data, based on laboratory experiments and field data, to correctly construct the next-generation of policy-informing models.

  12. Drug disposition and modelling before and after gastric bypass: immediate and controlled-release metoprolol formulations.

    PubMed

    Gesquiere, Ina; Darwich, Adam S; Van der Schueren, Bart; de Hoon, Jan; Lannoo, Matthias; Matthys, Christophe; Rostami, Amin; Foulon, Veerle; Augustijns, Patrick

    2015-11-01

    The aim of the present study was to evaluate the disposition of metoprolol after oral administration of an immediate and controlled-release formulation before and after Roux-en-Y gastric bypass (RYGB) surgery in the same individuals and to validate a physiologically based pharmacokinetic (PBPK) model for predicting oral bioavailability following RYGB. A single-dose pharmacokinetic study of metoprolol tartrate 200 mg immediate release and controlled release was performed in 14 volunteers before and 6-8 months after RYGB. The observed data were compared with predicted results from the PBPK modelling and simulation of metoprolol tartrate immediate and controlled-release formulation before and after RYGB. After administration of metoprolol immediate and controlled release, no statistically significant difference in the observed area under the curve (AUC(0-24 h)) was shown, although a tendency towards an increased oral exposure could be observed as the AUC(0-24 h) was 32.4% [95% confidence interval (CI) 1.36, 63.5] and 55.9% (95% CI 5.73, 106) higher following RYGB for the immediate and controlled-release formulation, respectively. This could be explained by surgery-related weight loss and a reduced presystemic biotransformation in the proximal gastrointestinal tract. The PBPK values predicted by modelling and simulation were similar to the observed data, confirming its validity. The disposition of metoprolol from an immediate-release and a controlled-release formulation was not significantly altered after RYGB; there was a tendency to an increase, which was also predicted by PBPK modelling and simulation. © 2015 The British Pharmacological Society.

  13. Drug disposition and modelling before and after gastric bypass: immediate and controlled-release metoprolol formulations

    PubMed Central

    Gesquiere, Ina; Darwich, Adam S; Van der Schueren, Bart; de Hoon, Jan; Lannoo, Matthias; Matthys, Christophe; Rostami, Amin; Foulon, Veerle; Augustijns, Patrick

    2015-01-01

    Aims The aim of the present study was to evaluate the disposition of metoprolol after oral administration of an immediate and controlled-release formulation before and after Roux-en-Y gastric bypass (RYGB) surgery in the same individuals and to validate a physiologically based pharmacokinetic (PBPK) model for predicting oral bioavailability following RYGB. Methods A single-dose pharmacokinetic study of metoprolol tartrate 200 mg immediate release and controlled release was performed in 14 volunteers before and 6–8 months after RYGB. The observed data were compared with predicted results from the PBPK modelling and simulation of metoprolol tartrate immediate and controlled-release formulation before and after RYGB. Results After administration of metoprolol immediate and controlled release, no statistically significant difference in the observed area under the curve (AUC0–24 h) was shown, although a tendency towards an increased oral exposure could be observed as the AUC0–24 h was 32.4% [95% confidence interval (CI) 1.36, 63.5] and 55.9% (95% CI 5.73, 106) higher following RYGB for the immediate and controlled-release formulation, respectively. This could be explained by surgery-related weight loss and a reduced presystemic biotransformation in the proximal gastrointestinal tract. The PBPK values predicted by modelling and simulation were similar to the observed data, confirming its validity. Conclusions The disposition of metoprolol from an immediate-release and a controlled-release formulation was not significantly altered after RYGB; there was a tendency to an increase, which was also predicted by PBPK modelling and simulation. PMID:25917170

  14. Core Engine Noise Control Program. Volume III. Prediction Methods

    DTIC Science & Technology

    1974-08-01

    turbofan engines , and Method (C) is based on an analytical description of viscous wake interaction between adjoining blade rows. Turbine Tone/ Jet ...levels for turbojet , turboshaft and turbofan engines . The turbojet data correlate highest and the turbofan data correlate lowest. Turbine Noise Noise...different engines were examined for combustor, jet and fan noise. Tnree turbojet , two turboshaft and two turbofan

  15. The moderating effect of gender on the relationship between coping and suicide risk in a Portuguese community sample of adults.

    PubMed

    Campos, Rui C; Holden, Ronald R; Costa, Fátima; Oliveira, Ana Rita; Abreu, Marta; Fresca, Natália

    2017-02-01

    Background and aims(s): The study evaluated the contribution of coping strategies, based on the Toulousiane conceptualization of coping, to the prediction of suicide risk and tested the moderating effect of gender, controlling for depressive symptoms. A two-time data collection design was used. A community sample of 195 adults (91 men and 104 women) ranging in age from 19 to 65 years and living in several Portuguese regions, mostly in Alentejo, participated in this research. Gender, depressive symptoms, control, and withdrawal and conversion significantly predicted suicide risk and gender interacted with control, withdrawal and conversion, and social distraction in the prediction of suicide risk. Coping predicted suicide risk only for women. Results have important implications for assessment and intervention with suicide at-risk individuals. In particular,the evaluation and development of coping skills is indicated as a goal for therapists having suicide at-risk women as clients.

  16. Multivariate Radiological-Based Models for the Prediction of Future Knee Pain: Data from the OAI

    PubMed Central

    Galván-Tejada, Jorge I.; Celaya-Padilla, José M.; Treviño, Victor; Tamez-Peña, José G.

    2015-01-01

    In this work, the potential of X-ray based multivariate prognostic models to predict the onset of chronic knee pain is presented. Using X-rays quantitative image assessments of joint-space-width (JSW) and paired semiquantitative central X-ray scores from the Osteoarthritis Initiative (OAI), a case-control study is presented. The pain assessments of the right knee at the baseline and the 60-month visits were used to screen for case/control subjects. Scores were analyzed at the time of pain incidence (T-0), the year prior incidence (T-1), and two years before pain incidence (T-2). Multivariate models were created by a cross validated elastic-net regularized generalized linear models feature selection tool. Univariate differences between cases and controls were reported by AUC, C-statistics, and ODDs ratios. Univariate analysis indicated that the medial osteophytes were significantly more prevalent in cases than controls: C-stat 0.62, 0.62, and 0.61, at T-0, T-1, and T-2, respectively. The multivariate JSW models significantly predicted pain: AUC = 0.695, 0.623, and 0.620, at T-0, T-1, and T-2, respectively. Semiquantitative multivariate models predicted paint with C-stat = 0.671, 0.648, and 0.645 at T-0, T-1, and T-2, respectively. Multivariate models derived from plain X-ray radiography assessments may be used to predict subjects that are at risk of developing knee pain. PMID:26504490

  17. Empirical predictions of hypervelocity impact damage to the space station

    NASA Technical Reports Server (NTRS)

    Rule, W. K.; Hayashida, K. B.

    1991-01-01

    A family of user-friendly, DOS PC based, Microsoft BASIC programs written to provide spacecraft designers with empirical predictions of space debris damage to orbiting spacecraft is described. The spacecraft wall configuration is assumed to consist of multilayer insulation (MLI) placed between a Whipple style bumper and the pressure wall. Predictions are based on data sets of experimental results obtained from simulating debris impacts on spacecraft using light gas guns on Earth. A module of the program facilitates the creation of the data base of experimental results that are used by the damage prediction modules of the code. The user has the choice of three different prediction modules to predict damage to the bumper, the MLI, and the pressure wall. One prediction module is based on fitting low order polynomials through subsets of the experimental data. Another prediction module fits functions based on nondimensional parameters through the data. The last prediction technique is a unique approach that is based on weighting the experimental data according to the distance from the design point.

  18. Development and validation of a melanoma risk score based on pooled data from 16 case-control studies

    PubMed Central

    Davies, John R; Chang, Yu-mei; Bishop, D Timothy; Armstrong, Bruce K; Bataille, Veronique; Bergman, Wilma; Berwick, Marianne; Bracci, Paige M; Elwood, J Mark; Ernstoff, Marc S; Green, Adele; Gruis, Nelleke A; Holly, Elizabeth A; Ingvar, Christian; Kanetsky, Peter A; Karagas, Margaret R; Lee, Tim K; Le Marchand, Loïc; Mackie, Rona M; Olsson, Håkan; Østerlind, Anne; Rebbeck, Timothy R; Reich, Kristian; Sasieni, Peter; Siskind, Victor; Swerdlow, Anthony J; Titus, Linda; Zens, Michael S; Ziegler, Andreas; Gallagher, Richard P.; Barrett, Jennifer H; Newton-Bishop, Julia

    2015-01-01

    Background We report the development of a cutaneous melanoma risk algorithm based upon 7 factors; hair colour, skin type, family history, freckling, nevus count, number of large nevi and history of sunburn, intended to form the basis of a self-assessment webtool for the general public. Methods Predicted odds of melanoma were estimated by analysing a pooled dataset from 16 case-control studies using logistic random coefficients models. Risk categories were defined based on the distribution of the predicted odds in the controls from these studies. Imputation was used to estimate missing data in the pooled datasets. The 30th, 60th and 90th centiles were used to distribute individuals into four risk groups for their age, sex and geographic location. Cross-validation was used to test the robustness of the thresholds for each group by leaving out each study one by one. Performance of the model was assessed in an independent UK case-control study dataset. Results Cross-validation confirmed the robustness of the threshold estimates. Cases and controls were well discriminated in the independent dataset (area under the curve 0.75, 95% CI 0.73-0.78). 29% of cases were in the highest risk group compared with 7% of controls, and 43% of controls were in the lowest risk group compared with 13% of cases. Conclusion We have identified a composite score representing an estimate of relative risk and successfully validated this score in an independent dataset. Impact This score may be a useful tool to inform members of the public about their melanoma risk. PMID:25713022

  19. Nonprincipal plane scattering of flat plates and pattern control of horn antennas

    NASA Technical Reports Server (NTRS)

    Balanis, Constantine A.; Polka, Lesley A.; Liu, Kefeng

    1989-01-01

    Using the geometrical theory of diffraction, the traditional method of high frequency scattering analysis, the prediction of the radar cross section of a perfectly conducting, flat, rectangular plate is limited to principal planes. Part A of this report predicts the radar cross section in nonprincipal planes using the method of equivalent currents. This technique is based on an asymptotic end-point reduction of the surface radiation integrals for an infinite wedge and enables nonprincipal plane prediction. The predicted radar cross sections for both horizontal and vertical polarizations are compared to moment method results and experimental data from Arizona State University's anechoic chamber. In part B, a variational calculus approach to the pattern control of the horn antenna is outlined. The approach starts with the optimization of the aperture field distribution so that the control of the radiation pattern in a range of directions can be realized. A control functional is thus formulated. Next, a spectral analysis method is introduced to solve for the eigenfunctions from the extremal condition of the formulated functional. Solutions to the optimized aperture field distribution are then obtained.

  20. Machine learning of structural magnetic resonance imaging predicts psychopathic traits in adolescent offenders.

    PubMed

    Steele, Vaughn R; Rao, Vikram; Calhoun, Vince D; Kiehl, Kent A

    2017-01-15

    Classification models are becoming useful tools for finding patterns in neuroimaging data sets that are not observable to the naked eye. Many of these models are applied to discriminating clinical groups such as schizophrenic patients from healthy controls or from patients with bipolar disorder. A more nuanced model might be to discriminate between levels of personality traits. Here, as a proof of concept, we take an initial step toward developing prediction models to differentiate individuals based on a personality disorder: psychopathy. We included three groups of adolescent participants: incarcerated youth with elevated psychopathic traits (i.e., callous and unemotional traits and conduct disordered traits; n=71), incarcerated youth with low psychopathic traits (n=72), and non-incarcerated youth as healthy controls (n=21). Support vector machine (SVM) learning models were developed to separate these groups using an out-of-sample cross-validation method on voxel-based morphometry (VBM) data. Regions of interest from the paralimbic system, identified in an independent forensic sample, were successful in differentiating youth groups. Models seeking to classify incarcerated individuals to have high or low psychopathic traits achieved 69.23% overall accuracy. As expected, accuracy increased in models differentiating healthy controls from individuals with high psychopathic traits (82.61%) and low psychopathic traits (80.65%). Here we have laid the foundation for using neural correlates of personality traits to identify group membership within and beyond psychopathy. This is only the first step, of many, toward prediction models using neural measures as a proxy for personality traits. As these methods are improved, prediction models with neural measures of personality traits could have far-reaching impact on diagnosis, treatment, and prediction of future behavior. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Machine Learning of Structural Magnetic Resonance Imaging Predicts Psychopathic Traits in Adolescent Offenders

    PubMed Central

    Steele, Vaughn R.; Rao, Vikram; Calhoun, Vince D.; Kiehl, Kent A.

    2015-01-01

    Classification models are becoming useful tools for finding patterns in neuroimaging data sets that are not observable to the naked eye. Many of these models are applied to discriminating clinical groups such as schizophrenic patients from healthy controls or from patients with bipolar disorder. A more nuanced model might be to discriminate between levels of personality traits. Here, as a proof-of-concept, we take an initial step toward developing prediction models to differentiate individuals based on a personality disorder: psychopathy. We included three groups of adolescent participants: incarcerated youth with elevated psychopathic traits (i.e., callous and unemotional traits and conduct disordered traits; n = 71), incarcerated youth with low psychopathic traits (n =72), and non-incarcerated youth as healthy controls (n = 21). Support vector machine (SVM) learning models were developed to separate these groups using an out-of-sample cross-validation method on voxel-based morphometry (VBM) data. Regions-of-interest from the paralimbic system, identified in an independent forensic sample, were successful in differentiating youth groups. Models seeking to classify incarcerated individuals to have high or low psychopathic traits achieved 69.23% overall accuracy. As expected, accuracy increased in models differentiating healthy controls from individuals with high psychopathic traits (82.61%) and low psychopathic traits (80.65%). Here we have laid the foundation for using neural correlates of personality traits to identify group membership within and beyond psychopathy. This is only the first step, of many, toward prediction models using neural measures as a proxy for personality traits. As these methods are improved, prediction models with neural measures of personality traits could have far-reaching impact on diagnosis, treatment, and prediction of future behavior. PMID:26690808

  2. An Autonomous Flight Safety System

    DTIC Science & Technology

    2008-11-01

    are taken. AFSS can take vehicle navigation data from redundant onboard sensors and make flight termination decisions using software-based rules...implemented on redundant flight processors. By basing these decisions on actual Instantaneous Impact Predictions and by providing for an arbitrary...number of mission rules, it is the contention of the AFSS development team that the decision making process used by Missile Flight Control Officers

  3. A plasma rotation control scheme for NSTX and NSTX-U

    NASA Astrophysics Data System (ADS)

    Goumiri, Imene

    2016-10-01

    Plasma rotation has been proven to play a key role in stabilizing large scale instabilities and improving plasma confinement by suppressing micro-turbulence. A model-based feedback system which controls the plasma rotation profile on the National Spherical Torus Experiment (NSTX) and its upgrade (NSTX-U) is presented. The first part of this work uses experimental measurements from NSTX as a starting point and models the control of plasma rotation using two different types of actuation: momentum from injected neutral beams and neoclassical toroidal viscosity generated by three-dimensional applied magnetic fields. Whether based on the data-driven model for NSTX or purely predictive modeling for NSTX-U, a reduced order model based feedback controller was designed. Predictive simulations using the TRANSP plasma transport code with the actuator input determined by the controller (controller-in-the-loop) show that the controller drives the plasma's rotation to the desired profiles in less than 100 ms given practical constraints on the actuators and the available real-time rotation measurements. This is the first time that TRANSP has been used as a plasma in simulator in a closed feedback loop test. Another approach to control simultaneously the toroidal rotation profile as well as βN is then shown for NSTX-U. For this case, the neutral beams (actuators) have been augmented in the modeling to match the upgrade version which spread the injection throughout the edge of the plasma. Control robustness in stability and performance has then been tested and used to predict the limits of the resulting controllers when the energy confinement time (τE) and the momentum diffusivity coefficient (χϕ) vary.

  4. Real-time Upstream Monitoring System: Using ACE Data to Predict the Arrival of Interplanetary Shocks

    NASA Astrophysics Data System (ADS)

    Donegan, M. M.; Wagstaff, K. L.; Ho, G. C.; Vandegriff, J.

    2003-12-01

    We have developed an algorithm to predict Earth arrival times for interplanetary (IP) shock events originating at the Sun. Our predictions are generated from real-time data collected by the Electron, Proton, and Alpha Monitor (EPAM) instrument on NASA's Advanced Composition Explorer (ACE) spacecraft. The high intensities of energetic ions that occur prior to and during an IP shock pose a radiation hazard to astronauts as well as to electronics in Earth orbit. The potential to predict such events is based on characteristic signatures in the Energetic Storm Particle (ESP) event ion intensities which are often associated with IP shocks. We have previously reported on the development and implementation of an algorithm to forecast the arrival of ESP events. Historical ion data from ACE/EPAM was used to train an artificial neural network which uses the signature of an approaching event to predict the time remaining until the shock arrives. Tests on the trained network have been encouraging, with an average error of 9.4 hours for predictions made 24 hours in advance, and an reduced average error of 4.9 hours when the shock is 12 hours away. The prediction engine has been integrated into a web-based system that uses real-time ACE/EPAM data provided by the NOAA Space Environment Center (http://sd-www.jhuapl.edu/UPOS/RISP/ index.html.) This system continually processes the latest ACE data, reports whether or not there is an impending shock, and predicts the time remaining until the shock arrival. Our predictions are updated every five minutes and provide significant lead-time, thereby supplying critical information that can be used by mission planners, satellite operations controllers, and scientists. We have continued to refine the prediction capabilities of this system; in addition to forecasting arrival times for shocks, we now provide confidence estimates for those predictions.

  5. Characterization and Design of Digital Pointing Subsystem for Optical Communication Demonstrator

    NASA Technical Reports Server (NTRS)

    Racho, C.; Portillo, A.

    1998-01-01

    The Optical Communications Demonstrator (OCD) is a laboratory-based lasercom demonstration terminal designed to validate several key technologies, including beacon acquisition, high bandwidth tracking, precision bearn pointing, and point-ahead compensation functions. It has been under active development over the past few years. The instrument uses a CCD array detector for both spatial acquisition and high-bandwidth tracking, and a fiber coupled laser transmitter. The array detector tracking concept provides wide field-of-view acquisition and permits effective platform jitter compensation and point-ahead control using only one steering mirror. This paper describes the detailed design and characterization of the digital control loop system which includes the Fast Steering Mirror (FSM), the CCD image tracker, and the associated electronics. The objective is to improve the overall system performance using laboratory measured data. The. design of the digital control loop is based on a linear time invariant open loop model. The closed loop performance is predicted using the theoretical model. With the digital filter programmed into the OCD control software, data is collected to verify the predictions. This paper presents the results of the, system modeling and performance analysis. It has been shown that measurement data closely matches theoretical predictions. An important part of the laser communication experiment is the ability of FSM to track the laser beacon within the. required tolerances. The pointing must be maintained to an accuracy that is much smaller than the transmit signal beamwidth. For an earth orbit distance, the system must be able to track the receiving station to within a few microradians. The failure. to do so will result in a severely degraded system performance.

  6. The Impact of Trajectory Prediction Uncertainty on Air Traffic Controller Performance and Acceptability

    NASA Technical Reports Server (NTRS)

    Mercer, Joey S.; Bienert, Nancy; Gomez, Ashley; Hunt, Sarah; Kraut, Joshua; Martin, Lynne; Morey, Susan; Green, Steven M.; Prevot, Thomas; Wu, Minghong G.

    2013-01-01

    A Human-In-The-Loop air traffic control simulation investigated the impact of uncertainties in trajectory predictions on NextGen Trajectory-Based Operations concepts, seeking to understand when the automation would become unacceptable to controllers or when performance targets could no longer be met. Retired air traffic controllers staffed two en route transition sectors, delivering arrival traffic to the northwest corner-post of Atlanta approach control under time-based metering operations. Using trajectory-based decision-support tools, the participants worked the traffic under varying levels of wind forecast error and aircraft performance model error, impacting the ground automations ability to make accurate predictions. Results suggest that the controllers were able to maintain high levels of performance, despite even the highest levels of trajectory prediction errors.

  7. A Low Cost Mobile Robot Based on Proportional Integral Derivative (PID) Control System and Odometer for Education

    NASA Astrophysics Data System (ADS)

    Haq, R.; Prayitno, H.; Dzulkiflih; Sucahyo, I.; Rahmawati, E.

    2018-03-01

    In this article, the development of a low cost mobile robot based on PID controller and odometer for education is presented. PID controller and odometer is applied for controlling mobile robot position. Two-dimensional position vector in cartesian coordinate system have been inserted to robot controller as an initial and final position. Mobile robot has been made based on differential drive and sensor magnetic rotary encoder which measured robot position from a number of wheel rotation. Odometry methode use data from actuator movements for predicting change of position over time. The mobile robot is examined to get final position with three different heading angle 30°, 45° and 60° by applying various value of KP, KD and KI constant.

  8. Hinge Moment Coefficient Prediction Tool and Control Force Analysis of Extra-300 Aerobatic Aircraft

    NASA Astrophysics Data System (ADS)

    Nurohman, Chandra; Arifianto, Ony; Barecasco, Agra

    2018-04-01

    This paper presents the development of tool that is applicable to predict hinge moment coefficients of subsonic aircraft based on Roskam’s method, including the validation and its application to predict hinge moment coefficient of an Extra-300. The hinge moment coefficients are used to predict the stick forces of the aircraft during several aerobatic maneuver i.e. inside loop, half cuban 8, split-s, and aileron roll. The maximum longitudinal stick force is 566.97 N occurs in inside loop while the maximum lateral stick force is 340.82 N occurs in aileron roll. Furthermore, validation hinge moment prediction method is performed using Cessna 172 data.

  9. Motor prediction in Brain-Computer Interfaces for controlling mobile robots.

    PubMed

    Geng, Tao; Gan, John Q

    2008-01-01

    EEG-based Brain-Computer Interface (BCI) can be regarded as a new channel for motor control except that it does not involve muscles. Normal neuromuscular motor control has two fundamental components: (1) to control the body, and (2) to predict the consequences of the control command, which is called motor prediction. In this study, after training with a specially designed BCI paradigm based on motor imagery, two subjects learnt to predict the time course of some features of the EEG signals. It is shown that, with this newly-obtained motor prediction skill, subjects can use motor imagery of feet to directly control a mobile robot to avoid obstacles and reach a small target in a time-critical scenario.

  10. Parsing Heterogeneity in the Brain Connectivity of Depressed and Healthy Adults During Positive Mood.

    PubMed

    Price, Rebecca B; Lane, Stephanie; Gates, Kathleen; Kraynak, Thomas E; Horner, Michelle S; Thase, Michael E; Siegle, Greg J

    2017-02-15

    There is well-known heterogeneity in affective mechanisms in depression that may extend to positive affect. We used data-driven parsing of neural connectivity to reveal subgroups present across depressed and healthy individuals during positive processing, informing targets for mechanistic intervention. Ninety-two individuals (68 depressed patients, 24 never-depressed control subjects) completed a sustained positive mood induction during functional magnetic resonance imaging. Directed functional connectivity paths within a depression-relevant network were characterized using Group Iterative Multiple Model Estimation (GIMME), a method shown to accurately recover the direction and presence of connectivity paths in individual participants. During model selection, individuals were clustered using community detection on neural connectivity estimates. Subgroups were externally tested across multiple levels of analysis. Two connectivity-based subgroups emerged: subgroup A, characterized by weaker connectivity overall, and subgroup B, exhibiting hyperconnectivity (relative to subgroup A), particularly among ventral affective regions. Subgroup predicted diagnostic status (subgroup B contained 81% of patients; 50% of control subjects; χ 2 = 8.6, p = .003) and default mode network connectivity during a separate resting-state task. Among patients, subgroup B members had higher self-reported symptoms, lower sustained positive mood during the induction, and higher negative bias on a reaction-time task. Symptom-based depression subgroups did not predict these external variables. Neural connectivity-based categorization travels with diagnostic category and is clinically predictive, but not clinically deterministic. Both patients and control subjects showed heterogeneous, and overlapping, profiles. The larger and more severely affected patient subgroup was characterized by ventrally driven hyperconnectivity during positive processing. Data-driven parsing suggests heterogeneous substrates of depression and possible resilience in control subjects in spite of biological overlap. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  11. The impact of HIV infection and antiretroviral therapy on the predicted risk of Down syndrome.

    PubMed

    Charlton, Thomas G; Franklin, Jamie M; Douglas, Melanie; Short, Charlotte E; Mills, Ian; Smith, Rachel; Clarke, Amanda; Smith, John; Tookey, Pat A; Cortina-Borja, Mario; Taylor, Graham P

    2014-02-01

    The aim of this study was to assess predicted Down syndrome risk, based on three serum analytes (triple test), with HIV infection status and antiretroviral therapy regimen. Screening results in 72 HIV-positive women were compared with results from age-matched and race-matched HIV-negative controls. Mean concentrations of each analyte were compared by serostatus and antiretroviral therapy. Observed Down syndrome incidence in the offspring of HIV-positive women was calculated from national HIV surveillance data. Overall, women with HIV had a significantly higher probability of receiving a 'high-risk' result than uninfected controls (p = 0.002). Compared with matched uninfected controls, women with HIV infection had significantly higher human chorionic gonadotrophin, lower unconjugated estriol, and higher overall predicted risk of their infant having Down syndrome (1/6250 vs. 1/50 000 p = < 0.001). National surveillance data show no evidence of higher than expected incidence of Down syndrome in the offspring of HIV-positive women. HIV infection impacts the serum analytes used to assay for Down syndrome risk resulting in a high rate of 'high risk' results. However, there is no population-based association between maternal HIV infection and Down syndrome. Care should be taken when interpreting high-risk serum screening results in HIV-positive women to avoid unnecessary invasive diagnostic procedures. © 2013 John Wiley & Sons, Ltd.

  12. Missile Guidance Law Based on Robust Model Predictive Control Using Neural-Network Optimization.

    PubMed

    Li, Zhijun; Xia, Yuanqing; Su, Chun-Yi; Deng, Jun; Fu, Jun; He, Wei

    2015-08-01

    In this brief, the utilization of robust model-based predictive control is investigated for the problem of missile interception. Treating the target acceleration as a bounded disturbance, novel guidance law using model predictive control is developed by incorporating missile inside constraints. The combined model predictive approach could be transformed as a constrained quadratic programming (QP) problem, which may be solved using a linear variational inequality-based primal-dual neural network over a finite receding horizon. Online solutions to multiple parametric QP problems are used so that constrained optimal control decisions can be made in real time. Simulation studies are conducted to illustrate the effectiveness and performance of the proposed guidance control law for missile interception.

  13. A fuzzy logic approach to control anaerobic digestion.

    PubMed

    Domnanovich, A M; Strik, D P; Zani, L; Pfeiffer, B; Karlovits, M; Braun, R; Holubar, P

    2003-01-01

    One of the goals of the EU-Project AMONCO (Advanced Prediction, Monitoring and Controlling of Anaerobic Digestion Process Behaviour towards Biogas Usage in Fuel Cells) is to create a control tool for the anaerobic digestion process, which predicts the volumetric organic loading rate (Bv) for the next day, to obtain a high biogas quality and production. The biogas should contain a high methane concentration (over 50%) and a low concentration of components toxic for fuel cells, e.g. hydrogen sulphide, siloxanes, ammonia and mercaptanes. For producing data to test the control tool, four 20 l anaerobic Continuously Stirred Tank Reactors (CSTR) are operated. For controlling two systems were investigated: a pure fuzzy logic system and a hybrid-system which contains a fuzzy based reactor condition calculation and a hierachial neural net in a cascade of optimisation algorithms.

  14. Determination of Irreducible Water Saturation from nuclear magnetic resonance based on fractal theory — a case study of sandstone with complex pore structure

    NASA Astrophysics Data System (ADS)

    Peng, L.; Pan, H.; Ma, H.; Zhao, P.; Qin, R.; Deng, C.

    2017-12-01

    The irreducible water saturation (Swir) is a vital parameter for permeability prediction and original oil and gas estimation. However, the complex pore structure of the rocks makes the parameter difficult to be calculated from both laboratory and conventional well logging methods. In this study, an effective statistical method to predict Swir is derived directly from nuclear magnetic resonance (NMR) data based on fractal theory. The spectrum of transversal relaxation time (T2) is normally considered as an indicator of pore size distribution, and the micro- and meso-pore's fractal dimension in two specific range of T2 spectrum distribution are calculated. Based on the analysis of the fractal characteristics of 22 core samples, which were drilled from four boreholes of tight lithologic oil reservoirs of Ordos Basin in China, the positive correlation between Swir and porosity is derived. Afterwards a predicting model for Swir based on linear regressions of fractal dimensions is proposed. It reveals that the Swir is controlled by the pore size and the roughness of the pore. The reliability of this model is tested and an ideal consistency between predicted results and experimental data is found. This model is a reliable supplementary to predict the irreducible water saturation in the case that T2 cutoff value cannot be accurately determined.

  15. Consistent prediction of GO protein localization.

    PubMed

    Spetale, Flavio E; Arce, Debora; Krsticevic, Flavia; Bulacio, Pilar; Tapia, Elizabeth

    2018-05-17

    The GO-Cellular Component (GO-CC) ontology provides a controlled vocabulary for the consistent description of the subcellular compartments or macromolecular complexes where proteins may act. Current machine learning-based methods used for the automated GO-CC annotation of proteins suffer from the inconsistency of individual GO-CC term predictions. Here, we present FGGA-CC + , a class of hierarchical graph-based classifiers for the consistent GO-CC annotation of protein coding genes at the subcellular compartment or macromolecular complex levels. Aiming to boost the accuracy of GO-CC predictions, we make use of the protein localization knowledge in the GO-Biological Process (GO-BP) annotations to boost the accuracy of GO-CC prediction. As a result, FGGA-CC + classifiers are built from annotation data in both the GO-CC and GO-BP ontologies. Due to their graph-based design, FGGA-CC + classifiers are fully interpretable and their predictions amenable to expert analysis. Promising results on protein annotation data from five model organisms were obtained. Additionally, successful validation results in the annotation of a challenging subset of tandem duplicated genes in the tomato non-model organism were accomplished. Overall, these results suggest that FGGA-CC + classifiers can indeed be useful for satisfying the huge demand of GO-CC annotation arising from ubiquitous high throughout sequencing and proteomic projects.

  16. [The safety of biologics : a risk-benefit assessment of treating rheumatoid arthritis with biologics based on registry data on mortality].

    PubMed

    Sander, O

    2010-11-01

    The aim of this study is a risk-benefit assessment of treating rheumatoid arthritis with biologics based on registry data on mortality.UK, Sweden and Spain have published evaluable data on mortality. A parallel control group was conducted in the UK. Sweden and Spain used an historical cohort for comparison.Central registries supported British and Swedish research by sending details on all deaths. The variety of possible confounders prevents direct comparisons of the registers and safe predictions for individual patients.The death rate in TNF-inhibitor-treated patients is higher than in the general population but lower than in the control groups with RA. Thus comorbidities are not balanced, the weighted mortality rate scaled down the difference between exposed patients and controls. When TNF-inhibitors are given for the usual indication, mortality is reduced compared to conventional therapy.

  17. Accurate multimodal probabilistic prediction of conversion to Alzheimer's disease in patients with mild cognitive impairment.

    PubMed

    Young, Jonathan; Modat, Marc; Cardoso, Manuel J; Mendelson, Alex; Cash, Dave; Ourselin, Sebastien

    2013-01-01

    Accurately identifying the patients that have mild cognitive impairment (MCI) who will go on to develop Alzheimer's disease (AD) will become essential as new treatments will require identification of AD patients at earlier stages in the disease process. Most previous work in this area has centred around the same automated techniques used to diagnose AD patients from healthy controls, by coupling high dimensional brain image data or other relevant biomarker data to modern machine learning techniques. Such studies can now distinguish between AD patients and controls as accurately as an experienced clinician. Models trained on patients with AD and control subjects can also distinguish between MCI patients that will convert to AD within a given timeframe (MCI-c) and those that remain stable (MCI-s), although differences between these groups are smaller and thus, the corresponding accuracy is lower. The most common type of classifier used in these studies is the support vector machine, which gives categorical class decisions. In this paper, we introduce Gaussian process (GP) classification to the problem. This fully Bayesian method produces naturally probabilistic predictions, which we show correlate well with the actual chances of converting to AD within 3 years in a population of 96 MCI-s and 47 MCI-c subjects. Furthermore, we show that GPs can integrate multimodal data (in this study volumetric MRI, FDG-PET, cerebrospinal fluid, and APOE genotype with the classification process through the use of a mixed kernel). The GP approach aids combination of different data sources by learning parameters automatically from training data via type-II maximum likelihood, which we compare to a more conventional method based on cross validation and an SVM classifier. When the resulting probabilities from the GP are dichotomised to produce a binary classification, the results for predicting MCI conversion based on the combination of all three types of data show a balanced accuracy of 74%. This is a substantially higher accuracy than could be obtained using any individual modality or using a multikernel SVM, and is competitive with the highest accuracy yet achieved for predicting conversion within three years on the widely used ADNI dataset.

  18. NAPR: a Cloud-Based Framework for Neuroanatomical Age Prediction.

    PubMed

    Pardoe, Heath R; Kuzniecky, Ruben

    2018-01-01

    The availability of cloud computing services has enabled the widespread adoption of the "software as a service" (SaaS) approach for software distribution, which utilizes network-based access to applications running on centralized servers. In this paper we apply the SaaS approach to neuroimaging-based age prediction. Our system, named "NAPR" (Neuroanatomical Age Prediction using R), provides access to predictive modeling software running on a persistent cloud-based Amazon Web Services (AWS) compute instance. The NAPR framework allows external users to estimate the age of individual subjects using cortical thickness maps derived from their own locally processed T1-weighted whole brain MRI scans. As a demonstration of the NAPR approach, we have developed two age prediction models that were trained using healthy control data from the ABIDE, CoRR, DLBS and NKI Rockland neuroimaging datasets (total N = 2367, age range 6-89 years). The provided age prediction models were trained using (i) relevance vector machines and (ii) Gaussian processes machine learning methods applied to cortical thickness surfaces obtained using Freesurfer v5.3. We believe that this transparent approach to out-of-sample evaluation and comparison of neuroimaging age prediction models will facilitate the development of improved age prediction models and allow for robust evaluation of the clinical utility of these methods.

  19. The probability of being identified as an outlier with commonly used funnel plot control limits for the standardised mortality ratio.

    PubMed

    Seaton, Sarah E; Manktelow, Bradley N

    2012-07-16

    Emphasis is increasingly being placed on the monitoring of clinical outcomes for health care providers. Funnel plots have become an increasingly popular graphical methodology used to identify potential outliers. It is assumed that a provider only displaying expected random variation (i.e. 'in-control') will fall outside a control limit with a known probability. In reality, the discrete count nature of these data, and the differing methods, can lead to true probabilities quite different from the nominal value. This paper investigates the true probability of an 'in control' provider falling outside control limits for the Standardised Mortality Ratio (SMR). The true probabilities of an 'in control' provider falling outside control limits for the SMR were calculated and compared for three commonly used limits: Wald confidence interval; 'exact' confidence interval; probability-based prediction interval. The probability of falling above the upper limit, or below the lower limit, often varied greatly from the nominal value. This was particularly apparent when there were a small number of expected events: for expected events ≤ 50 the median probability of an 'in-control' provider falling above the upper 95% limit was 0.0301 (Wald), 0.0121 ('exact'), 0.0201 (prediction). It is important to understand the properties and probability of being identified as an outlier by each of these different methods to aid the correct identification of poorly performing health care providers. The limits obtained using probability-based prediction limits have the most intuitive interpretation and their properties can be defined a priori. Funnel plot control limits for the SMR should not be based on confidence intervals.

  20. CisMapper: predicting regulatory interactions from transcription factor ChIP-seq data

    PubMed Central

    O'Connor, Timothy; Bodén, Mikael

    2017-01-01

    Abstract Identifying the genomic regions and regulatory factors that control the transcription of genes is an important, unsolved problem. The current method of choice predicts transcription factor (TF) binding sites using chromatin immunoprecipitation followed by sequencing (ChIP-seq), and then links the binding sites to putative target genes solely on the basis of the genomic distance between them. Evidence from chromatin conformation capture experiments shows that this approach is inadequate due to long-distance regulation via chromatin looping. We present CisMapper, which predicts the regulatory targets of a TF using the correlation between a histone mark at the TF's bound sites and the expression of each gene across a panel of tissues. Using both chromatin conformation capture and differential expression data, we show that CisMapper is more accurate at predicting the target genes of a TF than the distance-based approaches currently used, and is particularly advantageous for predicting the long-range regulatory interactions typical of tissue-specific gene expression. CisMapper also predicts which TF binding sites regulate a given gene more accurately than using genomic distance. Unlike distance-based methods, CisMapper can predict which transcription start site of a gene is regulated by a particular binding site of the TF. PMID:28204599

  1. Automated detection of external ventricular and lumbar drain-related meningitis using laboratory and microbiology results and medication data.

    PubMed

    van Mourik, Maaike S M; Groenwold, Rolf H H; Berkelbach van der Sprenkel, Jan Willem; van Solinge, Wouter W; Troelstra, Annet; Bonten, Marc J M

    2011-01-01

    Monitoring of healthcare-associated infection rates is important for infection control and hospital benchmarking. However, manual surveillance is time-consuming and susceptible to error. The aim was, therefore, to develop a prediction model to retrospectively detect drain-related meningitis (DRM), a frequently occurring nosocomial infection, using routinely collected data from a clinical data warehouse. As part of the hospital infection control program, all patients receiving an external ventricular (EVD) or lumbar drain (ELD) (2004 to 2009; n = 742) had been evaluated for the development of DRM through chart review and standardized diagnostic criteria by infection control staff; this was the reference standard. Children, patients dying <24 hours after drain insertion or with <1 day follow-up and patients with infection at the time of insertion or multiple simultaneous drains were excluded. Logistic regression was used to develop a model predicting the occurrence of DRM. Missing data were imputed using multiple imputation. Bootstrapping was applied to increase generalizability. 537 patients remained after application of exclusion criteria, of which 82 developed DRM (13.5/1000 days at risk). The automated model to detect DRM included the number of drains placed, drain type, blood leukocyte count, C-reactive protein, cerebrospinal fluid leukocyte count and culture result, number of antibiotics started during admission, and empiric antibiotic therapy. Discriminatory power of this model was excellent (area under the ROC curve 0.97). The model achieved 98.8% sensitivity (95% CI 88.0% to 99.9%) and specificity of 87.9% (84.6% to 90.8%). Positive and negative predictive values were 56.9% (50.8% to 67.9%) and 99.9% (98.6% to 99.9%), respectively. Predicted yearly infection rates concurred with observed infection rates. A prediction model based on multi-source data stored in a clinical data warehouse could accurately quantify rates of DRM. Automated detection using this statistical approach is feasible and could be applied to other nosocomial infections.

  2. Development and status of data quality assurance program at NASA Langley research center: Toward national standards

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.

    1996-01-01

    As part of a continuing effort to re-engineer the wind tunnel testing process, a comprehensive data quality assurance program is being established at NASA Langley Research Center (LaRC). The ultimate goal of the program is routing provision of tunnel-to-tunnel reproducibility with total uncertainty levels acceptable for test and evaluation of civilian transports. The operational elements for reaching such levels of reproducibility are: (1) statistical control, which provides long term measurement uncertainty predictability and a base for continuous improvement, (2) measurement uncertainty prediction, which provides test designs that can meet data quality expectations with the system's predictable variation, and (3) national standards, which provide a means for resolving tunnel-to-tunnel differences. The paper presents the LaRC design for the program and discusses the process of implementation.

  3. Effective Information Extraction Framework for Heterogeneous Clinical Reports Using Online Machine Learning and Controlled Vocabularies

    PubMed Central

    Zheng, Shuai; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A

    2017-01-01

    Background Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Objective Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. Methods A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Results Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports—each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. Conclusions IDEAL-X adopts a unique online machine learning–based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. PMID:28487265

  4. Robust model predictive control for satellite formation keeping with eccentricity/inclination vector separation

    NASA Astrophysics Data System (ADS)

    Lim, Yeerang; Jung, Youeyun; Bang, Hyochoong

    2018-05-01

    This study presents model predictive formation control based on an eccentricity/inclination vector separation strategy. Alternative collision avoidance can be accomplished by using eccentricity/inclination vectors and adding a simple goal function term for optimization process. Real-time control is also achievable with model predictive controller based on convex formulation. Constraint-tightening approach is address as well improve robustness of the controller, and simulation results are presented to verify performance enhancement for the proposed approach.

  5. Predicting Epidemic Risk from Past Temporal Contact Data

    PubMed Central

    Valdano, Eugenio; Poletto, Chiara; Giovannini, Armando; Palma, Diana; Savini, Lara; Colizza, Vittoria

    2015-01-01

    Understanding how epidemics spread in a system is a crucial step to prevent and control outbreaks, with broad implications on the system’s functioning, health, and associated costs. This can be achieved by identifying the elements at higher risk of infection and implementing targeted surveillance and control measures. One important ingredient to consider is the pattern of disease-transmission contacts among the elements, however lack of data or delays in providing updated records may hinder its use, especially for time-varying patterns. Here we explore to what extent it is possible to use past temporal data of a system’s pattern of contacts to predict the risk of infection of its elements during an emerging outbreak, in absence of updated data. We focus on two real-world temporal systems; a livestock displacements trade network among animal holdings, and a network of sexual encounters in high-end prostitution. We define the node’s loyalty as a local measure of its tendency to maintain contacts with the same elements over time, and uncover important non-trivial correlations with the node’s epidemic risk. We show that a risk assessment analysis incorporating this knowledge and based on past structural and temporal pattern properties provides accurate predictions for both systems. Its generalizability is tested by introducing a theoretical model for generating synthetic temporal networks. High accuracy of our predictions is recovered across different settings, while the amount of possible predictions is system-specific. The proposed method can provide crucial information for the setup of targeted intervention strategies. PMID:25763816

  6. A Microstructure-Based Time-Dependent Crack Growth Model for Life and Reliability Prediction of Turbopropulsion Systems

    NASA Astrophysics Data System (ADS)

    Chan, Kwai S.; Enright, Michael P.; Moody, Jonathan; Fitch, Simeon H. K.

    2014-01-01

    The objective of this investigation was to develop an innovative methodology for life and reliability prediction of hot-section components in advanced turbopropulsion systems. A set of generic microstructure-based time-dependent crack growth (TDCG) models was developed and used to assess the sources of material variability due to microstructure and material parameters such as grain size, activation energy, and crack growth threshold for TDCG. A comparison of model predictions and experimental data obtained in air and in vacuum suggests that oxidation is responsible for higher crack growth rates at high temperatures, low frequencies, and long dwell times, but oxidation can also induce higher crack growth thresholds (Δ K th or K th) under certain conditions. Using the enhanced risk analysis tool and material constants calibrated to IN 718 data, the effect of TDCG on the risk of fracture in turboengine components was demonstrated for a generic rotor design and a realistic mission profile using the DARWIN® probabilistic life-prediction code. The results of this investigation confirmed that TDCG and cycle-dependent crack growth in IN 718 can be treated by a simple summation of the crack increments over a mission. For the temperatures considered, TDCG in IN 718 can be considered as a K-controlled or a diffusion-controlled oxidation-induced degradation process. This methodology provides a pathway for evaluating microstructural effects on multiple damage modes in hot-section components.

  7. Does parental consent for birth control affect underage pregnancy rates? The case of Texas.

    PubMed

    Girma, Sourafel; Paton, David

    2013-12-01

    Previous work based on conjectural responses of minors predicted that the 2003 Texas requirement for parental consent for state-funded birth control to minors would lead to a large increase in underage pregnancies. We use state- and county-level data to test this prediction. The latter allow us to compare the impact of parental consent in counties with and without state-funded family planning clinics. We control for characteristics systematically correlated with the presence of state-funded clinics by combining difference-in-difference estimation with propensity score-weighted regressions. The evidence suggests that the parental consent mandate led to a large decrease in attendance at family planning clinics among teens but did not lead to an increase in underage pregnancies.

  8. GenoMatrix: A Software Package for Pedigree-Based and Genomic Prediction Analyses on Complex Traits.

    PubMed

    Nazarian, Alireza; Gezan, Salvador Alejandro

    2016-07-01

    Genomic and pedigree-based best linear unbiased prediction methodologies (G-BLUP and P-BLUP) have proven themselves efficient for partitioning the phenotypic variance of complex traits into its components, estimating the individuals' genetic merits, and predicting unobserved (or yet-to-be observed) phenotypes in many species and fields of study. The GenoMatrix software, presented here, is a user-friendly package to facilitate the process of using genome-wide marker data and parentage information for G-BLUP and P-BLUP analyses on complex traits. It provides users with a collection of applications which help them on a set of tasks from performing quality control on data to constructing and manipulating the genomic and pedigree-based relationship matrices and obtaining their inverses. Such matrices will be then used in downstream analyses by other statistical packages. The package also enables users to obtain predicted values for unobserved individuals based on the genetic values of observed related individuals. GenoMatrix is available to the research community as a Windows 64bit executable and can be downloaded free of charge at: http://compbio.ufl.edu/software/genomatrix/. © The American Genetic Association. 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Indicators of asthma control among students in a rural, school-based asthma management program

    PubMed Central

    Rasberry, Catherine N.; Cheung, Karen; Buckley, Rebekah; Dunville, Richard; Daniels, Brandy; Cook, Deborah; Robin, Leah; Dean, Blair

    2015-01-01

    Objective The evaluation sought to determine if a comprehensive, school-based asthma management program in a small, rural school district helped students improve asthma control. Methods To determine if students in the asthma program demonstrated better asthma control than students in a comparison school district, the evaluation team used a quasi-experimental, cross-sectional design and administered questionnaires assessing asthma control (which included FEV1 measurement) to 456 students with asthma in the intervention and comparison districts. Data were analyzed for differences in asthma control between students in the two districts. To determine if students in the intervention experienced increased asthma control between baseline and follow-up, the evaluation team used a one-group retrospective design. Program records for 323 students were analyzed for differences in percent of predicted forced expiratory volume in one second (FEV1) between baseline and follow-up. Results Students with asthma in the intervention district exhibited significantly better asthma control than students with asthma in the comparison district. Percent of predicted FEV1 did not change significantly between baseline and follow-up for the intervention participants; however, post hoc analyses revealed students with poorly-controlled asthma at baseline had significantly higher FEV1 scores at follow-up, and students with well-controlled asthma at baseline had significantly lower FEV1 scores at follow-up. Conclusions Findings suggest the comprehensive school-based program led to improvements in asthma control for students with poorly controlled asthma at baseline, and school-based programs need mechanisms for tracking students with initially well-controlled asthma in order to ensure they maintain control. PMID:24730771

  10. Thrust Vectoring on the NASA F-18 High Alpha Research Vehicle

    NASA Technical Reports Server (NTRS)

    Bowers, Albion H.; Pahle, Joseph W.

    1996-01-01

    Investigations into a multiaxis thrust-vectoring system have been conducted on an F-18 configuration. These investigations include ground-based scale-model tests, ground-based full-scale testing, and flight testing. This thrust-vectoring system has been tested on the NASA F-18 High Alpha Research Vehicle (HARV). The system provides thrust vectoring in pitch and yaw axes. Ground-based subscale test data have been gathered as background to the flight phase of the program. Tests investigated aerodynamic interaction and vane control effectiveness. The ground-based full-scale data were gathered from static engine runs with image analysis to determine relative thrust-vectoring effectiveness. Flight tests have been conducted at the NASA Dryden Flight Research Center. Parameter identification input techniques have been developed. Individual vanes were not directly controlled because of a mixer-predictor function built into the flight control laws. Combined effects of the vanes have been measured in flight and compared to combined effects of the vanes as predicted by the cold-jet test data. Very good agreement has been found in the linearized effectiveness derivatives.

  11. Simulation analysis of adaptive cruise prediction control

    NASA Astrophysics Data System (ADS)

    Zhang, Li; Cui, Sheng Min

    2017-09-01

    Predictive control is suitable for multi-variable and multi-constraint system control.In order to discuss the effect of predictive control on the vehicle longitudinal motion, this paper establishes the expected spacing model by combining variable pitch spacing and the of safety distance strategy. The model predictive control theory and the optimization method based on secondary planning are designed to obtain and track the best expected acceleration trajectory quickly. Simulation models are established including predictive and adaptive fuzzy control. Simulation results show that predictive control can realize the basic function of the system while ensuring the safety. The application of predictive and fuzzy adaptive algorithm in cruise condition indicates that the predictive control effect is better.

  12. Measurement and Control of the Variability of Scanning Pressure Transducer Measurements

    NASA Technical Reports Server (NTRS)

    Kuhl, David D.; Everhart, Joel L.; Hallissy, James B.

    2003-01-01

    This paper describes the new wall pressure measurement system and data-quality monitoring software installed at 14x22 Ft subsonic tunnel at the NASA Langley Research Center. The monitoring software was developed to enable measurement and control of the variability of the reference pressures and approximately 400 tunnel wall pressure measurements. Variability of the system, based upon data acquired over a year of wind tunnel tests and calibrations, is presented. The level of variation of the wall pressure measurements is shown to be predictable.

  13. Combination of acoustical radiosity and the image source method.

    PubMed

    Koutsouris, Georgios I; Brunskog, Jonas; Jeong, Cheol-Ho; Jacobsen, Finn

    2013-06-01

    A combined model for room acoustic predictions is developed, aiming to treat both diffuse and specular reflections in a unified way. Two established methods are incorporated: acoustical radiosity, accounting for the diffuse part, and the image source method, accounting for the specular part. The model is based on conservation of acoustical energy. Losses are taken into account by the energy absorption coefficient, and the diffuse reflections are controlled via the scattering coefficient, which defines the portion of energy that has been diffusely reflected. The way the model is formulated allows for a dynamic control of the image source production, so that no fixed maximum reflection order is required. The model is optimized for energy impulse response predictions in arbitrary polyhedral rooms. The predictions are validated by comparison with published measured data for a real music studio hall. The proposed model turns out to be promising for acoustic predictions providing a high level of detail and accuracy.

  14. Analysis of Turbine Blade Relative Cooling Flow Factor Used in the Subroutine Coolit Based on Film Cooling Correlations

    NASA Technical Reports Server (NTRS)

    Schneider, Steven J.

    2015-01-01

    Heat transfer correlations of data on flat plates are used to explore the parameters in the Coolit program used for calculating the quantity of cooling air for controlling turbine blade temperature. Correlations for both convection and film cooling are explored for their relevance to predicting blade temperature as a function of a total cooling flow which is split between external film and internal convection flows. Similar trends to those in Coolit are predicted as a function of the percent of the total cooling flow that is in the film. The exceptions are that no film or 100 percent convection is predicted to not be able to control blade temperature, while leaving less than 25 percent of the cooling flow in the convection path results in nearing a limit on convection cooling as predicted by a thermal effectiveness parameter not presently used in Coolit.

  15. Flight-Test-Determined Aerodynamic Force and Moment Characteristics of the X-43A at Mach 7.0

    NASA Technical Reports Server (NTRS)

    Davis. Marl C.; White, J. Terry

    2006-01-01

    The second flight of the Hyper-X program afforded a unique opportunity to determine the aerodynamic force and moment characteristics of an airframe-integrated scramjet-powered aircraft in hypersonic flight. These data were gathered via a repeated series of pitch, yaw, and roll doublets; frequency sweeps; and pushover-pullup maneuvers performed throughout the X-43A cowl-closed descent. Maneuvers were conducted at Mach numbers of 6.80 to 0.95 and altitudes from 92,000 ft msl to sea level. The dynamic pressure varied from 1300 psf to 400 psf with the angle of attack ranging from 0 deg to 14 deg. The flight-extracted aerodynamics were compared with preflight predictions based on wind-tunnel-test data. The X-43A flight-derived axial force was found to be 10 percent to 15 percent higher than prediction. Under-predictions of similar magnitude were observed for the normal force. For Mach numbers above 4.0, the flight-derived stability and control characteristics resulted in larger-than-predicted static margins, with the largest discrepancy approximately 5 in. forward along the x-axis center of gravity at Mach 6.0. This condition would result in less static margin in pitch. The predicted lateral-directional stability and control characteristics matched well with flight data when allowance was made for the high uncertainty in angle of sideslip.

  16. Online Recorded Data-Based Composite Neural Control of Strict-Feedback Systems With Application to Hypersonic Flight Dynamics.

    PubMed

    Xu, Bin; Yang, Daipeng; Shi, Zhongke; Pan, Yongping; Chen, Badong; Sun, Fuchun

    2017-09-25

    This paper investigates the online recorded data-based composite neural control of uncertain strict-feedback systems using the backstepping framework. In each step of the virtual control design, neural network (NN) is employed for uncertainty approximation. In previous works, most designs are directly toward system stability ignoring the fact how the NN is working as an approximator. In this paper, to enhance the learning ability, a novel prediction error signal is constructed to provide additional correction information for NN weight update using online recorded data. In this way, the neural approximation precision is highly improved, and the convergence speed can be faster. Furthermore, the sliding mode differentiator is employed to approximate the derivative of the virtual control signal, and thus, the complex analysis of the backstepping design can be avoided. The closed-loop stability is rigorously established, and the boundedness of the tracking error can be guaranteed. Through simulation of hypersonic flight dynamics, the proposed approach exhibits better tracking performance.

  17. Forecasting hotspots using predictive visual analytics approach

    DOEpatents

    Maciejewski, Ross; Hafen, Ryan; Rudolph, Stephen; Cleveland, William; Ebert, David

    2014-12-30

    A method for forecasting hotspots is provided. The method may include the steps of receiving input data at an input of the computational device, generating a temporal prediction based on the input data, generating a geospatial prediction based on the input data, and generating output data based on the time series and geospatial predictions. The output data may be configured to display at least one user interface at an output of the computational device.

  18. Controlling for Frailty in Pharmacoepidemiologic Studies of Older Adults: Validation of an Existing Medicare Claims-based Algorithm.

    PubMed

    Cuthbertson, Carmen C; Kucharska-Newton, Anna; Faurot, Keturah R; Stürmer, Til; Jonsson Funk, Michele; Palta, Priya; Windham, B Gwen; Thai, Sydney; Lund, Jennifer L

    2018-07-01

    Frailty is a geriatric syndrome characterized by weakness and weight loss and is associated with adverse health outcomes. It is often an unmeasured confounder in pharmacoepidemiologic and comparative effectiveness studies using administrative claims data. Among the Atherosclerosis Risk in Communities (ARIC) Study Visit 5 participants (2011-2013; n = 3,146), we conducted a validation study to compare a Medicare claims-based algorithm of dependency in activities of daily living (or dependency) developed as a proxy for frailty with a reference standard measure of phenotypic frailty. We applied the algorithm to the ARIC participants' claims data to generate a predicted probability of dependency. Using the claims-based algorithm, we estimated the C-statistic for predicting phenotypic frailty. We further categorized participants by their predicted probability of dependency (<5%, 5% to <20%, and ≥20%) and estimated associations with difficulties in physical abilities, falls, and mortality. The claims-based algorithm showed good discrimination of phenotypic frailty (C-statistic = 0.71; 95% confidence interval [CI] = 0.67, 0.74). Participants classified with a high predicted probability of dependency (≥20%) had higher prevalence of falls and difficulty in physical ability, and a greater risk of 1-year all-cause mortality (hazard ratio = 5.7 [95% CI = 2.5, 13]) than participants classified with a low predicted probability (<5%). Sensitivity and specificity varied across predicted probability of dependency thresholds. The Medicare claims-based algorithm showed good discrimination of phenotypic frailty and high predictive ability with adverse health outcomes. This algorithm can be used in future Medicare claims analyses to reduce confounding by frailty and improve study validity.

  19. Development of evaluation technique of GMAW welding quality based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shengqiang; Terasaki, Hidenri; Komizo, Yuichi; Hu, Shengsun; Chen, Donggao; Ma, Zhihua

    2014-11-01

    Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the mean µ and standard deviation σ of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero to µ+3 σ are regarded as "good". Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.

  20. Systematic ionospheric electron density tilts (SITs) at mid-latitudes and their associated HF bearing errors

    NASA Astrophysics Data System (ADS)

    Tedd, B. L.; Strangeways, H. J.; Jones, T. B.

    1985-11-01

    Systematic ionospheric tilts (SITs) at midlatitudes and the diurnal variation of bearing error for different transmission paths are examined. An explanation of diurnal variations of bearing error based on the dependence of ionospheric tilt on solar zenith angle and plasma transport processes is presented. The effect of vertical ion drift and the momentum transfer of neutral winds is investigated. During the daytime the transmissions are low and photochemical processes control SITs; however, at night transmissions are at higher heights and spatial and temporal variations of plasma transport processes influence SITs. A HF ray tracing technique which uses a three-dimensional ionospheric model based on predictions to simulate SIT-induced bearing errors is described; poor correlation with experimental data is observed and the causes for this are studied. A second model based on measured vertical-sounder data is proposed. Model two is applicable for predicting bearing error for a range of transmission paths and correlates well with experimental data.

  1. A Statistical Approach to Thermal Management of Data Centers Under Steady State and System Perturbations

    PubMed Central

    Haaland, Ben; Min, Wanli; Qian, Peter Z. G.; Amemiya, Yasuo

    2011-01-01

    Temperature control for a large data center is both important and expensive. On the one hand, many of the components produce a great deal of heat, and on the other hand, many of the components require temperatures below a fairly low threshold for reliable operation. A statistical framework is proposed within which the behavior of a large cooling system can be modeled and forecast under both steady state and perturbations. This framework is based upon an extension of multivariate Gaussian autoregressive hidden Markov models (HMMs). The estimated parameters of the fitted model provide useful summaries of the overall behavior of and relationships within the cooling system. Predictions under system perturbations are useful for assessing potential changes and improvements to be made to the system. Many data centers have far more cooling capacity than necessary under sensible circumstances, thus resulting in energy inefficiencies. Using this model, predictions for system behavior after a particular component of the cooling system is shut down or reduced in cooling power can be generated. Steady-state predictions are also useful for facility monitors. System traces outside control boundaries flag a change in behavior to examine. The proposed model is fit to data from a group of air conditioners within an enterprise data center from the IT industry. The fitted model is examined, and a particular unit is found to be underutilized. Predictions generated for the system under the removal of that unit appear very reasonable. Steady-state system behavior also is predicted well. PMID:22076026

  2. Improving the Forecast Accuracy of an Ocean Observation and Prediction System by Adaptive Control of the Sensor Network

    NASA Astrophysics Data System (ADS)

    Talukder, A.; Panangadan, A. V.; Blumberg, A. F.; Herrington, T.; Georgas, N.

    2008-12-01

    The New York Harbor Observation and Prediction System (NYHOPS) is a real-time, estuarine and coastal ocean observing and modeling system for the New York Harbor and surrounding waters. Real-time measurements from in-situ mobile and stationary sensors in the NYHOPS networks are assimilated into marine forecasts in order to reduce the discrepancy with ground truth. The forecasts are obtained from the ECOMSED hydrodynamic model, a shallow water derivative of the Princeton Ocean Model. Currently, all sensors in the NYHOPS system are operated in a fixed mode with uniform sampling rates. This technology infusion effort demonstrates the use of Model Predictive Control (MPC) to autonomously adapt the operation of both mobile and stationary sensors in response to changing events that are -automatically detected from the ECOMSED forecasts. The controller focuses sensing resources on those regions that are expected to be impacted by the detected events. The MPC approach involves formulating the problem of calculating the optimal sensor parameters as a constrained multi-objective optimization problem. We have developed an objective function that takes into account the spatiotemporal relationship of the in-situ sensor locations and the locations of events detected by the model. Experiments in simulation were carried out using data collected during a freshwater flooding event. The location of the resulting freshwater plume was calculated from the corresponding model forecasts and was used by the MPC controller to derive control parameters for the sensing assets. The operational parameters that are controlled include the sampling rates of stationary sensors, paths of unmanned underwater vehicles (UUVs), and data transfer routes between sensors and the central modeling computer. The simulation experiments show that MPC-based sensor control reduces the RMS error in the forecast by a factor of 380% as compared to uniform sampling. The paths of multiple UUVs were simultaneously calculated such that measurements from on-board sensors would lead to maximal reduction in the forecast error after data assimilation. The MPC controller also reduces the consumption of system resources such as energy expended in sampling and wireless communication. The MPC-based control approach can be generalized to accept data from remote sensing satellites. This will enable in-situ sensors to be regulated using forecasts generated by assimilating local high resolution in-situ measurements with wide-area observations from remote sensing satellites.

  3. Prediction of fruit and vegetable intake from biomarkers using individual participant data of diet-controlled intervention studies.

    PubMed

    Souverein, Olga W; de Vries, Jeanne H M; Freese, Riitta; Watzl, Bernhard; Bub, Achim; Miller, Edgar R; Castenmiller, Jacqueline J M; Pasman, Wilrike J; van Het Hof, Karin; Chopra, Mridula; Karlsen, Anette; Dragsted, Lars O; Winkels, Renate; Itsiopoulos, Catherine; Brazionis, Laima; O'Dea, Kerin; van Loo-Bouwman, Carolien A; Naber, Ton H J; van der Voet, Hilko; Boshuizen, Hendriek C

    2015-05-14

    Fruit and vegetable consumption produces changes in several biomarkers in blood. The present study aimed to examine the dose-response curve between fruit and vegetable consumption and carotenoid (α-carotene, β-carotene, β-cryptoxanthin, lycopene, lutein and zeaxanthin), folate and vitamin C concentrations. Furthermore, a prediction model of fruit and vegetable intake based on these biomarkers and subject characteristics (i.e. age, sex, BMI and smoking status) was established. Data from twelve diet-controlled intervention studies were obtained to develop a prediction model for fruit and vegetable intake (including and excluding fruit and vegetable juices). The study population in the present individual participant data meta-analysis consisted of 526 men and women. Carotenoid, folate and vitamin C concentrations showed a positive relationship with fruit and vegetable intake. Measures of performance for the prediction model were calculated using cross-validation. For the prediction model of fruit, vegetable and juice intake, the root mean squared error (RMSE) was 258.0 g, the correlation between observed and predicted intake was 0.78 and the mean difference between observed and predicted intake was - 1.7 g (limits of agreement: - 466.3, 462.8 g). For the prediction of fruit and vegetable intake (excluding juices), the RMSE was 201.1 g, the correlation was 0.65 and the mean bias was 2.4 g (limits of agreement: -368.2, 373.0 g). The prediction models which include the biomarkers and subject characteristics may be used to estimate average intake at the group level and to investigate the ranking of individuals with regard to their intake of fruit and vegetables when validating questionnaires that measure intake.

  4. Data analytics and optimization of an ice-based energy storage system for commercial buildings

    DOE PAGES

    Luo, Na; Hong, Tianzhen; Li, Hui; ...

    2017-07-25

    Ice-based thermal energy storage (TES) systems can shift peak cooling demand and reduce operational energy costs (with time-of-use rates) in commercial buildings. The accurate prediction of the cooling load, and the optimal control strategy for managing the charging and discharging of a TES system, are two critical elements to improving system performance and achieving energy cost savings. This study utilizes data-driven analytics and modeling to holistically understand the operation of an ice–based TES system in a shopping mall, calculating the system’s performance using actual measured data from installed meters and sensors. Results show that there is significant savings potential whenmore » the current operating strategy is improved by appropriately scheduling the operation of each piece of equipment of the TES system, as well as by determining the amount of charging and discharging for each day. A novel optimal control strategy, determined by an optimization algorithm of Sequential Quadratic Programming, was developed to minimize the TES system’s operating costs. Three heuristic strategies were also investigated for comparison with our proposed strategy, and the results demonstrate the superiority of our method to the heuristic strategies in terms of total energy cost savings. Specifically, the optimal strategy yields energy costs of up to 11.3% per day and 9.3% per month compared with current operational strategies. A one-day-ahead hourly load prediction was also developed using machine learning algorithms, which facilitates the adoption of the developed data analytics and optimization of the control strategy in a real TES system operation.« less

  5. Data analytics and optimization of an ice-based energy storage system for commercial buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Na; Hong, Tianzhen; Li, Hui

    Ice-based thermal energy storage (TES) systems can shift peak cooling demand and reduce operational energy costs (with time-of-use rates) in commercial buildings. The accurate prediction of the cooling load, and the optimal control strategy for managing the charging and discharging of a TES system, are two critical elements to improving system performance and achieving energy cost savings. This study utilizes data-driven analytics and modeling to holistically understand the operation of an ice–based TES system in a shopping mall, calculating the system’s performance using actual measured data from installed meters and sensors. Results show that there is significant savings potential whenmore » the current operating strategy is improved by appropriately scheduling the operation of each piece of equipment of the TES system, as well as by determining the amount of charging and discharging for each day. A novel optimal control strategy, determined by an optimization algorithm of Sequential Quadratic Programming, was developed to minimize the TES system’s operating costs. Three heuristic strategies were also investigated for comparison with our proposed strategy, and the results demonstrate the superiority of our method to the heuristic strategies in terms of total energy cost savings. Specifically, the optimal strategy yields energy costs of up to 11.3% per day and 9.3% per month compared with current operational strategies. A one-day-ahead hourly load prediction was also developed using machine learning algorithms, which facilitates the adoption of the developed data analytics and optimization of the control strategy in a real TES system operation.« less

  6. Air-Traffic Controllers Evaluate The Descent Advisor

    NASA Technical Reports Server (NTRS)

    Tobias, Leonard; Volckers, Uwe; Erzberger, Heinz

    1992-01-01

    Report describes study of Descent Advisor algorithm: software automation aid intended to assist air-traffic controllers in spacing traffic and meeting specified times or arrival. Based partly on mathematical models of weather conditions and performances of aircraft, it generates suggested clearances, including top-of-descent points and speed-profile data to attain objectives. Study focused on operational characteristics with specific attention to how it can be used for prediction, spacing, and metering.

  7. Integrating Predictive Modeling with Control System Design for Managed Aquifer Recharge and Recovery Applications

    NASA Astrophysics Data System (ADS)

    Drumheller, Z. W.; Regnery, J.; Lee, J. H.; Illangasekare, T. H.; Kitanidis, P. K.; Smits, K. M.

    2014-12-01

    Aquifers around the world show troubling signs of irreversible depletion and seawater intrusion as climate change, population growth, and urbanization led to reduced natural recharge rates and overuse. Scientists and engineers have begun to re-investigate the technology of managed aquifer recharge and recovery (MAR) as a means to increase the reliability of the diminishing and increasingly variable groundwater supply. MAR systems offer the possibility of naturally increasing groundwater storage while improving the quality of impaired water used for recharge. Unfortunately, MAR systems remain wrought with operational challenges related to the quality and quantity of recharged and recovered water stemming from a lack of data-driven, real-time control. Our project seeks to ease the operational challenges of MAR facilities through the implementation of active sensor networks, adaptively calibrated flow and transport models, and simulation-based meta-heuristic control optimization methods. The developed system works by continually collecting hydraulic and water quality data from a sensor network embedded within the aquifer. The data is fed into an inversion algorithm, which calibrates the parameters and initial conditions of a predictive flow and transport model. The calibrated model is passed to a meta-heuristic control optimization algorithm (e.g. genetic algorithm) to execute the simulations and determine the best course of action, i.e., the optimal pumping policy for current aquifer conditions. The optimal pumping policy is manually or autonomously applied. During operation, sensor data are used to assess the accuracy of the optimal prediction and augment the pumping strategy as needed. At laboratory-scale, a small (18"H x 46"L) and an intermediate (6'H x 16'L) two-dimensional synthetic aquifer were constructed and outfitted with sensor networks. Data collection and model inversion components were developed and sensor data were validated by analytical measurements.

  8. Vestibulospinal adaptation to microgravity

    NASA Technical Reports Server (NTRS)

    Paloski, W. H.

    1998-01-01

    Human balance control is known to be transiently disrupted after spaceflight; however, the mechanisms responsible for postflight postural ataxia are still under investigation. In this report, we propose a conceptual model of vestibulospinal adaptation based on theoretical adaptive control concepts and supported by the results from a comprehensive study of balance control recovery after spaceflight. The conceptual model predicts that immediately after spaceflight the balance control system of a returning astronaut does not expect to receive gravity-induced afferent inputs and that descending vestibulospinal control of balance is disrupted until the central nervous system is able to cope with the newly available vestibular otolith information. Predictions of the model are tested using data from a study of the neurosensory control of balance in astronauts immediately after landing. In that study, the mechanisms of sensorimotor balance control were assessed under normal, reduced, and/or altered (sway-referenced) visual and somatosensory input conditions. We conclude that the adaptive control model accurately describes the neurobehavioral responses to spaceflight and that similar models of altered sensory, motor, or environmental constraints are needed clinically to predict responses that patients with sensorimotor pathologies may have to various visual-vestibular or changing stimulus environments.

  9. EEG-based emergency braking intention prediction for brain-controlled driving considering one electrode falling-off.

    PubMed

    Huikang Wang; Luzheng Bi; Teng Teng

    2017-07-01

    This paper proposes a novel method of electroencephalography (EEG)-based driver emergency braking intention detection system for brain-controlled driving considering one electrode falling-off. First, whether one electrode falls off is discriminated based on EEG potentials. Then, the missing signals are estimated by using the signals collected from other channels based on multivariate linear regression. Finally, a linear decoder is applied to classify driver intentions. Experimental results show that the falling-off discrimination accuracy is 99.63% on average and the correlation coefficient and root mean squared error (RMSE) between the estimated and experimental data are 0.90 and 11.43 μV, respectively, on average. Given one electrode falls off, the system accuracy of the proposed intention prediction method is significantly higher than that of the original method (95.12% VS 79.11%) and is close to that (95.95%) of the original system under normal situations (i. e., no electrode falling-off).

  10. Adaptive adjustment of interval predictive control based on combined model and application in shell brand petroleum distillation tower

    NASA Astrophysics Data System (ADS)

    Sun, Chao; Zhang, Chunran; Gu, Xinfeng; Liu, Bin

    2017-10-01

    Constraints of the optimization objective are often unable to be met when predictive control is applied to industrial production process. Then, online predictive controller will not find a feasible solution or a global optimal solution. To solve this problem, based on Back Propagation-Auto Regressive with exogenous inputs (BP-ARX) combined control model, nonlinear programming method is used to discuss the feasibility of constrained predictive control, feasibility decision theorem of the optimization objective is proposed, and the solution method of soft constraint slack variables is given when the optimization objective is not feasible. Based on this, for the interval control requirements of the controlled variables, the slack variables that have been solved are introduced, the adaptive weighted interval predictive control algorithm is proposed, achieving adaptive regulation of the optimization objective and automatically adjust of the infeasible interval range, expanding the scope of the feasible region, and ensuring the feasibility of the interval optimization objective. Finally, feasibility and effectiveness of the algorithm is validated through the simulation comparative experiments.

  11. Silica exposure during construction activities: statistical modeling of task-based measurements from the literature.

    PubMed

    Sauvé, Jean-François; Beaudry, Charles; Bégin, Denis; Dion, Chantal; Gérin, Michel; Lavoué, Jérôme

    2013-05-01

    Many construction activities can put workers at risk of breathing silica containing dusts, and there is an important body of literature documenting exposure levels using a task-based strategy. In this study, statistical modeling was used to analyze a data set containing 1466 task-based, personal respirable crystalline silica (RCS) measurements gathered from 46 sources to estimate exposure levels during construction tasks and the effects of determinants of exposure. Monte-Carlo simulation was used to recreate individual exposures from summary parameters, and the statistical modeling involved multimodel inference with Tobit models containing combinations of the following exposure variables: sampling year, sampling duration, construction sector, project type, workspace, ventilation, and controls. Exposure levels by task were predicted based on the median reported duration by activity, the year 1998, absence of source control methods, and an equal distribution of the other determinants of exposure. The model containing all the variables explained 60% of the variability and was identified as the best approximating model. Of the 27 tasks contained in the data set, abrasive blasting, masonry chipping, scabbling concrete, tuck pointing, and tunnel boring had estimated geometric means above 0.1mg m(-3) based on the exposure scenario developed. Water-fed tools and local exhaust ventilation were associated with a reduction of 71 and 69% in exposure levels compared with no controls, respectively. The predictive model developed can be used to estimate RCS concentrations for many construction activities in a wide range of circumstances.

  12. Smartphone dependence classification using tensor factorization

    PubMed Central

    Kim, Yejin; Yook, In Hye; Yu, Hwanjo; Kim, Dai-Jin

    2017-01-01

    Excessive smartphone use causes personal and social problems. To address this issue, we sought to derive usage patterns that were directly correlated with smartphone dependence based on usage data. This study attempted to classify smartphone dependence using a data-driven prediction algorithm. We developed a mobile application to collect smartphone usage data. A total of 41,683 logs of 48 smartphone users were collected from March 8, 2015, to January 8, 2016. The participants were classified into the control group (SUC) or the addiction group (SUD) using the Korean Smartphone Addiction Proneness Scale for Adults (S-Scale) and a face-to-face offline interview by a psychiatrist and a clinical psychologist (SUC = 23 and SUD = 25). We derived usage patterns using tensor factorization and found the following six optimal usage patterns: 1) social networking services (SNS) during daytime, 2) web surfing, 3) SNS at night, 4) mobile shopping, 5) entertainment, and 6) gaming at night. The membership vectors of the six patterns obtained a significantly better prediction performance than the raw data. For all patterns, the usage times of the SUD were much longer than those of the SUC. From our findings, we concluded that usage patterns and membership vectors were effective tools to assess and predict smartphone dependence and could provide an intervention guideline to predict and treat smartphone dependence based on usage data. PMID:28636614

  13. An assessment of the impact of ATMS and CrIS data assimilation on precipitation prediction over the Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Xue, Tong; Xu, Jianjun; Guan, Zhaoyong; Chen, Han-Ching; Chiu, Long S.; Shao, Min

    2017-07-01

    Using the National Oceanic and Atmospheric Administration's Gridpoint Statistical Interpolation data assimilation system and the National Center for Atmospheric Research's Advanced Research Weather Research and Forecasting (WRF-ARW) regional model, the impact of assimilating Advanced Technology Microwave Sounder (ATMS) and Cross-track Infrared Sounder (CrIS) satellite data on precipitation prediction over the Tibetan Plateau in July 2015 was evaluated. Four experiments were designed: a control experiment and three data assimilation experiments with different data sets injected: conventional data only, a combination of conventional and ATMS satellite data, and a combination of conventional and CrIS satellite data. The results showed that the monthly mean of precipitation is shifted northward in the simulations and showed an orographic bias described as an overestimation upwind of the mountains and an underestimation in the south of the rain belt. The rain shadow mainly influenced prediction of the quantity of precipitation, although the main rainfall pattern was well simulated. For the first 24 h and last 24 h of accumulated daily precipitation, the model generally overestimated the amount of precipitation, but it was underestimated in the heavy-rainfall periods of 3-5, 13-16, and 22-25 July. The observed water vapor conveyance from the southeastern Tibetan Plateau was larger than in the model simulations, which induced inaccuracies in the forecast of heavy rain on 3-5 July. The data assimilation experiments, particularly the ATMS assimilation, were closer to the observations for the heavy-rainfall process than the control. Overall, based on the experiments in July 2015, the satellite data assimilation improved to some extent the prediction of the precipitation pattern over the Tibetan Plateau, although the simulation of the rain belt without data assimilation shows the regional shifting.

  14. Prediction-Oriented Marker Selection (PROMISE): With Application to High-Dimensional Regression.

    PubMed

    Kim, Soyeon; Baladandayuthapani, Veerabhadran; Lee, J Jack

    2017-06-01

    In personalized medicine, biomarkers are used to select therapies with the highest likelihood of success based on an individual patient's biomarker/genomic profile. Two goals are to choose important biomarkers that accurately predict treatment outcomes and to cull unimportant biomarkers to reduce the cost of biological and clinical verifications. These goals are challenging due to the high dimensionality of genomic data. Variable selection methods based on penalized regression (e.g., the lasso and elastic net) have yielded promising results. However, selecting the right amount of penalization is critical to simultaneously achieving these two goals. Standard approaches based on cross-validation (CV) typically provide high prediction accuracy with high true positive rates but at the cost of too many false positives. Alternatively, stability selection (SS) controls the number of false positives, but at the cost of yielding too few true positives. To circumvent these issues, we propose prediction-oriented marker selection (PROMISE), which combines SS with CV to conflate the advantages of both methods. Our application of PROMISE with the lasso and elastic net in data analysis shows that, compared to CV, PROMISE produces sparse solutions, few false positives, and small type I + type II error, and maintains good prediction accuracy, with a marginal decrease in the true positive rates. Compared to SS, PROMISE offers better prediction accuracy and true positive rates. In summary, PROMISE can be applied in many fields to select regularization parameters when the goals are to minimize false positives and maximize prediction accuracy.

  15. User Controllability in a Hybrid Recommender System

    ERIC Educational Resources Information Center

    Parra Santander, Denis Alejandro

    2013-01-01

    Since the introduction of Tapestry in 1990, research on recommender systems has traditionally focused on the development of algorithms whose goal is to increase the accuracy of predicting users' taste based on historical data. In the last decade, this research has diversified, with "human factors" being one area that has received…

  16. Impact of rain gauge quality control and interpolation on streamflow simulation: an application to the Warwick catchment, Australia

    NASA Astrophysics Data System (ADS)

    Liu, Shulun; Li, Yuan; Pauwels, Valentijn R. N.; Walker, Jeffrey P.

    2017-12-01

    Rain gauges are widely used to obtain temporally continuous point rainfall records, which are then interpolated into spatially continuous data to force hydrological models. However, rainfall measurements and interpolation procedure are subject to various uncertainties, which can be reduced by applying quality control and selecting appropriate spatial interpolation approaches. Consequently, the integrated impact of rainfall quality control and interpolation on streamflow simulation has attracted increased attention but not been fully addressed. This study applies a quality control procedure to the hourly rainfall measurements obtained in the Warwick catchment in eastern Australia. The grid-based daily precipitation from the Australian Water Availability Project was used as a reference. The Pearson correlation coefficient between the daily accumulation of gauged rainfall and the reference data was used to eliminate gauges with significant quality issues. The unrealistic outliers were censored based on a comparison between gauged rainfall and the reference. Four interpolation methods, including the inverse distance weighting (IDW), nearest neighbors (NN), linear spline (LN), and ordinary Kriging (OK), were implemented. The four methods were firstly assessed through a cross-validation using the quality-controlled rainfall data. The impacts of the quality control and interpolation on streamflow simulation were then evaluated through a semi-distributed hydrological model. The results showed that the Nash–Sutcliffe model efficiency coefficient (NSE) and Bias of the streamflow simulations were significantly improved after quality control. In the cross-validation, the IDW and OK methods resulted in good interpolation rainfall, while the NN led to the worst result. In term of the impact on hydrological prediction, the IDW led to the most consistent streamflow predictions with the observations, according to the validation at five streamflow-gauged locations. The OK method performed second best according to streamflow predictions at the five gauges in the calibration period (01/01/2007–31/12/2011) and four gauges during the validation period (01/01/2012–30/06/2014). However, NN produced the worst prediction at the outlet of the catchment in the validation period, indicating a low robustness. While the IDW exhibited the best performance in the study catchment in terms of accuracy, robustness and efficiency, more general recommendations on the selection of rainfall interpolation methods need to be further explored.

  17. Impact of rain gauge quality control and interpolation on streamflow simulation: an application to the Warwick catchment, Australia

    NASA Astrophysics Data System (ADS)

    Liu, Shulun; Li, Yuan; Pauwels, Valentijn R. N.; Walker, Jeffrey P.

    2018-01-01

    Rain gauges are widely used to obtain temporally continuous point rainfall records, which are then interpolated into spatially continuous data to force hydrological models. However, rainfall measurements and interpolation procedure are subject to various uncertainties, which can be reduced by applying quality control and selecting appropriate spatial interpolation approaches. Consequently, the integrated impact of rainfall quality control and interpolation on streamflow simulation has attracted increased attention but not been fully addressed. This study applies a quality control procedure to the hourly rainfall measurements obtained in the Warwick catchment in eastern Australia. The grid-based daily precipitation from the Australian Water Availability Project was used as a reference. The Pearson correlation coefficient between the daily accumulation of gauged rainfall and the reference data was used to eliminate gauges with significant quality issues. The unrealistic outliers were censored based on a comparison between gauged rainfall and the reference. Four interpolation methods, including the inverse distance weighting (IDW), nearest neighbors (NN), linear spline (LN), and ordinary Kriging (OK), were implemented. The four methods were firstly assessed through a cross-validation using the quality-controlled rainfall data. The impacts of the quality control and interpolation on streamflow simulation were then evaluated through a semi-distributed hydrological model. The results showed that the Nash–Sutcliffe model efficiency coefficient (NSE) and Bias of the streamflow simulations were significantly improved after quality control. In the cross-validation, the IDW and OK methods resulted in good interpolation rainfall, while the NN led to the worst result. In term of the impact on hydrological prediction, the IDW led to the most consistent streamflow predictions with the observations, according to the validation at five streamflow-gauged locations. The OK method performed second best according to streamflow predictions at the five gauges in the calibration period (01/01/2007–31/12/2011) and four gauges during the validation period (01/01/2012–30/06/2014). However, NN produced the worst prediction at the outlet of the catchment in the validation period, indicating a low robustness. While the IDW exhibited the best performance in the study catchment in terms of accuracy, robustness and efficiency, more general recommendations on the selection of rainfall interpolation methods need to be further explored.

  18. Cognitive Effects of Mindfulness Training: Results of a Pilot Study Based on a Theory Driven Approach

    PubMed Central

    Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa

    2016-01-01

    The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing. PMID:27462287

  19. Cognitive Effects of Mindfulness Training: Results of a Pilot Study Based on a Theory Driven Approach.

    PubMed

    Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa

    2016-01-01

    The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing.

  20. Brain Network Theory Can Predict Whether Neuropsychological Outcomes Will Differ from Clinical Expectations.

    PubMed

    Warren, David E; Denburg, Natalie L; Power, Jonathan D; Bruss, Joel; Waldron, Eric J; Sun, Haoxin; Petersen, Steve E; Tranel, Daniel

    2017-02-01

    Theories of brain-network organization based on neuroimaging data have burgeoned in recent years, but the predictive power of such theories for cognition and behavior has only rarely been examined. Here, predictions from clinical neuropsychologists about the cognitive profiles of patients with focal brain lesions were used to evaluate a brain-network theory (Warren et al., 2014). Neuropsychologists made predictions regarding the neuropsychological profiles of a neurological patient sample (N = 30) based on lesion location. The neuropsychologists then rated the congruence of their predictions with observed neuropsychological outcomes, in regard to the "severity" of neuropsychological deficits and the "focality" of neuropsychological deficits. Based on the network theory, two types of lesion locations were identified: "target" locations (putative hubs in a brain-wide network) and "control" locations (hypothesized to play limited roles in network function). We found that patients with lesions of target locations (N = 19) had deficits of greater than expected severity that were more widespread than expected, whereas patients with lesions of control locations (N = 11) showed milder, circumscribed deficits that were more congruent with expectations. The findings for the target brain locations suggest that prevailing views of brain-behavior relationships may be sharpened and refined by integrating recently proposed network-oriented perspectives. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Controller evaluations of the descent advisor automation aid

    NASA Technical Reports Server (NTRS)

    Tobias, Leonard; Volckers, Uwe; Erzberger, Heinz

    1989-01-01

    An automation aid to assist air traffic controllers in efficiently spacing traffic and meeting arrival times at a fix has been developed at NASA Ames Research Center. The automation aid, referred to as the descent advisor (DA), is based on accurate models of aircraft performance and weather conditions. The DA generates suggested clearances, including both top-of-descent point and speed profile data, for one or more aircraft in order to achieve specific time or distance separation objectives. The DA algorithm is interfaced with a mouse-based, menu-driven controller display that allows the air traffic controller to interactively use its accurate predictive capability to resolve conflicts and issue advisories to arrival aircraft. This paper focuses on operational issues concerning the utilization of the DA, specifically, how the DA can be used for prediction, intrail spacing, and metering. In order to evaluate the DA, a real time simulation was conducted using both current and retired controller subjects. Controllers operated in teams of two, as they do in the present environment; issues of training and team interaction will be discussed. Evaluations by controllers indicated considerable enthusiasm for the DA aid, and provided specific recommendations for using the tool effectively.

  2. Model-based redesign of global transcription regulation

    PubMed Central

    Carrera, Javier; Rodrigo, Guillermo; Jaramillo, Alfonso

    2009-01-01

    Synthetic biology aims to the design or redesign of biological systems. In particular, one possible goal could be the rewiring of the transcription regulation network by exchanging the endogenous promoters. To achieve this objective, we have adapted current methods to the inference of a model based on ordinary differential equations that is able to predict the network response after a major change in its topology. Our procedure utilizes microarray data for training. We have experimentally validated our inferred global regulatory model in Escherichia coli by predicting transcriptomic profiles under new perturbations. We have also tested our methodology in silico by providing accurate predictions of the underlying networks from expression data generated with artificial genomes. In addition, we have shown the predictive power of our methodology by obtaining the gene profile in experimental redesigns of the E. coli genome, where rewiring the transcriptional network by means of knockouts of master regulators or by upregulating transcription factors controlled by different promoters. Our approach is compatible with most network inference methods, allowing to explore computationally future genome-wide redesign experiments in synthetic biology. PMID:19188257

  3. Longitudinal Study-Based Dementia Prediction for Public Health

    PubMed Central

    Kim, HeeChel; Chun, Hong-Woo; Kim, Seonho; Coh, Byoung-Youl; Kwon, Oh-Jin; Moon, Yeong-Ho

    2017-01-01

    The issue of public health in Korea has attracted significant attention given the aging of the country’s population, which has created many types of social problems. The approach proposed in this article aims to address dementia, one of the most significant symptoms of aging and a public health care issue in Korea. The Korean National Health Insurance Service Senior Cohort Database contains personal medical data of every citizen in Korea. There are many different medical history patterns between individuals with dementia and normal controls. The approach used in this study involved examination of personal medical history features from personal disease history, sociodemographic data, and personal health examinations to develop a prediction model. The prediction model used a support-vector machine learning technique to perform a 10-fold cross-validation analysis. The experimental results demonstrated promising performance (80.9% F-measure). The proposed approach supported the significant influence of personal medical history features during an optimal observation period. It is anticipated that a biomedical “big data”-based disease prediction model may assist the diagnosis of any disease more correctly. PMID:28867810

  4. Collaborative development of predictive toxicology applications

    PubMed Central

    2010-01-01

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436

  5. Collaborative development of predictive toxicology applications.

    PubMed

    Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia

    2010-08-31

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.

  6. Collaboratory for the Study of Earthquake Predictability

    NASA Astrophysics Data System (ADS)

    Schorlemmer, D.; Jordan, T. H.; Zechar, J. D.; Gerstenberger, M. C.; Wiemer, S.; Maechling, P. J.

    2006-12-01

    Earthquake prediction is one of the most difficult problems in physical science and, owing to its societal implications, one of the most controversial. The study of earthquake predictability has been impeded by the lack of an adequate experimental infrastructure---the capability to conduct scientific prediction experiments under rigorous, controlled conditions and evaluate them using accepted criteria specified in advance. To remedy this deficiency, the Southern California Earthquake Center (SCEC) is working with its international partners, which include the European Union (through the Swiss Seismological Service) and New Zealand (through GNS Science), to develop a virtual, distributed laboratory with a cyberinfrastructure adequate to support a global program of research on earthquake predictability. This Collaboratory for the Study of Earthquake Predictability (CSEP) will extend the testing activities of SCEC's Working Group on Regional Earthquake Likelihood Models, from which we will present first results. CSEP will support rigorous procedures for registering prediction experiments on regional and global scales, community-endorsed standards for assessing probability-based and alarm-based predictions, access to authorized data sets and monitoring products from designated natural laboratories, and software to allow researchers to participate in prediction experiments. CSEP will encourage research on earthquake predictability by supporting an environment for scientific prediction experiments that allows the predictive skill of proposed algorithms to be rigorously compared with standardized reference methods and data sets. It will thereby reduce the controversies surrounding earthquake prediction, and it will allow the results of prediction experiments to be communicated to the scientific community, governmental agencies, and the general public in an appropriate research context.

  7. Gene-expression programming for flip-bucket spillway scour.

    PubMed

    Guven, Aytac; Azamathulla, H Md

    2012-01-01

    During the last two decades, researchers have noticed that the use of soft computing techniques as an alternative to conventional statistical methods based on controlled laboratory or field data, gave significantly better results. Gene-expression programming (GEP), which is an extension to genetic programming (GP), has nowadays attracted the attention of researchers in prediction of hydraulic data. This study presents GEP as an alternative tool in the prediction of scour downstream of a flip-bucket spillway. Actual field measurements were used to develop GEP models. The proposed GEP models are compared with the earlier conventional GP results of others (Azamathulla et al. 2008b; RMSE = 2.347, δ = 0.377, R = 0.842) and those of commonly used regression-based formulae. The predictions of GEP models were observed to be in strictly good agreement with measured ones, and quite a bit better than conventional GP and the regression-based formulae. The results are tabulated in terms of statistical error measures (GEP1; RMSE = 1.596, δ = 0.109, R = 0.917) and illustrated via scatter plots.

  8. Modeling and control of plasma rotation for NSTX using neoclassical toroidal viscosity and neutral beam injection

    NASA Astrophysics Data System (ADS)

    Goumiri, I. R.; Rowley, C. W.; Sabbagh, S. A.; Gates, D. A.; Gerhardt, S. P.; Boyer, M. D.; Andre, R.; Kolemen, E.; Taira, K.

    2016-03-01

    A model-based feedback system is presented to control plasma rotation in a magnetically confined toroidal fusion device, to maintain plasma stability for long-pulse operation. This research uses experimental measurements from the National Spherical Torus Experiment (NSTX) and is aimed at controlling plasma rotation using two different types of actuation: momentum from injected neutral beams and neoclassical toroidal viscosity generated by three-dimensional applied magnetic fields. Based on the data-driven model obtained, a feedback controller is designed, and predictive simulations using the TRANSP plasma transport code show that the controller is able to attain desired plasma rotation profiles given practical constraints on the actuators and the available measurements of rotation.

  9. Modeling and control of plasma rotation for NSTX using neoclassical toroidal viscosity and neutral beam injection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goumiri, I. R.; Rowley, C. W.; Sabbagh, S. A.

    2016-02-19

    A model-based feedback system is presented to control plasma rotation in a magnetically confined toroidal fusion device, to maintain plasma stability for long-pulse operation. This research uses experimental measurements from the National Spherical Torus Experiment (NSTX) and is aimed at controlling plasma rotation using two different types of actuation: momentum from injected neutral beams and neoclassical toroidal viscosity generated by three-dimensional applied magnetic fields. Based on the data-driven model obtained, a feedback controller is designed, and predictive simulations using the TRANSP plasma transport code show that the controller is able to attain desired plasma rotation profiles given practical constraints onmore » the actuators and the available measurements of rotation.« less

  10. Predictability of depression severity based on posterior alpha oscillations.

    PubMed

    Jiang, H; Popov, T; Jylänki, P; Bi, K; Yao, Z; Lu, Q; Jensen, O; van Gerven, M A J

    2016-04-01

    We aimed to integrate neural data and an advanced machine learning technique to predict individual major depressive disorder (MDD) patient severity. MEG data was acquired from 22 MDD patients and 22 healthy controls (HC) resting awake with eyes closed. Individual power spectra were calculated by a Fourier transform. Sources were reconstructed via beamforming technique. Bayesian linear regression was applied to predict depression severity based on the spatial distribution of oscillatory power. In MDD patients, decreased theta (4-8 Hz) and alpha (8-14 Hz) power was observed in fronto-central and posterior areas respectively, whereas increased beta (14-30 Hz) power was observed in fronto-central regions. In particular, posterior alpha power was negatively related to depression severity. The Bayesian linear regression model showed significant depression severity prediction performance based on the spatial distribution of both alpha (r=0.68, p=0.0005) and beta power (r=0.56, p=0.007) respectively. Our findings point to a specific alteration of oscillatory brain activity in MDD patients during rest as characterized from MEG data in terms of spectral and spatial distribution. The proposed model yielded a quantitative and objective estimation for the depression severity, which in turn has a potential for diagnosis and monitoring of the recovery process. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  11. Towards an agent based traffic regulation and recommendation system for the on-road air quality control.

    PubMed

    Sadiq, Abderrahmane; El Fazziki, Abdelaziz; Ouarzazi, Jamal; Sadgal, Mohamed

    2016-01-01

    This paper presents an integrated and adaptive problem-solving approach to control the on-road air quality by modeling the road infrastructure, managing traffic based on pollution level and generating recommendations for road users. The aim is to reduce vehicle emissions in the most polluted road segments and optimizing the pollution levels. For this we propose the use of historical and real time pollution records and contextual data to calculate the air quality index on road networks and generate recommendations for reassigning traffic flow in order to improve the on-road air quality. The resulting air quality indexes are used in the system's traffic network generation, which the cartography is represented by a weighted graph. The weights evolve according to the pollution indexes and path properties and the graph is therefore dynamic. Furthermore, the systems use the available pollution data and meteorological records in order to predict the on-road pollutant levels by using an artificial neural network based prediction model. The proposed approach combines the benefits of multi-agent systems, Big data technology, machine learning tools and the available data sources. For the shortest path searching in the road network, we use the Dijkstra algorithm over Hadoop MapReduce framework. The use Hadoop framework in the data retrieve and analysis process has significantly improved the performance of the proposed system. Also, the agent technology allowed proposing a suitable solution in terms of robustness and agility.

  12. Validity, accuracy, and predictive value of urinary tract infection signs and symptoms in individuals with spinal cord injury on intermittent catheterization.

    PubMed

    Massa, Luiz M; Hoffman, Jeanne M; Cardenas, Diana D

    2009-01-01

    To determine the validity, accuracy, and predictive value of the signs and symptoms of urinary tract infection (UTI) for individuals with spinal cord injury (SCI) using intermittent catheterization (IC) and the accuracy of individuals with SCI on IC at predicting their own UTI. Prospective cohort based on data from the first 3 months of a 1-year randomized controlled trial to evaluate UTI prevention effectiveness of hydrophilic and standard catheters. Fifty-six community-based individuals on IC. Presence of UTI as defined as bacteriuria with a colony count of at least 10(5) colony-forming units/mL and at least 1 sign or symptom of UTI. Analysis of monthly urine culture and urinalysis data combined with analysis of monthly data collected using a questionnaire that asked subjects to self-report on UTI signs and symptoms and whether or not they felt they had a UTI. Overall, "cloudy urine" had the highest accuracy (83.1%), and "leukocytes in the urine" had the highest sensitivity (82.8%). The highest specificity was for "fever" (99.0%); however, it had a very low sensitivity (6.9%). Subjects were able to predict their own UTI with an accuracy of 66.2%, and the negative predictive value (82.8%) was substantially higher than the positive predictive value (32.6%). The UTI signs and symptoms can predict a UTI more accurately than individual subjects can by using subjective impressions of their own signs and symptoms. Subjects were better at predicting when they did not have a UTI than when they did have a UTI.

  13. Development of a noise prediction model based on advanced fuzzy approaches in typical industrial workrooms.

    PubMed

    Aliabadi, Mohsen; Golmohammadi, Rostam; Khotanlou, Hassan; Mansoorizadeh, Muharram; Salarpour, Amir

    2014-01-01

    Noise prediction is considered to be the best method for evaluating cost-preventative noise controls in industrial workrooms. One of the most important issues is the development of accurate models for analysis of the complex relationships among acoustic features affecting noise level in workrooms. In this study, advanced fuzzy approaches were employed to develop relatively accurate models for predicting noise in noisy industrial workrooms. The data were collected from 60 industrial embroidery workrooms in the Khorasan Province, East of Iran. The main acoustic and embroidery process features that influence the noise were used to develop prediction models using MATLAB software. Multiple regression technique was also employed and its results were compared with those of fuzzy approaches. Prediction errors of all prediction models based on fuzzy approaches were within the acceptable level (lower than one dB). However, Neuro-fuzzy model (RMSE=0.53dB and R2=0.88) could slightly improve the accuracy of noise prediction compared with generate fuzzy model. Moreover, fuzzy approaches provided more accurate predictions than did regression technique. The developed models based on fuzzy approaches as useful prediction tools give professionals the opportunity to have an optimum decision about the effectiveness of acoustic treatment scenarios in embroidery workrooms.

  14. Fuzzy Integration of Support Vector Regression Models for Anticipatory Control of Complex Energy Systems

    DOE PAGES

    Alamaniotis, Miltiadis; Agarwal, Vivek

    2014-04-01

    Anticipatory control systems are a class of systems whose decisions are based on predictions for the future state of the system under monitoring. Anticipation denotes intelligence and is an inherent property of humans that make decisions by projecting in future. Likewise, artificially intelligent systems equipped with predictive functions may be utilized for anticipating future states of complex systems, and therefore facilitate automated control decisions. Anticipatory control of complex energy systems is paramount to their normal and safe operation. In this paper a new intelligent methodology integrating fuzzy inference with support vector regression is introduced. Our proposed methodology implements an anticipatorymore » system aiming at controlling energy systems in a robust way. Initially a set of support vector regressors is adopted for making predictions over critical system parameters. Furthermore, the predicted values are fed into a two stage fuzzy inference system that makes decisions regarding the state of the energy system. The inference system integrates the individual predictions into a single one at its first stage, and outputs a decision together with a certainty factor computed at its second stage. The certainty factor is an index of the significance of the decision. The proposed anticipatory control system is tested on a real world set of data obtained from a complex energy system, describing the degradation of a turbine. Results exhibit the robustness of the proposed system in controlling complex energy systems.« less

  15. From data to the decision: A software architecture to integrate predictive modelling in clinical settings.

    PubMed

    Martinez-Millana, A; Fernandez-Llatas, C; Sacchi, L; Segagni, D; Guillen, S; Bellazzi, R; Traver, V

    2015-08-01

    The application of statistics and mathematics over large amounts of data is providing healthcare systems with new tools for screening and managing multiple diseases. Nonetheless, these tools have many technical and clinical limitations as they are based on datasets with concrete characteristics. This proposition paper describes a novel architecture focused on providing a validation framework for discrimination and prediction models in the screening of Type 2 diabetes. For that, the architecture has been designed to gather different data sources under a common data structure and, furthermore, to be controlled by a centralized component (Orchestrator) in charge of directing the interaction flows among data sources, models and graphical user interfaces. This innovative approach aims to overcome the data-dependency of the models by providing a validation framework for the models as they are used within clinical settings.

  16. Application of GIS to predict malaria hotspots based on Anopheles arabiensis habitat suitability in Southern Africa

    NASA Astrophysics Data System (ADS)

    Gwitira, Isaiah; Murwira, Amon; Zengeya, Fadzai M.; Shekede, Munyaradzi Davis

    2018-02-01

    Malaria remains a major public health problem and a principal cause of morbidity and mortality in most developing countries. Although malaria still presents health problems, significant successes have been recorded in reducing deaths resulting from the disease. As malaria transmission continues to decline, control interventions will increasingly depend on the ability to define high-risk areas known as malaria hotspots. Therefore, there is urgent need to use geospatial tools such as geographic information system to detect spatial patterns of malaria and delineate disease hot spots for better planning and management. Thus, accurate mapping and prediction of seasonality of malaria hotspots is an important step towards developing strategies for effective malaria control. In this study, we modelled seasonal malaria hotspots as a function of habitat suitability of Anopheles arabiensis (A. Arabiensis) as a first step towards predicting likely seasonal malaria hotspots that could provide guidance in targeted malaria control. We used Geographical information system (GIS) and spatial statistic methods to identify seasonal hotspots of malaria cases at the country level. In order to achieve this, we first determined the spatial distribution of seasonal malaria hotspots using the Getis Ord Gi* statistic based on confirmed positive malaria cases recorded at health facilities in Zimbabwe over four years (1996-1999). We then used MAXENT technique to model habitat suitability of A. arabiensis from presence data collected from 1990 to 2002 based on bioclimatic variables and altitude. Finally, we used autologistic regression to test the extent to which malaria hotspots can be predicted using A. arabiensis habitat suitability. Our results show that A. arabiensis habitat suitability consistently and significantly (p < 0.05) predicts malaria hotspots from 1996 to 1999. Overall, our results show that malaria hotspots can be predicted using A. arabiensis habitat suitability, suggesting the possibility of developing models for malaria early warning based on vector habitat suitability.

  17. Development of a Model to Predict the Primary Infection Date of Bacterial Spot (Xanthomonas campestris pv. vesicatoria) on Hot Pepper.

    PubMed

    Kim, Ji-Hoon; Kang, Wee-Soo; Yun, Sung-Chul

    2014-06-01

    A population model of bacterial spot caused by Xanthomonas campestris pv. vesicatoria on hot pepper was developed to predict the primary disease infection date. The model estimated the pathogen population on the surface and within the leaf of the host based on the wetness period and temperature. For successful infection, at least 5,000 cells/ml of the bacterial population were required. Also, wind and rain were necessary according to regression analyses of the monitored data. Bacterial spot on the model is initiated when the pathogen population exceeds 10(15) cells/g within the leaf. The developed model was validated using 94 assessed samples from 2000 to 2007 obtained from monitored fields. Based on the validation study, the predicted initial infection dates varied based on the year rather than the location. Differences in initial infection dates between the model predictions and the monitored data in the field were minimal. For example, predicted infection dates for 7 locations were within the same month as the actual infection dates, 11 locations were within 1 month of the actual infection, and only 3 locations were more than 2 months apart from the actual infection. The predicted infection dates were mapped from 2009 to 2012; 2011 was the most severe year. Although the model was not sensitive enough to predict disease severity of less than 0.1% in the field, our model predicted bacterial spot severity of 1% or more. Therefore, this model can be applied in the field to determine when bacterial spot control is required.

  18. Development of a Model to Predict the Primary Infection Date of Bacterial Spot (Xanthomonas campestris pv. vesicatoria) on Hot Pepper

    PubMed Central

    Kim, Ji-Hoon; Kang, Wee-Soo; Yun, Sung-Chul

    2014-01-01

    A population model of bacterial spot caused by Xanthomonas campestris pv. vesicatoria on hot pepper was developed to predict the primary disease infection date. The model estimated the pathogen population on the surface and within the leaf of the host based on the wetness period and temperature. For successful infection, at least 5,000 cells/ml of the bacterial population were required. Also, wind and rain were necessary according to regression analyses of the monitored data. Bacterial spot on the model is initiated when the pathogen population exceeds 1015 cells/g within the leaf. The developed model was validated using 94 assessed samples from 2000 to 2007 obtained from monitored fields. Based on the validation study, the predicted initial infection dates varied based on the year rather than the location. Differences in initial infection dates between the model predictions and the monitored data in the field were minimal. For example, predicted infection dates for 7 locations were within the same month as the actual infection dates, 11 locations were within 1 month of the actual infection, and only 3 locations were more than 2 months apart from the actual infection. The predicted infection dates were mapped from 2009 to 2012; 2011 was the most severe year. Although the model was not sensitive enough to predict disease severity of less than 0.1% in the field, our model predicted bacterial spot severity of 1% or more. Therefore, this model can be applied in the field to determine when bacterial spot control is required. PMID:25288995

  19. Iterative learning-based decentralized adaptive tracker for large-scale systems: a digital redesign approach.

    PubMed

    Tsai, Jason Sheng-Hong; Du, Yan-Yi; Huang, Pei-Hsiang; Guo, Shu-Mei; Shieh, Leang-San; Chen, Yuhua

    2011-07-01

    In this paper, a digital redesign methodology of the iterative learning-based decentralized adaptive tracker is proposed to improve the dynamic performance of sampled-data linear large-scale control systems consisting of N interconnected multi-input multi-output subsystems, so that the system output will follow any trajectory which may not be presented by the analytic reference model initially. To overcome the interference of each sub-system and simplify the controller design, the proposed model reference decentralized adaptive control scheme constructs a decoupled well-designed reference model first. Then, according to the well-designed model, this paper develops a digital decentralized adaptive tracker based on the optimal analog control and prediction-based digital redesign technique for the sampled-data large-scale coupling system. In order to enhance the tracking performance of the digital tracker at specified sampling instants, we apply the iterative learning control (ILC) to train the control input via continual learning. As a result, the proposed iterative learning-based decentralized adaptive tracker not only has robust closed-loop decoupled property but also possesses good tracking performance at both transient and steady state. Besides, evolutionary programming is applied to search for a good learning gain to speed up the learning process of ILC. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Guidance and control 1992; Proceedings of the 15th Annual AAS Rocky Mountain Conference, Keystone, CO, Feb. 8-12, 1992

    NASA Astrophysics Data System (ADS)

    Culp, Robert D.; Zietz, Richard P.

    The present volume on guidance and control discusses advances in guidance, navigation, and control, guidance and control storyboard displays, space robotic control, spacecraft control and flexible body interaction, and the Mission to Planet Earth. Attention is given to applications of Newton's method to attitude determination, a new family of low-cost momentum/reaction wheels, stellar attitude data handling, and satellite life prediction using propellant quantity measurements. Topics addressed include robust manipulator controller specification and design, implementations and applications of a manipulator control testbed, optimizing transparency in teleoperator architectures, and MIMO system identification using frequency response data. Also discussed are instrument configurations for the restructured Earth Observing System, the HIRIS instrument, clouds and the earth's radiant energy system, and large space-based systems for dealing with global change.

  1. Web-based Traffic Noise Control Support System for Sustainable Transportation

    NASA Astrophysics Data System (ADS)

    Fan, Lisa; Dai, Liming; Li, Anson

    Traffic noise is considered as one of the major pollutions that will affect our communities in the future. This paper presents a framework of web-based traffic noise control support system (WTNCSS) for a sustainable transportation. WTNCSS is to provide the decision makers, engineers and publics a platform to efficiently access the information, and effectively making decisions related to traffic control. The system is based on a Service Oriented Architecture (SOA) which takes the advantages of the convenience of World Wide Web system with the data format of XML. The whole system is divided into different modules such as the prediction module, ontology-based expert module and dynamic online survey module. Each module of the system provides a distinct information service to the decision support center through the HTTP protocol.

  2. IFCPT S-Duct Grid-Adapted FUN3D Computations for the Third Propulsion Aerodynamics Works

    NASA Technical Reports Server (NTRS)

    Davis, Zach S.; Park, M. A.

    2017-01-01

    Contributions of the unstructured Reynolds-averaged Navier-Stokes code, FUN3D, to the 3rd AIAA Propulsion Aerodynamics Workshop are described for the diffusing IFCPT S-Duct. Using workshop-supplied grids, results for the baseline S-Duct, baseline S-Duct with Aerodynamic Interface Plane (AIP) rake hardware, and baseline S-Duct with flow control devices are compared with experimental data and results computed with output-based, off-body grid adaptation in FUN3D. Due to the absence of influential geometry components, total pressure recovery is overpredicted on the baseline S-Duct and S-Duct with flow control vanes when compared to experimental values. An estimate for the exact value of total pressure recovery is derived for these cases given an infinitely refined mesh. When results from output-based mesh adaptation are compared with those computed on workshop-supplied grids, a considerable improvement in predicting total pressure recovery is observed. By including more representative geometry, output-based mesh adaptation compares very favorably with experimental data in terms of predicting the total pressure recovery cost-function; whereas, results computed using the workshop-supplied grids are underpredicted.

  3. Selecting the minimum prediction base of historical data to perform 5-year predictions of the cancer burden: The GoF-optimal method.

    PubMed

    Valls, Joan; Castellà, Gerard; Dyba, Tadeusz; Clèries, Ramon

    2015-06-01

    Predicting the future burden of cancer is a key issue for health services planning, where a method for selecting the predictive model and the prediction base is a challenge. A method, named here Goodness-of-Fit optimal (GoF-optimal), is presented to determine the minimum prediction base of historical data to perform 5-year predictions of the number of new cancer cases or deaths. An empirical ex-post evaluation exercise for cancer mortality data in Spain and cancer incidence in Finland using simple linear and log-linear Poisson models was performed. Prediction bases were considered within the time periods 1951-2006 in Spain and 1975-2007 in Finland, and then predictions were made for 37 and 33 single years in these periods, respectively. The performance of three fixed different prediction bases (last 5, 10, and 20 years of historical data) was compared to that of the prediction base determined by the GoF-optimal method. The coverage (COV) of the 95% prediction interval and the discrepancy ratio (DR) were calculated to assess the success of the prediction. The results showed that (i) models using the prediction base selected through GoF-optimal method reached the highest COV and the lowest DR and (ii) the best alternative strategy to GoF-optimal was the one using the base of prediction of 5-years. The GoF-optimal approach can be used as a selection criterion in order to find an adequate base of prediction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Integrating Statistical Machine Learning in a Semantic Sensor Web for Proactive Monitoring and Control.

    PubMed

    Adeleke, Jude Adekunle; Moodley, Deshendran; Rens, Gavin; Adewumi, Aderemi Oluyinka

    2017-04-09

    Proactive monitoring and control of our natural and built environments is important in various application scenarios. Semantic Sensor Web technologies have been well researched and used for environmental monitoring applications to expose sensor data for analysis in order to provide responsive actions in situations of interest. While these applications provide quick response to situations, to minimize their unwanted effects, research efforts are still necessary to provide techniques that can anticipate the future to support proactive control, such that unwanted situations can be averted altogether. This study integrates a statistical machine learning based predictive model in a Semantic Sensor Web using stream reasoning. The approach is evaluated in an indoor air quality monitoring case study. A sliding window approach that employs the Multilayer Perceptron model to predict short term PM 2 . 5 pollution situations is integrated into the proactive monitoring and control framework. Results show that the proposed approach can effectively predict short term PM 2 . 5 pollution situations: precision of up to 0.86 and sensitivity of up to 0.85 is achieved over half hour prediction horizons, making it possible for the system to warn occupants or even to autonomously avert the predicted pollution situations within the context of Semantic Sensor Web.

  5. Integrating Statistical Machine Learning in a Semantic Sensor Web for Proactive Monitoring and Control

    PubMed Central

    Adeleke, Jude Adekunle; Moodley, Deshendran; Rens, Gavin; Adewumi, Aderemi Oluyinka

    2017-01-01

    Proactive monitoring and control of our natural and built environments is important in various application scenarios. Semantic Sensor Web technologies have been well researched and used for environmental monitoring applications to expose sensor data for analysis in order to provide responsive actions in situations of interest. While these applications provide quick response to situations, to minimize their unwanted effects, research efforts are still necessary to provide techniques that can anticipate the future to support proactive control, such that unwanted situations can be averted altogether. This study integrates a statistical machine learning based predictive model in a Semantic Sensor Web using stream reasoning. The approach is evaluated in an indoor air quality monitoring case study. A sliding window approach that employs the Multilayer Perceptron model to predict short term PM2.5 pollution situations is integrated into the proactive monitoring and control framework. Results show that the proposed approach can effectively predict short term PM2.5 pollution situations: precision of up to 0.86 and sensitivity of up to 0.85 is achieved over half hour prediction horizons, making it possible for the system to warn occupants or even to autonomously avert the predicted pollution situations within the context of Semantic Sensor Web. PMID:28397776

  6. Online prediction of organileptic data for snack food using color images

    NASA Astrophysics Data System (ADS)

    Yu, Honglu; MacGregor, John F.

    2004-11-01

    In this paper, a study for the prediction of organileptic properties of snack food in real-time using RGB color images is presented. The so-called organileptic properties, which are properties based on texture, taste and sight, are generally measured either by human sensory response or by mechanical devices. Neither of these two methods can be used for on-line feedback control in high-speed production. In this situation, a vision-based soft sensor is very attractive. By taking images of the products, the samples remain untouched and the product properties can be predicted in real time from image data. Four types of organileptic properties are considered in this study: blister level, toast points, taste and peak break force. Wavelet transform are applied on the color images and the averaged absolute value for each filtered image is used as texture feature variable. In order to handle the high correlation among the feature variables, Partial Least Squares (PLS) is used to regress the extracted feature variables against the four response variables.

  7. Alzheimer Disease and Behavioral Variant Frontotemporal Dementia: Automatic Classification Based on Cortical Atrophy for Single-Subject Diagnosis.

    PubMed

    Möller, Christiane; Pijnenburg, Yolande A L; van der Flier, Wiesje M; Versteeg, Adriaan; Tijms, Betty; de Munck, Jan C; Hafkemeijer, Anne; Rombouts, Serge A R B; van der Grond, Jeroen; van Swieten, John; Dopper, Elise; Scheltens, Philip; Barkhof, Frederik; Vrenken, Hugo; Wink, Alle Meije

    2016-06-01

    Purpose To investigate the diagnostic accuracy of an image-based classifier to distinguish between Alzheimer disease (AD) and behavioral variant frontotemporal dementia (bvFTD) in individual patients by using gray matter (GM) density maps computed from standard T1-weighted structural images obtained with multiple imagers and with independent training and prediction data. Materials and Methods The local institutional review board approved the study. Eighty-four patients with AD, 51 patients with bvFTD, and 94 control subjects were divided into independent training (n = 115) and prediction (n = 114) sets with identical diagnosis and imager type distributions. Training of a support vector machine (SVM) classifier used diagnostic status and GM density maps and produced voxelwise discrimination maps. Discriminant function analysis was used to estimate suitability of the extracted weights for single-subject classification in the prediction set. Receiver operating characteristic (ROC) curves and area under the ROC curve (AUC) were calculated for image-based classifiers and neuropsychological z scores. Results Training accuracy of the SVM was 85% for patients with AD versus control subjects, 72% for patients with bvFTD versus control subjects, and 79% for patients with AD versus patients with bvFTD (P ≤ .029). Single-subject diagnosis in the prediction set when using the discrimination maps yielded accuracies of 88% for patients with AD versus control subjects, 85% for patients with bvFTD versus control subjects, and 82% for patients with AD versus patients with bvFTD, with a good to excellent AUC (range, 0.81-0.95; P ≤ .001). Machine learning-based categorization of AD versus bvFTD based on GM density maps outperforms classification based on neuropsychological test results. Conclusion The SVM can be used in single-subject discrimination and can help the clinician arrive at a diagnosis. The SVM can be used to distinguish disease-specific GM patterns in patients with AD and those with bvFTD as compared with normal aging by using common T1-weighted structural MR imaging. (©) RSNA, 2015.

  8. Development of an expert system for analysis of Shuttle atmospheric revitalization and pressure control subsystem anomalies

    NASA Technical Reports Server (NTRS)

    Lafuse, Sharon A.

    1991-01-01

    The paper describes the Shuttle Leak Management Expert System (SLMES), a preprototype expert system developed to enable the ECLSS subsystem manager to analyze subsystem anomalies and to formulate flight procedures based on flight data. The SLMES combines the rule-based expert system technology with the traditional FORTRAN-based software into an integrated system. SLMES analyzes the data using rules, and, when it detects a problem that requires simulation, it sets up the input for the FORTRAN-based simulation program ARPCS2AT2, which predicts the cabin total pressure and composition as a function of time. The program simulates the pressure control system, the crew oxygen masks, the airlock repress/depress valves, and the leakage. When the simulation has completed, other SLMES rules are triggered to examine the results of simulation contrary to flight data and to suggest methods for correcting the problem. Results are then presented in form of graphs and tables.

  9. Robust current control-based generalized predictive control with sliding mode disturbance compensation for PMSM drives.

    PubMed

    Liu, Xudong; Zhang, Chenghui; Li, Ke; Zhang, Qi

    2017-11-01

    This paper addresses the current control of permanent magnet synchronous motor (PMSM) for electric drives with model uncertainties and disturbances. A generalized predictive current control method combined with sliding mode disturbance compensation is proposed to satisfy the requirement of fast response and strong robustness. Firstly, according to the generalized predictive control (GPC) theory based on the continuous time model, a predictive current control method is presented without considering the disturbance, which is convenient to be realized in the digital controller. In fact, it's difficult to derive the exact motor model and parameters in the practical system. Thus, a sliding mode disturbance compensation controller is studied to improve the adaptiveness and robustness of the control system. The designed controller attempts to combine the merits of both predictive control and sliding mode control, meanwhile, the controller parameters are easy to be adjusted. Lastly, the proposed controller is tested on an interior PMSM by simulation and experiment, and the results indicate that it has good performance in both current tracking and disturbance rejection. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Deep phenotyping to predict live birth outcomes in in vitro fertilization

    PubMed Central

    Banerjee, Prajna; Choi, Bokyung; Shahine, Lora K.; Jun, Sunny H.; O’Leary, Kathleen; Lathi, Ruth B.; Westphal, Lynn M.; Wong, Wing H.; Yao, Mylene W. M.

    2010-01-01

    Nearly 75% of in vitro fertilization (IVF) treatments do not result in live births and patients are largely guided by a generalized age-based prognostic stratification. We sought to provide personalized and validated prognosis by using available clinical and embryo data from prior, failed treatments to predict live birth probabilities in the subsequent treatment. We generated a boosted tree model, IVFBT, by training it with IVF outcomes data from 1,676 first cycles (C1s) from 2003–2006, followed by external validation with 634 cycles from 2007–2008, respectively. We tested whether this model could predict the probability of having a live birth in the subsequent treatment (C2). By using nondeterministic methods to identify prognostic factors and their relative nonredundant contribution, we generated a prediction model, IVFBT, that was superior to the age-based control by providing over 1,000-fold improvement to fit new data (p < 0.05), and increased discrimination by receiver–operative characteristic analysis (area-under-the-curve, 0.80 vs. 0.68 for C1, 0.68 vs. 0.58 for C2). IVFBT provided predictions that were more accurate for ∼83% of C1 and ∼60% of C2 cycles that were out of the range predicted by age. Over half of those patients were reclassified to have higher live birth probabilities. We showed that data from a prior cycle could be used effectively to provide personalized and validated live birth probabilities in a subsequent cycle. Our approach may be replicated and further validated in other IVF clinics. PMID:20643955

  11. Multi-Source Learning for Joint Analysis of Incomplete Multi-Modality Neuroimaging Data

    PubMed Central

    Yuan, Lei; Wang, Yalin; Thompson, Paul M.; Narayan, Vaibhav A.; Ye, Jieping

    2013-01-01

    Incomplete data present serious problems when integrating largescale brain imaging data sets from different imaging modalities. In the Alzheimer’s Disease Neuroimaging Initiative (ADNI), for example, over half of the subjects lack cerebrospinal fluid (CSF) measurements; an independent half of the subjects do not have fluorodeoxyglucose positron emission tomography (FDG-PET) scans; many lack proteomics measurements. Traditionally, subjects with missing measures are discarded, resulting in a severe loss of available information. We address this problem by proposing two novel learning methods where all the samples (with at least one available data source) can be used. In the first method, we divide our samples according to the availability of data sources, and we learn shared sets of features with state-of-the-art sparse learning methods. Our second method learns a base classifier for each data source independently, based on which we represent each source using a single column of prediction scores; we then estimate the missing prediction scores, which, combined with the existing prediction scores, are used to build a multi-source fusion model. To illustrate the proposed approaches, we classify patients from the ADNI study into groups with Alzheimer’s disease (AD), mild cognitive impairment (MCI) and normal controls, based on the multi-modality data. At baseline, ADNI’s 780 participants (172 AD, 397 MCI, 211 Normal), have at least one of four data types: magnetic resonance imaging (MRI), FDG-PET, CSF and proteomics. These data are used to test our algorithms. Comprehensive experiments show that our proposed methods yield stable and promising results. PMID:24014189

  12. Chapter 16 - Predictive Analytics for Comprehensive Energy Systems State Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yingchen; Yang, Rui; Hodge, Brian S

    Energy sustainability is a subject of concern to many nations in the modern world. It is critical for electric power systems to diversify energy supply to include systems with different physical characteristics, such as wind energy, solar energy, electrochemical energy storage, thermal storage, bio-energy systems, geothermal, and ocean energy. Each system has its own range of control variables and targets. To be able to operate such a complex energy system, big-data analytics become critical to achieve the goal of predicting energy supplies and consumption patterns, assessing system operation conditions, and estimating system states - all providing situational awareness to powermore » system operators. This chapter presents data analytics and machine learning-based approaches to enable predictive situational awareness of the power systems.« less

  13. Predicting Reading Growth with Event-Related Potentials: Thinking Differently about Indexing “Responsiveness”

    PubMed Central

    Lemons, Christopher J.; Key, Alexandra P.F.; Fuchs, Douglas; Yoder, Paul J.; Fuchs, Lynn S.; Compton, Donald L.; Williams, Susan M.; Bouton, Bobette

    2009-01-01

    The purpose of this study was to determine if event-related potential (ERP) data collected during three reading-related tasks (Letter Sound Matching, Nonword Rhyming, and Nonword Reading) could be used to predict short-term reading growth on a curriculum-based measure of word identification fluency over 19 weeks in a sample of 29 first-grade children. Results indicate that ERP responses to the Letter Sound Matching task were predictive of reading change and remained so after controlling for two previously validated behavioral predictors of reading, Rapid Letter Naming and Segmenting. ERP data for the other tasks were not correlated with reading change. The potential for cognitive neuroscience to enhance current methods of indexing responsiveness in a response-to-intervention (RTI) model is discussed. PMID:20514353

  14. Model predictive control of non-linear systems over networks with data quantization and packet loss.

    PubMed

    Yu, Jimin; Nan, Liangsheng; Tang, Xiaoming; Wang, Ping

    2015-11-01

    This paper studies the approach of model predictive control (MPC) for the non-linear systems under networked environment where both data quantization and packet loss may occur. The non-linear controlled plant in the networked control system (NCS) is represented by a Tagaki-Sugeno (T-S) model. The sensed data and control signal are quantized in both links and described as sector bound uncertainties by applying sector bound approach. Then, the quantized data are transmitted in the communication networks and may suffer from the effect of packet losses, which are modeled as Bernoulli process. A fuzzy predictive controller which guarantees the stability of the closed-loop system is obtained by solving a set of linear matrix inequalities (LMIs). A numerical example is given to illustrate the effectiveness of the proposed method. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Improved fuzzy PID controller design using predictive functional control structure.

    PubMed

    Wang, Yuzhong; Jin, Qibing; Zhang, Ridong

    2017-11-01

    In conventional PID scheme, the ensemble control performance may be unsatisfactory due to limited degrees of freedom under various kinds of uncertainty. To overcome this disadvantage, a novel PID control method that inherits the advantages of fuzzy PID control and the predictive functional control (PFC) is presented and further verified on the temperature model of a coke furnace. Based on the framework of PFC, the prediction of the future process behavior is first obtained using the current process input signal. Then, the fuzzy PID control based on the multi-step prediction is introduced to acquire the optimal control law. Finally, the case study on a temperature model of a coke furnace shows the effectiveness of the fuzzy PID control scheme when compared with conventional PID control and fuzzy self-adaptive PID control. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Speech recognition in advanced rotorcraft - Using speech controls to reduce manual control overload

    NASA Technical Reports Server (NTRS)

    Vidulich, Michael A.; Bortolussi, Michael R.

    1988-01-01

    An experiment has been conducted to ascertain the usefulness of helicopter pilot speech controls and their effect on time-sharing performance, under the impetus of multiple-resource theories of attention which predict that time-sharing should be more efficient with mixed manual and speech controls than with all-manual ones. The test simulation involved an advanced, single-pilot scout/attack helicopter. Performance and subjective workload levels obtained supported the claimed utility of speech recognition-based controls; specifically, time-sharing performance was improved while preparing a data-burst transmission of information during helicopter hover.

  17. Constant-Time Pattern Matching For Real-Time Production Systems

    NASA Astrophysics Data System (ADS)

    Parson, Dale E.; Blank, Glenn D.

    1989-03-01

    Many intelligent systems must respond to sensory data or critical environmental conditions in fixed, predictable time. Rule-based systems, including those based on the efficient Rete matching algorithm, cannot guarantee this result. Improvement in execution-time efficiency is not all that is needed here; it is important to ensure constant, 0(1) time limits for portions of the matching process. Our approach is inspired by two observations about human performance. First, cognitive psychologists distinguish between automatic and controlled processing. Analogously, we partition the matching process across two networks. The first is the automatic partition; it is characterized by predictable 0(1) time and space complexity, lack of persistent memory, and is reactive in nature. The second is the controlled partition; it includes the search-based goal-driven and data-driven processing typical of most production system programming. The former is responsible for recognition and response to critical environmental conditions. The latter is responsible for the more flexible problem-solving behaviors consistent with the notion of intelligence. Support for learning and refining the automatic partition can be placed in the controlled partition. Our second observation is that people are able to attend to more critical stimuli or requirements selectively. Our match algorithm uses priorities to focus matching. It compares priority of information during matching, rather than deferring this comparison until conflict resolution. Messages from the automatic partition are able to interrupt the controlled partition, enhancing system responsiveness. Our algorithm has numerous applications for systems that must exhibit time-constrained behavior.

  18. The potential distribution of Phlebotomus papatasi (Diptera: Psychodidae) in Libya based on ecological niche model.

    PubMed

    Abdel-Dayem, M S; Annajar, B B; Hanafi, H A; Obenauer, P J

    2012-05-01

    The increased cases of cutaneous leishmaniasis vectored by Phlebotomus papatasi (Scopoli) in Libya have driven considerable effort to develop a predictive model for the potential geographical distribution of this disease. We collected adult P. papatasi from 17 sites in Musrata and Yefern regions of Libya using four different attraction traps. Our trap results and literature records describing the distribution of P. papatasi were incorporated into a MaxEnt algorithm prediction model that used 22 environmental variables. The model showed a high performance (AUC = 0.992 and 0.990 for training and test data, respectively). High suitability for P. papatasi was predicted to be largely confined to the coast at altitudes <600 m. Regions south of 300 degrees N latitude were calculated as unsuitable for this species. Jackknife analysis identified precipitation as having the most significant predictive power, while temperature and elevation variables were less influential. The National Leishmaniasis Control Program in Libya may find this information useful in their efforts to control zoonotic cutaneous leishmaniasis. Existing records are strongly biased toward a few geographical regions, and therefore, further sand fly collections are warranted that should include documentation of such factors as soil texture and humidity, land cover, and normalized difference vegetation index (NDVI) data to increase the model's predictive power.

  19. Predicting Successful Treatment Outcome of Web-Based Self-help for Problem Drinkers: Secondary Analysis From a Randomized Controlled Trial

    PubMed Central

    Kramer, Jeannet; Keuken, Max; Smit, Filip; Schippers, Gerard; Cuijpers, Pim

    2008-01-01

    Background Web-based self-help interventions for problem drinking are coming of age. They have shown promising results in terms of cost-effectiveness, and they offer opportunities to reach out on a broad scale to problem drinkers. The question now is whether certain groups of problem drinkers benefit more from such Web-based interventions than others. Objective We sought to identify baseline, client-related predictors of the effectiveness of Drinking Less, a 24/7, free-access, interactive, Web-based self-help intervention without therapist guidance for problem drinkers who want to reduce their alcohol consumption. The intervention is based on cognitive-behavioral and self-control principles. Methods We conducted secondary analysis of data from a pragmatic randomized trial with follow-up at 6 and 12 months. Participants (N = 261) were adult problem drinkers in the Dutch general population with a weekly alcohol consumption above 210 g of ethanol for men or 140 g for women, or consumption of at least 60 g (men) or 40 g (women) one or more days a week over the past 3 months. Six baseline participant characteristics were designated as putative predictors of treatment response: (1) gender, (2) education, (3) Internet use competence (sociodemographics), (4) mean weekly alcohol consumption, (5) prior professional help for alcohol problems (level of problem drinking), and (6) participants’ expectancies of Web-based interventions for problem drinking. Intention-to-treat (ITT) analyses, using last-observation-carried-forward (LOCF) data, and regression imputation (RI) were performed to deal with loss to follow-up. Statistical tests for interaction terms were conducted and linear regression analysis was performed to investigate whether the participants’ characteristics as measured at baseline predicted positive treatment responses at 6- and 12-month follow-ups. Results At 6 months, prior help for alcohol problems predicted a small, marginally significant positive treatment outcome in the RI model only (beta = .18, P = .05, R2 = .11). At 12 months, females displayed modest predictive power in both imputation models (LOCF: beta = .22, P = .045, R2 = .02; regression: beta = .27, P = .01, R2 = .03). Those with higher levels of education exhibited modest predictive power in the LOCF model only (beta = .33, P = .01, R2 = .03). Conclusions Although female and more highly educated users appeared slightly more likely to derive benefit from the Drinking Less intervention, none of the baseline characteristics we studied persuasively predicted a favorable treatment outcome. The Web-based intervention therefore seems well suited for a heterogeneous group of problem drinkers and could hence be offered as a first-step treatment in a stepped-care approach directed at problem drinkers in the general population. Trial Registration International Standard Randomized Controlled Trial Number (ISRCTN): 47285230; http://www.controlled-trials.com/isrctn47285230 (Archived by WebCite at http://www.webcitation.org/5cSR2sMkp). PMID:19033150

  20. Predicting successful treatment outcome of web-based self-help for problem drinkers: secondary analysis from a randomized controlled trial.

    PubMed

    Riper, Heleen; Kramer, Jeannet; Keuken, Max; Smit, Filip; Schippers, Gerard; Cuijpers, Pim

    2008-11-22

    Web-based self-help interventions for problem drinking are coming of age. They have shown promising results in terms of cost-effectiveness, and they offer opportunities to reach out on a broad scale to problem drinkers. The question now is whether certain groups of problem drinkers benefit more from such Web-based interventions than others. We sought to identify baseline, client-related predictors of the effectiveness of Drinking Less, a 24/7, free-access, interactive, Web-based self-help intervention without therapist guidance for problem drinkers who want to reduce their alcohol consumption. The intervention is based on cognitive-behavioral and self-control principles. We conducted secondary analysis of data from a pragmatic randomized trial with follow-up at 6 and 12 months. Participants (N = 261) were adult problem drinkers in the Dutch general population with a weekly alcohol consumption above 210 g of ethanol for men or 140 g for women, or consumption of at least 60 g (men) or 40 g (women) one or more days a week over the past 3 months. Six baseline participant characteristics were designated as putative predictors of treatment response: (1) gender, (2) education, (3) Internet use competence (sociodemographics), (4) mean weekly alcohol consumption, (5) prior professional help for alcohol problems (level of problem drinking), and (6) participants' expectancies of Web-based interventions for problem drinking. Intention-to-treat (ITT) analyses, using last-observation-carried-forward (LOCF) data, and regression imputation (RI) were performed to deal with loss to follow-up. Statistical tests for interaction terms were conducted and linear regression analysis was performed to investigate whether the participants' characteristics as measured at baseline predicted positive treatment responses at 6- and 12-month follow-ups. At 6 months, prior help for alcohol problems predicted a small, marginally significant positive treatment outcome in the RI model only (beta = .18, P = .05, R(2) = .11). At 12 months, females displayed modest predictive power in both imputation models (LOCF: beta = .22, P = .045, R(2) = .02; regression: beta = .27, P = .01, R(2) = .03). Those with higher levels of education exhibited modest predictive power in the LOCF model only (beta = .33, P = .01, R(2) = .03). Although female and more highly educated users appeared slightly more likely to derive benefit from the Drinking Less intervention, none of the baseline characteristics we studied persuasively predicted a favorable treatment outcome. The Web-based intervention therefore seems well suited for a heterogeneous group of problem drinkers and could hence be offered as a first-step treatment in a stepped-care approach directed at problem drinkers in the general population. International Standard Randomized Controlled Trial Number (ISRCTN): 47285230; http://www.controlled-trials.com/isrctn47285230 (Archived by WebCite at http://www.webcitation.org/5cSR2sMkp).

  1. Real-tiem Adaptive Control Scheme for Superior Plasma Confinement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexander Trunov, Ph.D.

    2001-06-01

    During this Phase I project, IOS, in collaboration with our subcontractors at General Atomics, Inc., acquired and analyzed measurement data on various plasma equilibrium modes. We developed a Matlab-based toolbox consisting of linear and neural network approximators that are capable of learning and predicting, with accuracy, the behavior of plasma parameters. We also began development of the control algorithm capable of using the model of the plasma obtained by the neural network approximator.

  2. Mariner Mars 1971 battery design, test, and flight performance

    NASA Technical Reports Server (NTRS)

    Bogner, R. S.

    1973-01-01

    The design, integration, fabrication, test results, and flight performance of the battery system for the Mariner Mars spacecraft launched in May 1971 are presented. The battery consists of 26 20-Ah hermetically sealed nickel-cadmium cells housed in a machined magnesium chassis. The battery package weighs 29.5 kg and is unique in that the chassis also serves as part of the spacecraft structure. Active thermal control is accomplished by louvers mounted to the battery baseplate. Battery charge is accomplished by C/10 and C/30 constant current chargers. The switch from the high-rate to low-rate charge is automatic, based on terminal voltage. Additional control is possible by ground command or onboard computer. The performance data from the flight battery is compared to the data from various battery tests in the laboratory. Flight battery data was predictable based on ground test data.

  3. Model predictive control based on reduced order models applied to belt conveyor system.

    PubMed

    Chen, Wei; Li, Xin

    2016-11-01

    In the paper, a model predictive controller based on reduced order model is proposed to control belt conveyor system, which is an electro-mechanics complex system with long visco-elastic body. Firstly, in order to design low-degree controller, the balanced truncation method is used for belt conveyor model reduction. Secondly, MPC algorithm based on reduced order model for belt conveyor system is presented. Because of the error bound between the full-order model and reduced order model, two Kalman state estimators are applied in the control scheme to achieve better system performance. Finally, the simulation experiments are shown that balanced truncation method can significantly reduce the model order with high-accuracy and model predictive control based on reduced-model performs well in controlling the belt conveyor system. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Analysis of functional redundancies within the Arabidopsis TCP transcription factor family.

    PubMed

    Danisman, Selahattin; van Dijk, Aalt D J; Bimbo, Andrea; van der Wal, Froukje; Hennig, Lars; de Folter, Stefan; Angenent, Gerco C; Immink, Richard G H

    2013-12-01

    Analyses of the functions of TEOSINTE-LIKE1, CYCLOIDEA, and PROLIFERATING CELL FACTOR1 (TCP) transcription factors have been hampered by functional redundancy between its individual members. In general, putative functionally redundant genes are predicted based on sequence similarity and confirmed by genetic analysis. In the TCP family, however, identification is impeded by relatively low overall sequence similarity. In a search for functionally redundant TCP pairs that control Arabidopsis leaf development, this work performed an integrative bioinformatics analysis, combining protein sequence similarities, gene expression data, and results of pair-wise protein-protein interaction studies for the 24 members of the Arabidopsis TCP transcription factor family. For this, the work completed any lacking gene expression and protein-protein interaction data experimentally and then performed a comprehensive prediction of potential functional redundant TCP pairs. Subsequently, redundant functions could be confirmed for selected predicted TCP pairs by genetic and molecular analyses. It is demonstrated that the previously uncharacterized class I TCP19 gene plays a role in the control of leaf senescence in a redundant fashion with TCP20. Altogether, this work shows the power of combining classical genetic and molecular approaches with bioinformatics predictions to unravel functional redundancies in the TCP transcription factor family.

  5. Analysis of functional redundancies within the Arabidopsis TCP transcription factor family

    PubMed Central

    Danisman, Selahattin; de Folter, Stefan; Immink, Richard G. H.

    2013-01-01

    Analyses of the functions of TEOSINTE-LIKE1, CYCLOIDEA, and PROLIFERATING CELL FACTOR1 (TCP) transcription factors have been hampered by functional redundancy between its individual members. In general, putative functionally redundant genes are predicted based on sequence similarity and confirmed by genetic analysis. In the TCP family, however, identification is impeded by relatively low overall sequence similarity. In a search for functionally redundant TCP pairs that control Arabidopsis leaf development, this work performed an integrative bioinformatics analysis, combining protein sequence similarities, gene expression data, and results of pair-wise protein–protein interaction studies for the 24 members of the Arabidopsis TCP transcription factor family. For this, the work completed any lacking gene expression and protein–protein interaction data experimentally and then performed a comprehensive prediction of potential functional redundant TCP pairs. Subsequently, redundant functions could be confirmed for selected predicted TCP pairs by genetic and molecular analyses. It is demonstrated that the previously uncharacterized class I TCP19 gene plays a role in the control of leaf senescence in a redundant fashion with TCP20. Altogether, this work shows the power of combining classical genetic and molecular approaches with bioinformatics predictions to unravel functional redundancies in the TCP transcription factor family. PMID:24129704

  6. Text Mining Improves Prediction of Protein Functional Sites

    PubMed Central

    Cohn, Judith D.; Ravikumar, Komandur E.

    2012-01-01

    We present an approach that integrates protein structure analysis and text mining for protein functional site prediction, called LEAP-FS (Literature Enhanced Automated Prediction of Functional Sites). The structure analysis was carried out using Dynamics Perturbation Analysis (DPA), which predicts functional sites at control points where interactions greatly perturb protein vibrations. The text mining extracts mentions of residues in the literature, and predicts that residues mentioned are functionally important. We assessed the significance of each of these methods by analyzing their performance in finding known functional sites (specifically, small-molecule binding sites and catalytic sites) in about 100,000 publicly available protein structures. The DPA predictions recapitulated many of the functional site annotations and preferentially recovered binding sites annotated as biologically relevant vs. those annotated as potentially spurious. The text-based predictions were also substantially supported by the functional site annotations: compared to other residues, residues mentioned in text were roughly six times more likely to be found in a functional site. The overlap of predictions with annotations improved when the text-based and structure-based methods agreed. Our analysis also yielded new high-quality predictions of many functional site residues that were not catalogued in the curated data sources we inspected. We conclude that both DPA and text mining independently provide valuable high-throughput protein functional site predictions, and that integrating the two methods using LEAP-FS further improves the quality of these predictions. PMID:22393388

  7. Applying network analysis and Nebula (neighbor-edges based and unbiased leverage algorithm) to ToxCast data.

    PubMed

    Ye, Hao; Luo, Heng; Ng, Hui Wen; Meehan, Joe; Ge, Weigong; Tong, Weida; Hong, Huixiao

    2016-01-01

    ToxCast data have been used to develop models for predicting in vivo toxicity. To predict the in vivo toxicity of a new chemical using a ToxCast data based model, its ToxCast bioactivity data are needed but not normally available. The capability of predicting ToxCast bioactivity data is necessary to fully utilize ToxCast data in the risk assessment of chemicals. We aimed to understand and elucidate the relationships between the chemicals and bioactivity data of the assays in ToxCast and to develop a network analysis based method for predicting ToxCast bioactivity data. We conducted modularity analysis on a quantitative network constructed from ToxCast data to explore the relationships between the assays and chemicals. We further developed Nebula (neighbor-edges based and unbiased leverage algorithm) for predicting ToxCast bioactivity data. Modularity analysis on the network constructed from ToxCast data yielded seven modules. Assays and chemicals in the seven modules were distinct. Leave-one-out cross-validation yielded a Q(2) of 0.5416, indicating ToxCast bioactivity data can be predicted by Nebula. Prediction domain analysis showed some types of ToxCast assay data could be more reliably predicted by Nebula than others. Network analysis is a promising approach to understand ToxCast data. Nebula is an effective algorithm for predicting ToxCast bioactivity data, helping fully utilize ToxCast data in the risk assessment of chemicals. Published by Elsevier Ltd.

  8. Incremental Validity of Biographical Data in the Prediction of En Route Air Traffic Control Specialist Technical Skills

    DOT National Transportation Integrated Search

    2012-07-01

    Previous research demonstrated that an empirically-keyed, response-option scored biographical data (biodata) : scale predicted supervisory ratings of air traffic control specialist (ATCS) job performance (Dean & Broach, : 2011). This research f...

  9. Evaluation of calibration efficacy under different levels of uncertainty

    DOE PAGES

    Heo, Yeonsook; Graziano, Diane J.; Guzowski, Leah; ...

    2014-06-10

    This study examines how calibration performs under different levels of uncertainty in model input data. It specifically assesses the efficacy of Bayesian calibration to enhance the reliability of EnergyPlus model predictions. A Bayesian approach can be used to update uncertain values of parameters, given measured energy-use data, and to quantify the associated uncertainty.We assess the efficacy of Bayesian calibration under a controlled virtual-reality setup, which enables rigorous validation of the accuracy of calibration results in terms of both calibrated parameter values and model predictions. Case studies demonstrate the performance of Bayesian calibration of base models developed from audit data withmore » differing levels of detail in building design, usage, and operation.« less

  10. Comparison of spring measures of length, weight, and condition factor for predicting metamorphosis in two populations of sea lampreys (Petromyzon marinus) larvae

    USGS Publications Warehouse

    Henson, Mary P.; Bergstedt, Roger A.; Adams, Jean V.

    2003-01-01

    The ability to predict when sea lampreys (Petromyzon marinus) will metamorphose from the larval phase to the parasitic phase is essential to the operation of the sea lamprey control program. During the spring of 1994, two populations of sea lamprey larvae from two rivers were captured, measured, weighed, implanted with coded wire tags, and returned to the same sites in the streams from which they were taken. Sea lampreys were recovered in the fall, after metamorphosis would have occurred, and checked for the presence of a tag. When the spring data were compared to the fall data it was found that the minimum requirements (length ≥ 120 mm, weight ≥ 3 g, and condition factor ≥ 1.50) suggested for metamorphosis did define a pool of larvae capable of metamorphosing. However, logistic regressions that relate the probability of metamorphosis to size are necessary to predict metamorphosis in a population. The data indicated, based on cross-validation, that weight measurements alone predicted metamorphosis with greater precision than length or condition factor in both the Marengo and Amnicon rivers. Based on the Akaike Information Criterion, weight alone was a better predictor in the Amnicon River, but length and condition factor combined predicted metamorphosis better in the Marengo River. There would be no additional cost if weight alone were used instead of length. However, if length and weight were measured the gain in predictive power would not be enough to justify the additional cost.

  11. Model-Based Geostatistical Mapping of the Prevalence of Onchocerca volvulus in West Africa

    PubMed Central

    O’Hanlon, Simon J.; Slater, Hannah C.; Cheke, Robert A.; Boatin, Boakye A.; Coffeng, Luc E.; Pion, Sébastien D. S.; Boussinesq, Michel; Zouré, Honorat G. M.; Stolk, Wilma A.; Basáñez, María-Gloria

    2016-01-01

    Background The initial endemicity (pre-control prevalence) of onchocerciasis has been shown to be an important determinant of the feasibility of elimination by mass ivermectin distribution. We present the first geostatistical map of microfilarial prevalence in the former Onchocerciasis Control Programme in West Africa (OCP) before commencement of antivectorial and antiparasitic interventions. Methods and Findings Pre-control microfilarial prevalence data from 737 villages across the 11 constituent countries in the OCP epidemiological database were used as ground-truth data. These 737 data points, plus a set of statistically selected environmental covariates, were used in a Bayesian model-based geostatistical (B-MBG) approach to generate a continuous surface (at pixel resolution of 5 km x 5km) of microfilarial prevalence in West Africa prior to the commencement of the OCP. Uncertainty in model predictions was measured using a suite of validation statistics, performed on bootstrap samples of held-out validation data. The mean Pearson’s correlation between observed and estimated prevalence at validation locations was 0.693; the mean prediction error (average difference between observed and estimated values) was 0.77%, and the mean absolute prediction error (average magnitude of difference between observed and estimated values) was 12.2%. Within OCP boundaries, 17.8 million people were deemed to have been at risk, 7.55 million to have been infected, and mean microfilarial prevalence to have been 45% (range: 2–90%) in 1975. Conclusions and Significance This is the first map of initial onchocerciasis prevalence in West Africa using B-MBG. Important environmental predictors of infection prevalence were identified and used in a model out-performing those without spatial random effects or environmental covariates. Results may be compared with recent epidemiological mapping efforts to find areas of persisting transmission. These methods may be extended to areas where data are sparse, and may be used to help inform the feasibility of elimination with current and novel tools. PMID:26771545

  12. TA [B] Predicting Microstructure-Creep Resistance Correlation in High Temperature Alloys over Multiple Time Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tomar, Vikas

    2017-03-06

    DoE-NETL partnered with Purdue University to predict the creep and associated microstructure evolution of tungsten-based refractory alloys. Researchers use grain boundary (GB) diagrams, a new concept, to establish time-dependent creep resistance and associated microstructure evolution of grain boundaries/intergranular films GB/IGF controlled creep as a function of load, environment, and temperature. The goal was to conduct a systematic study that includes the development of a theoretical framework, multiscale modeling, and experimental validation using W-based body-centered-cubic alloys, doped/alloyed with one or two of the following elements: nickel, palladium, cobalt, iron, and copper—typical refractory alloys. Prior work has already established and validated amore » basic theory for W-based binary and ternary alloys; the study conducted under this project extended this proven work. Based on interface diagrams phase field models were developed to predict long term microstructural evolution. In order to validate the models nanoindentation creep data was used to elucidate the role played by the interface properties in predicting long term creep strength and microstructure evolution.« less

  13. A Unified Statistical Rain-Attenuation Model for Communication Link Fade Predictions and Optimal Stochastic Fade Control Design Using a Location-Dependent Rain-Statistic Database

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1990-01-01

    A static and dynamic rain-attenuation model is presented which describes the statistics of attenuation on an arbitrarily specified satellite link for any location for which there are long-term rainfall statistics. The model may be used in the design of the optimal stochastic control algorithms to mitigate the effects of attenuation and maintain link reliability. A rain-statistics data base is compiled, which makes it possible to apply the model to any location in the continental U.S. with a resolution of 0-5 degrees in latitude and longitude. The model predictions are compared with experimental observations, showing good agreement.

  14. Emotion Awareness Predicts Body Mass Index Percentile Trajectories in Youth.

    PubMed

    Whalen, Diana J; Belden, Andy C; Barch, Deanna; Luby, Joan

    2015-10-01

    To examine the rate of change in body mass index (BMI) percentile across 3 years in relation to emotion identification ability and brain-based reactivity in emotional processing regions. A longitudinal sample of 202 youths completed 3 functional magnetic resonance imaging-based facial processing tasks and behavioral emotion differentiation tasks. We examined the rate of change in the youth's BMI percentile as a function of reactivity in emotional processing brain regions and behavioral emotion identification tasks using multilevel modeling. Lower correct identification of both happiness and sadness measured behaviorally predicted increases in BMI percentile across development, whereas higher correct identification of both happiness and sadness predicted decreases in BMI percentile, while controlling for children's pubertal status, sex, ethnicity, IQ score, exposure to antipsychotic medication, family income-to-needs ratio, and externalizing, internalizing, and depressive symptoms. Greater neural activation in emotional reactivity regions to sad faces also predicted increases in BMI percentile during development, also controlling for the aforementioned covariates. Our findings provide longitudinal developmental data demonstrating links between both emotion identification ability and greater neural reactivity in emotional processing regions with trajectories of BMI percentiles across childhood. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Computational prediction of human salivary proteins from blood circulation and application to diagnostic biomarker identification.

    PubMed

    Wang, Jiaxin; Liang, Yanchun; Wang, Yan; Cui, Juan; Liu, Ming; Du, Wei; Xu, Ying

    2013-01-01

    Proteins can move from blood circulation into salivary glands through active transportation, passive diffusion or ultrafiltration, some of which are then released into saliva and hence can potentially serve as biomarkers for diseases if accurately identified. We present a novel computational method for predicting salivary proteins that come from circulation. The basis for the prediction is a set of physiochemical and sequence features we found to be discerning between human proteins known to be movable from circulation to saliva and proteins deemed to be not in saliva. A classifier was trained based on these features using a support-vector machine to predict protein secretion into saliva. The classifier achieved 88.56% average recall and 90.76% average precision in 10-fold cross-validation on the training data, indicating that the selected features are informative. Considering the possibility that our negative training data may not be highly reliable (i.e., proteins predicted to be not in saliva), we have also trained a ranking method, aiming to rank the known salivary proteins from circulation as the highest among the proteins in the general background, based on the same features. This prediction capability can be used to predict potential biomarker proteins for specific human diseases when coupled with the information of differentially expressed proteins in diseased versus healthy control tissues and a prediction capability for blood-secretory proteins. Using such integrated information, we predicted 31 candidate biomarker proteins in saliva for breast cancer.

  16. Computational Prediction of Human Salivary Proteins from Blood Circulation and Application to Diagnostic Biomarker Identification

    PubMed Central

    Wang, Jiaxin; Liang, Yanchun; Wang, Yan; Cui, Juan; Liu, Ming; Du, Wei; Xu, Ying

    2013-01-01

    Proteins can move from blood circulation into salivary glands through active transportation, passive diffusion or ultrafiltration, some of which are then released into saliva and hence can potentially serve as biomarkers for diseases if accurately identified. We present a novel computational method for predicting salivary proteins that come from circulation. The basis for the prediction is a set of physiochemical and sequence features we found to be discerning between human proteins known to be movable from circulation to saliva and proteins deemed to be not in saliva. A classifier was trained based on these features using a support-vector machine to predict protein secretion into saliva. The classifier achieved 88.56% average recall and 90.76% average precision in 10-fold cross-validation on the training data, indicating that the selected features are informative. Considering the possibility that our negative training data may not be highly reliable (i.e., proteins predicted to be not in saliva), we have also trained a ranking method, aiming to rank the known salivary proteins from circulation as the highest among the proteins in the general background, based on the same features. This prediction capability can be used to predict potential biomarker proteins for specific human diseases when coupled with the information of differentially expressed proteins in diseased versus healthy control tissues and a prediction capability for blood-secretory proteins. Using such integrated information, we predicted 31 candidate biomarker proteins in saliva for breast cancer. PMID:24324552

  17. An alternative approach based on artificial neural networks to study controlled drug release.

    PubMed

    Reis, Marcus A A; Sinisterra, Rubén D; Belchior, Jadson C

    2004-02-01

    An alternative methodology based on artificial neural networks is proposed to be a complementary tool to other conventional methods to study controlled drug release. Two systems are used to test the approach; namely, hydrocortisone in a biodegradable matrix and rhodium (II) butyrate complexes in a bioceramic matrix. Two well-established mathematical models are used to simulate different release profiles as a function of fundamental properties; namely, diffusion coefficient (D), saturation solubility (C(s)), drug loading (A), and the height of the device (h). The models were tested, and the results show that these fundamental properties can be predicted after learning the experimental or model data for controlled drug release systems. The neural network results obtained after the learning stage can be considered to quantitatively predict ideal experimental conditions. Overall, the proposed methodology was shown to be efficient for ideal experiments, with a relative average error of <1% in both tests. This approach can be useful for the experimental analysis to simulate and design efficient controlled drug-release systems. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association

  18. Feedforward hysteresis compensation in trajectory control of piezoelectrically-driven nanostagers

    NASA Astrophysics Data System (ADS)

    Bashash, Saeid; Jalili, Nader

    2006-03-01

    Complex structural nonlinearities of piezoelectric materials drastically degrade their performance in variety of micro- and nano-positioning applications. From the precision positioning and control perspective, the multi-path time-history dependent hysteresis phenomenon is the most concerned nonlinearity in piezoelectric actuators to be analyzed. To realize the underlying physics of this phenomenon and to develop an efficient compensation strategy, the intelligent properties of hysteresis with the effects of non-local memories are discussed. Through performing a set of experiments on a piezoelectrically-driven nanostager with high resolution capacitive position sensor, it is shown that for the precise prediction of hysteresis path, certain memory units are required to store the previous hysteresis trajectory data. Based on the experimental observations, a constitutive memory-based mathematical modeling framework is developed and trained for the precise prediction of hysteresis path for arbitrarily assigned input profiles. Using the inverse hysteresis model, a feedforward control strategy is then developed and implemented on the nanostager to compensate for the system everpresent nonlinearity. Experimental results demonstrate that the controller remarkably eliminates the nonlinear effect if memory units are sufficiently chosen for the inverse model.

  19. Inside the black box: starting to uncover the underlying decision rules used in one-by-one expert assessment of occupational exposure in case-control studies

    PubMed Central

    Wheeler, David C.; Burstyn, Igor; Vermeulen, Roel; Yu, Kai; Shortreed, Susan M.; Pronk, Anjoeka; Stewart, Patricia A.; Colt, Joanne S.; Baris, Dalsu; Karagas, Margaret R.; Schwenn, Molly; Johnson, Alison; Silverman, Debra T.; Friesen, Melissa C.

    2014-01-01

    Objectives Evaluating occupational exposures in population-based case-control studies often requires exposure assessors to review each study participants' reported occupational information job-by-job to derive exposure estimates. Although such assessments likely have underlying decision rules, they usually lack transparency, are time-consuming and have uncertain reliability and validity. We aimed to identify the underlying rules to enable documentation, review, and future use of these expert-based exposure decisions. Methods Classification and regression trees (CART, predictions from a single tree) and random forests (predictions from many trees) were used to identify the underlying rules from the questionnaire responses and an expert's exposure assignments for occupational diesel exhaust exposure for several metrics: binary exposure probability and ordinal exposure probability, intensity, and frequency. Data were split into training (n=10,488 jobs), testing (n=2,247), and validation (n=2,248) data sets. Results The CART and random forest models' predictions agreed with 92–94% of the expert's binary probability assignments. For ordinal probability, intensity, and frequency metrics, the two models extracted decision rules more successfully for unexposed and highly exposed jobs (86–90% and 57–85%, respectively) than for low or medium exposed jobs (7–71%). Conclusions CART and random forest models extracted decision rules and accurately predicted an expert's exposure decisions for the majority of jobs and identified questionnaire response patterns that would require further expert review if the rules were applied to other jobs in the same or different study. This approach makes the exposure assessment process in case-control studies more transparent and creates a mechanism to efficiently replicate exposure decisions in future studies. PMID:23155187

  20. Effective Information Extraction Framework for Heterogeneous Clinical Reports Using Online Machine Learning and Controlled Vocabularies.

    PubMed

    Zheng, Shuai; Lu, James J; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A; Wang, Fusheng

    2017-05-09

    Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports-each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. IDEAL-X adopts a unique online machine learning-based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. ©Shuai Zheng, James J Lu, Nima Ghasemzadeh, Salim S Hayek, Arshed A Quyyumi, Fusheng Wang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 09.05.2017.

  1. Predicting the effectiveness of depth-based technologies to prevent salmon lice infection using a dispersal model.

    PubMed

    Samsing, Francisca; Johnsen, Ingrid; Stien, Lars Helge; Oppedal, Frode; Albretsen, Jon; Asplin, Lars; Dempster, Tim

    2016-07-01

    Salmon lice is one of the major parasitic problems affecting wild and farmed salmonid species. The planktonic larval stages of these marine parasites can survive for extended periods without a host and are transported long distances by water masses. Salmon lice larvae have limited swimming capacity, but can influence their horizontal transport by vertical positioning. Here, we adapted a coupled biological-physical model to calculate the distribution of farm-produced salmon lice (Lepeophtheirus salmonis) during winter in the southwest coast of Norway. We tested 4 model simulations to see which best represented empirical data from two sources: (1) observed lice infection levels reported by farms; and (2) experimental data from a vertical exposure experiment where fish were forced to swim at different depths with a lice-barrier technology. Model simulations tested were different development time to the infective stage (35 or 50°-days), with or without the presence of temperature-controlled vertical behaviour of lice early planktonic stages (naupliar stages). The best model fit occurred with a 35°-day development time to the infective stage, and temperature-controlled vertical behaviour. We applied this model to predict the effectiveness of depth-based preventive lice-barrier technologies. Both simulated and experimental data revealed that hindering fish from swimming close to the surface efficiently reduced lice infection. Moreover, while our model simulation predicted that this preventive technology is widely applicable, its effectiveness will depend on environmental conditions. Low salinity surface waters reduce the effectiveness of this technology because salmon lice avoid these conditions, and can encounter the fish as they sink deeper in the water column. Correctly parameterized and validated salmon lice dispersal models can predict the impact of preventive approaches to control this parasite and become an essential tool in lice management strategies. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Metabolic Model-Based Integration of Microbiome Taxonomic and Metabolomic Profiles Elucidates Mechanistic Links between Ecological and Metabolic Variation.

    PubMed

    Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M; Young, Vincent B; Jansson, Janet K; Fredricks, David N; Borenstein, Elhanan

    2016-01-01

    Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites' abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism.

  3. Metabolic Model-Based Integration of Microbiome Taxonomic and Metabolomic Profiles Elucidates Mechanistic Links between Ecological and Metabolic Variation

    PubMed Central

    Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M.; Young, Vincent B.; Jansson, Janet K.; Fredricks, David N.

    2016-01-01

    ABSTRACT Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. IMPORTANCE Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism. PMID:27239563

  4. Incremental Validity of Biographical Data in the Prediction of En Route Air Traffic Control Specialist Technical Skills

    DTIC Science & Technology

    2012-07-01

    Incremental Validity of Biographical Data in the Prediction of En Route Air Traffic Control Specialist Technical Skills Dana Broach Civil Aerospace...Medical Institute Federal Aviation Administration Oklahoma City, OK 73125 July 2012 Final Report DOT/FAA/AM- 12 /8 Office of Aerospace Medicine...FAA/AM- 12 /8 4. Title and Subtitle 5. Report Date July 2012 Incremental Validity of Biographical Data in the Prediction of En Route Air

  5. Using a hybrid model to predict solute transfer from initially saturated soil into surface runoff with controlled drainage water.

    PubMed

    Tong, Juxiu; Hu, Bill X; Yang, Jinzhong; Zhu, Yan

    2016-06-01

    The mixing layer theory is not suitable for predicting solute transfer from initially saturated soil to surface runoff water under controlled drainage conditions. By coupling the mixing layer theory model with the numerical model Hydrus-1D, a hybrid solute transfer model has been proposed to predict soil solute transfer from an initially saturated soil into surface water, under controlled drainage water conditions. The model can also consider the increasing ponding water conditions on soil surface before surface runoff. The data of solute concentration in surface runoff and drainage water from a sand experiment is used as the reference experiment. The parameters for the water flow and solute transfer model and mixing layer depth under controlled drainage water condition are identified. Based on these identified parameters, the model is applied to another initially saturated sand experiment with constant and time-increasing mixing layer depth after surface runoff, under the controlled drainage water condition with lower drainage height at the bottom. The simulation results agree well with the observed data. Study results suggest that the hybrid model can accurately simulate the solute transfer from initially saturated soil into surface runoff under controlled drainage water condition. And it has been found that the prediction with increasing mixing layer depth is better than that with the constant one in the experiment with lower drainage condition. Since lower drainage condition and deeper ponded water depth result in later runoff start time, more solute sources in the mixing layer are needed for the surface water, and larger change rate results in the increasing mixing layer depth.

  6. NASA IVHM Technology Experiment for X-vehicles (NITEX)

    NASA Technical Reports Server (NTRS)

    Sandra, Hayden; Bajwa, Anupa

    2001-01-01

    The purpose of the NASA IVHM Technology Experiment for X-vehicles (NITEX) is to advance the development of selected IVHM technologies in a flight environment and to demonstrate the potential for reusable launch vehicle ground processing savings. The technologies to be developed and demonstrated include system-level and detailed diagnostics for real-time fault detection and isolation, prognostics for fault prediction, automated maintenance planning based on diagnostic and prognostic results, and a microelectronics hardware platform. Complete flight The Evolution of Flexible Insulation as IVHM consists of advanced sensors, distributed data acquisition, data processing that includes model-based diagnostics, prognostics and vehicle autonomy for control or suggested action, and advanced data storage. Complete ground IVHM consists of evolved control room architectures, advanced applications including automated maintenance planning and automated ground support equipment. This experiment will advance the development of a subset of complete IVHM.

  7. Data driven CAN node reliability assessment for manufacturing system

    NASA Astrophysics Data System (ADS)

    Zhang, Leiming; Yuan, Yong; Lei, Yong

    2017-01-01

    The reliability of the Controller Area Network(CAN) is critical to the performance and safety of the system. However, direct bus-off time assessment tools are lacking in practice due to inaccessibility of the node information and the complexity of the node interactions upon errors. In order to measure the mean time to bus-off(MTTB) of all the nodes, a novel data driven node bus-off time assessment method for CAN network is proposed by directly using network error information. First, the corresponding network error event sequence for each node is constructed using multiple-layer network error information. Then, the generalized zero inflated Poisson process(GZIP) model is established for each node based on the error event sequence. Finally, the stochastic model is constructed to predict the MTTB of the node. The accelerated case studies with different error injection rates are conducted on a laboratory network to demonstrate the proposed method, where the network errors are generated by a computer controlled error injection system. Experiment results show that the MTTB of nodes predicted by the proposed method agree well with observations in the case studies. The proposed data driven node time to bus-off assessment method for CAN networks can successfully predict the MTTB of nodes by directly using network error event data.

  8. Improved disturbance rejection for predictor-based control of MIMO linear systems with input delay

    NASA Astrophysics Data System (ADS)

    Shi, Shang; Liu, Wenhui; Lu, Junwei; Chu, Yuming

    2018-02-01

    In this paper, we are concerned with the predictor-based control of multi-input multi-output (MIMO) linear systems with input delay and disturbances. By taking the future values of disturbances into consideration, a new improved predictive scheme is proposed. Compared with the existing predictive schemes, our proposed predictive scheme can achieve a finite-time exact state prediction for some smooth disturbances including the constant disturbances, and a better disturbance attenuation can also be achieved for a large class of other time-varying disturbances. The attenuation of mismatched disturbances for second-order linear systems with input delay is also investigated by using our proposed predictor-based controller.

  9. Computer-based test-bed for clinical assessment of hand/wrist feed-forward neuroprosthetic controllers using artificial neural networks.

    PubMed

    Luján, J L; Crago, P E

    2004-11-01

    Neuroprosthestic systems can be used to restore hand grasp and wrist control in individuals with C5/C6 spinal cord injury. A computer-based system was developed for the implementation, tuning and clinical assessment of neuroprosthetic controllers, using off-the-shelf hardware and software. The computer system turned a Pentium III PC running Windows NT into a non-dedicated, real-time system for the control of neuroprostheses. Software execution (written using the high-level programming languages LabVIEW and MATLAB) was divided into two phases: training and real-time control. During the training phase, the computer system collected input/output data by stimulating the muscles and measuring the muscle outputs in real-time, analysed the recorded data, generated a set of training data and trained an artificial neural network (ANN)-based controller. During real-time control, the computer system stimulated the muscles using stimulus pulsewidths predicted by the ANN controller in response to a sampled input from an external command source, to provide independent control of hand grasp and wrist posture. System timing was stable, reliable and capable of providing muscle stimulation at frequencies up to 24Hz. To demonstrate the application of the test-bed, an ANN-based controller was implemented with three inputs and two independent channels of stimulation. The ANN controller's ability to control hand grasp and wrist angle independently was assessed by quantitative comparison of the outputs of the stimulated muscles with a set of desired grasp or wrist postures determined by the command signal. Controller performance results were mixed, but the platform provided the tools to implement and assess future controller designs.

  10. Nonlinear adaptive control system design with asymptotically stable parameter estimation error

    NASA Astrophysics Data System (ADS)

    Mishkov, Rumen; Darmonski, Stanislav

    2018-01-01

    The paper presents a new general method for nonlinear adaptive system design with asymptotic stability of the parameter estimation error. The advantages of the approach include asymptotic unknown parameter estimation without persistent excitation and capability to directly control the estimates transient response time. The method proposed modifies the basic parameter estimation dynamics designed via a known nonlinear adaptive control approach. The modification is based on the generalised prediction error, a priori constraints with a hierarchical parameter projection algorithm, and the stable data accumulation concepts. The data accumulation principle is the main tool for achieving asymptotic unknown parameter estimation. It relies on the parametric identifiability system property introduced. Necessary and sufficient conditions for exponential stability of the data accumulation dynamics are derived. The approach is applied in a nonlinear adaptive speed tracking vector control of a three-phase induction motor.

  11. Practical Use of Operation Data in the Process Industry

    NASA Astrophysics Data System (ADS)

    Kano, Manabu

    This paper aims to reveal real problems in the process industry and introduce recent development to solve such problems from the viewpoint of effective use of operation data. Two topics are discussed: virtual sensor and process control. First, in order to clarify the present state and problems, a part of our recent questionnaire survey of process control is quoted. It is emphasized that maintenance is a key issue not only for soft-sensors but also for controllers. Then, new techniques are explained. The first one is correlation-based just-in-time modeling (CoJIT), which can realize higher prediction performance than conventional methods and simplify model maintenance. The second is extended fictitious reference iterative tuning (E-FRIT), which can realize data-driven PID control parameter tuning without process modeling. The great usefulness of these techniques are demonstrated through their industrial applications.

  12. Automated System Checkout to Support Predictive Maintenance for the Reusable Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, Ann; Deb, Somnath; Kulkarni, Deepak; Wang, Yao; Lau, Sonie (Technical Monitor)

    1998-01-01

    The Propulsion Checkout and Control System (PCCS) is a predictive maintenance software system. The real-time checkout procedures and diagnostics are designed to detect components that need maintenance based on their condition, rather than using more conventional approaches such as scheduled or reliability centered maintenance. Predictive maintenance can reduce turn-around time and cost and increase safety as compared to conventional maintenance approaches. Real-time sensor validation, limit checking, statistical anomaly detection, and failure prediction based on simulation models are employed. Multi-signal models, useful for testability analysis during system design, are used during the operational phase to detect and isolate degraded or failed components. The TEAMS-RT real-time diagnostic engine was developed to utilize the multi-signal models by Qualtech Systems, Inc. Capability of predicting the maintenance condition was successfully demonstrated with a variety of data, from simulation to actual operation on the Integrated Propulsion Technology Demonstrator (IPTD) at Marshall Space Flight Center (MSFC). Playback of IPTD valve actuations for feature recognition updates identified an otherwise undetectable Main Propulsion System 12 inch prevalve degradation. The algorithms were loaded into the Propulsion Checkout and Control System for further development and are the first known application of predictive Integrated Vehicle Health Management to an operational cryogenic testbed. The software performed successfully in real-time, meeting the required performance goal of 1 second cycle time.

  13. Flight investigation of cabin noise control treatments for a light turboprop aircraft

    NASA Technical Reports Server (NTRS)

    Wilby, J. F.; Oneal, R. L.; Mixson, J. S.

    1985-01-01

    The in-flight evaluation of noise control treatments for a light, twin-engined turboprop aircraft presents several problems associated with data analysis and interpretation. These problems include data repeatability, propeller synchronization, spatial distributions of the exterior pressure field and acoustic treatment, and the presence of flanking paths. They are discussed here with regard to a specific aeroplane configuration. Measurements were made in an untreated cabin and in a cabin fitted with an experimental sidewall treatment. Results are presented in terms of the insertion loss provided by the treatment and comparison made with predictions based on laboratory measurements.

  14. Predicting Energy Consumption for Potential Effective Use in Hybrid Vehicle Powertrain Management Using Driver Prediction

    NASA Astrophysics Data System (ADS)

    Magnuson, Brian

    A proof-of-concept software-in-the-loop study is performed to assess the accuracy of predicted net and charge-gaining energy consumption for potential effective use in optimizing powertrain management of hybrid vehicles. With promising results of improving fuel efficiency of a thermostatic control strategy for a series, plug-ing, hybrid-electric vehicle by 8.24%, the route and speed prediction machine learning algorithms are redesigned and implemented for real- world testing in a stand-alone C++ code-base to ingest map data, learn and predict driver habits, and store driver data for fast startup and shutdown of the controller or computer used to execute the compiled algorithm. Speed prediction is performed using a multi-layer, multi-input, multi- output neural network using feed-forward prediction and gradient descent through back- propagation training. Route prediction utilizes a Hidden Markov Model with a recurrent forward algorithm for prediction and multi-dimensional hash maps to store state and state distribution constraining associations between atomic road segments and end destinations. Predicted energy is calculated using the predicted time-series speed and elevation profile over the predicted route and the road-load equation. Testing of the code-base is performed over a known road network spanning 24x35 blocks on the south hill of Spokane, Washington. A large set of training routes are traversed once to add randomness to the route prediction algorithm, and a subset of the training routes, testing routes, are traversed to assess the accuracy of the net and charge-gaining predicted energy consumption. Each test route is traveled a random number of times with varying speed conditions from traffic and pedestrians to add randomness to speed prediction. Prediction data is stored and analyzed in a post process Matlab script. The aggregated results and analysis of all traversals of all test routes reflect the performance of the Driver Prediction algorithm. The error of average energy gained through charge-gaining events is 31.3% and the error of average net energy consumed is 27.3%. The average delta and average standard deviation of the delta of predicted energy gained through charge-gaining events is 0.639 and 0.601 Wh respectively for individual time-series calculations. Similarly, the average delta and average standard deviation of the delta of the predicted net energy consumed is 0.567 and 0.580 Wh respectively for individual time-series calculations. The average delta and standard deviation of the delta of the predicted speed is 1.60 and 1.15 respectively also for the individual time-series measurements. The percentage of accuracy of route prediction is 91%. Overall, test routes are traversed 151 times for a total test distance of 276.4 km.

  15. A Discrete-Time Average Model Based Predictive Control for Quasi-Z-Source Inverter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yushan; Abu-Rub, Haitham; Xue, Yaosuo

    A discrete-time average model-based predictive control (DTA-MPC) is proposed for a quasi-Z-source inverter (qZSI). As a single-stage inverter topology, the qZSI regulates the dc-link voltage and the ac output voltage through the shoot-through (ST) duty cycle and the modulation index. Several feedback strategies have been dedicated to produce these two control variables, among which the most popular are the proportional–integral (PI)-based control and the conventional model-predictive control (MPC). However, in the former, there are tradeoffs between fast response and stability; the latter is robust, but at the cost of high calculation burden and variable switching frequency. Moreover, they require anmore » elaborated design or fine tuning of controller parameters. The proposed DTA-MPC predicts future behaviors of the ST duty cycle and modulation signals, based on the established discrete-time average model of the quasi-Z-source (qZS) inductor current, the qZS capacitor voltage, and load currents. The prediction actions are applied to the qZSI modulator in the next sampling instant, without the need of other controller parameters’ design. A constant switching frequency and significantly reduced computations are achieved with high performance. Transient responses and steady-state accuracy of the qZSI system under the proposed DTA-MPC are investigated and compared with the PI-based control and the conventional MPC. Simulation and experimental results verify the effectiveness of the proposed approach for the qZSI.« less

  16. A Discrete-Time Average Model Based Predictive Control for Quasi-Z-Source Inverter

    DOE PAGES

    Liu, Yushan; Abu-Rub, Haitham; Xue, Yaosuo; ...

    2017-12-25

    A discrete-time average model-based predictive control (DTA-MPC) is proposed for a quasi-Z-source inverter (qZSI). As a single-stage inverter topology, the qZSI regulates the dc-link voltage and the ac output voltage through the shoot-through (ST) duty cycle and the modulation index. Several feedback strategies have been dedicated to produce these two control variables, among which the most popular are the proportional–integral (PI)-based control and the conventional model-predictive control (MPC). However, in the former, there are tradeoffs between fast response and stability; the latter is robust, but at the cost of high calculation burden and variable switching frequency. Moreover, they require anmore » elaborated design or fine tuning of controller parameters. The proposed DTA-MPC predicts future behaviors of the ST duty cycle and modulation signals, based on the established discrete-time average model of the quasi-Z-source (qZS) inductor current, the qZS capacitor voltage, and load currents. The prediction actions are applied to the qZSI modulator in the next sampling instant, without the need of other controller parameters’ design. A constant switching frequency and significantly reduced computations are achieved with high performance. Transient responses and steady-state accuracy of the qZSI system under the proposed DTA-MPC are investigated and compared with the PI-based control and the conventional MPC. Simulation and experimental results verify the effectiveness of the proposed approach for the qZSI.« less

  17. Operating Comfort Prediction Model of Human-Machine Interface Layout for Cabin Based on GEP.

    PubMed

    Deng, Li; Wang, Guohua; Chen, Bo

    2015-01-01

    In view of the evaluation and decision-making problem of human-machine interface layout design for cabin, the operating comfort prediction model is proposed based on GEP (Gene Expression Programming), using operating comfort to evaluate layout scheme. Through joint angles to describe operating posture of upper limb, the joint angles are taken as independent variables to establish the comfort model of operating posture. Factor analysis is adopted to decrease the variable dimension; the model's input variables are reduced from 16 joint angles to 4 comfort impact factors, and the output variable is operating comfort score. The Chinese virtual human body model is built by CATIA software, which will be used to simulate and evaluate the operators' operating comfort. With 22 groups of evaluation data as training sample and validation sample, GEP algorithm is used to obtain the best fitting function between the joint angles and the operating comfort; then, operating comfort can be predicted quantitatively. The operating comfort prediction result of human-machine interface layout of driller control room shows that operating comfort prediction model based on GEP is fast and efficient, it has good prediction effect, and it can improve the design efficiency.

  18. Operating Comfort Prediction Model of Human-Machine Interface Layout for Cabin Based on GEP

    PubMed Central

    Wang, Guohua; Chen, Bo

    2015-01-01

    In view of the evaluation and decision-making problem of human-machine interface layout design for cabin, the operating comfort prediction model is proposed based on GEP (Gene Expression Programming), using operating comfort to evaluate layout scheme. Through joint angles to describe operating posture of upper limb, the joint angles are taken as independent variables to establish the comfort model of operating posture. Factor analysis is adopted to decrease the variable dimension; the model's input variables are reduced from 16 joint angles to 4 comfort impact factors, and the output variable is operating comfort score. The Chinese virtual human body model is built by CATIA software, which will be used to simulate and evaluate the operators' operating comfort. With 22 groups of evaluation data as training sample and validation sample, GEP algorithm is used to obtain the best fitting function between the joint angles and the operating comfort; then, operating comfort can be predicted quantitatively. The operating comfort prediction result of human-machine interface layout of driller control room shows that operating comfort prediction model based on GEP is fast and efficient, it has good prediction effect, and it can improve the design efficiency. PMID:26448740

  19. BRCA-Monet: a breast cancer specific drug treatment mode-of-action network for treatment effective prediction using large scale microarray database

    PubMed Central

    2013-01-01

    Background Connectivity map (cMap) is a recent developed dataset and algorithm for uncovering and understanding the treatment effect of small molecules on different cancer cell lines. It is widely used but there are still remaining challenges for accurate predictions. Method Here, we propose BRCA-MoNet, a network of drug mode of action (MoA) specific to breast cancer, which is constructed based on the cMap dataset. A drug signature selection algorithm fitting the characteristic of cMap data, a quality control scheme as well as a novel query algorithm based on BRCA-MoNet are developed for more effective prediction of drug effects. Result BRCA-MoNet was applied to three independent data sets obtained from the GEO database: Estrodial treated MCF7 cell line, BMS-754807 treated MCF7 cell line, and a breast cancer patient microarray dataset. In the first case, BRCA-MoNet could identify drug MoAs likely to share same and reverse treatment effect. In the second case, the result demonstrated the potential of BRCA-MoNet to reposition drugs and predict treatment effects for drugs not in cMap data. In the third case, a possible procedure of personalized drug selection is showcased. Conclusions The results clearly demonstrated that the proposed BRCA-MoNet approach can provide increased prediction power to cMap and thus will be useful for identification of new therapeutic candidates. Website: The web based application is developed and can be access through the following link http://compgenomics.utsa.edu/BRCAMoNet/ PMID:24564956

  20. Multi-centre diagnostic classification of individual structural neuroimaging scans from patients with major depressive disorder.

    PubMed

    Mwangi, Benson; Ebmeier, Klaus P; Matthews, Keith; Steele, J Douglas

    2012-05-01

    Quantitative abnormalities of brain structure in patients with major depressive disorder have been reported at a group level for decades. However, these structural differences appear subtle in comparison with conventional radiologically defined abnormalities, with considerable inter-subject variability. Consequently, it has not been possible to readily identify scans from patients with major depressive disorder at an individual level. Recently, machine learning techniques such as relevance vector machines and support vector machines have been applied to predictive classification of individual scans with variable success. Here we describe a novel hybrid method, which combines machine learning with feature selection and characterization, with the latter aimed at maximizing the accuracy of machine learning prediction. The method was tested using a multi-centre dataset of T(1)-weighted 'structural' scans. A total of 62 patients with major depressive disorder and matched controls were recruited from referred secondary care clinical populations in Aberdeen and Edinburgh, UK. The generalization ability and predictive accuracy of the classifiers was tested using data left out of the training process. High prediction accuracy was achieved (~90%). While feature selection was important for maximizing high predictive accuracy with machine learning, feature characterization contributed only a modest improvement to relevance vector machine-based prediction (~5%). Notably, while the only information provided for training the classifiers was T(1)-weighted scans plus a categorical label (major depressive disorder versus controls), both relevance vector machine and support vector machine 'weighting factors' (used for making predictions) correlated strongly with subjective ratings of illness severity. These results indicate that machine learning techniques have the potential to inform clinical practice and research, as they can make accurate predictions about brain scan data from individual subjects. Furthermore, machine learning weighting factors may reflect an objective biomarker of major depressive disorder illness severity, based on abnormalities of brain structure.

  1. Improved model predictive control of resistive wall modes by error field estimator in EXTRAP T2R

    NASA Astrophysics Data System (ADS)

    Setiadi, A. C.; Brunsell, P. R.; Frassinetti, L.

    2016-12-01

    Many implementations of a model-based approach for toroidal plasma have shown better control performance compared to the conventional type of feedback controller. One prerequisite of model-based control is the availability of a control oriented model. This model can be obtained empirically through a systematic procedure called system identification. Such a model is used in this work to design a model predictive controller to stabilize multiple resistive wall modes in EXTRAP T2R reversed-field pinch. Model predictive control is an advanced control method that can optimize the future behaviour of a system. Furthermore, this paper will discuss an additional use of the empirical model which is to estimate the error field in EXTRAP T2R. Two potential methods are discussed that can estimate the error field. The error field estimator is then combined with the model predictive control and yields better radial magnetic field suppression.

  2. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes

    PubMed Central

    Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-01-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215

  3. A comparative study: classification vs. user-based collaborative filtering for clinical prediction.

    PubMed

    Hao, Fang; Blair, Rachael Hageman

    2016-12-08

    Recommender systems have shown tremendous value for the prediction of personalized item recommendations for individuals in a variety of settings (e.g., marketing, e-commerce, etc.). User-based collaborative filtering is a popular recommender system, which leverages an individuals' prior satisfaction with items, as well as the satisfaction of individuals that are "similar". Recently, there have been applications of collaborative filtering based recommender systems for clinical risk prediction. In these applications, individuals represent patients, and items represent clinical data, which includes an outcome. Application of recommender systems to a problem of this type requires the recasting a supervised learning problem as unsupervised. The rationale is that patients with similar clinical features carry a similar disease risk. As the "Big Data" era progresses, it is likely that approaches of this type will be reached for as biomedical data continues to grow in both size and complexity (e.g., electronic health records). In the present study, we set out to understand and assess the performance of recommender systems in a controlled yet realistic setting. User-based collaborative filtering recommender systems are compared to logistic regression and random forests with different types of imputation and varying amounts of missingness on four different publicly available medical data sets: National Health and Nutrition Examination Survey (NHANES, 2011-2012 on Obesity), Study to Understand Prognoses Preferences Outcomes and Risks of Treatment (SUPPORT), chronic kidney disease, and dermatology data. We also examined performance using simulated data with observations that are Missing At Random (MAR) or Missing Completely At Random (MCAR) under various degrees of missingness and levels of class imbalance in the response variable. Our results demonstrate that user-based collaborative filtering is consistently inferior to logistic regression and random forests with different imputations on real and simulated data. The results warrant caution for the collaborative filtering for the purpose of clinical risk prediction when traditional classification is feasible and practical. CF may not be desirable in datasets where classification is an acceptable alternative. We describe some natural applications related to "Big Data" where CF would be preferred and conclude with some insights as to why caution may be warranted in this context.

  4. Calculation of Shuttle Base Heating Environments and Comparison with Flight Data

    NASA Technical Reports Server (NTRS)

    Greenwood, T. F.; Lee, Y. C.; Bender, R. L.; Carter, R. E.

    1983-01-01

    The techniques, analytical tools, and experimental programs used initially to generate and later to improve and validate the Shuttle base heating design environments are discussed. In general, the measured base heating environments for STS-1 through STS-5 were in good agreement with the preflight predictions. However, some changes were made in the methodology after reviewing the flight data. The flight data is described, preflight predictions are compared with the flight data, and improvements in the prediction methodology based on the data are discussed.

  5. How physiological and physical processes contribute to the phenology of cyanobacterial blooms in large shallow lakes: A new Euler-Lagrangian coupled model.

    PubMed

    Feng, Tao; Wang, Chao; Wang, Peifang; Qian, Jin; Wang, Xun

    2018-09-01

    Cyanobacterial blooms have emerged as one of the most severe ecological problems affecting large and shallow freshwater lakes. To improve our understanding of the factors that influence, and could be used to predict, surface blooms, this study developed a novel Euler-Lagrangian coupled approach combining the Eulerian model with agent-based modelling (ABM). The approach was subsequently verified based on monitoring datasets and MODIS data in a large shallow lake (Lake Taihu, China). The Eulerian model solves the Eulerian variables and physiological parameters, whereas ABM generates the complete life cycle and transport processes of cyanobacterial colonies. This model ensemble performed well in fitting historical data and predicting the dynamics of cyanobacterial biomass, bloom distribution, and area. Based on the calculated physical and physiological characteristics of surface blooms, principal component analysis (PCA) captured the major processes influencing surface bloom formation at different stages (two bloom clusters). Early bloom outbreaks were influenced by physical processes (horizontal transport and vertical turbulence-induced mixing), whereas buoyancy-controlling strategies were essential for mature bloom outbreaks. Canonical correlation analysis (CCA) revealed the combined actions of multiple environment variables on different bloom clusters. The effects of buoyancy-controlling strategies (ISP), vertical turbulence-induced mixing velocity of colony (VMT) and horizontal drift velocity of colony (HDT) were quantitatively compared using scenario simulations in the coupled model. VMT accounted for 52.9% of bloom formations and maintained blooms over long periods, thus demonstrating the importance of wind-induced turbulence in shallow lakes. In comparison, HDT and buoyancy controlling strategies influenced blooms at different stages. In conclusion, the approach developed here presents a promising tool for understanding the processes of onshore/offshore algal blooms formation and subsequent predicting. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Active control strategy for the running attitude of high-speed train under strong crosswind condition

    NASA Astrophysics Data System (ADS)

    Li, Decang; Meng, Jianjun; Bai, Huan; Xu, Ruxun

    2018-07-01

    This paper focuses on the safety of high-speed trains under strong crosswind conditions. A new active control strategy is proposed based on the adaptive predictive control theory. The new control strategy aims at adjusting the attitudes of a train by controlling the new-type intelligent giant magnetostrictive actuator (GMA). It combined adaptive control with dynamic matrix control; parameters of predictive controller was real-time adjusted by online distinguishing to enhance the robustness of the control algorithm. On this basis, a correction control algorithm is also designed to regulate the parameters of predictive controller based on the step response of a controlled objective. Finally, the simulation results show that the proposed control strategy can adjust the running attitudes of high-speed trains under strong crosswind conditions; they also indicate that the new active control strategy is effective and applicable in improving the safety performance of a train based on a host-target computer technology provided by Matlab/Simulink.

  7. Parameterized data-driven fuzzy model based optimal control of a semi-batch reactor.

    PubMed

    Kamesh, Reddi; Rani, K Yamuna

    2016-09-01

    A parameterized data-driven fuzzy (PDDF) model structure is proposed for semi-batch processes, and its application for optimal control is illustrated. The orthonormally parameterized input trajectories, initial states and process parameters are the inputs to the model, which predicts the output trajectories in terms of Fourier coefficients. Fuzzy rules are formulated based on the signs of a linear data-driven model, while the defuzzification step incorporates a linear regression model to shift the domain from input to output domain. The fuzzy model is employed to formulate an optimal control problem for single rate as well as multi-rate systems. Simulation study on a multivariable semi-batch reactor system reveals that the proposed PDDF modeling approach is capable of capturing the nonlinear and time-varying behavior inherent in the semi-batch system fairly accurately, and the results of operating trajectory optimization using the proposed model are found to be comparable to the results obtained using the exact first principles model, and are also found to be comparable to or better than parameterized data-driven artificial neural network model based optimization results. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Assess and Predict Automatic Generation Control Performances for Thermal Power Generation Units Based on Modeling Techniques

    NASA Astrophysics Data System (ADS)

    Zhao, Yan; Yang, Zijiang; Gao, Song; Liu, Jinbiao

    2018-02-01

    Automatic generation control(AGC) is a key technology to maintain real time power generation and load balance, and to ensure the quality of power supply. Power grids require each power generation unit to have a satisfactory AGC performance, being specified in two detailed rules. The two rules provide a set of indices to measure the AGC performance of power generation unit. However, the commonly-used method to calculate these indices is based on particular data samples from AGC responses and will lead to incorrect results in practice. This paper proposes a new method to estimate the AGC performance indices via system identification techniques. In addition, a nonlinear regression model between performance indices and load command is built in order to predict the AGC performance indices. The effectiveness of the proposed method is validated through industrial case studies.

  9. ARGES: an Expert System for Fault Diagnosis Within Space-Based ECLS Systems

    NASA Technical Reports Server (NTRS)

    Pachura, David W.; Suleiman, Salem A.; Mendler, Andrew P.

    1988-01-01

    ARGES (Atmospheric Revitalization Group Expert System) is a demonstration prototype expert system for fault management for the Solid Amine, Water Desorbed (SAWD) CO2 removal assembly, associated with the Environmental Control and Life Support (ECLS) System. ARGES monitors and reduces data in real time from either the SAWD controller or a simulation of the SAWD assembly. It can detect gradual degradations or predict failures. This allows graceful shutdown and scheduled maintenance, which reduces crew maintenance overhead. Status and fault information is presented in a user interface that simulates what would be seen by a crewperson. The user interface employs animated color graphics and an object oriented approach to provide detailed status information, fault identification, and explanation of reasoning in a rapidly assimulated manner. In addition, ARGES recommends possible courses of action for predicted and actual faults. ARGES is seen as a forerunner of AI-based fault management systems for manned space systems.

  10. Fourier transform wavefront control with adaptive prediction of the atmosphere.

    PubMed

    Poyneer, Lisa A; Macintosh, Bruce A; Véran, Jean-Pierre

    2007-09-01

    Predictive Fourier control is a temporal power spectral density-based adaptive method for adaptive optics that predicts the atmosphere under the assumption of frozen flow. The predictive controller is based on Kalman filtering and a Fourier decomposition of atmospheric turbulence using the Fourier transform reconstructor. It provides a stable way to compensate for arbitrary numbers of atmospheric layers. For each Fourier mode, efficient and accurate algorithms estimate the necessary atmospheric parameters from closed-loop telemetry and determine the predictive filter, adjusting as conditions change. This prediction improves atmospheric rejection, leading to significant improvements in system performance. For a 48x48 actuator system operating at 2 kHz, five-layer prediction for all modes is achievable in under 2x10(9) floating-point operations/s.

  11. Using Time Series Analysis to Predict Cardiac Arrest in a PICU.

    PubMed

    Kennedy, Curtis E; Aoki, Noriaki; Mariscalco, Michele; Turley, James P

    2015-11-01

    To build and test cardiac arrest prediction models in a PICU, using time series analysis as input, and to measure changes in prediction accuracy attributable to different classes of time series data. Retrospective cohort study. Thirty-one bed academic PICU that provides care for medical and general surgical (not congenital heart surgery) patients. Patients experiencing a cardiac arrest in the PICU and requiring external cardiac massage for at least 2 minutes. None. One hundred three cases of cardiac arrest and 109 control cases were used to prepare a baseline dataset that consisted of 1,025 variables in four data classes: multivariate, raw time series, clinical calculations, and time series trend analysis. We trained 20 arrest prediction models using a matrix of five feature sets (combinations of data classes) with four modeling algorithms: linear regression, decision tree, neural network, and support vector machine. The reference model (multivariate data with regression algorithm) had an accuracy of 78% and 87% area under the receiver operating characteristic curve. The best model (multivariate + trend analysis data with support vector machine algorithm) had an accuracy of 94% and 98% area under the receiver operating characteristic curve. Cardiac arrest predictions based on a traditional model built with multivariate data and a regression algorithm misclassified cases 3.7 times more frequently than predictions that included time series trend analysis and built with a support vector machine algorithm. Although the final model lacks the specificity necessary for clinical application, we have demonstrated how information from time series data can be used to increase the accuracy of clinical prediction models.

  12. Bayesian Geostatistical Model-Based Estimates of Soil-Transmitted Helminth Infection in Nigeria, Including Annual Deworming Requirements

    PubMed Central

    Oluwole, Akinola S.; Ekpo, Uwem F.; Karagiannis-Voules, Dimitrios-Alexios; Abe, Eniola M.; Olamiju, Francisca O.; Isiyaku, Sunday; Okoronkwo, Chukwu; Saka, Yisa; Nebe, Obiageli J.; Braide, Eka I.; Mafiana, Chiedu F.; Utzinger, Jürg; Vounatsou, Penelope

    2015-01-01

    Background The acceleration of the control of soil-transmitted helminth (STH) infections in Nigeria, emphasizing preventive chemotherapy, has become imperative in light of the global fight against neglected tropical diseases. Predictive risk maps are an important tool to guide and support control activities. Methodology STH infection prevalence data were obtained from surveys carried out in 2011 using standard protocols. Data were geo-referenced and collated in a nationwide, geographic information system database. Bayesian geostatistical models with remotely sensed environmental covariates and variable selection procedures were utilized to predict the spatial distribution of STH infections in Nigeria. Principal Findings We found that hookworm, Ascaris lumbricoides, and Trichuris trichiura infections are endemic in 482 (86.8%), 305 (55.0%), and 55 (9.9%) locations, respectively. Hookworm and A. lumbricoides infection co-exist in 16 states, while the three species are co-endemic in 12 states. Overall, STHs are endemic in 20 of the 36 states of Nigeria, including the Federal Capital Territory of Abuja. The observed prevalence at endemic locations ranged from 1.7% to 51.7% for hookworm, from 1.6% to 77.8% for A. lumbricoides, and from 1.0% to 25.5% for T. trichiura. Model-based predictions ranged from 0.7% to 51.0% for hookworm, from 0.1% to 82.6% for A. lumbricoides, and from 0.0% to 18.5% for T. trichiura. Our models suggest that day land surface temperature and dense vegetation are important predictors of the spatial distribution of STH infection in Nigeria. In 2011, a total of 5.7 million (13.8%) school-aged children were predicted to be infected with STHs in Nigeria. Mass treatment at the local government area level for annual or bi-annual treatment of the school-aged population in Nigeria in 2011, based on World Health Organization prevalence thresholds, were estimated at 10.2 million tablets. Conclusions/Significance The predictive risk maps and estimated deworming needs presented here will be helpful for escalating the control and spatial targeting of interventions against STH infections in Nigeria. PMID:25909633

  13. Bayesian geostatistical model-based estimates of soil-transmitted helminth infection in Nigeria, including annual deworming requirements.

    PubMed

    Oluwole, Akinola S; Ekpo, Uwem F; Karagiannis-Voules, Dimitrios-Alexios; Abe, Eniola M; Olamiju, Francisca O; Isiyaku, Sunday; Okoronkwo, Chukwu; Saka, Yisa; Nebe, Obiageli J; Braide, Eka I; Mafiana, Chiedu F; Utzinger, Jürg; Vounatsou, Penelope

    2015-04-01

    The acceleration of the control of soil-transmitted helminth (STH) infections in Nigeria, emphasizing preventive chemotherapy, has become imperative in light of the global fight against neglected tropical diseases. Predictive risk maps are an important tool to guide and support control activities. STH infection prevalence data were obtained from surveys carried out in 2011 using standard protocols. Data were geo-referenced and collated in a nationwide, geographic information system database. Bayesian geostatistical models with remotely sensed environmental covariates and variable selection procedures were utilized to predict the spatial distribution of STH infections in Nigeria. We found that hookworm, Ascaris lumbricoides, and Trichuris trichiura infections are endemic in 482 (86.8%), 305 (55.0%), and 55 (9.9%) locations, respectively. Hookworm and A. lumbricoides infection co-exist in 16 states, while the three species are co-endemic in 12 states. Overall, STHs are endemic in 20 of the 36 states of Nigeria, including the Federal Capital Territory of Abuja. The observed prevalence at endemic locations ranged from 1.7% to 51.7% for hookworm, from 1.6% to 77.8% for A. lumbricoides, and from 1.0% to 25.5% for T. trichiura. Model-based predictions ranged from 0.7% to 51.0% for hookworm, from 0.1% to 82.6% for A. lumbricoides, and from 0.0% to 18.5% for T. trichiura. Our models suggest that day land surface temperature and dense vegetation are important predictors of the spatial distribution of STH infection in Nigeria. In 2011, a total of 5.7 million (13.8%) school-aged children were predicted to be infected with STHs in Nigeria. Mass treatment at the local government area level for annual or bi-annual treatment of the school-aged population in Nigeria in 2011, based on World Health Organization prevalence thresholds, were estimated at 10.2 million tablets. The predictive risk maps and estimated deworming needs presented here will be helpful for escalating the control and spatial targeting of interventions against STH infections in Nigeria.

  14. The advantages of the surface Laplacian in brain-computer interface research.

    PubMed

    McFarland, Dennis J

    2015-09-01

    Brain-computer interface (BCI) systems frequently use signal processing methods, such as spatial filtering, to enhance performance. The surface Laplacian can reduce spatial noise and aid in identification of sources. In BCI research, these two functions of the surface Laplacian correspond to prediction accuracy and signal orthogonality. In the present study, an off-line analysis of data from a sensorimotor rhythm-based BCI task dissociated these functions of the surface Laplacian by comparing nearest-neighbor and next-nearest neighbor Laplacian algorithms. The nearest-neighbor Laplacian produced signals that were more orthogonal while the next-nearest Laplacian produced signals that resulted in better accuracy. Both prediction and signal identification are important for BCI research. Better prediction of user's intent produces increased speed and accuracy of communication and control. Signal identification is important for ruling out the possibility of control by artifacts. Identifying the nature of the control signal is relevant both to understanding exactly what is being studied and in terms of usability for individuals with limited motor control. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Predicting adsorption isotherms for aqueous organic micropollutants from activated carbon and pollutant properties.

    PubMed

    Li, Lei; Quinlivan, Patricia A; Knappe, Detlef R U

    2005-05-01

    A method based on the Polanyi-Dubinin-Manes (PDM) model is presented to predict adsorption isotherms of aqueous organic contaminants on activated carbons. It was assumed that trace organic compound adsorption from aqueous solution is primarily controlled by nonspecific dispersive interactions while water adsorption is controlled by specific interactions with oxygen-containing functional groups on the activated carbon surface. Coefficients describing the affinity of water for the activated carbon surface were derived from aqueous-phase methyl tertiary-butyl ether (MTBE) and trichloroethene (TCE) adsorption isotherm data that were collected with 12 well-characterized activated carbons. Over the range of oxygen contents covered by the adsorbents (approximately 0.8-10 mmol O/g dry, ash-free activated carbon), a linear relationship between water affinity coefficients and adsorbent oxygen content was obtained. Incorporating water affinity coefficients calculated from the developed relationship into the PDM model, isotherm predictions resulted that agreed well with experimental data for three adsorbents and two adsorbates [tetrachloroethene (PCE), cis-1,2-dichloroethene (DCE)] that were not used to calibrate the model.

  16. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    PubMed

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson's disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson's disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer's, Huntington's, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications.

  17. Profile control simulations and experiments on TCV: a controller test environment and results using a model-based predictive controller

    NASA Astrophysics Data System (ADS)

    Maljaars, E.; Felici, F.; Blanken, T. C.; Galperti, C.; Sauter, O.; de Baar, M. R.; Carpanese, F.; Goodman, T. P.; Kim, D.; Kim, S. H.; Kong, M.; Mavkov, B.; Merle, A.; Moret, J. M.; Nouailletas, R.; Scheffer, M.; Teplukhina, A. A.; Vu, N. M. T.; The EUROfusion MST1-team; The TCV-team

    2017-12-01

    The successful performance of a model predictive profile controller is demonstrated in simulations and experiments on the TCV tokamak, employing a profile controller test environment. Stable high-performance tokamak operation in hybrid and advanced plasma scenarios requires control over the safety factor profile (q-profile) and kinetic plasma parameters such as the plasma beta. This demands to establish reliable profile control routines in presently operational tokamaks. We present a model predictive profile controller that controls the q-profile and plasma beta using power requests to two clusters of gyrotrons and the plasma current request. The performance of the controller is analyzed in both simulation and TCV L-mode discharges where successful tracking of the estimated inverse q-profile as well as plasma beta is demonstrated under uncertain plasma conditions and the presence of disturbances. The controller exploits the knowledge of the time-varying actuator limits in the actuator input calculation itself such that fast transitions between targets are achieved without overshoot. A software environment is employed to prepare and test this and three other profile controllers in parallel in simulations and experiments on TCV. This set of tools includes the rapid plasma transport simulator RAPTOR and various algorithms to reconstruct the plasma equilibrium and plasma profiles by merging the available measurements with model-based predictions. In this work the estimated q-profile is merely based on RAPTOR model predictions due to the absence of internal current density measurements in TCV. These results encourage to further exploit model predictive profile control in experiments on TCV and other (future) tokamaks.

  18. A closed-loop hybrid physiological model relating to subjects under physical stress.

    PubMed

    El-Samahy, Emad; Mahfouf, Mahdi; Linkens, Derek A

    2006-11-01

    The objective of this research study is to derive a comprehensive physiological model relating to subjects under physical stress conditions. The model should describe the behaviour of the cardiovascular system, respiratory system, thermoregulation and brain activity in response to physical workload. An experimental testing rig was built which consists of recumbent high performance bicycle for inducing the physical load and a data acquisition system comprising monitors and PCs. The signals acquired and used within this study are the blood pressure, heart rate, respiration, body temperature, and EEG signals. The proposed model is based on a grey-box based modelling approach which was used because of the sufficient level of details it provides. Cardiovascular and EEG Data relating to 16 healthy subject volunteers (data from 12 subjects were used for training/validation and the data from 4 subjects were used for model testing) were collected using the Finapres and the ProComp+ monitors. For model validation, residual analysis via the computing of the confidence intervals as well as related histograms was performed. Closed-loop simulations for different subjects showed that the model can provide reliable predictions for heart rate, blood pressure, body temperature, respiration, and the EEG signals. These findings were also reinforced by the residual analyses data obtained, which suggested that the residuals were within the 90% confidence bands and that the corresponding histograms were of a normal distribution. A higher intelligent level was added to the model, based on neural networks, to extend the capabilities of the model to predict over a wide range of subjects dynamics. The elicited physiological model describing the effect of physiological stress on several physiological variables can be used to predict performance breakdown of operators in critical environments. Such a model architecture lends itself naturally to exploitation via feedback control in a 'reverse-engineering' fashion to control stress via the specification of a safe operating range for the psycho-physiological variables.

  19. A combined-slip predictive control of vehicle stability with experimental verification

    NASA Astrophysics Data System (ADS)

    Jalali, Milad; Hashemi, Ehsan; Khajepour, Amir; Chen, Shih-ken; Litkouhi, Bakhtiar

    2018-02-01

    In this paper, a model predictive vehicle stability controller is designed based on a combined-slip LuGre tyre model. Variations in the lateral tyre forces due to changes in tyre slip ratios are considered in the prediction model of the controller. It is observed that the proposed combined-slip controller takes advantage of the more accurate tyre model and can adjust tyre slip ratios based on lateral forces of the front axle. This results in an interesting closed-loop response that challenges the notion of braking only the wheels on one side of the vehicle in differential braking. The performance of the proposed controller is evaluated in software simulations and is compared to a similar pure-slip controller. Furthermore, experimental tests are conducted on a rear-wheel drive electric Chevrolet Equinox equipped with differential brakes to evaluate the closed-loop response of the model predictive control controller.

  20. The role of effort in moderating the anxiety-performance relationship: Testing the prediction of processing efficiency theory in simulated rally driving.

    PubMed

    Wilson, Mark; Smith, Nickolas C; Chattington, Mark; Ford, Mike; Marple-Horvat, Dilwyn E

    2006-11-01

    We tested some of the key predictions of processing efficiency theory using a simulated rally driving task. Two groups of participants were classified as either dispositionally high or low anxious based on trait anxiety scores and trained on a simulated driving task. Participants then raced individually on two similar courses under counterbalanced experimental conditions designed to manipulate the level of anxiety experienced. The effort exerted on the driving tasks was assessed though self-report (RSME), psychophysiological measures (pupil dilation) and visual gaze data. Efficiency was measured in terms of efficiency of visual processing (search rate) and driving control (variability of wheel and accelerator pedal) indices. Driving performance was measured as the time taken to complete the course. As predicted, increased anxiety had a negative effect on processing efficiency as indexed by the self-report, pupillary response and variability of gaze data. Predicted differences due to dispositional levels of anxiety were also found in the driving control and effort data. Although both groups of drivers performed worse under the threatening condition, the performance of the high trait anxious individuals was affected to a greater extent by the anxiety manipulation than the performance of the low trait anxious drivers. The findings suggest that processing efficiency theory holds promise as a theoretical framework for examining the relationship between anxiety and performance in sport.

  1. X-43A Flight-Test-Determined Aerodynamic Force and Moment Characteristics at Mach 7.0

    NASA Technical Reports Server (NTRS)

    Davis, Mark C.; White, J. Terry

    2008-01-01

    The second flight of the Hyper-X program afforded a unique opportunity to determine the aerodynamic force and moment characteristics of an airframe-integrated scramjet-powered aircraft in hypersonic flight. These data were gathered via a repeated series of pitch, yaw, and roll doublets, frequency sweeps, and pushover-pullup maneuvers performed throughout the X-43A cowl-closed descent. Maneuvers were conducted at Mach numbers of 6.80-0.95 and at altitudes from 92,000 ft mean sea level to sea level. The dynamic pressure varied from 1300 to 400 psf with the angle of attack ranging from 0 to 14 deg. The flight-extracted aerodynamics were compared with preflight predictions based on wind-tunnel test data. The X-43A flight-derived axial force was found to be 10-15%higher than prediction. Underpredictions of similar magnitude were observed for the normal force. For Mach numbers above 4.0, the flight-derived stability and control characteristics resulted in larger-than-predicted static margins, with the largest discrepancy approximately 5 in. forward along the x-axis center of gravity at Mach 6.0. This condition would result in less static margin in pitch. The predicted lateral-directional stability and control characteristics matched well with flight data when allowance was made for the high uncertainty in angle of sideslip.

  2. Fatigue behavior and life prediction of a SiC/Ti-24Al-11Nb composite under isothermal conditions. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Bartolotta, Paul A.

    1991-01-01

    Metal Matrix Composites (MMC) and Intermetallic Matrix Composites (IMC) were identified as potential material candidates for advanced aerospace applications. They are especially attractive for high temperature applications which require a low density material that maintains its structural integrity at elevated temperatures. High temperature fatigue resistance plays an important role in determining the structural integrity of the material. This study attempts to examine the relevance of test techniques, failure criterion, and life prediction as they pertain to an IMC material, specifically, unidirectional SiC fiber reinforced titanium aluminide. A series of strain and load controlled fatigue tests were conducted on unidirectional SiC/Ti-24Al-11Nb composite at 425 and 815 C. Several damage mechanism regimes were identified by using a strain-based representation of the data, Talreja's fatigue life diagram concept. Results of these tests were then used to address issues of test control modes, definition of failure, and testing techniques. Finally, a strain-based life prediction method was proposed for an IMC under tensile cyclic loadings at elevated temperatures.

  3. Application of a Novel Grey Self-Memory Coupling Model to Forecast the Incidence Rates of Two Notifiable Diseases in China: Dysentery and Gonorrhea

    PubMed Central

    Guo, Xiaojun; Liu, Sifeng; Wu, Lifeng; Tang, Lingling

    2014-01-01

    Objective In this study, a novel grey self-memory coupling model was developed to forecast the incidence rates of two notifiable infectious diseases (dysentery and gonorrhea); the effectiveness and applicability of this model was assessed based on its ability to predict the epidemiological trend of infectious diseases in China. Methods The linear model, the conventional GM(1,1) model and the GM(1,1) model with self-memory principle (SMGM(1,1) model) were used to predict the incidence rates of the two notifiable infectious diseases based on statistical incidence data. Both simulation accuracy and prediction accuracy were assessed to compare the predictive performances of the three models. The best-fit model was applied to predict future incidence rates. Results Simulation results show that the SMGM(1,1) model can take full advantage of the systematic multi-time historical data and possesses superior predictive performance compared with the linear model and the conventional GM(1,1) model. By applying the novel SMGM(1,1) model, we obtained the possible incidence rates of the two representative notifiable infectious diseases in China. Conclusion The disadvantages of the conventional grey prediction model, such as sensitivity to initial value, can be overcome by the self-memory principle. The novel grey self-memory coupling model can predict the incidence rates of infectious diseases more accurately than the conventional model, and may provide useful references for making decisions involving infectious disease prevention and control. PMID:25546054

  4. Application of a novel grey self-memory coupling model to forecast the incidence rates of two notifiable diseases in China: dysentery and gonorrhea.

    PubMed

    Guo, Xiaojun; Liu, Sifeng; Wu, Lifeng; Tang, Lingling

    2014-01-01

    In this study, a novel grey self-memory coupling model was developed to forecast the incidence rates of two notifiable infectious diseases (dysentery and gonorrhea); the effectiveness and applicability of this model was assessed based on its ability to predict the epidemiological trend of infectious diseases in China. The linear model, the conventional GM(1,1) model and the GM(1,1) model with self-memory principle (SMGM(1,1) model) were used to predict the incidence rates of the two notifiable infectious diseases based on statistical incidence data. Both simulation accuracy and prediction accuracy were assessed to compare the predictive performances of the three models. The best-fit model was applied to predict future incidence rates. Simulation results show that the SMGM(1,1) model can take full advantage of the systematic multi-time historical data and possesses superior predictive performance compared with the linear model and the conventional GM(1,1) model. By applying the novel SMGM(1,1) model, we obtained the possible incidence rates of the two representative notifiable infectious diseases in China. The disadvantages of the conventional grey prediction model, such as sensitivity to initial value, can be overcome by the self-memory principle. The novel grey self-memory coupling model can predict the incidence rates of infectious diseases more accurately than the conventional model, and may provide useful references for making decisions involving infectious disease prevention and control.

  5. Ensemble-based prediction of RNA secondary structures.

    PubMed

    Aghaeepour, Nima; Hoos, Holger H

    2013-04-24

    Accurate structure prediction methods play an important role for the understanding of RNA function. Energy-based, pseudoknot-free secondary structure prediction is one of the most widely used and versatile approaches, and improved methods for this task have received much attention over the past five years. Despite the impressive progress that as been achieved in this area, existing evaluations of the prediction accuracy achieved by various algorithms do not provide a comprehensive, statistically sound assessment. Furthermore, while there is increasing evidence that no prediction algorithm consistently outperforms all others, no work has been done to exploit the complementary strengths of multiple approaches. In this work, we present two contributions to the area of RNA secondary structure prediction. Firstly, we use state-of-the-art, resampling-based statistical methods together with a previously published and increasingly widely used dataset of high-quality RNA structures to conduct a comprehensive evaluation of existing RNA secondary structure prediction procedures. The results from this evaluation clarify the performance relationship between ten well-known existing energy-based pseudoknot-free RNA secondary structure prediction methods and clearly demonstrate the progress that has been achieved in recent years. Secondly, we introduce AveRNA, a generic and powerful method for combining a set of existing secondary structure prediction procedures into an ensemble-based method that achieves significantly higher prediction accuracies than obtained from any of its component procedures. Our new, ensemble-based method, AveRNA, improves the state of the art for energy-based, pseudoknot-free RNA secondary structure prediction by exploiting the complementary strengths of multiple existing prediction procedures, as demonstrated using a state-of-the-art statistical resampling approach. In addition, AveRNA allows an intuitive and effective control of the trade-off between false negative and false positive base pair predictions. Finally, AveRNA can make use of arbitrary sets of secondary structure prediction procedures and can therefore be used to leverage improvements in prediction accuracy offered by algorithms and energy models developed in the future. Our data, MATLAB software and a web-based version of AveRNA are publicly available at http://www.cs.ubc.ca/labs/beta/Software/AveRNA.

  6. Further Investigation of Receding Horizion-Based Controllers and Neural Network-Based Systems

    NASA Technical Reports Server (NTRS)

    Kelkar, Atul G.; Haley, Pamela J. (Technical Monitor)

    2000-01-01

    This report provides a comprehensive summary of the research work performed over the entire duration of the co-operative research agreement between NASA Langley Research Center and Kansas State University. This summary briefly lists the findings and also suggests possible future directions for the continuation of the subject research in the area of Generalized Predictive Control (GPC) and Network Based Generalized Predictive Control (NGPC).

  7. Analyzing the uncertainty of suspended sediment load prediction using sequential data assimilation

    NASA Astrophysics Data System (ADS)

    Leisenring, Marc; Moradkhani, Hamid

    2012-10-01

    SummaryA first step in understanding the impacts of sediment and controlling the sources of sediment is to quantify the mass loading. Since mass loading is the product of flow and concentration, the quantification of loads first requires the quantification of runoff volume. Using the National Weather Service's SNOW-17 and the Sacramento Soil Moisture Accounting (SAC-SMA) models, this study employed particle filter based Bayesian data assimilation methods to predict seasonal snow water equivalent (SWE) and runoff within a small watershed in the Lake Tahoe Basin located in California, USA. A procedure was developed to scale the variance multipliers (a.k.a hyperparameters) for model parameters and predictions based on the accuracy of the mean predictions relative to the ensemble spread. In addition, an online bias correction algorithm based on the lagged average bias was implemented to detect and correct for systematic bias in model forecasts prior to updating with the particle filter. Both of these methods significantly improved the performance of the particle filter without requiring excessively wide prediction bounds. The flow ensemble was linked to a non-linear regression model that was used to predict suspended sediment concentrations (SSCs) based on runoff rate and time of year. Runoff volumes and SSC were then combined to produce an ensemble of suspended sediment load estimates. Annual suspended sediment loads for the 5 years of simulation were finally computed along with 95% prediction intervals that account for uncertainty in both the SSC regression model and flow rate estimates. Understanding the uncertainty associated with annual suspended sediment load predictions is critical for making sound watershed management decisions aimed at maintaining the exceptional clarity of Lake Tahoe. The computational methods developed and applied in this research could assist with similar studies where it is important to quantify the predictive uncertainty of pollutant load estimates.

  8. Predicting plant biomass accumulation from image-derived parameters

    PubMed Central

    Chen, Dijun; Shi, Rongli; Pape, Jean-Michel; Neumann, Kerstin; Graner, Andreas; Chen, Ming; Klukas, Christian

    2018-01-01

    Abstract Background Image-based high-throughput phenotyping technologies have been rapidly developed in plant science recently, and they provide a great potential to gain more valuable information than traditionally destructive methods. Predicting plant biomass is regarded as a key purpose for plant breeders and ecologists. However, it is a great challenge to find a predictive biomass model across experiments. Results In the present study, we constructed 4 predictive models to examine the quantitative relationship between image-based features and plant biomass accumulation. Our methodology has been applied to 3 consecutive barley (Hordeum vulgare) experiments with control and stress treatments. The results proved that plant biomass can be accurately predicted from image-based parameters using a random forest model. The high prediction accuracy based on this model will contribute to relieving the phenotyping bottleneck in biomass measurement in breeding applications. The prediction performance is still relatively high across experiments under similar conditions. The relative contribution of individual features for predicting biomass was further quantified, revealing new insights into the phenotypic determinants of the plant biomass outcome. Furthermore, methods could also be used to determine the most important image-based features related to plant biomass accumulation, which would be promising for subsequent genetic mapping to uncover the genetic basis of biomass. Conclusions We have developed quantitative models to accurately predict plant biomass accumulation from image data. We anticipate that the analysis results will be useful to advance our views of the phenotypic determinants of plant biomass outcome, and the statistical methods can be broadly used for other plant species. PMID:29346559

  9. Risk of fetal mortality after exposure to Listeria monocytogenes based on dose-response data from pregnant guinea pigs and primates.

    PubMed

    Williams, Denita; Castleman, Jennifer; Lee, Chi-Ching; Mote, Beth; Smith, Mary Alice

    2009-11-01

    One-third of the annual cases of listeriosis in the United States occur during pregnancy and can lead to miscarriage or stillbirth, premature delivery, or infection of the newborn. Previous risk assessments completed by the Food and Drug Administration/the Food Safety Inspection Service of the U.S. Department of Agriculture/the Centers for Disease Control and Prevention (FDA/USDA/CDC) and Food and Agricultural Organization/the World Health Organization (FAO/WHO) were based on dose-response data from mice. Recent animal studies using nonhuman primates and guinea pigs have both estimated LD(50)s of approximately 10(7) Listeria monocytogenes colony forming units (cfu). The FAO/WHO estimated a human LD(50) of 1.9 x 10(6) cfu based on data from a pregnant woman consuming contaminated soft cheese. We reevaluated risk based on dose-response curves from pregnant rhesus monkeys and guinea pigs. Using standard risk assessment methodology including hazard identification, exposure assessment, hazard characterization, and risk characterization, risk was calculated based on the new dose-response information. To compare models, we looked at mortality rate per serving at predicted doses ranging from 10(-4) to 10(12) L. monocytogenes cfu. Based on a serving of 10(6) L. monocytogenes cfu, the primate model predicts a death rate of 5.9 x 10(-1) compared to the FDA/USDA/CDC (fig. IV-12) predicted rate of 1.3 x 10(-7). Based on the guinea pig and primate models, the mortality rate calculated by the FDA/USDA/CDC is underestimated for this susceptible population.

  10. Prediction of muscle activation for an eye movement with finite element modeling.

    PubMed

    Karami, Abbas; Eghtesad, Mohammad; Haghpanah, Seyyed Arash

    2017-10-01

    In this paper, a 3D finite element (FE) modeling is employed in order to predict extraocular muscles' activation and investigate force coordination in various motions of the eye orbit. A continuum constitutive hyperelastic model is employed for material description in dynamic modeling of the extraocular muscles (EOMs). Two significant features of this model are accurate mass modeling with FE method and stimulating EOMs for motion through muscle activation parameter. In order to validate the eye model, a forward dynamics simulation of the eye motion is carried out by variation of the muscle activation. Furthermore, to realize muscle activation prediction in various eye motions, two different tracking-based inverse controllers are proposed. The performance of these two inverse controllers is investigated according to their resulted muscle force magnitude and muscle force coordination. The simulation results are compared with the available experimental data and the well-known existing neurological laws. The comparison authenticates both the validation and the prediction results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Reconstructing genome-wide regulatory network of E. coli using transcriptome data and predicted transcription factor activities

    PubMed Central

    2011-01-01

    Background Gene regulatory networks play essential roles in living organisms to control growth, keep internal metabolism running and respond to external environmental changes. Understanding the connections and the activity levels of regulators is important for the research of gene regulatory networks. While relevance score based algorithms that reconstruct gene regulatory networks from transcriptome data can infer genome-wide gene regulatory networks, they are unfortunately prone to false positive results. Transcription factor activities (TFAs) quantitatively reflect the ability of the transcription factor to regulate target genes. However, classic relevance score based gene regulatory network reconstruction algorithms use models do not include the TFA layer, thus missing a key regulatory element. Results This work integrates TFA prediction algorithms with relevance score based network reconstruction algorithms to reconstruct gene regulatory networks with improved accuracy over classic relevance score based algorithms. This method is called Gene expression and Transcription factor activity based Relevance Network (GTRNetwork). Different combinations of TFA prediction algorithms and relevance score functions have been applied to find the most efficient combination. When the integrated GTRNetwork method was applied to E. coli data, the reconstructed genome-wide gene regulatory network predicted 381 new regulatory links. This reconstructed gene regulatory network including the predicted new regulatory links show promising biological significances. Many of the new links are verified by known TF binding site information, and many other links can be verified from the literature and databases such as EcoCyc. The reconstructed gene regulatory network is applied to a recent transcriptome analysis of E. coli during isobutanol stress. In addition to the 16 significantly changed TFAs detected in the original paper, another 7 significantly changed TFAs have been detected by using our reconstructed network. Conclusions The GTRNetwork algorithm introduces the hidden layer TFA into classic relevance score-based gene regulatory network reconstruction processes. Integrating the TFA biological information with regulatory network reconstruction algorithms significantly improves both detection of new links and reduces that rate of false positives. The application of GTRNetwork on E. coli gene transcriptome data gives a set of potential regulatory links with promising biological significance for isobutanol stress and other conditions. PMID:21668997

  12. Optimal interpolation analysis of leaf area index using MODIS data

    USGS Publications Warehouse

    Gu, Yingxin; Belair, Stephane; Mahfouf, Jean-Francois; Deblonde, Godelieve

    2006-01-01

    A simple data analysis technique for vegetation leaf area index (LAI) using Moderate Resolution Imaging Spectroradiometer (MODIS) data is presented. The objective is to generate LAI data that is appropriate for numerical weather prediction. A series of techniques and procedures which includes data quality control, time-series data smoothing, and simple data analysis is applied. The LAI analysis is an optimal combination of the MODIS observations and derived climatology, depending on their associated errors σo and σc. The “best estimate” LAI is derived from a simple three-point smoothing technique combined with a selection of maximum LAI (after data quality control) values to ensure a higher quality. The LAI climatology is a time smoothed mean value of the “best estimate” LAI during the years of 2002–2004. The observation error is obtained by comparing the MODIS observed LAI with the “best estimate” of the LAI, and the climatological error is obtained by comparing the “best estimate” of LAI with the climatological LAI value. The LAI analysis is the result of a weighting between these two errors. Demonstration of the method described in this paper is presented for the 15-km grid of Meteorological Service of Canada (MSC)'s regional version of the numerical weather prediction model. The final LAI analyses have a relatively smooth temporal evolution, which makes them more appropriate for environmental prediction than the original MODIS LAI observation data. They are also more realistic than the LAI data currently used operationally at the MSC which is based on land-cover databases.

  13. Geochemical Modeling of Reactions and Partitioning of Trace Metals and Radionuclides during Titration of Contaminated Acidic Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Fan; Parker, Jack C.; Luo, Wensui

    2008-01-01

    Many geochemical reactions that control aqueous metal concentrations are directly affected by solution pH. However, changes in solution pH are strongly buffered by various aqueous phase and solid phase precipitation/dissolution and adsorption/desorption reactions. The ability to predict acid-base behavior of the soil-solution system is thus critical to predict metal transport under variable pH conditions. This study was undertaken to develop a practical generic geochemical modeling approach to predict aqueous and solid phase concentrations of metals and anions during conditions of acid or base additions. The method of Spalding and Spalding was utilized to model soil buffer capacity and pH-dependent cationmore » exchange capacity by treating aquifer solids as a polyprotic acid. To simulate the dynamic and pH-dependent anion exchange capacity, the aquifer solids were simultaneously treated as a polyprotic base controlled by mineral precipitation/dissolution reactions. An equilibrium reaction model that describes aqueous complexation, precipitation, sorption and soil buffering with pH-dependent ion exchange was developed using HydroGeoChem v5.0 (HGC5). Comparison of model results with experimental titration data of pH, Al, Ca, Mg, Sr, Mn, Ni, Co, and SO{sub 4}{sup 2-} for contaminated sediments indicated close agreement, suggesting that the model could potentially be used to predict the acid-base behavior of the sediment-solution system under variable pH conditions.« less

  14. Using remote sensing satellite data and artificial neural network for prediction of potato yield in Bangladesh

    NASA Astrophysics Data System (ADS)

    Akhand, Kawsar; Nizamuddin, Mohammad; Roytman, Leonid; Kogan, Felix

    2016-09-01

    Potato is one of the staple foods and cash crops in Bangladesh. It is widely cultivated in all of the districts and ranks second after rice in production. Bangladesh is the fourth largest potato producer in Asia and is among the world's top 15 potato producing countries. The weather condition for potato cultivation is favorable during the sowing, growing and harvesting period. It is a winter crop and is cultivated during the period of November to March. Bangladesh is mainly an agricultural based country with respect to agriculture's contribution to GDP, employment and consumption. Potato is a prominent crop in consideration of production, its internal demand and economic value. Bangladesh has a big economic activities related to potato cultivation and marketing, especially the economic relations among farmers, traders, stockers and cold storage owners. Potato yield prediction before harvest is an important issue for the Government and the stakeholders in managing and controlling the potato market. Advanced very high resolution radiometer (AVHRR) based satellite data product vegetation health indices VCI (vegetation condition index) and TCI (temperature condition index) are used as predictors for early prediction. Artificial neural network (ANN) is used to develop a prediction model. The simulated result from this model is encouraging and the error of prediction is less than 10%.

  15. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald

    2015-11-30

    This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.

  16. Toward Hypertension Prediction Based on PPG-Derived HRV Signals: a Feasibility Study.

    PubMed

    Lan, Kun-Chan; Raknim, Paweeya; Kao, Wei-Fong; Huang, Jyh-How

    2018-04-21

    Heart rate variability (HRV) is often used to assess the risk of cardiovascular disease, and data on this can be obtained via electrocardiography (ECG). However, collecting heart rate data via photoplethysmography (PPG) is now a lot easier. We investigate the feasibility of using the PPG-based heart rate to estimate HRV and predict diseases. We obtain three months of PPG-based heart rate data from subjects with and without hypertension, and calculate the HRV based on various forms of time and frequency domain analysis. We then apply a data mining technique to this estimated HRV data, to see if it is possible to correctly identify patients with hypertension. We use six HRV parameters to predict hypertension, and find SDNN has the best predictive power. We show that early disease prediction is possible through collecting one's PPG-based heart rate information.

  17. Statistical and engineering methods for model enhancement

    NASA Astrophysics Data System (ADS)

    Chang, Chia-Jung

    Models which describe the performance of physical process are essential for quality prediction, experimental planning, process control and optimization. Engineering models developed based on the underlying physics/mechanics of the process such as analytic models or finite element models are widely used to capture the deterministic trend of the process. However, there usually exists stochastic randomness in the system which may introduce the discrepancy between physics-based model predictions and observations in reality. Alternatively, statistical models can be used to develop models to obtain predictions purely based on the data generated from the process. However, such models tend to perform poorly when predictions are made away from the observed data points. This dissertation contributes to model enhancement research by integrating physics-based model and statistical model to mitigate the individual drawbacks and provide models with better accuracy by combining the strengths of both models. The proposed model enhancement methodologies including the following two streams: (1) data-driven enhancement approach and (2) engineering-driven enhancement approach. Through these efforts, more adequate models are obtained, which leads to better performance in system forecasting, process monitoring and decision optimization. Among different data-driven enhancement approaches, Gaussian Process (GP) model provides a powerful methodology for calibrating a physical model in the presence of model uncertainties. However, if the data contain systematic experimental errors, the GP model can lead to an unnecessarily complex adjustment of the physical model. In Chapter 2, we proposed a novel enhancement procedure, named as “Minimal Adjustment”, which brings the physical model closer to the data by making minimal changes to it. This is achieved by approximating the GP model by a linear regression model and then applying a simultaneous variable selection of the model and experimental bias terms. Two real examples and simulations are presented to demonstrate the advantages of the proposed approach. Different from enhancing the model based on data-driven perspective, an alternative approach is to focus on adjusting the model by incorporating the additional domain or engineering knowledge when available. This often leads to models that are very simple and easy to interpret. The concepts of engineering-driven enhancement are carried out through two applications to demonstrate the proposed methodologies. In the first application where polymer composite quality is focused, nanoparticle dispersion has been identified as a crucial factor affecting the mechanical properties. Transmission Electron Microscopy (TEM) images are commonly used to represent nanoparticle dispersion without further quantifications on its characteristics. In Chapter 3, we developed the engineering-driven nonhomogeneous Poisson random field modeling strategy to characterize nanoparticle dispersion status of nanocomposite polymer, which quantitatively represents the nanomaterial quality presented through image data. The model parameters are estimated through the Bayesian MCMC technique to overcome the challenge of limited amount of accessible data due to the time consuming sampling schemes. The second application is to calibrate the engineering-driven force models of laser-assisted micro milling (LAMM) process statistically, which facilitates a systematic understanding and optimization of targeted processes. In Chapter 4, the force prediction interval has been derived by incorporating the variability in the runout parameters as well as the variability in the measured cutting forces. The experimental results indicate that the model predicts the cutting force profile with good accuracy using a 95% confidence interval. To conclude, this dissertation is the research drawing attention to model enhancement, which has considerable impacts on modeling, design, and optimization of various processes and systems. The fundamental methodologies of model enhancement are developed and further applied to various applications. These research activities developed engineering compliant models for adequate system predictions based on observational data with complex variable relationships and uncertainty, which facilitate process planning, monitoring, and real-time control.

  18. Does tooth wear status predict ongoing sleep bruxism in 30-year-old Japanese subjects?

    PubMed

    Baba, Kazuyoshi; Haketa, Tadasu; Clark, Glenn T; Ohyama, Takashi

    2004-01-01

    This study investigated whether tooth wear status can predict bruxism level. Sixteen Japanese subjects (eight bruxers and eight age- and gender-matched controls; mean age 30 years) participated in this study. From dental casts of these subjects, the tooth wear was scored by Murphy's method. Bruxism level in these subjects was also recorded for 5 consecutive nights in the subject's home environment using a force-based bruxism detecting system. The relationship between the tooth wear score and bruxism data was evaluated statistically. Correlation analysis between the Murphy's scores of maxillary and mandibular dental arch and bruxism event duration score revealed no significant relationship between tooth wear and current bruxism. Tooth wear status is not predictive of ongoing bruxism level as measured by the force-based bruxism detection system in 30-year-old Japanese subjects.

  19. Synergies in the space of control variables within the equilibrium-point hypothesis.

    PubMed

    Ambike, S; Mattos, D; Zatsiorsky, V M; Latash, M L

    2016-02-19

    We use an approach rooted in the recent theory of synergies to analyze possible co-variation between two hypothetical control variables involved in finger force production based on the equilibrium-point (EP) hypothesis. These control variables are the referent coordinate (R) and apparent stiffness (C) of the finger. We tested a hypothesis that inter-trial co-variation in the {R; C} space during repeated, accurate force production trials stabilizes the fingertip force. This was expected to correspond to a relatively low amount of inter-trial variability affecting force and a high amount of variability keeping the force unchanged. We used the "inverse piano" apparatus to apply small and smooth positional perturbations to fingers during force production tasks. Across trials, R and C showed strong co-variation with the data points lying close to a hyperbolic curve. Hyperbolic regressions accounted for over 99% of the variance in the {R; C} space. Another analysis was conducted by randomizing the original {R; C} data sets and creating surrogate data sets that were then used to compute predicted force values. The surrogate sets always showed much higher force variance compared to the actual data, thus reinforcing the conclusion that finger force control was organized in the {R; C} space, as predicted by the EP hypothesis, and involved co-variation in that space stabilizing total force. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  20. Synergies in the space of control variables within the equilibrium-point hypothesis

    PubMed Central

    Ambike, Satyajit; Mattos, Daniela; Zatsiorsky, Vladimir M.; Latash, Mark L.

    2015-01-01

    We use an approach rooted in the recent theory of synergies to analyze possible co-variation between two hypothetical control variables involved in finger force production based in the equilibrium-point hypothesis. These control variables are the referent coordinate (R) and apparent stiffness (C) of the finger. We tested a hypothesis that inter-trial co-variation in the {R; C} space during repeated, accurate force production trials stabilizes the fingertip force. This was expected to correspond to a relatively low amount of inter-trial variability affecting force and a high amount of variability keeping the force unchanged. We used the “inverse piano” apparatus to apply small and smooth positional perturbations to fingers during force production tasks. Across trials, R and C showed strong co-variation with the data points lying close to a hyperbolic curve. Hyperbolic regressions accounted for over 99% of the variance in the {R; C} space. Another analysis was conducted by randomizing the original {R; C} data sets and creating surrogate data sets that were then used to compute predicted force values. The surrogate sets always showed much higher force variance compared to the actual data, thus reinforcing the conclusion that finger force control was organized in the {R; C} space, as predicted by the equilibrium-point hypothesis, and involved co-variation in that space stabilizing total force. PMID:26701299

  1. TSAFE Interface Control Document v 2.0

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Bach, Ralph E.

    2013-01-01

    This document specifies the data interface for TSAFE, the Tactical Separation-Assured Flight Environment. TSAFE is a research prototype of a software application program for alerting air traffic controllers to imminent conflicts in enroute airspace. It is intended for Air Route Traffic Control Centers ("Centers") in the U.S. National Airspace System. It predicts trajectories for approximately 3 minutes into the future, searches for conflicts, and sends data about predicted conflicts to the client, which uses the data to alert an air traffic controller of conflicts. TSAFE itself does not provide a graphical user interface.

  2. Discrete return lidar-based prediction of leaf area index in two conifer forests

    Treesearch

    Jennifer L. R. Jensen; Karen S. Humes; Lee A. Vierling; Andrew T. Hudak

    2008-01-01

    Leaf area index (LAI) is a key forest structural characteristic that serves as a primary control for exchanges of mass and energy within a vegetated ecosystem. Most previous attempts to estimate LAI from remotely sensed data have relied on empirical relationships between field-measured observations and various spectral vegetation indices (SVIs) derived from optical...

  3. Parental Participation and Retention in an Alcohol Preventive Family-Focused Programme

    ERIC Educational Resources Information Center

    Skarstrand, Eva; Branstrom, Richard; Sundell, Knut; Kallmen, Hakan; Andreassen, Sven

    2009-01-01

    Purpose: The purpose of this paper is to examine factors predicting parental participation and retention in a Swedish version of the Strengthening Families Programme (SFP). Design/methodology/approach: This study is based on data from a randomised controlled trial to evaluate the effects of the Swedish version of the SFP. The sample involves 441…

  4. Modeling urban coastal flood severity from crowd-sourced flood reports using Poisson regression and Random Forest

    NASA Astrophysics Data System (ADS)

    Sadler, J. M.; Goodall, J. L.; Morsy, M. M.; Spencer, K.

    2018-04-01

    Sea level rise has already caused more frequent and severe coastal flooding and this trend will likely continue. Flood prediction is an essential part of a coastal city's capacity to adapt to and mitigate this growing problem. Complex coastal urban hydrological systems however, do not always lend themselves easily to physically-based flood prediction approaches. This paper presents a method for using a data-driven approach to estimate flood severity in an urban coastal setting using crowd-sourced data, a non-traditional but growing data source, along with environmental observation data. Two data-driven models, Poisson regression and Random Forest regression, are trained to predict the number of flood reports per storm event as a proxy for flood severity, given extensive environmental data (i.e., rainfall, tide, groundwater table level, and wind conditions) as input. The method is demonstrated using data from Norfolk, Virginia USA from September 2010 to October 2016. Quality-controlled, crowd-sourced street flooding reports ranging from 1 to 159 per storm event for 45 storm events are used to train and evaluate the models. Random Forest performed better than Poisson regression at predicting the number of flood reports and had a lower false negative rate. From the Random Forest model, total cumulative rainfall was by far the most dominant input variable in predicting flood severity, followed by low tide and lower low tide. These methods serve as a first step toward using data-driven methods for spatially and temporally detailed coastal urban flood prediction.

  5. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes.

    PubMed

    Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-11-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  6. Predictors of treatment seeking intention among people with cough in East Wollega, Ethiopia based on the theory of planned behavior: a community based cross-sectional study.

    PubMed

    Addisu, Yohannes; Birhanu, Zewdie; Tilahun, Dejen; Assefa, Tsion

    2014-04-01

    Early treatment seeking for cough is crucial in the prevention and control of Tuberculosis. This study was intended to assess treatment seeking intention of people with cough of more than two weeks, and to identify its predictors. A community based cross-sectional study was conducted among 763 individuals with cough of more than two weeks in East Wollega Zone from March 10 to April 16, 2011. Study participants were selected from eighteen villages by cluster sampling method. Data collection instruments were developed according to the standard guideline of the theory of planned behavior. The data were analyzed with SPSS 16.0. Multiple linear regression was used to identify predictors. Mean score of intention was found to be 12.6 (SD=2.8) (range of possible score=3-15). Knowledge (β=0.14, 95%CI: 0.07-0.2), direct attitude (β=0.31, 95%CI: 0.25-0.35), belief-based attitude (β=0.03, 95%CI: 0.02-0.06) and perceived subjective norm (β=0.22, 95%CI: 0.13-0.31) positively predicted treatment seeking intention. However, perceived behavioral control and control belief were not significantly associated with treatment seeking intention (p>0.05). Being smoker (β=-0.97, 95%CI:-1.65 (-0.37)) and higher family income (β=-0.06, 95%CI:-0.07-(-0.01) were significantly associated with lower treatment seeking intention. TPB significantly predicted treatment seeking intention among the study participants. Attitude and silent beliefs held by the respondents play an important role and should be given emphasize in prevention and control of Tuberculosis.

  7. Comparisons of Predictions of the XB-70-1 Longitudinal Stability and Control Derivatives with Flight Results for Six Flight Conditions

    NASA Technical Reports Server (NTRS)

    Wolowicz, C. H.; Yancey, R. B.

    1973-01-01

    Preliminary correlations of flight-determined and predicted stability and control characteristics of the XB-70-1 reported in NASA TN D-4578 were subject to uncertainties in several areas which necessitated a review of prediction techniques particularly for the longitudinal characteristics. Reevaluation and updating of the original predictions, including aeroelastic corrections, for six specific flight-test conditions resulted in improved correlations of static pitch stability with flight data. The original predictions for the pitch-damping derivative, on the other hand, showed better correlation with flight data than the updated predictions. It appears that additional study is required in the application of aeroelastic corrections to rigid model wind-tunnel data and the theoretical determination of dynamic derivatives for this class of aircraft.

  8. Noise transmission and reduction in turboprop aircraft

    NASA Astrophysics Data System (ADS)

    MacMartin, Douglas G.; Basso, Gordon L.; Leigh, Barry

    1994-09-01

    There is considerable interest in reducing the cabin noise environment in turboprop aircraft. Various approaches have been considered at deHaviland Inc., including passive tuned-vibration absorbers, speaker-based noise cancellation, and structural vibration control of the fuselage. These approaches will be discussed briefly. In addition to controlling the noise, a method of predicting the internal noise is required both to evaluate potential noise reduction approaches, and to validate analytical design models. Instead of costly flight tests, or carrying out a ground simulation of the propeller pressure field, a much simpler reciprocal technique can be used. A capacitive scanner is used to measure the fuselage vibration response on a deHaviland Dash-8 fuselage, due to an internal noise source. The approach is validated by comparing this reciprocal noise transmission measurement with the direct measurement. The fuselage noise transmission information is then combined with computer predictions of the propeller pressure field data to predict the internal noise at two points.

  9. Development and Implementation of a Hardware In-the-Loop Test Bed for Unmanned Aerial Vehicle Control Algorithms

    NASA Technical Reports Server (NTRS)

    Nyangweso, Emmanuel; Bole, Brian

    2014-01-01

    Successful prediction and management of battery life using prognostic algorithms through ground and flight tests is important for performance evaluation of electrical systems. This paper details the design of test beds suitable for replicating loading profiles that would be encountered in deployed electrical systems. The test bed data will be used to develop and validate prognostic algorithms for predicting battery discharge time and battery failure time. Online battery prognostic algorithms will enable health management strategies. The platform used for algorithm demonstration is the EDGE 540T electric unmanned aerial vehicle (UAV). The fully designed test beds developed and detailed in this paper can be used to conduct battery life tests by controlling current and recording voltage and temperature to develop a model that makes a prediction of end-of-charge and end-of-life of the system based on rapid state of health (SOH) assessment.

  10. Multi-model predictive control based on LMI: from the adaptation of the state-space model to the analytic description of the control law

    NASA Astrophysics Data System (ADS)

    Falugi, P.; Olaru, S.; Dumur, D.

    2010-08-01

    This article proposes an explicit robust predictive control solution based on linear matrix inequalities (LMIs). The considered predictive control strategy uses different local descriptions of the system dynamics and uncertainties and thus allows the handling of less conservative input constraints. The computed control law guarantees constraint satisfaction and asymptotic stability. The technique is effective for a class of nonlinear systems embedded into polytopic models. A detailed discussion of the procedures which adapt the partition of the state space is presented. For the practical implementation the construction of suitable (explicit) descriptions of the control law are described upon concrete algorithms.

  11. Extended active disturbance rejection controller

    NASA Technical Reports Server (NTRS)

    Tian, Gang (Inventor); Gao, Zhiqiang (Inventor)

    2012-01-01

    Multiple designs, systems, methods and processes for controlling a system or plant using an extended active disturbance rejection control (ADRC) based controller are presented. The extended ADRC controller accepts sensor information from the plant. The sensor information is used in conjunction with an extended state observer in combination with a predictor that estimates and predicts the current state of the plant and a co-joined estimate of the system disturbances and system dynamics. The extended state observer estimates and predictions are used in conjunction with a control law that generates an input to the system based in part on the extended state observer estimates and predictions as well as a desired trajectory for the plant to follow.

  12. Extended Active Disturbance Rejection Controller

    NASA Technical Reports Server (NTRS)

    Gao, Zhiqiang (Inventor); Tian, Gang (Inventor)

    2016-01-01

    Multiple designs, systems, methods and processes for controlling a system or plant using an extended active disturbance rejection control (ADRC) based controller are presented. The extended ADRC controller accepts sensor information from the plant. The sensor information is used in conjunction with an extended state observer in combination with a predictor that estimates and predicts the current state of the plant and a co-joined estimate of the system disturbances and system dynamics. The extended state observer estimates and predictions are used in conjunction with a control law that generates an input to the system based in part on the extended state observer estimates and predictions as well as a desired trajectory for the plant to follow.

  13. Extended Active Disturbance Rejection Controller

    NASA Technical Reports Server (NTRS)

    Tian, Gang (Inventor); Gao, Zhiqiang (Inventor)

    2014-01-01

    Multiple designs, systems, methods and processes for controlling a system or plant using an extended active disturbance rejection control (ADRC) based controller are presented. The extended ADRC controller accepts sensor information from the plant. The sensor information is used in conjunction with an extended state observer in combination with a predictor that estimates and predicts the current state of the plant and a co-joined estimate of the system disturbances and system dynamics. The extended state observer estimates and predictions are used in conjunction with a control law that generates an input to the system based in part on the extended state observer estimates and predictions as well as a desired trajectory for the plant to follow.

  14. Model Predictive Control Based Motion Drive Algorithm for a Driving Simulator

    NASA Astrophysics Data System (ADS)

    Rehmatullah, Faizan

    In this research, we develop a model predictive control based motion drive algorithm for the driving simulator at Toronto Rehabilitation Institute. Motion drive algorithms exploit the limitations of the human vestibular system to formulate a perception of motion within the constrained workspace of a simulator. In the absence of visual cues, the human perception system is unable to distinguish between acceleration and the force of gravity. The motion drive algorithm determines control inputs to displace the simulator platform, and by using the resulting inertial forces and angular rates, creates the perception of motion. By using model predictive control, we can optimize the use of simulator workspace for every maneuver while simulating the vehicle perception. With the ability to handle nonlinear constraints, the model predictive control allows us to incorporate workspace limitations.

  15. Predicted and flight test results of the performance, stability and control of the space shuttle from reentry to landing

    NASA Technical Reports Server (NTRS)

    Kirsten, P. W.; Richardson, D. F.; Wilson, C. M.

    1983-01-01

    Aerodynaic performance, stability and control data obtained from the first five reentries of the Space Shuttle orbiter are given. Flight results are compared to pedicted data from Mach 26.4 to Mach 0.4. Differences between flight and predicted data as well as probable causes for the discrepancies are given.

  16. Communication Dynamics in Finite Capacity Social Networks

    NASA Astrophysics Data System (ADS)

    Haerter, Jan O.; Jamtveit, Bjørn; Mathiesen, Joachim

    2012-10-01

    In communication networks, structure and dynamics are tightly coupled. The structure controls the flow of information and is itself shaped by the dynamical process of information exchanged between nodes. In order to reconcile structure and dynamics, a generic model, based on the local interaction between nodes, is considered for the communication in large social networks. In agreement with data from a large human organization, we show that the flow is non-Markovian and controlled by the temporal limitations of individuals. We confirm the versatility of our model by predicting simultaneously the degree-dependent node activity, the balance between information input and output of nodes, and the degree distribution. Finally, we quantify the limitations to network analysis when it is based on data sampled over a finite period of time.

  17. Remaining useful life assessment of lithium-ion batteries in implantable medical devices

    NASA Astrophysics Data System (ADS)

    Hu, Chao; Ye, Hui; Jain, Gaurav; Schmidt, Craig

    2018-01-01

    This paper presents a prognostic study on lithium-ion batteries in implantable medical devices, in which a hybrid data-driven/model-based method is employed for remaining useful life assessment. The method is developed on and evaluated against data from two sets of lithium-ion prismatic cells used in implantable applications exhibiting distinct fade performance: 1) eight cells from Medtronic, PLC whose rates of capacity fade appear to be stable and gradually decrease over a 10-year test duration; and 2) eight cells from Manufacturer X whose rates appear to be greater and show sharp increase after some period over a 1.8-year test duration. The hybrid method enables online prediction of remaining useful life for predictive maintenance/control. It consists of two modules: 1) a sparse Bayesian learning module (data-driven) for inferring capacity from charge-related features; and 2) a recursive Bayesian filtering module (model-based) for updating empirical capacity fade models and predicting remaining useful life. A generic particle filter is adopted to implement recursive Bayesian filtering for the cells from the first set, whose capacity fade behavior can be represented by a single fade model; a multiple model particle filter with fixed-lag smoothing is proposed for the cells from the second data set, whose capacity fade behavior switches between multiple fade models.

  18. Combining Knowledge and Data Driven Insights for Identifying Risk Factors using Electronic Health Records

    PubMed Central

    Sun, Jimeng; Hu, Jianying; Luo, Dijun; Markatou, Marianthi; Wang, Fei; Edabollahi, Shahram; Steinhubl, Steven E.; Daar, Zahra; Stewart, Walter F.

    2012-01-01

    Background: The ability to identify the risk factors related to an adverse condition, e.g., heart failures (HF) diagnosis, is very important for improving care quality and reducing cost. Existing approaches for risk factor identification are either knowledge driven (from guidelines or literatures) or data driven (from observational data). No existing method provides a model to effectively combine expert knowledge with data driven insight for risk factor identification. Methods: We present a systematic approach to enhance known knowledge-based risk factors with additional potential risk factors derived from data. The core of our approach is a sparse regression model with regularization terms that correspond to both knowledge and data driven risk factors. Results: The approach is validated using a large dataset containing 4,644 heart failure cases and 45,981 controls. The outpatient electronic health records (EHRs) for these patients include diagnosis, medication, lab results from 2003–2010. We demonstrate that the proposed method can identify complementary risk factors that are not in the existing known factors and can better predict the onset of HF. We quantitatively compare different sets of risk factors in the context of predicting onset of HF using the performance metric, the Area Under the ROC Curve (AUC). The combined risk factors between knowledge and data significantly outperform knowledge-based risk factors alone. Furthermore, those additional risk factors are confirmed to be clinically meaningful by a cardiologist. Conclusion: We present a systematic framework for combining knowledge and data driven insights for risk factor identification. We demonstrate the power of this framework in the context of predicting onset of HF, where our approach can successfully identify intuitive and predictive risk factors beyond a set of known HF risk factors. PMID:23304365

  19. Application of the Refined Integral Method in the mathematical modeling of drug delivery from one-layer torus-shaped devices.

    PubMed

    Helbling, Ignacio M; Ibarra, Juan C D; Luna, Julio A

    2012-02-28

    A mathematical modeling of controlled release of drug from one-layer torus-shaped devices is presented. Analytical solutions based on Refined Integral Method (RIM) are derived. The validity and utility of the model are ascertained by comparison of the simulation results with matrix-type vaginal rings experimental release data reported in the literature. For the comparisons, the pair-wise procedure is used to measure quantitatively the fit of the theoretical predictions to the experimental data. A good agreement between the model prediction and the experimental data is observed. A comparison with a previously reported model is also presented. More accurate results are achieved for small A/C(s) ratios. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. A preliminary correlation of the orbiter stability and control aerodynamics from the first two Space Shuttle flights /STS-1 & 2/ with preflight predictions

    NASA Technical Reports Server (NTRS)

    Underwood, J. M.; Cooke, D. R.

    1982-01-01

    A correlation of the stability and control derivatives from flight (STS-1 & 2) with preflight predictions is presented across the Mach range from 0.9 to 25. Flight data obtained from specially designed flight test maneuvers as well as from conventional bank maneuvers generally indicate good agreement with predicted data. However, the vehicle appears to be lateral-directionally more stable than predicted in the transonic regime. Aerodynamic 'reasonableness tests' are employed to test for validity of flight data. The importance of testing multiple models in multiple wind tunnels at the same test conditions is demonstrated.

  1. Coordinated path-following and direct yaw-moment control of autonomous electric vehicles with sideslip angle estimation

    NASA Astrophysics Data System (ADS)

    Guo, Jinghua; Luo, Yugong; Li, Keqiang; Dai, Yifan

    2018-05-01

    This paper presents a novel coordinated path following system (PFS) and direct yaw-moment control (DYC) of autonomous electric vehicles via hierarchical control technique. In the high-level control law design, a new fuzzy factor is introduced based on the magnitude of longitudinal velocity of vehicle, a linear time varying (LTV)-based model predictive controller (MPC) is proposed to acquire the wheel steering angle and external yaw moment. Then, a pseudo inverse (PI) low-level control allocation law is designed to realize the tracking of desired external moment torque and management of the redundant tire actuators. Furthermore, the vehicle sideslip angle is estimated by the data fusion of low-cost GPS and INS, which can be obtained by the integral of modified INS signals with GPS signals as initial value. Finally, the effectiveness of the proposed control system is validated by the simulation and experimental tests.

  2. Datamining approaches for modeling tumor control probability.

    PubMed

    Naqa, Issam El; Deasy, Joseph O; Mu, Yi; Huang, Ellen; Hope, Andrew J; Lindsay, Patricia E; Apte, Aditya; Alaly, James; Bradley, Jeffrey D

    2010-11-01

    Tumor control probability (TCP) to radiotherapy is determined by complex interactions between tumor biology, tumor microenvironment, radiation dosimetry, and patient-related variables. The complexity of these heterogeneous variable interactions constitutes a challenge for building predictive models for routine clinical practice. We describe a datamining framework that can unravel the higher order relationships among dosimetric dose-volume prognostic variables, interrogate various radiobiological processes, and generalize to unseen data before when applied prospectively. Several datamining approaches are discussed that include dose-volume metrics, equivalent uniform dose, mechanistic Poisson model, and model building methods using statistical regression and machine learning techniques. Institutional datasets of non-small cell lung cancer (NSCLC) patients are used to demonstrate these methods. The performance of the different methods was evaluated using bivariate Spearman rank correlations (rs). Over-fitting was controlled via resampling methods. Using a dataset of 56 patients with primary NCSLC tumors and 23 candidate variables, we estimated GTV volume and V75 to be the best model parameters for predicting TCP using statistical resampling and a logistic model. Using these variables, the support vector machine (SVM) kernel method provided superior performance for TCP prediction with an rs=0.68 on leave-one-out testing compared to logistic regression (rs=0.4), Poisson-based TCP (rs=0.33), and cell kill equivalent uniform dose model (rs=0.17). The prediction of treatment response can be improved by utilizing datamining approaches, which are able to unravel important non-linear complex interactions among model variables and have the capacity to predict on unseen data for prospective clinical applications.

  3. A review of propeller noise prediction methodology: 1919-1994

    NASA Technical Reports Server (NTRS)

    Metzger, F. Bruce

    1995-01-01

    This report summarizes a review of the literature regarding propeller noise prediction methods. The review is divided into six sections: (1) early methods; (2) more recent methods based on earlier theory; (3) more recent methods based on the Acoustic Analogy; (4) more recent methods based on Computational Acoustics; (5) empirical methods; and (6) broadband methods. The report concludes that there are a large number of noise prediction procedures available which vary markedly in complexity. Deficiencies in accuracy of methods in many cases may be related, not to the methods themselves, but the accuracy and detail of the aerodynamic inputs used to calculate noise. The steps recommended in the report to provide accurate and easy to use prediction methods are: (1) identify reliable test data; (2) define and conduct test programs to fill gaps in the existing data base; (3) identify the most promising prediction methods; (4) evaluate promising prediction methods relative to the data base; (5) identify and correct the weaknesses in the prediction methods, including lack of user friendliness, and include features now available only in research codes; (6) confirm the accuracy of improved prediction methods to the data base; and (7) make the methods widely available and provide training in their use.

  4. Predicting the particle size distribution of eroded sediment using artificial neural networks.

    PubMed

    Lagos-Avid, María Paz; Bonilla, Carlos A

    2017-03-01

    Water erosion causes soil degradation and nonpoint pollution. Pollutants are primarily transported on the surfaces of fine soil and sediment particles. Several soil loss models and empirical equations have been developed for the size distribution estimation of the sediment leaving the field, including the physically-based models and empirical equations. Usually, physically-based models require a large amount of data, sometimes exceeding the amount of available data in the modeled area. Conversely, empirical equations do not always predict the sediment composition associated with individual events and may require data that are not always available. Therefore, the objective of this study was to develop a model to predict the particle size distribution (PSD) of eroded soil. A total of 41 erosion events from 21 soils were used. These data were compiled from previous studies. Correlation and multiple regression analyses were used to identify the main variables controlling sediment PSD. These variables were the particle size distribution in the soil matrix, the antecedent soil moisture condition, soil erodibility, and hillslope geometry. With these variables, an artificial neural network was calibrated using data from 29 events (r 2 =0.98, 0.97, and 0.86; for sand, silt, and clay in the sediment, respectively) and then validated and tested on 12 events (r 2 =0.74, 0.85, and 0.75; for sand, silt, and clay in the sediment, respectively). The artificial neural network was compared with three empirical models. The network presented better performance in predicting sediment PSD and differentiating rain-runoff events in the same soil. In addition to the quality of the particle distribution estimates, this model requires a small number of easily obtained variables, providing a convenient routine for predicting PSD in eroded sediment in other pollutant transport models. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Exposure–response model for sibutramine and placebo: suggestion for application to long-term weight-control drug development

    PubMed Central

    Han, Seunghoon; Jeon, Sangil; Hong, Taegon; Lee, Jongtae; Bae, Soo Hyeon; Park, Wan-su; Park, Gab-jin; Youn, Sunil; Jang, Doo Yeon; Kim, Kyung-Soo; Yim, Dong-Seok

    2015-01-01

    No wholly successful weight-control drugs have been developed to date, despite the tremendous demand. We present an exposure–response model of sibutramine mesylate that can be applied during clinical development of other weight-control drugs. Additionally, we provide a model-based evaluation of sibutramine efficacy. Data from a double-blind, randomized, placebo-controlled, multicenter study were used (N=120). Subjects in the treatment arm were initially given 8.37 mg sibutramine base daily, and those who lost <2 kg after 4 weeks’ treatment were escalated to 12.55 mg. The duration of treatment was 24 weeks. Drug concentration and body weight were measured predose and at 4 weeks, 8 weeks, and 24 weeks after treatment initiation. Exposure and response to sibutramine, including the placebo effect, were modeled using NONMEM 7.2. An asymptotic model approaching the final body weight was chosen to describe the time course of weight loss. Extent of weight loss was described successfully using a sigmoidal exposure–response relationship of the drug with a constant placebo effect in each individual. The placebo effect was influenced by subjects’ sex and baseline body mass index. Maximal weight loss was predicted to occur around 1 year after treatment initiation. The difference in mean weight loss between the sibutramine (daily 12.55 mg) and placebo groups was predicted to be 4.5% in a simulation of 1 year of treatment, with considerable overlap of prediction intervals. Our exposure–response model, which included the placebo effect, is the first example of a quantitative model that can be used to predict the efficacy of weight-control drugs. Similar approaches can help decision-making during clinical development of novel weight-loss drugs. PMID:26392753

  6. Exposure-response model for sibutramine and placebo: suggestion for application to long-term weight-control drug development.

    PubMed

    Han, Seunghoon; Jeon, Sangil; Hong, Taegon; Lee, Jongtae; Bae, Soo Hyeon; Park, Wan-su; Park, Gab-jin; Youn, Sunil; Jang, Doo Yeon; Kim, Kyung-Soo; Yim, Dong-Seok

    2015-01-01

    No wholly successful weight-control drugs have been developed to date, despite the tremendous demand. We present an exposure-response model of sibutramine mesylate that can be applied during clinical development of other weight-control drugs. Additionally, we provide a model-based evaluation of sibutramine efficacy. Data from a double-blind, randomized, placebo-controlled, multicenter study were used (N=120). Subjects in the treatment arm were initially given 8.37 mg sibutramine base daily, and those who lost <2 kg after 4 weeks' treatment were escalated to 12.55 mg. The duration of treatment was 24 weeks. Drug concentration and body weight were measured predose and at 4 weeks, 8 weeks, and 24 weeks after treatment initiation. Exposure and response to sibutramine, including the placebo effect, were modeled using NONMEM 7.2. An asymptotic model approaching the final body weight was chosen to describe the time course of weight loss. Extent of weight loss was described successfully using a sigmoidal exposure-response relationship of the drug with a constant placebo effect in each individual. The placebo effect was influenced by subjects' sex and baseline body mass index. Maximal weight loss was predicted to occur around 1 year after treatment initiation. The difference in mean weight loss between the sibutramine (daily 12.55 mg) and placebo groups was predicted to be 4.5% in a simulation of 1 year of treatment, with considerable overlap of prediction intervals. Our exposure-response model, which included the placebo effect, is the first example of a quantitative model that can be used to predict the efficacy of weight-control drugs. Similar approaches can help decision-making during clinical development of novel weight-loss drugs.

  7. Long-Term Sustainability of Evidence-Based Prevention Interventions and Community Coalitions Survival: a Five and One-Half Year Follow-up Study.

    PubMed

    Johnson, Knowlton; Collins, David; Shamblen, Steve; Kenworthy, Tara; Wandersman, Abraham

    2017-07-01

    This study examines (1) coalition survival, (2) prevalence of evidence-based prevention interventions (EBPIs) to reduce substance abuse implemented as part of the Tennessee Strategic Prevention Framework (SPF) State Incentive Grant (SIG), (3) EBPI sustainability, and (4) factors that predict EBPI sustainability. Secondary data were collected on 27 SPF SIG-funded coalitions and 88 EBPI and non-EBPI implementations. Primary data were collected by a telephone interview/web survey five and one-half years after the SPF SIG ended. Results from secondary data show that 25 of the 27 coalitions survived beyond the SPF SIG for one to five and one-half years; 19 coalitions (70%) were still active five and one-half years later. Further, 88 EBPIs and non-EBPIs were implemented by 27 county SPF SIG coalitions. Twenty-one (21) of 27 coalitions (78%) implemented one to three EBPIs, totaling 37 EBPI implementations. Based on primary survey data on 29 of the 37 EBPI implementations, 28 EBPIs (97%) were sustained between two and five and one-half years while 22 EBPI implementations (76%) were sustained for five and one-half years. When controlling for variability among coalitions (nesting of EBPIs in coalitions), increases in data resources (availability of five types of prevention data) was a strong predictor of length of EBPI sustainability. Positive change in extramural funding resources and level of expertise during SPF SIG implementation, as well as level of coalition formalization at the end of the SPF SIG predicted EBPI sustainability length. One intervention attribute (trialability) also predicted length of sustainability. Implications are discussed.

  8. 2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation

    NASA Technical Reports Server (NTRS)

    Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.

    2009-01-01

    A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.

  9. Low weight gain at the start of a family-based intervention for adolescent girls with restrictive eating disorders predicted emergency hospital admission.

    PubMed

    Swenne, Ingemar; Ros, Helena Salonen

    2017-10-01

    This study examined predictors of emergency hospitalisation of adolescent girls with restrictive eating disorders and weight loss treated by a family-based intervention programme. We studied 339 girls aged 10-17 years treated in a specialist unit at Uppsala University Children's Hospital, Sweden, from August 2010 to December 2015. Historical weight data were obtained from school health services, and other weight data were determined at presentation. Weight controlling behaviour was recorded, and patients were evaluated using the Eating Disorder Examination Questionnaire. A family-based intervention started after assessment and the early weight gain after one week, one month and three months was assessed. There were 17 emergency admissions of 15 patients for refusing food, progressive weight loss and medical instability. Logistic regression analysis showed that emergency admissions were predicted by a low body mass index standard deviation score at presentation (odds ratio 2.57), a high rate of weight loss before presentation (odds ratio 4.38) and a low rate of weight gain at the start of treatment (odds ratio 4.59). Poor weight gain at the start of a family-based intervention for adolescent girls with restrictive eating disorders predicted emergency hospital admission. ©2017 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  10. The Effects of Longitudinal Control-System Dynamics on Pilot Opinion and Response Characteristics as Determined from Flight Tests and from Ground Simulator Studies

    NASA Technical Reports Server (NTRS)

    Sadoff, Melvin

    1958-01-01

    The results of a fixed-base simulator study of the effects of variable longitudinal control-system dynamics on pilot opinion are presented and compared with flight-test data. The control-system variables considered in this investigation included stick force per g, time constant, and dead-band, or stabilizer breakout force. In general, the fairly good correlation between flight and simulator results for two pilots demonstrates the validity of fixed-base simulator studies which are designed to complement and supplement flight studies and serve as a guide in control-system preliminary design. However, in the investigation of certain problem areas (e.g., sensitive control-system configurations associated with pilot- induced oscillations in flight), fixed-base simulator results did not predict the occurrence of an instability, although the pilots noted the system was extremely sensitive and unsatisfactory. If it is desired to predict pilot-induced-oscillation tendencies, tests in moving-base simulators may be required. It was found possible to represent the human pilot by a linear pilot analog for the tracking task assumed in the present study. The criterion used to adjust the pilot analog was the root-mean-square tracking error of one of the human pilots on the fixed-base simulator. Matching the tracking error of the pilot analog to that of the human pilot gave an approximation to the variation of human-pilot behavior over a range of control-system dynamics. Results of the pilot-analog study indicated that both for optimized control-system dynamics (for poor airplane dynamics) and for a region of good airplane dynamics, the pilot response characteristics are approximately the same.

  11. Nonlinear Multiobjective MPC-Based Optimal Operation of a High Consistency Refining System in Papermaking

    DOE PAGES

    Li, Mingjie; Zhou, Ping; Wang, Hong; ...

    2017-09-19

    As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less

  12. Nonlinear Multiobjective MPC-Based Optimal Operation of a High Consistency Refining System in Papermaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Mingjie; Zhou, Ping; Wang, Hong

    As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less

  13. Modelling and control for laser based welding processes: modern methods of process control to improve quality of laser-based joining methods

    NASA Astrophysics Data System (ADS)

    Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt

    2017-02-01

    To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.

  14. Physiologically-Based Pharmacokinetic Modelling to Inform Development of Intramuscular Long Acting Nanoformulations for HIV

    PubMed Central

    Rajoli, Rajith KR; Back, David J; Rannard, Steve; Meyers, Caren Freel; Flexner, Charles; Owen, Andrew; Siccardi, Marco

    2014-01-01

    Background and Objectives Antiretrovirals (ARVs) are currently used for the treatment and prevention of HIV infection. Poor adherence and low tolerability of some existing oral formulations can hinder their efficacy. Long-acting (LA) injectable nanoformulations could help address these complications by simplifying ARV administration. The aim of this study is to inform the optimisation of intramuscular LA formulations for eight ARVs through physiologically-based pharmacokinetic (PBPK) modelling. Methods A whole-body PBPK model was constructed using mathematical descriptions of molecular, physiological and anatomical processes defining pharmacokinetics. These models were validated against available clinical data and subsequently used to predict the pharmacokinetics of injectable LA formulations Results The predictions suggest that monthly intramuscular injections are possible for dolutegravir, efavirenz, emtricitabine, raltegravir, rilpivirine and tenofovir provided that technological challenges to control release rate can be addressed. Conclusions These data may help inform the target product profiles for LA ARV reformulation strategies. PMID:25523214

  15. Aerothermodynamic Environments Definition for the Mars Science Laboratory Entry Capsule

    NASA Technical Reports Server (NTRS)

    Edquist, Karl T.; Dyakonov, Artem A.; Wright, Michael J.; Tang, Chun Y.

    2007-01-01

    An overview of the aerothermodynamic environments definition status is presented for the Mars Science Laboratory entry vehicle. The environments are based on Navier-Stokes flowfield simulations on a candidate aeroshell geometry and worst-case entry heating trajectories. Uncertainties for the flowfield predictions are based primarily on available ground data since Mars flight data are scarce. The forebody aerothermodynamics analysis focuses on boundary layer transition and turbulent heating augmentation. Turbulent transition is expected prior to peak heating, a first for Mars entry, resulting in augmented heat flux and shear stress at the same heatshield location. Afterbody computations are also shown with and without interference effects of reaction control system thruster plumes. Including uncertainties, analysis predicts that the heatshield may experience peaks of 225 W/sq cm for turbulent heat flux, 0.32 atm for stagnation pressure, and 400 Pa for turbulent shear stress. The afterbody heat flux without thruster plume interference is predicted to be 7 W/sq cm on the backshell and 10 W/sq cm on the parachute cover. If the reaction control jets are fired near peak dynamic pressure, the heat flux at localized areas could reach as high as 76 W/sq cm on the backshell and 38 W/sq cm on the parachute cover, including uncertainties. The final flight environments used for hardware design will be updated for any changes in the aeroshell configuration, heating design trajectories, or uncertainties.

  16. Genome-wide association data classification and SNPs selection using two-stage quality-based Random Forests.

    PubMed

    Nguyen, Thanh-Tung; Huang, Joshua; Wu, Qingyao; Nguyen, Thuy; Li, Mark

    2015-01-01

    Single-nucleotide polymorphisms (SNPs) selection and identification are the most important tasks in Genome-wide association data analysis. The problem is difficult because genome-wide association data is very high dimensional and a large portion of SNPs in the data is irrelevant to the disease. Advanced machine learning methods have been successfully used in Genome-wide association studies (GWAS) for identification of genetic variants that have relatively big effects in some common, complex diseases. Among them, the most successful one is Random Forests (RF). Despite of performing well in terms of prediction accuracy in some data sets with moderate size, RF still suffers from working in GWAS for selecting informative SNPs and building accurate prediction models. In this paper, we propose to use a new two-stage quality-based sampling method in random forests, named ts-RF, for SNP subspace selection for GWAS. The method first applies p-value assessment to find a cut-off point that separates informative and irrelevant SNPs in two groups. The informative SNPs group is further divided into two sub-groups: highly informative and weak informative SNPs. When sampling the SNP subspace for building trees for the forest, only those SNPs from the two sub-groups are taken into account. The feature subspaces always contain highly informative SNPs when used to split a node at a tree. This approach enables one to generate more accurate trees with a lower prediction error, meanwhile possibly avoiding overfitting. It allows one to detect interactions of multiple SNPs with the diseases, and to reduce the dimensionality and the amount of Genome-wide association data needed for learning the RF model. Extensive experiments on two genome-wide SNP data sets (Parkinson case-control data comprised of 408,803 SNPs and Alzheimer case-control data comprised of 380,157 SNPs) and 10 gene data sets have demonstrated that the proposed model significantly reduced prediction errors and outperformed most existing the-state-of-the-art random forests. The top 25 SNPs in Parkinson data set were identified by the proposed model including four interesting genes associated with neurological disorders. The presented approach has shown to be effective in selecting informative sub-groups of SNPs potentially associated with diseases that traditional statistical approaches might fail. The new RF works well for the data where the number of case-control objects is much smaller than the number of SNPs, which is a typical problem in gene data and GWAS. Experiment results demonstrated the effectiveness of the proposed RF model that outperformed the state-of-the-art RFs, including Breiman's RF, GRRF and wsRF methods.

  17. BRCA-Monet: a breast cancer specific drug treatment mode-of-action network for treatment effective prediction using large scale microarray database.

    PubMed

    Ma, Chifeng; Chen, Hung-I; Flores, Mario; Huang, Yufei; Chen, Yidong

    2013-01-01

    Connectivity map (cMap) is a recent developed dataset and algorithm for uncovering and understanding the treatment effect of small molecules on different cancer cell lines. It is widely used but there are still remaining challenges for accurate predictions. Here, we propose BRCA-MoNet, a network of drug mode of action (MoA) specific to breast cancer, which is constructed based on the cMap dataset. A drug signature selection algorithm fitting the characteristic of cMap data, a quality control scheme as well as a novel query algorithm based on BRCA-MoNet are developed for more effective prediction of drug effects. BRCA-MoNet was applied to three independent data sets obtained from the GEO database: Estrodial treated MCF7 cell line, BMS-754807 treated MCF7 cell line, and a breast cancer patient microarray dataset. In the first case, BRCA-MoNet could identify drug MoAs likely to share same and reverse treatment effect. In the second case, the result demonstrated the potential of BRCA-MoNet to reposition drugs and predict treatment effects for drugs not in cMap data. In the third case, a possible procedure of personalized drug selection is showcased. The results clearly demonstrated that the proposed BRCA-MoNet approach can provide increased prediction power to cMap and thus will be useful for identification of new therapeutic candidates.

  18. Evaluation of plasma proteomic data for Alzheimer disease state classification and for the prediction of progression from mild cognitive impairment to Alzheimer disease.

    PubMed

    Llano, Daniel A; Devanarayan, Viswanath; Simon, Adam J

    2013-01-01

    Previous studies that have examined the potential for plasma markers to serve as biomarkers for Alzheimer disease (AD) have studied single analytes and focused on the amyloid-β and τ isoforms and have failed to yield conclusive results. In this study, we performed a multivariate analysis of 146 plasma analytes (the Human DiscoveryMAP v 1.0 from Rules-Based Medicine) in 527 subjects with AD, mild cognitive impairment (MCI), or cognitively normal elderly subjects from the Alzheimer's Disease Neuroimaging Initiative database. We identified 4 different proteomic signatures, each using 5 to 14 analytes, that differentiate AD from control patients with sensitivity and specificity ranging from 74% to 85%. Five analytes were common to all 4 signatures: apolipoprotein A-II, apolipoprotein E, serum glutamic oxaloacetic transaminase, α-1-microglobulin, and brain natriuretic peptide. None of the signatures adequately predicted progression from MCI to AD over a 12- and 24-month period. A new panel of analytes, optimized to predict MCI to AD conversion, was able to provide 55% to 60% predictive accuracy. These data suggest that a simple panel of plasma analytes may provide an adjunctive tool to differentiate AD from controls, may provide mechanistic insights to the etiology of AD, but cannot adequately predict MCI to AD conversion.

  19. Daily Associations Among Self-control, Heavy Episodic Drinking, and Relationship Functioning: An Examination of Actor and Partner Effects

    PubMed Central

    Crane, Cory A.; Testa, Maria; Derrick, Jaye L.; Leonard, Kenneth E.

    2014-01-01

    An emerging literature suggests that temporary deficits in the ability to inhibit impulsive urges may be proximally associated with intimate partner aggression. The current study examined the experience of alcohol use and the depletion of self-control in the prediction of relationship functioning. Daily diary data collected from 118 heterosexual couples were analyzed using parallel multi-level Actor Partner Interdependence Models to assess the effects of heavy episodic drinking and depletion of self-control across partners on outcomes of participant-reported daily arguing with and anger toward an intimate partner. Heavy episodic drinking among actors predicted greater arguing but failed to interact with either actor or partner depletion. We also found that greater arguing was reported on days of high congruent actor and partner depletion. Both actor and partner depletion, as well as their interaction, predicted greater partner-specific anger. Greater partner-specific anger was generally reported on days of congruent actor and partner depletion, particularly on days of high partner depletion. The current results highlight the importance of independently assessing partner effects (i.e., depletion of self-control), which interact dynamically with disinhibiting actor effects, in the prediction of daily adverse relationship functioning. Results offer further support for the development of prospective individualized and couples-based interventions for partner conflict. PMID:24700558

  20. Assessment of the Uniqueness of Wind Tunnel Strain-Gage Balance Load Predictions

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.

    2016-01-01

    A new test was developed to assess the uniqueness of wind tunnel strain-gage balance load predictions that are obtained from regression models of calibration data. The test helps balance users to gain confidence in load predictions of non-traditional balance designs. It also makes it possible to better evaluate load predictions of traditional balances that are not used as originally intended. The test works for both the Iterative and Non-Iterative Methods that are used in the aerospace testing community for the prediction of balance loads. It is based on the hypothesis that the total number of independently applied balance load components must always match the total number of independently measured bridge outputs or bridge output combinations. This hypothesis is supported by a control volume analysis of the inputs and outputs of a strain-gage balance. It is concluded from the control volume analysis that the loads and bridge outputs of a balance calibration data set must separately be tested for linear independence because it cannot always be guaranteed that a linearly independent load component set will result in linearly independent bridge output measurements. Simple linear math models for the loads and bridge outputs in combination with the variance inflation factor are used to test for linear independence. A highly unique and reversible mapping between the applied load component set and the measured bridge output set is guaranteed to exist if the maximum variance inflation factor of both sets is less than the literature recommended threshold of five. Data from the calibration of a six{component force balance is used to illustrate the application of the new test to real-world data.

  1. Patient Similarity in Prediction Models Based on Health Data: A Scoping Review

    PubMed Central

    Sharafoddini, Anis; Dubin, Joel A

    2017-01-01

    Background Physicians and health policy makers are required to make predictions during their decision making in various medical problems. Many advances have been made in predictive modeling toward outcome prediction, but these innovations target an average patient and are insufficiently adjustable for individual patients. One developing idea in this field is individualized predictive analytics based on patient similarity. The goal of this approach is to identify patients who are similar to an index patient and derive insights from the records of similar patients to provide personalized predictions.. Objective The aim is to summarize and review published studies describing computer-based approaches for predicting patients’ future health status based on health data and patient similarity, identify gaps, and provide a starting point for related future research. Methods The method involved (1) conducting the review by performing automated searches in Scopus, PubMed, and ISI Web of Science, selecting relevant studies by first screening titles and abstracts then analyzing full-texts, and (2) documenting by extracting publication details and information on context, predictors, missing data, modeling algorithm, outcome, and evaluation methods into a matrix table, synthesizing data, and reporting results. Results After duplicate removal, 1339 articles were screened in abstracts and titles and 67 were selected for full-text review. In total, 22 articles met the inclusion criteria. Within included articles, hospitals were the main source of data (n=10). Cardiovascular disease (n=7) and diabetes (n=4) were the dominant patient diseases. Most studies (n=18) used neighborhood-based approaches in devising prediction models. Two studies showed that patient similarity-based modeling outperformed population-based predictive methods. Conclusions Interest in patient similarity-based predictive modeling for diagnosis and prognosis has been growing. In addition to raw/coded health data, wavelet transform and term frequency-inverse document frequency methods were employed to extract predictors. Selecting predictors with potential to highlight special cases and defining new patient similarity metrics were among the gaps identified in the existing literature that provide starting points for future work. Patient status prediction models based on patient similarity and health data offer exciting potential for personalizing and ultimately improving health care, leading to better patient outcomes. PMID:28258046

  2. Shear wave prediction using committee fuzzy model constrained by lithofacies, Zagros basin, SW Iran

    NASA Astrophysics Data System (ADS)

    Shiroodi, Sadjad Kazem; Ghafoori, Mohammad; Ansari, Hamid Reza; Lashkaripour, Golamreza; Ghanadian, Mostafa

    2017-02-01

    The main purpose of this study is to introduce the geological controlling factors in improving an intelligence-based model to estimate shear wave velocity from seismic attributes. The proposed method includes three main steps in the framework of geological events in a complex sedimentary succession located in the Persian Gulf. First, the best attributes were selected from extracted seismic data. Second, these attributes were transformed into shear wave velocity using fuzzy inference systems (FIS) such as Sugeno's fuzzy inference (SFIS), adaptive neuro-fuzzy inference (ANFIS) and optimized fuzzy inference (OFIS). Finally, a committee fuzzy machine (CFM) based on bat-inspired algorithm (BA) optimization was applied to combine previous predictions into an enhanced solution. In order to show the geological effect on improving the prediction, the main classes of predominate lithofacies in the reservoir of interest including shale, sand, and carbonate were selected and then the proposed algorithm was performed with and without lithofacies constraint. The results showed a good agreement between real and predicted shear wave velocity in the lithofacies-based model compared to the model without lithofacies especially in sand and carbonate.

  3. Synchronizing movements with the metronome: nonlinear error correction and unstable periodic orbits.

    PubMed

    Engbert, Ralf; Krampe, Ralf Th; Kurths, Jürgen; Kliegl, Reinhold

    2002-02-01

    The control of human hand movements is investigated in a simple synchronization task. We propose and analyze a stochastic model based on nonlinear error correction; a mechanism which implies the existence of unstable periodic orbits. This prediction is tested in an experiment with human subjects. We find that our experimental data are in good agreement with numerical simulations of our theoretical model. These results suggest that feedback control of the human motor systems shows nonlinear behavior. Copyright 2001 Elsevier Science (USA).

  4. Group-regularized individual prediction: theory and application to pain.

    PubMed

    Lindquist, Martin A; Krishnan, Anjali; López-Solà, Marina; Jepma, Marieke; Woo, Choong-Wan; Koban, Leonie; Roy, Mathieu; Atlas, Lauren Y; Schmidt, Liane; Chang, Luke J; Reynolds Losin, Elizabeth A; Eisenbarth, Hedwig; Ashar, Yoni K; Delk, Elizabeth; Wager, Tor D

    2017-01-15

    Multivariate pattern analysis (MVPA) has become an important tool for identifying brain representations of psychological processes and clinical outcomes using fMRI and related methods. Such methods can be used to predict or 'decode' psychological states in individual subjects. Single-subject MVPA approaches, however, are limited by the amount and quality of individual-subject data. In spite of higher spatial resolution, predictive accuracy from single-subject data often does not exceed what can be accomplished using coarser, group-level maps, because single-subject patterns are trained on limited amounts of often-noisy data. Here, we present a method that combines population-level priors, in the form of biomarker patterns developed on prior samples, with single-subject MVPA maps to improve single-subject prediction. Theoretical results and simulations motivate a weighting based on the relative variances of biomarker-based prediction-based on population-level predictive maps from prior groups-and individual-subject, cross-validated prediction. Empirical results predicting pain using brain activity on a trial-by-trial basis (single-trial prediction) across 6 studies (N=180 participants) confirm the theoretical predictions. Regularization based on a population-level biomarker-in this case, the Neurologic Pain Signature (NPS)-improved single-subject prediction accuracy compared with idiographic maps based on the individuals' data alone. The regularization scheme that we propose, which we term group-regularized individual prediction (GRIP), can be applied broadly to within-person MVPA-based prediction. We also show how GRIP can be used to evaluate data quality and provide benchmarks for the appropriateness of population-level maps like the NPS for a given individual or study. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Automated Clinical Assessment from Smart home-based Behavior Data

    PubMed Central

    Dawadi, Prafulla Nath; Cook, Diane Joyce; Schmitter-Edgecombe, Maureen

    2016-01-01

    Smart home technologies offer potential benefits for assisting clinicians by automating health monitoring and well-being assessment. In this paper, we examine the actual benefits of smart home-based analysis by monitoring daily behaviour in the home and predicting standard clinical assessment scores of the residents. To accomplish this goal, we propose a Clinical Assessment using Activity Behavior (CAAB) approach to model a smart home resident’s daily behavior and predict the corresponding standard clinical assessment scores. CAAB uses statistical features that describe characteristics of a resident’s daily activity performance to train machine learning algorithms that predict the clinical assessment scores. We evaluate the performance of CAAB utilizing smart home sensor data collected from 18 smart homes over two years using prediction and classification-based experiments. In the prediction-based experiments, we obtain a statistically significant correlation (r = 0.72) between CAAB-predicted and clinician-provided cognitive assessment scores and a statistically significant correlation (r = 0.45) between CAAB-predicted and clinician-provided mobility scores. Similarly, for the classification-based experiments, we find CAAB has a classification accuracy of 72% while classifying cognitive assessment scores and 76% while classifying mobility scores. These prediction and classification results suggest that it is feasible to predict standard clinical scores using smart home sensor data and learning-based data analysis. PMID:26292348

  6. A text-based data mining and toxicity prediction modeling system for a clinical decision support in radiation oncology: A preliminary study

    NASA Astrophysics Data System (ADS)

    Kim, Kwang Hyeon; Lee, Suk; Shim, Jang Bo; Chang, Kyung Hwan; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Kim, Chul Yong; Cao, Yuan Jie

    2017-08-01

    The aim of this study is an integrated research for text-based data mining and toxicity prediction modeling system for clinical decision support system based on big data in radiation oncology as a preliminary research. The structured and unstructured data were prepared by treatment plans and the unstructured data were extracted by dose-volume data image pattern recognition of prostate cancer for research articles crawling through the internet. We modeled an artificial neural network to build a predictor model system for toxicity prediction of organs at risk. We used a text-based data mining approach to build the artificial neural network model for bladder and rectum complication predictions. The pattern recognition method was used to mine the unstructured toxicity data for dose-volume at the detection accuracy of 97.9%. The confusion matrix and training model of the neural network were achieved with 50 modeled plans (n = 50) for validation. The toxicity level was analyzed and the risk factors for 25% bladder, 50% bladder, 20% rectum, and 50% rectum were calculated by the artificial neural network algorithm. As a result, 32 plans could cause complication but 18 plans were designed as non-complication among 50 modeled plans. We integrated data mining and a toxicity modeling method for toxicity prediction using prostate cancer cases. It is shown that a preprocessing analysis using text-based data mining and prediction modeling can be expanded to personalized patient treatment decision support based on big data.

  7. Gender differences in condom use prediction with Theory of Reasoned Action and Planned Behaviour: the role of self-efficacy and control.

    PubMed

    Muñoz-Silva, A; Sánchez-García, M; Nunes, C; Martins, A

    2007-10-01

    There is much evidence that demonstrates that programs and interventions based on the theoretical models of the Theory of Reasoned Action (TRA) and the Theory of Planned Behaviour (TPB) have been effective in the prevention of the sexual transmission of HIV. The objective of this work is to compare the effectiveness of both models in the prediction of condom use, distinguishing two components inside the variable Perceived Behavioural Control of the TPB model: self-efficacy and control. The perspective of gender differences is also added. The study was carried out in a sample of 601 Portuguese and Spanish university students. The results show that the females have a higher average in all the TPB variables than males, except in the frequency of condom use: females request the use of condoms less frequently than males. On the other hand, for both females and males the TPB model predicts better condom-use intention than the TRA. However there are no differences between the two models in relation to the prediction of condom-use behaviour. For prediction of intention, the most outstanding variable among females is attitude, while among males they are subjective norm and self-efficacy. Finally, we analyze the implications of these data from a theoretical and practical point of view.

  8. Real time simulation of nonlinear generalized predictive control for wind energy conversion system with nonlinear observer.

    PubMed

    Ouari, Kamel; Rekioua, Toufik; Ouhrouche, Mohand

    2014-01-01

    In order to make a wind power generation truly cost-effective and reliable, an advanced control techniques must be used. In this paper, we develop a new control strategy, using nonlinear generalized predictive control (NGPC) approach, for DFIG-based wind turbine. The proposed control law is based on two points: NGPC-based torque-current control loop generating the rotor reference voltage and NGPC-based speed control loop that provides the torque reference. In order to enhance the robustness of the controller, a disturbance observer is designed to estimate the aerodynamic torque which is considered as an unknown perturbation. Finally, a real-time simulation is carried out to illustrate the performance of the proposed controller. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Perspectives for geographically oriented management of fusarium mycotoxins in the cereal supply chain.

    PubMed

    van der Fels-Klerx, H J; Booij, C J H

    2010-06-01

    This article provides an overview of available systems for management of Fusarium mycotoxins in the cereal grain supply chain, with an emphasis on the use of predictive mathematical modeling. From the state of the art, it proposes future developments in modeling and management and their challenges. Mycotoxin contamination in cereal grain-based feed and food products is currently managed and controlled by good agricultural practices, good manufacturing practices, hazard analysis critical control points, and by checking and more recently by notification systems and predictive mathematical models. Most of the predictive models for Fusarium mycotoxins in cereal grains focus on deoxynivalenol in wheat and aim to help growers make decisions about the application of fungicides during cultivation. Future developments in managing Fusarium mycotoxins should include the linkage between predictive mathematical models and geographical information systems, resulting into region-specific predictions for mycotoxin occurrence. The envisioned geographically oriented decision support system may incorporate various underlying models for specific users' demands and regions and various related databases to feed the particular models with (geographically oriented) input data. Depending on the user requirements, the system selects the best fitting model and available input information. Future research areas include organizing data management in the cereal grain supply chain, developing predictive models for other stakeholders (taking into account the period up to harvest), other Fusarium mycotoxins, and cereal grain types, and understanding the underlying effects of the regional component in the models.

  10. Taxi Time Prediction at Charlotte Airport Using Fast-Time Simulation and Machine Learning Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong

    2016-01-01

    Accurate taxi time prediction is required for enabling efficient runway scheduling that can increase runway throughput and reduce taxi times and fuel consumptions on the airport surface. Currently NASA and American Airlines are jointly developing a decision-support tool called Spot and Runway Departure Advisor (SARDA) that assists airport ramp controllers to make gate pushback decisions and improve the overall efficiency of airport surface traffic. In this presentation, we propose to use Linear Optimized Sequencing (LINOS), a discrete-event fast-time simulation tool, to predict taxi times and provide the estimates to the runway scheduler in real-time airport operations. To assess its prediction accuracy, we also introduce a data-driven analytical method using machine learning techniques. These two taxi time prediction methods are evaluated with actual taxi time data obtained from the SARDA human-in-the-loop (HITL) simulation for Charlotte Douglas International Airport (CLT) using various performance measurement metrics. Based on the taxi time prediction results, we also discuss how the prediction accuracy can be affected by the operational complexity at this airport and how we can improve the fast time simulation model before implementing it with an airport scheduling algorithm in a real-time environment.

  11. Predictive Big Data Analytics: A Study of Parkinson’s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    PubMed Central

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.

    2016-01-01

    Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson’s disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Conclusions Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson’s disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer’s, Huntington’s, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications. PMID:27494614

  12. Bridge Structure Deformation Prediction Based on GNSS Data Using Kalman-ARIMA-GARCH Model

    PubMed Central

    Li, Xiaoqing; Wang, Yu

    2018-01-01

    Bridges are an essential part of the ground transportation system. Health monitoring is fundamentally important for the safety and service life of bridges. A large amount of structural information is obtained from various sensors using sensing technology, and the data processing has become a challenging issue. To improve the prediction accuracy of bridge structure deformation based on data mining and to accurately evaluate the time-varying characteristics of bridge structure performance evolution, this paper proposes a new method for bridge structure deformation prediction, which integrates the Kalman filter, autoregressive integrated moving average model (ARIMA), and generalized autoregressive conditional heteroskedasticity (GARCH). Firstly, the raw deformation data is directly pre-processed using the Kalman filter to reduce the noise. After that, the linear recursive ARIMA model is established to analyze and predict the structure deformation. Finally, the nonlinear recursive GARCH model is introduced to further improve the accuracy of the prediction. Simulation results based on measured sensor data from the Global Navigation Satellite System (GNSS) deformation monitoring system demonstrated that: (1) the Kalman filter is capable of denoising the bridge deformation monitoring data; (2) the prediction accuracy of the proposed Kalman-ARIMA-GARCH model is satisfactory, where the mean absolute error increases only from 3.402 mm to 5.847 mm with the increment of the prediction step; and (3) in comparision to the Kalman-ARIMA model, the Kalman-ARIMA-GARCH model results in superior prediction accuracy as it includes partial nonlinear characteristics (heteroscedasticity); the mean absolute error of five-step prediction using the proposed model is improved by 10.12%. This paper provides a new way for structural behavior prediction based on data processing, which can lay a foundation for the early warning of bridge health monitoring system based on sensor data using sensing technology. PMID:29351254

  13. Bridge Structure Deformation Prediction Based on GNSS Data Using Kalman-ARIMA-GARCH Model.

    PubMed

    Xin, Jingzhou; Zhou, Jianting; Yang, Simon X; Li, Xiaoqing; Wang, Yu

    2018-01-19

    Bridges are an essential part of the ground transportation system. Health monitoring is fundamentally important for the safety and service life of bridges. A large amount of structural information is obtained from various sensors using sensing technology, and the data processing has become a challenging issue. To improve the prediction accuracy of bridge structure deformation based on data mining and to accurately evaluate the time-varying characteristics of bridge structure performance evolution, this paper proposes a new method for bridge structure deformation prediction, which integrates the Kalman filter, autoregressive integrated moving average model (ARIMA), and generalized autoregressive conditional heteroskedasticity (GARCH). Firstly, the raw deformation data is directly pre-processed using the Kalman filter to reduce the noise. After that, the linear recursive ARIMA model is established to analyze and predict the structure deformation. Finally, the nonlinear recursive GARCH model is introduced to further improve the accuracy of the prediction. Simulation results based on measured sensor data from the Global Navigation Satellite System (GNSS) deformation monitoring system demonstrated that: (1) the Kalman filter is capable of denoising the bridge deformation monitoring data; (2) the prediction accuracy of the proposed Kalman-ARIMA-GARCH model is satisfactory, where the mean absolute error increases only from 3.402 mm to 5.847 mm with the increment of the prediction step; and (3) in comparision to the Kalman-ARIMA model, the Kalman-ARIMA-GARCH model results in superior prediction accuracy as it includes partial nonlinear characteristics (heteroscedasticity); the mean absolute error of five-step prediction using the proposed model is improved by 10.12%. This paper provides a new way for structural behavior prediction based on data processing, which can lay a foundation for the early warning of bridge health monitoring system based on sensor data using sensing technology.

  14. Design of optimal hyperthermia protocols for prostate cancer by controlling HSP expression through computer modeling (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Rylander, Marissa N.; Feng, Yusheng; Diller, Kenneth; Bass, J.

    2005-04-01

    Heat shock proteins (HSP) are critical components of a complex defense mechanism essential for preserving cell survival under adverse environmental conditions. It is inevitable that hyperthermia will enhance tumor tissue viability, due to HSP expression in regions where temperatures are insufficient to coagulate proteins, and would likely increase the probability of cancer recurrence. Although hyperthermia therapy is commonly used in conjunction with radiotherapy, chemotherapy, and gene therapy to increase therapeutic effectiveness, the efficacy of these therapies can be substantially hindered due to HSP expression when hyperthermia is applied prior to these procedures. Therefore, in planning hyperthermia protocols, prediction of the HSP response of the tumor must be incorporated into the treatment plan to optimize the thermal dose delivery and permit prediction of overall tissue response. In this paper, we present a highly accurate, adaptive, finite element tumor model capable of predicting the HSP expression distribution and tissue damage region based on measured cellular data when hyperthermia protocols are specified. Cubic spline representations of HSP27 and HSP70, and Arrhenius damage models were integrated into the finite element model to enable prediction of the HSP expression and damage distribution in the tissue following laser heating. Application of the model can enable optimized treatment planning by controlling of the tissue response to therapy based on accurate prediction of the HSP expression and cell damage distribution.

  15. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.

    PubMed

    Lee, Leng-Feng; Umberger, Brian R

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.

  16. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB

    PubMed Central

    Lee, Leng-Feng

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184

  17. Near infrared spectroscopy based monitoring of extraction processes of raw material with the help of dynamic predictive modeling

    NASA Astrophysics Data System (ADS)

    Wang, Haixia; Suo, Tongchuan; Wu, Xiaolin; Zhang, Yue; Wang, Chunhua; Yu, Heshui; Li, Zheng

    2018-03-01

    The control of batch-to-batch quality variations remains a challenging task for pharmaceutical industries, e.g., traditional Chinese medicine (TCM) manufacturing. One difficult problem is to produce pharmaceutical products with consistent quality from raw material of large quality variations. In this paper, an integrated methodology combining the near infrared spectroscopy (NIRS) and dynamic predictive modeling is developed for the monitoring and control of the batch extraction process of licorice. With the spectra data in hand, the initial state of the process is firstly estimated with a state-space model to construct a process monitoring strategy for the early detection of variations induced by the initial process inputs such as raw materials. Secondly, the quality property of the end product is predicted at the mid-course during the extraction process with a partial least squares (PLS) model. The batch-end-time (BET) is then adjusted accordingly to minimize the quality variations. In conclusion, our study shows that with the help of the dynamic predictive modeling, NIRS can offer the past and future information of the process, which enables more accurate monitoring and control of process performance and product quality.

  18. Prognostics of Power Mosfets Under Thermal Stress Accelerated Aging Using Data-Driven and Model-Based Methodologies

    NASA Technical Reports Server (NTRS)

    Celaya, Jose; Saxena, Abhinav; Saha, Sankalita; Goebel, Kai F.

    2011-01-01

    An approach for predicting remaining useful life of power MOSFETs (metal oxide field effect transistor) devices has been developed. Power MOSFETs are semiconductor switching devices that are instrumental in electronics equipment such as those used in operation and control of modern aircraft and spacecraft. The MOSFETs examined here were aged under thermal overstress in a controlled experiment and continuous performance degradation data were collected from the accelerated aging experiment. Dieattach degradation was determined to be the primary failure mode. The collected run-to-failure data were analyzed and it was revealed that ON-state resistance increased as die-attach degraded under high thermal stresses. Results from finite element simulation analysis support the observations from the experimental data. Data-driven and model based prognostics algorithms were investigated where ON-state resistance was used as the primary precursor of failure feature. A Gaussian process regression algorithm was explored as an example for a data-driven technique and an extended Kalman filter and a particle filter were used as examples for model-based techniques. Both methods were able to provide valid results. Prognostic performance metrics were employed to evaluate and compare the algorithms.

  19. Prediction of wastewater treatment plants performance based on artificial fish school neural network

    NASA Astrophysics Data System (ADS)

    Zhang, Ruicheng; Li, Chong

    2011-10-01

    A reliable model for wastewater treatment plant is essential in providing a tool for predicting its performance and to form a basis for controlling the operation of the process. This would minimize the operation costs and assess the stability of environmental balance. For the multi-variable, uncertainty, non-linear characteristics of the wastewater treatment system, an artificial fish school neural network prediction model is established standing on actual operation data in the wastewater treatment system. The model overcomes several disadvantages of the conventional BP neural network. The results of model calculation show that the predicted value can better match measured value, played an effect on simulating and predicting and be able to optimize the operation status. The establishment of the predicting model provides a simple and practical way for the operation and management in wastewater treatment plant, and has good research and engineering practical value.

  20. A Predictive Model to Identify Patients With Fecal Incontinence Based on High-Definition Anorectal Manometry.

    PubMed

    Zifan, Ali; Ledgerwood-Lee, Melissa; Mittal, Ravinder K

    2016-12-01

    Three-dimensional high-definition anorectal manometry (3D-HDAM) is used to assess anal sphincter function; it determines profiles of regional pressure distribution along the length and circumference of the anal canal. There is no consensus, however, on the best way to analyze data from 3D-HDAM to distinguish healthy individuals from persons with sphincter dysfunction. We developed a computer analysis system to analyze 3D-HDAM data and to aid in the diagnosis and assessment of patients with fecal incontinence (FI). In a prospective study, we performed 3D-HDAM analysis of 24 asymptomatic healthy subjects (control subjects; all women; mean age, 39 ± 10 years) and 24 patients with symptoms of FI (all women; mean age, 58 ± 13 years). Patients completed a standardized questionnaire (FI severity index) to score the severity of FI symptoms. We developed and evaluated a robust prediction model to distinguish patients with FI from control subjects using linear discriminant, quadratic discriminant, and logistic regression analyses. In addition to collecting pressure information from the HDAM data, we assessed regional features based on shape characteristics and the anal sphincter pressure symmetry index. The combination of pressure values, anal sphincter area, and reflective symmetry values was identified in patients with FI versus control subjects with an area under the curve value of 1.0. In logistic regression analyses using different predictors, the model identified patients with FI with an area under the curve value of 0.96 (interquartile range, 0.22). In discriminant analysis, results were classified with a minimum error of 0.02, calculated using 10-fold cross-validation; different combinations of predictors produced median classification errors of 0.16 in linear discriminant analysis (interquartile range, 0.25) and 0.08 in quadratic discriminant analysis (interquartile range, 0.25). We developed and validated a novel prediction model to analyze 3D-HDAM data. This system can accurately distinguish patients with FI from control subjects. Copyright © 2016 AGA Institute. Published by Elsevier Inc. All rights reserved.

  1. The value of reasons for encounter in early detection of colorectal cancer.

    PubMed

    van Boxtel-Wilms, Susan J M; van Boven, Kees; Bor, J H Hans; Bakx, J Carel; Lucassen, Peter; Oskam, Sibo; van Weel, Chris

    2016-06-01

    Symptoms with a high predictive power for colorectal cancer (CRC) do not exist. To explore the predictive value of patients' reason for encounter (RFE) in the two years prior to the diagnosis of CRC. A retrospective nested case-control study using prospectively collected data from electronic records in general practice over 20 years. Matching was done based on age (within two years), gender and practice. The positive likelihood ratios (LR+) and odds ratios (OR) were calculated for RFE between cases and controls in the two years before the index date. We identified 184 CRC cases and matched 366 controls. Six RFEs had significant LR + and ORs for CRC, which may have high predictive power. These RFEs are part of four chapters in the International Classification of Primary Care (ICPC) that include tiredness (significant at 3-6 months prior to the diagnosis; LR+ 2.6 and OR 3.07; and from 0 to 3 months prior to the diagnosis; LR+ 2.0 and OR 2.36), anaemia (significant at three months before diagnosis; LR+ 9.8 and OR 16.54), abdominal pain, rectal bleeding and constipation (significant at 3-6 months before diagnosis; LR+ 3.0 and OR 3.33; 3 months prior to the diagnosis LR+ 8.0 and OR 18.10) and weight loss (significant at three months before diagnosis; LR+ 14.9 and OR 14.53). Data capture and organization in ICPC permits study of the predictive value of RFE for CRC in primary care.

  2. Human-centric predictive model of task difficulty for human-in-the-loop control tasks

    PubMed Central

    Majewicz Fey, Ann

    2018-01-01

    Quantitatively measuring the difficulty of a manipulation task in human-in-the-loop control systems is ill-defined. Currently, systems are typically evaluated through task-specific performance measures and post-experiment user surveys; however, these methods do not capture the real-time experience of human users. In this study, we propose to analyze and predict the difficulty of a bivariate pointing task, with a haptic device interface, using human-centric measurement data in terms of cognition, physical effort, and motion kinematics. Noninvasive sensors were used to record the multimodal response of human user for 14 subjects performing the task. A data-driven approach for predicting task difficulty was implemented based on several task-independent metrics. We compare four possible models for predicting task difficulty to evaluated the roles of the various types of metrics, including: (I) a movement time model, (II) a fusion model using both physiological and kinematic metrics, (III) a model only with kinematic metrics, and (IV) a model only with physiological metrics. The results show significant correlation between task difficulty and the user sensorimotor response. The fusion model, integrating user physiology and motion kinematics, provided the best estimate of task difficulty (R2 = 0.927), followed by a model using only kinematic metrics (R2 = 0.921). Both models were better predictors of task difficulty than the movement time model (R2 = 0.847), derived from Fitt’s law, a well studied difficulty model for human psychomotor control. PMID:29621301

  3. Underconnectivity of the superior temporal sulcus predicts emotion recognition deficits in autism

    PubMed Central

    Woolley, Daniel G.; Steyaert, Jean; Di Martino, Adriana; Swinnen, Stephan P.; Wenderoth, Nicole

    2014-01-01

    Neurodevelopmental disconnections have been assumed to cause behavioral alterations in autism spectrum disorders (ASDs). Here, we combined measurements of intrinsic functional connectivity (iFC) from resting-state functional magnetic resonance imaging (fMRI) with task-based fMRI to explore whether altered activity and/or iFC of the right posterior superior temporal sulcus (pSTS) mediates deficits in emotion recognition in ASD. Fifteen adults with ASD and 15 matched-controls underwent resting-state and task-based fMRI, during which participants discriminated emotional states from point light displays (PLDs). Intrinsic FC of the right pSTS was further examined using 584 (278 ASD/306 controls) resting-state data of the Autism Brain Imaging Data Exchange (ABIDE). Participants with ASD were less accurate than controls in recognizing emotional states from PLDs. Analyses revealed pronounced ASD-related reductions both in task-based activity and resting-state iFC of the right pSTS with fronto-parietal areas typically encompassing the action observation network (AON). Notably, pSTS-hypo-activity was related to pSTS-hypo-connectivity, and both measures were predictive of emotion recognition performance with each measure explaining a unique part of the variance. Analyses with the large independent ABIDE dataset replicated reductions in pSTS-iFC to fronto-parietal regions. These findings provide novel evidence that pSTS hypo-activity and hypo-connectivity with the fronto-parietal AON are linked to the social deficits characteristic of ASD. PMID:24078018

  4. A Risk Prediction Model for Sporadic CRC Based on Routine Lab Results.

    PubMed

    Boursi, Ben; Mamtani, Ronac; Hwang, Wei-Ting; Haynes, Kevin; Yang, Yu-Xiao

    2016-07-01

    Current risk scores for colorectal cancer (CRC) are based on demographic and behavioral factors and have limited predictive values. To develop a novel risk prediction model for sporadic CRC using clinical and laboratory data in electronic medical records. We conducted a nested case-control study in a UK primary care database. Cases included those with a diagnostic code of CRC, aged 50-85. Each case was matched with four controls using incidence density sampling. CRC predictors were examined using univariate conditional logistic regression. Variables with p value <0.25 in the univariate analysis were further evaluated in multivariate models using backward elimination. Discrimination was assessed using receiver operating curve. Calibration was evaluated using the McFadden's R2. Net reclassification index (NRI) associated with incorporation of laboratory results was calculated. Results were internally validated. A model similar to existing CRC prediction models which included age, sex, height, obesity, ever smoking, alcohol dependence, and previous screening colonoscopy had an AUC of 0.58 (0.57-0.59) with poor goodness of fit. A laboratory-based model including hematocrit, MCV, lymphocytes, and neutrophil-lymphocyte ratio (NLR) had an AUC of 0.76 (0.76-0.77) and a McFadden's R2 of 0.21 with a NRI of 47.6 %. A combined model including sex, hemoglobin, MCV, white blood cells, platelets, NLR, and oral hypoglycemic use had an AUC of 0.80 (0.79-0.81) with a McFadden's R2 of 0.27 and a NRI of 60.7 %. Similar results were shown in an internal validation set. A laboratory-based risk model had good predictive power for sporadic CRC risk.

  5. Muscle Synergies Facilitate Computational Prediction of Subject-Specific Walking Motions

    PubMed Central

    Meyer, Andrew J.; Eskinazi, Ilan; Jackson, Jennifer N.; Rao, Anil V.; Patten, Carolynn; Fregly, Benjamin J.

    2016-01-01

    Researchers have explored a variety of neurorehabilitation approaches to restore normal walking function following a stroke. However, there is currently no objective means for prescribing and implementing treatments that are likely to maximize recovery of walking function for any particular patient. As a first step toward optimizing neurorehabilitation effectiveness, this study develops and evaluates a patient-specific synergy-controlled neuromusculoskeletal simulation framework that can predict walking motions for an individual post-stroke. The main question we addressed was whether driving a subject-specific neuromusculoskeletal model with muscle synergy controls (5 per leg) facilitates generation of accurate walking predictions compared to a model driven by muscle activation controls (35 per leg) or joint torque controls (5 per leg). To explore this question, we developed a subject-specific neuromusculoskeletal model of a single high-functioning hemiparetic subject using instrumented treadmill walking data collected at the subject’s self-selected speed of 0.5 m/s. The model included subject-specific representations of lower-body kinematic structure, foot–ground contact behavior, electromyography-driven muscle force generation, and neural control limitations and remaining capabilities. Using direct collocation optimal control and the subject-specific model, we evaluated the ability of the three control approaches to predict the subject’s walking kinematics and kinetics at two speeds (0.5 and 0.8 m/s) for which experimental data were available from the subject. We also evaluated whether synergy controls could predict a physically realistic gait period at one speed (1.1 m/s) for which no experimental data were available. All three control approaches predicted the subject’s walking kinematics and kinetics (including ground reaction forces) well for the model calibration speed of 0.5 m/s. However, only activation and synergy controls could predict the subject’s walking kinematics and kinetics well for the faster non-calibration speed of 0.8 m/s, with synergy controls predicting the new gait period the most accurately. When used to predict how the subject would walk at 1.1 m/s, synergy controls predicted a gait period close to that estimated from the linear relationship between gait speed and stride length. These findings suggest that our neuromusculoskeletal simulation framework may be able to bridge the gap between patient-specific muscle synergy information and resulting functional capabilities and limitations. PMID:27790612

  6. The dynamic-response characteristics of a 35 degree swept-wing airplane as determined from flight measurements

    NASA Technical Reports Server (NTRS)

    Triplett, William C; Brown, Stuart C; Smith, G Allan

    1955-01-01

    The longitudinal and lateral-directional dynamic-response characteristics of a 35 degree swept-wing fighter-type airplane determined from flight measurements are presented and compared with predictions based on theoretical studies and wind-tunnel data. Flights were made at an altitude of 35,000 feet covering the Mach number range of 0.50 to 1.04. A limited amount of lateral-directional data were also obtained at 10,000 feet. The flight consisted essentially of recording transient responses to pilot-applied pulsed motions of each of the three primary control surfaces. These transient data were converted into frequency-response form by means of the Fourier transformation and compared with predicted responses calculated from the basic equations. Experimentally determined transfer functions were used for the evaluation of the stability derivatives that have the greatest effect on the dynamic response of the airplane. The values of these derivatives, in most cases, agreed favorably with predictions over the Mach number range of the test.

  7. Using social media as a tool to predict syphilis.

    PubMed

    Young, Sean D; Mercer, Neil; Weiss, Robert E; Torrone, Elizabeth A; Aral, Sevgi O

    2018-04-01

    Syphilis rates have been rapidly rising in the United States. New technologies, such as social media, might be used to anticipate and prevent the spread of disease. Because social media data collection is easy and inexpensive, integration of social media data into syphilis surveillance may be a cost-effective surveillance strategy, especially in low-resource regions. People are increasingly using social media to discuss health-related issues, such as sexual risk behaviors, allowing social media to be a potential tool for public health and medical research. This study mined Twitter data to assess whether social media could be used to predict syphilis cases in 2013 based on 2012 data. We collected 2012 and 2013 county-level primary and secondary (P&S) and early latent syphilis cases reported to the Center for Disease Control and Prevention, along with >8500 geolocated tweets in the United States that were filtered to include sexual risk-related keywords, including colloquial terms for intercourse. We assessed the relationship between syphilis-related tweets and actual case reports by county, controlling for socioeconomic indicators and prior year syphilis cases. We found a significant positive relationship between tweets and cases of P&S and early latent syphilis. This study shows that social media may be an additional tool to enhance syphilis prediction and surveillance. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Methionine Uptake and Required Radiation Dose to Control Glioblastoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iuchi, Toshihiko, E-mail: tiuchi@chiba-cc.jp; Hatano, Kazuo; Uchino, Yoshio

    Purpose: The purpose of this study was to retrospectively assess the feasibility of radiation therapy planning for glioblastoma multiforme (GBM) based on the use of methionine (MET) positron emission tomography (PET), and the correlation among MET uptake, radiation dose, and tumor control. Methods and Materials: Twenty-two patients with GBM who underwent MET-PET prior to radiation therapy were enrolled. MET uptake in 30 regions of interest (ROIs) from 22 GBMs, biologically effective doses (BEDs) for the ROIs and their ratios (MET uptake:BED) were compared in terms of whether the ROIs were controlled for >12 months. Results: MET uptake was significantly correlated withmore » tumor control (odds ratio [OR], 10.0; P=.005); however, there was a higher level of correlation between MET uptake:BED ratio and tumor control (OR, 40.0; P<.0001). These data indicated that the required BEDs for controlling the ROIs could be predicted in terms of MET uptake; BED could be calculated as [34.0 × MET uptake] Gy from the optimal threshold of the MET uptake:BED ratio for tumor control. Conclusions: Target delineation based on MET-PET was demonstrated to be feasible for radiation therapy treatment planning. MET-PET could not only provide precise visualization of infiltrating tumor cells but also predict the required radiation doses to control target regions.« less

  9. The Impact of Prenatal Parental Locus of Control on Children's Psychological Outcomes in Infancy and Early Childhood: A Prospective 5 Year Study

    PubMed Central

    Nowicki, Stephen; Iles-Caven, Yasmin; Gregory, Steven; Ellis, Genette; Golding, Jean

    2017-01-01

    Locus of control is one of the most widely studied concepts in the history of personality psychology. In spite of its popularity and its associations with numerous relevant outcomes, the ability of locus of control to predict future behaviors involving parenting effectiveness has been under researched. The few parent locus of control children's outcome studies are characterized by cross-sectional methodologies that focus on mothers. The present study uses a prospective methodology to compare data on mothers' and fathers' locus of control with their child's behavior outcomes from a large scale research project, the Avon Longitudinal Study of Parents and Children (ALSPAC). Based on Rotter's Social Learning Theory published in 1954 and past empirical research, it was predicted and found that parent internality was associated with more positive child outcomes than parent externality. More specifically, when both parents were internal, their children had more positive outcomes in sleeping, eating, and tantrum behavior as compared to any other parent locus of control combination. However external parents had a less restrictive attitude which appeared to have a more beneficial effect on picky eating. Results confirmed how important parent locus of control is in the lives of children. Based on the findings, researchers are urged to develop interventions to change advice to parents and promote more internal locus of control among parents. PMID:28446887

  10. [Risk Prediction Using Routine Data: Development and Validation of Multivariable Models Predicting 30- and 90-day Mortality after Surgical Treatment of Colorectal Cancer].

    PubMed

    Crispin, Alexander; Strahwald, Brigitte; Cheney, Catherine; Mansmann, Ulrich

    2018-06-04

    Quality control, benchmarking, and pay for performance (P4P) require valid indicators and statistical models allowing adjustment for differences in risk profiles of the patient populations of the respective institutions. Using hospital remuneration data for measuring quality and modelling patient risks has been criticized by clinicians. Here we explore the potential of prediction models for 30- and 90-day mortality after colorectal cancer surgery based on routine data. Full census of a major statutory health insurer. Surgical departments throughout the Federal Republic of Germany. 4283 and 4124 insurants with major surgery for treatment of colorectal cancer during 2013 and 2014, respectively. Age, sex, primary and secondary diagnoses as well as tumor locations as recorded in the hospital remuneration data according to §301 SGB V. 30- and 90-day mortality. Elixhauser comorbidities, Charlson conditions, and Charlson scores were generated from the ICD-10 diagnoses. Multivariable prediction models were developed using a penalized logistic regression approach (logistic ridge regression) in a derivation set (patients treated in 2013). Calibration and discrimination of the models were assessed in an internal validation sample (patients treated in 2014) using calibration curves, Brier scores, receiver operating characteristic curves (ROC curves) and the areas under the ROC curves (AUC). 30- and 90-day mortality rates in the learning-sample were 5.7 and 8.4%, respectively. The corresponding values in the validation sample were 5.9% and once more 8.4%. Models based on Elixhauser comorbidities exhibited the highest discriminatory power with AUC values of 0.804 (95% CI: 0.776 -0.832) and 0.805 (95% CI: 0.782-0.828) for 30- and 90-day mortality. The Brier scores for these models were 0.050 (95% CI: 0.044-0.056) and 0.067 (95% CI: 0.060-0.074) and similar to the models based on Charlson conditions. Regardless of the model, low predicted probabilities were well calibrated, while higher predicted values tended to be overestimates. The reasonable results regarding discrimination and calibration notwithstanding, models based on hospital remuneration data may not be helpful for P4P. Routine data do not offer information regarding a wide range of quality indicators more useful than mortality. As an alternative, models based on clinical registries may allow a wider, more valid perspective. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Identifying cooperative transcriptional regulations using protein–protein interactions

    PubMed Central

    Nagamine, Nobuyoshi; Kawada, Yuji; Sakakibara, Yasubumi

    2005-01-01

    Cooperative transcriptional activations among multiple transcription factors (TFs) are important to understand the mechanisms of complex transcriptional regulations in eukaryotes. Previous studies have attempted to find cooperative TFs based on gene expression data with gene expression profiles as a measure of similarity of gene regulations. In this paper, we use protein–protein interaction data to infer synergistic binding of cooperative TFs. Our fundamental idea is based on the assumption that genes contributing to a similar biological process are regulated under the same control mechanism. First, the protein–protein interaction networks are used to calculate the similarity of biological processes among genes. Second, we integrate this similarity and the chromatin immuno-precipitation data to identify cooperative TFs. Our computational experiments in yeast show that predictions made by our method have successfully identified eight pairs of cooperative TFs that have literature evidences but could not be identified by the previous method. Further, 12 new possible pairs have been inferred and we have examined the biological relevances for them. However, since a typical problem using protein–protein interaction data is that many false-positive data are contained, we propose a method combining various biological data to increase the prediction accuracy. PMID:16126847

  12. Model-based and Model-free Machine Learning Techniques for Diagnostic Prediction and Classification of Clinical Outcomes in Parkinson's Disease.

    PubMed

    Gao, Chao; Sun, Hanbo; Wang, Tuo; Tang, Ming; Bohnen, Nicolaas I; Müller, Martijn L T M; Herman, Talia; Giladi, Nir; Kalinin, Alexandr; Spino, Cathie; Dauer, William; Hausdorff, Jeffrey M; Dinov, Ivo D

    2018-05-08

    In this study, we apply a multidisciplinary approach to investigate falls in PD patients using clinical, demographic and neuroimaging data from two independent initiatives (University of Michigan and Tel Aviv Sourasky Medical Center). Using machine learning techniques, we construct predictive models to discriminate fallers and non-fallers. Through controlled feature selection, we identified the most salient predictors of patient falls including gait speed, Hoehn and Yahr stage, postural instability and gait difficulty-related measurements. The model-based and model-free analytical methods we employed included logistic regression, random forests, support vector machines, and XGboost. The reliability of the forecasts was assessed by internal statistical (5-fold) cross validation as well as by external out-of-bag validation. Four specific challenges were addressed in the study: Challenge 1, develop a protocol for harmonizing and aggregating complex, multisource, and multi-site Parkinson's disease data; Challenge 2, identify salient predictive features associated with specific clinical traits, e.g., patient falls; Challenge 3, forecast patient falls and evaluate the classification performance; and Challenge 4, predict tremor dominance (TD) vs. posture instability and gait difficulty (PIGD). Our findings suggest that, compared to other approaches, model-free machine learning based techniques provide a more reliable clinical outcome forecasting of falls in Parkinson's patients, for example, with a classification accuracy of about 70-80%.

  13. A proof-of-principle simulation for closed-loop control based on preexisting experimental thalamic DBS-enhanced instrumental learning.

    PubMed

    Wang, Ching-Fu; Yang, Shih-Hung; Lin, Sheng-Huang; Chen, Po-Chuan; Lo, Yu-Chun; Pan, Han-Chi; Lai, Hsin-Yi; Liao, Lun-De; Lin, Hui-Ching; Chen, Hsu-Yan; Huang, Wei-Chen; Huang, Wun-Jhu; Chen, You-Yin

    Deep brain stimulation (DBS) has been applied as an effective therapy for treating Parkinson's disease or essential tremor. Several open-loop DBS control strategies have been developed for clinical experiments, but they are limited by short battery life and inefficient therapy. Therefore, many closed-loop DBS control systems have been designed to tackle these problems by automatically adjusting the stimulation parameters via feedback from neural signals, which has been reported to reduce the power consumption. However, when the association between the biomarkers of the model and stimulation is unclear, it is difficult to develop an optimal control scheme for other DBS applications, i.e., DBS-enhanced instrumental learning. Furthermore, few studies have investigated the effect of closed-loop DBS control for cognition function, such as instrumental skill learning, and have been implemented in simulation environments. In this paper, we proposed a proof-of-principle design for a closed-loop DBS system, cognitive-enhancing DBS (ceDBS), which enhanced skill learning based on in vivo experimental data. The ceDBS acquired local field potential (LFP) signal from the thalamic central lateral (CL) nuclei of animals through a neural signal processing system. A strong coupling of the theta oscillation (4-7 Hz) and the learning period was found in the water reward-related lever-pressing learning task. Therefore, the theta-band power ratio, which was the averaged theta band to averaged total band (1-55 Hz) power ratio, could be used as a physiological marker for enhancement of instrumental skill learning. The on-line extraction of the theta-band power ratio was implemented on a field-programmable gate array (FPGA). An autoregressive with exogenous inputs (ARX)-based predictor was designed to construct a CL-thalamic DBS model and forecast the future physiological marker according to the past physiological marker and applied DBS. The prediction could further assist the design of a closed-loop DBS controller. A DBS controller based on a fuzzy expert system was devised to automatically control DBS according to the predicted physiological marker via a set of rules. The simulated experimental results demonstrate that the ceDBS based on the closed-loop control architecture not only reduced power consumption using the predictive physiological marker, but also achieved a desired level of physiological marker through the DBS controller. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Intelligence rules of hysteresis in the feedforward trajectory control of piezoelectrically-driven nanostagers

    NASA Astrophysics Data System (ADS)

    Bashash, Saeid; Jalili, Nader

    2007-02-01

    Piezoelectrically-driven nanostagers have limited performance in a variety of feedforward and feedback positioning applications because of their nonlinear hysteretic response to input voltage. The hysteresis phenomenon is well known for its complex and multi-path behavior. To realize the underlying physics of this phenomenon and to develop an efficient compensation strategy, the intelligence properties of hysteresis with the effects of non-local memories are discussed here. Through performing a set of experiments on a piezoelectrically-driven nanostager with a high resolution capacitive position sensor, it is shown that for the precise prediction of the hysteresis path, certain memory units are required to store the previous hysteresis trajectory data. Based on the experimental observations, a constitutive memory-based mathematical modeling framework is developed and trained for the precise prediction of the hysteresis path for arbitrarily assigned input profiles. Using the inverse hysteresis model, a feedforward control strategy is then developed and implemented on the nanostager to compensate for the ever-present nonlinearity. Experimental results demonstrate that the controller remarkably eliminates the nonlinear effect, if memory units are sufficiently chosen for the inverse model.

  15. Predicting subscriber dissatisfaction and improving retention in the wireless telecommunications industry.

    PubMed

    Mozer, M C; Wolniewicz, R; Grimes, D B; Johnson, E; Kaushansky, H

    2000-01-01

    Competition in the wireless telecommunications industry is fierce. To maintain profitability, wireless carriers must control churn, which is the loss of subscribers who switch from one carrier to another.We explore techniques from statistical machine learning to predict churn and, based on these predictions, to determine what incentives should be offered to subscribers to improve retention and maximize profitability to the carrier. The techniques include logit regression, decision trees, neural networks, and boosting. Our experiments are based on a database of nearly 47,000 U.S. domestic subscribers and includes information about their usage, billing, credit, application, and complaint history. Our experiments show that under a wide variety of assumptions concerning the cost of intervention and the retention rate resulting from intervention, using predictive techniques to identify potential churners and offering incentives can yield significant savings to a carrier. We also show the importance of a data representation crafted by domain experts. Finally, we report on a real-world test of the techniques that validate our simulation experiments.

  16. LMI-Based Generation of Feedback Laws for a Robust Model Predictive Control Algorithm

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Carson, John M., III

    2007-01-01

    This technical note provides a mathematical proof of Corollary 1 from the paper 'A Nonlinear Model Predictive Control Algorithm with Proven Robustness and Resolvability' that appeared in the 2006 Proceedings of the American Control Conference. The proof was omitted for brevity in the publication. The paper was based on algorithms developed for the FY2005 R&TD (Research and Technology Development) project for Small-body Guidance, Navigation, and Control [2].The framework established by the Corollary is for a robustly stabilizing MPC (model predictive control) algorithm for uncertain nonlinear systems that guarantees the resolvability of the associated nite-horizon optimal control problem in a receding-horizon implementation. Additional details of the framework are available in the publication.

  17. Medium-range predictability of early summer sea ice thickness distribution in the East Siberian Sea based on the TOPAZ4 ice-ocean data assimilation system

    NASA Astrophysics Data System (ADS)

    Nakanowatari, Takuya; Inoue, Jun; Sato, Kazutoshi; Bertino, Laurent; Xie, Jiping; Matsueda, Mio; Yamagami, Akio; Sugimura, Takeshi; Yabuki, Hironori; Otsuka, Natsuhiko

    2018-06-01

    Accelerated retreat of Arctic Ocean summertime sea ice has focused attention on the potential use of the Northern Sea Route (NSR), for which sea ice thickness (SIT) information is crucial for safe maritime navigation. This study evaluated the medium-range (lead time below 10 days) forecast of SIT distribution in the East Siberian Sea (ESS) in early summer (June-July) based on the TOPAZ4 ice-ocean data assimilation system. A comparison of the operational model SIT data with reliable SIT estimates (hindcast, satellite and in situ data) showed that the TOPAZ4 reanalysis qualitatively reproduces the tongue-like distribution of SIT in ESS in early summer and the seasonal variations. Pattern correlation analysis of the SIT forecast data over 3 years (2014-2016) reveals that the early summer SIT distribution is accurately predicted for a lead time of up to 3 days, but that the prediction accuracy drops abruptly after the fourth day, which is related to a dynamical process controlled by synoptic-scale atmospheric fluctuations. For longer lead times ( > 4 days), the thermodynamic melting process takes over, which contributes to most of the remaining prediction accuracy. In July 2014, during which an ice-blocking incident occurred, relatively thick SIT ( ˜ 150 cm) was simulated over the ESS, which is consistent with the reduction in vessel speed. These results suggest that TOPAZ4 sea ice information has great potential for practical applications in summertime maritime navigation via the NSR.

  18. Predicting hydrofacies and hydraulic conductivity from direct-push data using a data-driven relevance vector machine approach: Motivations, algorithms, and application

    NASA Astrophysics Data System (ADS)

    Paradis, Daniel; Lefebvre, René; Gloaguen, Erwan; Rivera, Alfonso

    2015-01-01

    The spatial heterogeneity of hydraulic conductivity (K) exerts a major control on groundwater flow and solute transport. The heterogeneous spatial distribution of K can be imaged using indirect geophysical data as long as reliable relations exist to link geophysical data to K. This paper presents a nonparametric learning machine approach to predict aquifer K from cone penetrometer tests (CPT) coupled with a soil moisture and resistivity probe (SMR) using relevance vector machines (RVMs). The learning machine approach is demonstrated with an application to a heterogeneous unconsolidated littoral aquifer in a 12 km2 subwatershed, where relations between K and multiparameters CPT/SMR soundings appear complex. Our approach involved fuzzy clustering to define hydrofacies (HF) on the basis of CPT/SMR and K data prior to the training of RVMs for HFs recognition and K prediction on the basis of CPT/SMR data alone. The learning machine was built from a colocated training data set representative of the study area that includes K data from slug tests and CPT/SMR data up-scaled at a common vertical resolution of 15 cm with K data. After training, the predictive capabilities of the learning machine were assessed through cross validation with data withheld from the training data set and with K data from flowmeter tests not used during the training process. Results show that HF and K predictions from the learning machine are consistent with hydraulic tests. The combined use of CPT/SMR data and RVM-based learning machine proved to be powerful and efficient for the characterization of high-resolution K heterogeneity for unconsolidated aquifers.

  19. Using short-term evidence to predict six-month outcomes in clinical trials of signs and symptoms in rheumatoid arthritis.

    PubMed

    Nixon, Richard M; Bansback, Nick; Stevens, John W; Brennan, Alan; Madan, Jason

    2009-01-01

    A model is presented to generate a distribution for the probability of an ACR response at six months for a new treatment for rheumatoid arthritis given evidence from a one- or three-month clinical trial. The model is based on published evidence from 11 randomized controlled trials on existing treatments. A hierarchical logistic regression model is used to find the relationship between the proportion of patients achieving ACR20 and ACR50 at one and three months and the proportion at six months. The model is assessed by Bayesian predictive P-values that demonstrate that the model fits the data well. The model can be used to predict the number of patients with an ACR response for proposed six-month clinical trials given data from clinical trials of one or three months duration. Copyright 2008 John Wiley & Sons, Ltd.

  20. Assessing risk factors in the organic control system: evidence from inspection data in Italy.

    PubMed

    Zanoli, Raffaele; Gambelli, Danilo; Solfanelli, Francesco

    2014-12-01

    Certification is an essential feature in organic farming, and it is based on inspections to verify compliance with respect to European Council Regulation-EC Reg. No 834/2007. A risk-based approach to noncompliance that alerts the control bodies to activate planning inspections would contribute to a more efficient and cost-effective certification system. An analysis of factors that can affect the probability of noncompliance in organic farming has thus been developed. This article examines the application of zero-inflated count data models to farm-level panel data from inspection results and sanctions obtained from the Ethical and Environmental Certification Institute, one of the main control bodies in Italy. We tested many a priori hypotheses related to the risk of noncompliance. We find evidence of an important role for past noncompliant behavior in predicting future noncompliance, while farm size and the occurrence of livestock also have roles in an increased probability of noncompliance. We conclude the article proposing that an efficient risk-based inspection system should be designed, weighting up the known probability of occurrence of a given noncompliance according to the severity of its impact. © 2014 Society for Risk Analysis.

  1. XplOit: An Ontology-Based Data Integration Platform Supporting the Development of Predictive Models for Personalized Medicine.

    PubMed

    Weiler, Gabriele; Schwarz, Ulf; Rauch, Jochen; Rohm, Kerstin; Lehr, Thorsten; Theobald, Stefan; Kiefer, Stephan; Götz, Katharina; Och, Katharina; Pfeifer, Nico; Handl, Lisa; Smola, Sigrun; Ihle, Matthias; Turki, Amin T; Beelen, Dietrich W; Rissland, Jürgen; Bittenbring, Jörg; Graf, Norbert

    2018-01-01

    Predictive models can support physicians to tailor interventions and treatments to their individual patients based on their predicted response and risk of disease and help in this way to put personalized medicine into practice. In allogeneic stem cell transplantation risk assessment is to be enhanced in order to respond to emerging viral infections and transplantation reactions. However, to develop predictive models it is necessary to harmonize and integrate high amounts of heterogeneous medical data that is stored in different health information systems. Driven by the demand for predictive instruments in allogeneic stem cell transplantation we present in this paper an ontology-based platform that supports data owners and model developers to share and harmonize their data for model development respecting data privacy.

  2. Validating Ultrasound-based HIFU Lesion-size Monitoring Technique with MR Thermometry and Histology

    NASA Astrophysics Data System (ADS)

    Zhou, Shiwei; Petruzzello, John; Anand, Ajay; Sethuraman, Shriram; Azevedo, Jose

    2010-03-01

    In order to control and monitor HIFU lesions accurately and cost-effectively in real-time, we have developed an ultrasound-based therapy monitoring technique using acoustic radiation force to track the change in tissue mechanical properties. We validate our method with concurrent MR thermometry and histology. Comparison studies have been completed on in-vitro bovine liver samples. A single-element 1.1 MHz focused transducer was used to deliver HIFU and produce acoustic radiation force (ARF). A 5 MHz single-element transducer was placed co-axially with the HIFU transducer to acquire the RF data, and track the tissue displacement induced by ARF. During therapy, the monitoring procedure was interleaved with HIFU. MR thermometry (Philips Panorama 1T system) and ultrasound monitoring were performed simultaneously. The tissue temperature and thermal dose (CEM43 = 240 mins) were computed from the MR thermometry data. The tissue displacement induced by the acoustic radiation force was calculated from the ultrasound RF data in real-time using a cross-correlation based method. A normalized displacement difference (NDD) parameter was developed and calibrated to estimate the lesion size. The lesion size estimated by the NDD was compared with both MR thermometry prediction and the histology analysis. For lesions smaller than 8mm, the NDD-based lesion monitoring technique showed very similar performance as MR thermometry. The standard deviation of lesion size error is 0.66 mm, which is comparable to MR thermal dose contour prediction (0.42 mm). A phased array is needed for tracking displacement in 2D and monitoring lesion larger than 8 mm. The study demonstrates the potential of our ultrasound based technique to achieve precise HIFU lesion control through real-time monitoring. The results compare well with histology and an established technique like MR Thermometry. This approach provides feedback control in real-time to terminate therapy when a pre-determined lesion size is achieved, and can be extended to 2D and implemented on commercial ultrasound scanner systems.

  3. Design of Accelerator Online Simulator Server Using Structured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Guobao; /Brookhaven; Chu, Chungming

    2012-07-06

    Model based control plays an important role for a modern accelerator during beam commissioning, beam study, and even daily operation. With a realistic model, beam behaviour can be predicted and therefore effectively controlled. The approach used by most current high level application environments is to use a built-in simulation engine and feed a realistic model into that simulation engine. Instead of this traditional monolithic structure, a new approach using a client-server architecture is under development. An on-line simulator server is accessed via network accessible structured data. With this approach, a user can easily access multiple simulation codes. This paper describesmore » the design, implementation, and current status of PVData, which defines the structured data, and PVAccess, which provides network access to the structured data.« less

  4. Preliminary Exploration of Adaptive State Predictor Based Human Operator Modeling

    NASA Technical Reports Server (NTRS)

    Trujillo, Anna C.; Gregory, Irene M.

    2012-01-01

    Control-theoretic modeling of the human operator dynamic behavior in manual control tasks has a long and rich history. In the last two decades, there has been a renewed interest in modeling the human operator. There has also been significant work on techniques used to identify the pilot model of a given structure. The purpose of this research is to attempt to go beyond pilot identification based on collected experimental data and to develop a predictor of pilot behavior. An experiment was conducted to quantify the effects of changing aircraft dynamics on an operator s ability to track a signal in order to eventually model a pilot adapting to changing aircraft dynamics. A gradient descent estimator and a least squares estimator with exponential forgetting used these data to predict pilot stick input. The results indicate that individual pilot characteristics and vehicle dynamics did not affect the accuracy of either estimator method to estimate pilot stick input. These methods also were able to predict pilot stick input during changing aircraft dynamics and they may have the capability to detect a change in a subject due to workload, engagement, etc., or the effects of changes in vehicle dynamics on the pilot.

  5. Modelling Hotspots for Invasive Alien Plants in India.

    PubMed

    Adhikari, Dibyendu; Tiwary, Raghuvar; Barik, Saroj Kanta

    2015-01-01

    Identification of invasion hotspots that support multiple invasive alien species (IAS) is a pre-requisite for control and management of invasion. However, till recently it remained a methodological challenge to precisely determine such invasive hotspots. We identified the hotspots of alien species invasion in India through Ecological Niche Modelling (ENM) using species occurrence data from the Global Biodiversity Information Facility (GBIF). The predicted area of invasion for selected species were classified into 4 categories based on number of model agreements for a region i.e. high, medium, low and very low. About 49% of the total geographical area of India was predicted to be prone to invasion at moderate to high levels of climatic suitability. The intersection of anthropogenic biomes and ecoregions with the regions of 'high' climatic suitability was classified as hotspot of alien plant invasion. Nineteen of 47 ecoregions of India, harboured such hotspots. Most ecologically sensitive regions of India, including the 'biodiversity hotspots' and coastal regions coincide with invasion hotspots, indicating their vulnerability to alien plant invasion. Besides demonstrating the usefulness of ENM and open source data for IAS management, the present study provides a knowledge base for guiding the formulation of an effective policy and management strategy for controlling the invasive alien species.

  6. Estimation of CO2 reduction by parallel hard-type power hybridization for gasoline and diesel vehicles.

    PubMed

    Oh, Yunjung; Park, Junhong; Lee, Jong Tae; Seo, Jigu; Park, Sungwook

    2017-10-01

    The purpose of this study is to investigate possible improvements in ICEVs by implementing fuzzy logic-based parallel hard-type power hybrid systems. Two types of conventional ICEVs (gasoline and diesel) and two types of HEVs (gasoline-electric, diesel electric) were generated using vehicle and powertrain simulation tools and a Matlab-Simulink application programming interface. For gasoline and gasoline-electric HEV vehicles, the prediction accuracy for four types of LDV models was validated by conducting comparative analysis with the chassis dynamometer and OBD test data. The predicted results show strong correlation with the test data. The operating points of internal combustion engines and electric motors are well controlled in the high efficiency region and battery SOC was well controlled within ±1.6%. However, for diesel vehicles, we generated virtual diesel-electric HEV vehicle because there is no available vehicles with similar engine and vehicle specifications with ICE vehicle. Using a fuzzy logic-based parallel hybrid system in conventional ICEVs demonstrated that HEVs showed superior performance in terms of fuel consumption and CO 2 emission in most driving modes. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Geochemical modeling of reactions and partitioning of trace metals and radionuclides during titration of contaminated acidic sediments.

    PubMed

    Zhang, Fan; Luo, Wensui; Parker, Jack C; Spalding, Brian P; Brooks, Scott C; Watson, David B; Jardine, Philip M; Gu, Baohua

    2008-11-01

    Many geochemical reactions that control aqueous metal concentrations are directly affected by solution pH. However, changes in solution pH are strongly buffered by various aqueous phase and solid phase precipitation/dissolution and adsorption/desorption reactions. The ability to predict acid-base behavior of the soil-solution system is thus critical to predict metal transport under variable pH conditions. This studywas undertaken to develop a practical generic geochemical modeling approach to predict aqueous and solid phase concentrations of metals and anions during conditions of acid or base additions. The method of Spalding and Spalding was utilized to model soil buffer capacity and pH-dependent cation exchange capacity by treating aquifer solids as a polyprotic acid. To simulate the dynamic and pH-dependent anion exchange capacity, the aquifer solids were simultaneously treated as a polyprotic base controlled by mineral precipitation/ dissolution reactions. An equilibrium reaction model that describes aqueous complexation, precipitation, sorption and soil buffering with pH-dependent ion exchange was developed using HydroGeoChem v5.0 (HGC5). Comparison of model results with experimental titration data of pH, Al, Ca, Mg, Sr, Mn, Ni, Co, and SO4(2-) for contaminated sediments indicated close agreement suggesting that the model could potentially be used to predictthe acid-base behavior of the sediment-solution system under variable pH conditions.

  8. Predictive sufficiency and the use of stored internal state

    NASA Technical Reports Server (NTRS)

    Musliner, David J.; Durfee, Edmund H.; Shin, Kang G.

    1994-01-01

    In all embedded computing systems, some delay exists between sensing and acting. By choosing an action based on sensed data, a system is essentially predicting that there will be no significant changes in the world during this delay. However, the dynamic and uncertain nature of the real world can make these predictions incorrect, and thus, a system may execute inappropriate actions. Making systems more reactive by decreasing the gap between sensing and action leaves less time for predictions to err, but still provides no principled assurance that they will be correct. Using the concept of predictive sufficiency described in this paper, a system can prove that its predictions are valid, and that it will never execute inappropriate actions. In the context of our CIRCA system, we also show how predictive sufficiency allows a system to guarantee worst-case response times to changes in its environment. Using predictive sufficiency, CIRCA is able to build real-time reactive control plans which provide a sound basis for performance guarantees that are unavailable with other reactive systems.

  9. Epidemiology of measles in Southwest Nigeria: an analysis of measles case-based surveillance data from 2007 to 2012.

    PubMed

    Fatiregun, Akinola A; Adebowale, Ayodeji S; Fagbamigbe, Adeniyi F

    2014-03-01

    In Nigeria, a system of measles case-based surveillance with laboratory confirmation of suspected cases was introduced in 2005 as one of the strategies for the control of measles morbidity and mortality. In this report, we provide an epidemiological distribution of confirmed cases of measles reported from the southwest of the country between 2007 and 2012, and predict the expected number of cases for the ensuing years. A descriptive analysis of persons and place and time of confirmed measles cases (laboratory and epidemiological link) reported in the case-based surveillance data was carried out. Using an additive time series model, we predicted the expected number of cases to the year 2015, assuming that current interventional efforts were sustained. From the 10 187 suspected cases investigated during the time period, 1631 (16.0%) cases of measles were confirmed. The annual incidence rose from <1 case per million in 2007 to 23 cases per million in 2011. Cases were confirmed from all six states within the zone and most (97.4%) were in individuals aged less than 20 years. Seasonal variation existed with peaks of infection in the first and second quarters of the year. There was an increasing trend in the number of expected cases based on projections. Case-based surveillance provided an insight into understanding the epidemiology of measles infection in Southwest Nigeria. There is a need to work out alternate strategies for control of measles and to strengthen the surveillance system.

  10. Predicting Positive Education Outcomes for Emerging Adults in Mental Health Systems of Care.

    PubMed

    Brennan, Eileen M; Nygren, Peggy; Stephens, Robert L; Croskey, Adrienne

    2016-10-01

    Emerging adults who receive services based on positive youth development models have shown an ability to shape their own life course to achieve positive goals. This paper reports secondary data analysis from the Longitudinal Child and Family Outcome Study including 248 culturally diverse youth ages 17 through 22 receiving mental health services in systems of care. After 12 months of services, school performance was positively related to youth ratings of school functioning and service participation and satisfaction. Regression analysis revealed ratings of young peoples' perceptions of school functioning, and their experience in services added to the significant prediction of satisfactory school performance, even controlling for sex and attendance. Finally, in addition to expected predictors, participation in planning their own services significantly predicted enrollment in higher education for those who finished high school. Findings suggest that programs and practices based on positive youth development approaches can improve educational outcomes for emerging adults.

  11. A survey-based study of factors that motivate nurses to protect the privacy of electronic medical records.

    PubMed

    Ma, Chen-Chung; Kuo, Kuang-Ming; Alexander, Judith W

    2016-02-02

    The purpose of this study is to investigate factors that motivate nurses to protect privacy in electronic medical records, based on the Decomposed Theory of Planned Behavior. This cross-sectional study used questionnaires to collect data from nurses in a large tertiary care military hospital in Taiwan. The three hundred two (302) valid questionnaires returned resulted in a response rate of 63.7 %. Structural equation modeling identified that the factors of attitude, subjective norm, and perceived behavioral control of the nurses significantly predicted the nurses' intention to protect the privacy of electronic medical records. Further, perceived usefulness and compatibility, peer and superior influence, self-efficacy and facilitating conditions, respectively predicted these three factors. The results of our study may provide valuable information for education and practice in predicting nurses' intention to protect privacy of electronic medical records.

  12. Prediction of quantitative intrathoracic fluid volume to diagnose pulmonary oedema using LabVIEW.

    PubMed

    Urooj, Shabana; Khan, M; Ansari, A Q; Lay-Ekuakille, Aimé; Salhan, Ashok K

    2012-01-01

    Pulmonary oedema is a life-threatening disease that requires special attention in the area of research and clinical diagnosis. Computer-based techniques are rarely used to quantify the intrathoracic fluid volume (IFV) for diagnostic purposes. This paper discusses a software program developed to detect and diagnose pulmonary oedema using LabVIEW. The software runs on anthropometric dimensions and physiological parameters, mainly transthoracic electrical impedance (TEI). This technique is accurate and faster than existing manual techniques. The LabVIEW software was used to compute the parameters required to quantify IFV. An equation relating per cent control and IFV was obtained. The results of predicted TEI and measured TEI were compared with previously reported data to validate the developed program. It was found that the predicted values of TEI obtained from the computer-based technique were much closer to the measured values of TEI. Six new subjects were enrolled to measure and predict transthoracic impedance and hence to quantify IFV. A similar difference was also observed in the measured and predicted values of TEI for the new subjects.

  13. EMG-Torque correction on Human Upper extremity using Evolutionary Computation

    NASA Astrophysics Data System (ADS)

    JL, Veronica; Parasuraman, S.; Khan, M. K. A. Ahamed; Jeba DSingh, Kingsly

    2016-09-01

    There have been many studies indicating that control system of rehabilitative robot plays an important role in determining the outcome of the therapy process. Existing works have done the prediction of feedback signal in the controller based on the kinematics parameters and EMG readings of upper limb's skeletal system. Kinematics and kinetics based control signal system is developed by reading the output of the sensors such as position sensor, orientation sensor and F/T (Force/Torque) sensor and there readings are to be compared with the preceding measurement to decide on the amount of assistive force. There are also other works that incorporated the kinematics parameters to calculate the kinetics parameters via formulation and pre-defined assumptions. Nevertheless, these types of control signals analyze the movement of the upper limb only based on the movement of the upper joints. They do not anticipate the possibility of muscle plasticity. The focus of the paper is to make use of the kinematics parameters and EMG readings of skeletal system to predict the individual torque of upper extremity's joints. The surface EMG signals are fed into different mathematical models so that these data can be trained through Genetic Algorithm (GA) to find the best correlation between EMG signals and torques acting on the upper limb's joints. The estimated torque attained from the mathematical models is called simulated output. The simulated output will then be compared with the actual individual joint which is calculated based on the real time kinematics parameters of the upper movement of the skeleton when the muscle cells are activated. The findings from this contribution are extended into the development of the active control signal based controller for rehabilitation robot.

  14. Analysis of the irradiation data for A302B and A533B correlation monitor materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J.A.

    1996-04-01

    The results of Charpy V-notch impact tests for A302B and A533B-1 Correlation Monitor Materials (CMM) listed in the surveillance power reactor data base (PR-EDB) and material test reactor data base (TR-EDB) are analyzed. The shift of the transition temperature at 30 ft-lb (T{sub 30}) is considered as the primary measure of radiation embrittlement in this report. The hyperbolic tangent fitting model and uncertainty of the fitting parameters for Charpy impact tests are presented in this report. For the surveillance CMM data, the transition temperature shifts at 30 ft-lb ({Delta}T{sub 30}) generally follow the predictions provided by Revision 2 of Regulatorymore » Guide 1.99 (R.G. 1.99). Difference in capsule temperatures is a likely explanation for large deviations from R.G. 1.99 predictions. Deviations from the R.G. 1.99 predictions are correlated to similar deviations for the accompanying materials in the same capsules, but large random fluctuations prevent precise quantitative determination. Significant scatter is noted in the surveillance data, some of which may be attributed to variations from one specimen set to another, or inherent in Charpy V-notch testing. The major contributions to the uncertainty of the R.G. 1.99 prediction model, and the overall data scatter are from mechanical test results, chemical analysis, irradiation environments, fluence evaluation, and inhomogeneous material properties. Thus in order to improve the prediction model, control of the above-mentioned error sources needs to be improved. In general the embrittlement behavior of both the A302B and A533B-1 plate materials is similar. There is evidence for a fluence-rate effect in the CMM data irradiated in test reactors; thus its implication on power reactor surveillance programs deserves special attention.« less

  15. Recent tests of the equilibrium-point hypothesis (lambda model).

    PubMed

    Feldman, A G; Ostry, D J; Levin, M F; Gribble, P L; Mitnitski, A B

    1998-07-01

    The lambda model of the equilibrium-point hypothesis (Feldman & Levin, 1995) is an approach to motor control which, like physics, is based on a logical system coordinating empirical data. The model has gone through an interesting period. On one hand, several nontrivial predictions of the model have been successfully verified in recent studies. In addition, the explanatory and predictive capacity of the model has been enhanced by its extension to multimuscle and multijoint systems. On the other hand, claims have recently appeared suggesting that the model should be abandoned. The present paper focuses on these claims and concludes that they are unfounded. Much of the experimental data that have been used to reject the model are actually consistent with it.

  16. Predicting caregiver burden in general veterinary clients: Contribution of companion animal clinical signs and problem behaviors.

    PubMed

    Spitznagel, M B; Jacobson, D M; Cox, M D; Carlson, M D

    2018-06-01

    Caregiver burden, found in many clients with a chronically or terminally ill companion animal, has been linked to poorer psychosocial function in the client and greater utilization of non-billable veterinary services. To reduce client caregiver burden, its determinants must first be identified. This study examined if companion animal clinical signs and problem behaviors predict veterinary client burden within broader client- and patient-based risk factor models. Data were collected in two phases. Phase 1 included 238 companion animal owners, including those with a sick companion animal (n=119) and matched healthy controls (n=119) recruited online. Phase 2 was comprised of 602 small animal general veterinary hospital clients (n=95 with a sick dog or cat). Participants completed cross-sectional online assessments of caregiver burden, psychosocial resources (social support, active coping, self-mastery), and an item pool of companion animal clinical signs and problem behaviors. Several signs/behaviors correlated with burden, most prominently: weakness, appearing sad/depressed or anxious, appearing to have pain/discomfort, change in personality, frequent urination, and excessive sleeping/lethargy. Within patient-based risk factors, caregiver burden was predicted by frequency of the companion animal's signs/behaviors (P<.01). Within client-based factors, potentially modifiable factors of client reaction to the animal's signs/behaviors (P=.01), and client sense of control (P<.04) predicted burden. Understanding burden may enhance veterinarian-client communication, and is important due to potential downstream effects of client burden, such as higher workload for the veterinarian. Supporting the client's sense of control may help alleviate burden when amelioration of the companion animal's presentation is not feasible. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Prediction of sonic boom from experimental near-field overpressure data. Volume 2: Data base construction

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Reiners, S. J.; Hague, D. S.

    1975-01-01

    A computerized method for storing, updating and augmenting experimentally determined overpressure signatures has been developed. A data base of pressure signatures for a shuttle type vehicle has been stored. The data base has been used for the prediction of sonic boom with the program described in Volume I.

  18. A web-based information system for management and analysis of patient data after refractive eye surgery.

    PubMed

    Zuberbuhler, Bruno; Galloway, Peter; Reddy, Aravind; Saldana, Manuel; Gale, Richard

    2007-12-01

    The aim was to develop a software tool for refractive surgeons using a standard user-friendly web-based interface, providing the user with a secure environment to protect large volumes of patient data. The software application was named "Internet-based refractive analysis" (IBRA), and was programmed with the computer languages PHP, HTML and JavaScript, attached to the opensource MySQL database. IBRA facilitated internationally accepted presentation methods including the stability chart, the predictability chart and the safety chart; it was able to perform vector analysis for the course of a single patient or for group data. With the integrated nomogram calculation, treatment could be customised to reduce the postoperative refractive error. Multicenter functions permitted quality-control comparisons between different surgeons and laser units.

  19. Motivators and hygiene factors among physicians responding to explicit incentives to improve the value of care.

    PubMed

    Waddimba, Anthony C; Burgess, James F; Young, Gary J; Beckman, Howard B; Meterko, Mark

    2013-01-01

    Physician's dissatisfaction is reported to be increasing, especially in primary care. The transition from fee-for-service to outcome-based reimbursements may make matters worse. To investigate influences of provider attitudes and practice settings on job satisfaction/dissatisfaction during transition to quality-based payment models, we assessed self-reported satisfaction/dissatisfaction with practice in a Rochester (New York)-area physician practice association in the process of implementing pay-for-performance. We linked cross-sectional data for 215 survey respondents on satisfaction ratings and behavioral attitudes with medical record data on their clinical behavior and practices, and census data on their catchment population. Factors associated with the odds of being satisfied or dissatisfied were determined via predictive multivariable logistic regression modeling. Dissatisfied physicians were more likely to have larger-than-average patient panels, lower autonomy and/or control, and beliefs that quality incentives were hindering patient care. Satisfied physicians were more likely to have a higher sense of autonomy and control, smaller patient volumes, and a less complex patient mix. Efforts to maintain or improve satisfaction among physicians should focus on encouraging professional autonomy during transitions from volume-based to quality/outcomes-based payment systems. An optimum balance between accountability and autonomy/control might maximize both health care quality and job satisfaction.

  20. Basic numerical competences in large-scale assessment data: Structure and long-term relevance.

    PubMed

    Hirsch, Stefa; Lambert, Katharina; Coppens, Karien; Moeller, Korbinian

    2018-03-01

    Basic numerical competences are seen as building blocks for later numerical and mathematical achievement. The current study aimed at investigating the structure of early numeracy reflected by different basic numerical competences in kindergarten and its predictive value for mathematical achievement 6 years later using data from large-scale assessment. This allowed analyses based on considerably large sample sizes (N > 1700). A confirmatory factor analysis indicated that a model differentiating five basic numerical competences at the end of kindergarten fitted the data better than a one-factor model of early numeracy representing a comprehensive number sense. In addition, these basic numerical competences were observed to reliably predict performance in a curricular mathematics test in Grade 6 even after controlling for influences of general cognitive ability. Thus, our results indicated a differentiated view on early numeracy considering basic numerical competences in kindergarten reflected in large-scale assessment data. Consideration of different basic numerical competences allows for evaluating their specific predictive value for later mathematical achievement but also mathematical learning difficulties. Copyright © 2017 Elsevier Inc. All rights reserved.

Top