Integrated Wind Power Planning Tool
NASA Astrophysics Data System (ADS)
Rosgaard, Martin; Giebel, Gregor; Skov Nielsen, Torben; Hahmann, Andrea; Sørensen, Poul; Madsen, Henrik
2013-04-01
This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the title "Integrated Wind Power Planning Tool". The goal is to integrate a mesoscale numerical weather prediction (NWP) model with purely statistical tools in order to assess wind power fluctuations, with focus on long term power system planning for future wind farms as well as short term forecasting for existing wind farms. Currently, wind power fluctuation models are either purely statistical or integrated with NWP models of limited resolution. Using the state-of-the-art mesoscale NWP model Weather Research & Forecasting model (WRF) the forecast error is sought quantified in dependence of the time scale involved. This task constitutes a preparative study for later implementation of features accounting for NWP forecast errors in the DTU Wind Energy maintained Corwind code - a long term wind power planning tool. Within the framework of PSO 10464 research related to operational short term wind power prediction will be carried out, including a comparison of forecast quality at different mesoscale NWP model resolutions and development of a statistical wind power prediction tool taking input from WRF. The short term prediction part of the project is carried out in collaboration with ENFOR A/S; a Danish company that specialises in forecasting and optimisation for the energy sector. The integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting for its spatio-temporal dependencies, and depending on the prevailing weather conditions defined by the WRF output. The output from the integrated short term prediction tool constitutes scenario forecasts for the coming period, which can then be fed into any type of system model or decision making problem to be solved. The high resolution of the WRF results loaded into the integrated prediction model will ensure a high accuracy data basis is available for use in the decision making process of the Danish transmission system operator. The need for high accuracy predictions will only increase over the next decade as Denmark approaches the goal of 50% wind power based electricity in 2025 from the current 20%.
Lee, Ciaran M; Davis, Timothy H; Bao, Gang
2018-04-01
What is the topic of this review? In this review, we analyse the performance of recently described tools for CRISPR/Cas9 guide RNA design, in particular, design tools that predict CRISPR/Cas9 activity. What advances does it highlight? Recently, many tools designed to predict CRISPR/Cas9 activity have been reported. However, the majority of these tools lack experimental validation. Our analyses indicate that these tools have poor predictive power. Our preliminary results suggest that target site accessibility should be considered in order to develop better guide RNA design tools with improved predictive power. The recent adaptation of the clustered regulatory interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein 9 (Cas9) system for targeted genome engineering has led to its widespread application in many fields worldwide. In order to gain a better understanding of the design rules of CRISPR/Cas9 systems, several groups have carried out large library-based screens leading to some insight into sequence preferences among highly active target sites. To facilitate CRISPR/Cas9 design, these studies have spawned a plethora of guide RNA (gRNA) design tools with algorithms based solely on direct or indirect sequence features. Here, we demonstrate that the predictive power of these tools is poor, suggesting that sequence features alone cannot accurately inform the cutting efficiency of a particular CRISPR/Cas9 gRNA design. Furthermore, we demonstrate that DNA target site accessibility influences the activity of CRISPR/Cas9. With further optimization, we hypothesize that it will be possible to increase the predictive power of gRNA design tools by including both sequence and target site accessibility metrics. © 2017 The Authors. Experimental Physiology © 2017 The Physiological Society.
Integrated Wind Power Planning Tool
NASA Astrophysics Data System (ADS)
Rosgaard, M. H.; Giebel, G.; Nielsen, T. S.; Hahmann, A.; Sørensen, P.; Madsen, H.
2012-04-01
This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the working title "Integrated Wind Power Planning Tool". The project commenced October 1, 2011, and the goal is to integrate a numerical weather prediction (NWP) model with purely statistical tools in order to assess wind power fluctuations, with focus on long term power system planning for future wind farms as well as short term forecasting for existing wind farms. Currently, wind power fluctuation models are either purely statistical or integrated with NWP models of limited resolution. With regard to the latter, one such simulation tool has been developed at the Wind Energy Division, Risø DTU, intended for long term power system planning. As part of the PSO project the inferior NWP model used at present will be replaced by the state-of-the-art Weather Research & Forecasting (WRF) model. Furthermore, the integrated simulation tool will be improved so it can handle simultaneously 10-50 times more turbines than the present ~ 300, as well as additional atmospheric parameters will be included in the model. The WRF data will also be input for a statistical short term prediction model to be developed in collaboration with ENFOR A/S; a danish company that specialises in forecasting and optimisation for the energy sector. This integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting for its spatio-temporal dependencies, and depending on the prevailing weather conditions defined by the WRF output. The output from the integrated prediction tool constitute scenario forecasts for the coming period, which can then be fed into any type of system model or decision making problem to be solved. The high resolution of the WRF results loaded into the integrated prediction model will ensure a high accuracy data basis is available for use in the decision making process of the Danish transmission system operator, and the need for high accuracy predictions will only increase over the next decade as Denmark approaches the goal of 50% wind power based electricity in 2020, from the current 20%.
Power hand tool kinetics associated with upper limb injuries in an automobile assembly plant.
Ku, Chia-Hua; Radwin, Robert G; Karsh, Ben-Tzion
2007-06-01
This study investigated the relationship between pneumatic nutrunner handle reactions, workstation characteristics, and prevalence of upper limb injuries in an automobile assembly plant. Tool properties (geometry, inertial properties, and motor characteristics), fastener properties, orientation relative to the fastener, and the position of the tool operator (horizontal and vertical distances) were measured for 69 workstations using 15 different pneumatic nutrunners. Handle reaction response was predicted using a deterministic mechanical model of the human operator and tool that was previously developed in our laboratory, specific to the measured tool, workstation, and job factors. Handle force was a function of target torque, tool geometry and inertial properties, motor speed, work orientation, and joint hardness. The study found that tool target torque was not well correlated with predicted handle reaction force (r=0.495) or displacement (r=0.285). The individual tool, tool shape, and threaded fastener joint hardness all affected predicted forces and displacements (p<0.05). The average peak handle force and displacement for right-angle tools were twice as great as pistol grip tools. Soft-threaded fastener joints had the greatest average handle forces and displacements. Upper limb injury cases were identified using plant OSHA 200 log and personnel records. Predicted handle forces for jobs where injuries were reported were significantly greater than those jobs free of injuries (p<0.05), whereas target torque and predicted handle displacement did not show statistically significant differences. The study concluded that quantification of handle reaction force, rather than target torque alone, is necessary for identifying stressful power hand tool operations and for controlling exposure to forces in manufacturing jobs involving power nutrunners. Therefore, a combination of tool, work station, and task requirements should be considered.
Signal Detection Theory as a Tool for Successful Student Selection
ERIC Educational Resources Information Center
van Ooijen-van der Linden, Linda; van der Smagt, Maarten J.; Woertman, Liesbeth; te Pas, Susan F.
2017-01-01
Prediction accuracy of academic achievement for admission purposes requires adequate "sensitivity" and "specificity" of admission tools, yet the available information on the validity and predictive power of admission tools is largely based on studies using correlational and regression statistics. The goal of this study was to…
Forces associated with pneumatic power screwdriver operation: statics and dynamics.
Lin, Jia-Hua; Radwin, Robert G; Fronczak, Frank J; Richard, Terry G
2003-10-10
The statics and dynamics of pneumatic power screwdriver operation were investigated in the context of predicting forces acting against the human operator. A static force model is described in the paper, based on tool geometry, mass, orientation in space, feed force, torque build up, and stall torque. Three common power hand tool shapes are considered, including pistol grip, right angle, and in-line. The static model estimates handle force needed to support a power nutrunner when it acts against the tightened fastener with a constant torque. A system of equations for static force and moment equilibrium conditions are established, and the resultant handle force (resolved in orthogonal directions) is calculated in matrix form. A dynamic model is formulated to describe pneumatic motor torque build-up characteristics dependent on threaded fastener joint hardness. Six pneumatic tools were tested to validate the deterministic model. The average torque prediction error was 6.6% (SD = 5.4%) and the average handle force prediction error was 6.7% (SD = 6.4%) for a medium-soft threaded fastener joint. The average torque prediction error was 5.2% (SD = 5.3%) and the average handle force prediction error was 3.6% (SD = 3.2%) for a hard threaded fastener joint. Use of these equations for estimating handle forces based on passive mechanical elements representing the human operator is also described. These models together should be useful for considering tool handle force in the selection and design of power screwdrivers, particularly for minimizing handle forces in the prevention of injuries and work related musculoskeletal disorders.
Introducing AC Inductive Reactance with a Power Tool
ERIC Educational Resources Information Center
Bryant, Wesley; Baker, Blane
2016-01-01
The concept of reactance in AC electrical circuits is often non-intuitive and difficult for students to grasp. In order to address this lack of conceptual understanding, classroom exercises compare the predicted resistance of a power tool, based on electrical specifications, to measured resistance. Once students discover that measured resistance…
Thermomechanical modelling of laser surface glazing for H13 tool steel
NASA Astrophysics Data System (ADS)
Kabir, I. R.; Yin, D.; Tamanna, N.; Naher, S.
2018-03-01
A two-dimensional thermomechanical finite element (FE) model of laser surface glazing (LSG) has been developed for H13 tool steel. The direct coupling technique of ANSYS 17.2 (APDL) has been utilised to solve the transient thermomechanical process. A H13 tool steel cylindrical cross-section has been modelled for laser power 200 W and 300 W at constant 0.2 mm beam width and 0.15 ms residence time. The model can predict temperature distribution, stress-strain increments in elastic and plastic region with time and space. The crack formation tendency also can be assumed by analysing the von Mises stress in the heat-concentrated zone. Isotropic and kinematic hardening models have been applied separately to predict the after-yield phenomena. At 200 W laser power, the peak surface temperature achieved is 1520 K which is below the melting point (1727 K) of H13 tool steel. For laser power 300 W, the peak surface temperature is 2523 K. Tensile residual stresses on surface have been found after cooling, which are in agreement with literature. Isotropic model shows higher residual stress that increases with laser power. Conversely, kinematic model gives lower residual stress which decreases with laser power. Therefore, both plasticity models could work in LSG for H13 tool steel.
Introducing AC inductive reactance with a power tool
NASA Astrophysics Data System (ADS)
Bryant, Wesley; Baker, Blane
2016-09-01
The concept of reactance in AC electrical circuits is often non-intuitive and difficult for students to grasp. In order to address this lack of conceptual understanding, classroom exercises compare the predicted resistance of a power tool, based on electrical specifications, to measured resistance. Once students discover that measured resistance is smaller than expected, they are asked to explain these observations using previously studied principles of magnetic induction. Exercises also introduce the notion of inductive reactance and impedance in AC circuits and, ultimately, determine self-inductance of the motor windings within the power tool.
SAVANT: Solar Array Verification and Analysis Tool Demonstrated
NASA Technical Reports Server (NTRS)
Chock, Ricaurte
2000-01-01
The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.
Nemes, Szilard; Rolfson, Ola; Garellick, Göran
2018-02-01
Clinicians considering improvements in health-related quality of life (HRQoL) after total hip replacement (THR) must account for multiple pieces of information. Evidence-based decisions are important to best assess the effect of THR on HRQoL. This work aims at constructing a shared decision-making tool that helps clinicians assessing the future benefits of THR by offering predictions of 1-year postoperative HRQoL of THR patients. We used data from the Swedish Hip Arthroplasty Register. Data from 2008 were used as training set and data from 2009 to 2012 as validation set. We adopted two approaches. First, we assumed a continuous distribution for the EQ-5D index and modelled the postoperative EQ-5D index with regression models. Second, we modelled the five dimensions of the EQ-5D and weighted together the predictions using the UK Time Trade-Off value set. As predictors, we used preoperative EQ-5D dimensions and the EQ-5D index, EQ visual analogue scale, visual analogue scale pain, Charnley classification, age, gender, body mass index, American Society of Anesthesiologists, surgical approach and prosthesis type. Additionally, the tested algorithms were combined in a single predictive tool by stacking. Best predictive power was obtained by the multivariate adaptive regression splines (R 2 = 0.158). However, this was not significantly better than the predictive power of linear regressions (R 2 = 0.157). The stacked model had a predictive power of 17%. Successful implementation of a shared decision-making tool that can aid clinicians and patients in understanding expected improvement in HRQoL following THR would require higher predictive power than we achieved. For a shared decision-making tool to succeed, further variables, such as socioeconomics, need to be considered. © 2016 John Wiley & Sons, Ltd.
Online Analysis of Wind and Solar Part I: Ramping Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.
2012-01-31
To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.
Univariate Time Series Prediction of Solar Power Using a Hybrid Wavelet-ARMA-NARX Prediction Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nazaripouya, Hamidreza; Wang, Yubo; Chu, Chi-Cheng
This paper proposes a new hybrid method for super short-term solar power prediction. Solar output power usually has a complex, nonstationary, and nonlinear characteristic due to intermittent and time varying behavior of solar radiance. In addition, solar power dynamics is fast and is inertia less. An accurate super short-time prediction is required to compensate for the fluctuations and reduce the impact of solar power penetration on the power system. The objective is to predict one step-ahead solar power generation based only on historical solar power time series data. The proposed method incorporates discrete wavelet transform (DWT), Auto-Regressive Moving Average (ARMA)more » models, and Recurrent Neural Networks (RNN), while the RNN architecture is based on Nonlinear Auto-Regressive models with eXogenous inputs (NARX). The wavelet transform is utilized to decompose the solar power time series into a set of richer-behaved forming series for prediction. ARMA model is employed as a linear predictor while NARX is used as a nonlinear pattern recognition tool to estimate and compensate the error of wavelet-ARMA prediction. The proposed method is applied to the data captured from UCLA solar PV panels and the results are compared with some of the common and most recent solar power prediction methods. The results validate the effectiveness of the proposed approach and show a considerable improvement in the prediction precision.« less
PPSP: prediction of PK-specific phosphorylation site with Bayesian decision theory.
Xue, Yu; Li, Ao; Wang, Lirong; Feng, Huanqing; Yao, Xuebiao
2006-03-20
As a reversible and dynamic post-translational modification (PTM) of proteins, phosphorylation plays essential regulatory roles in a broad spectrum of the biological processes. Although many studies have been contributed on the molecular mechanism of phosphorylation dynamics, the intrinsic feature of substrates specificity is still elusive and remains to be delineated. In this work, we present a novel, versatile and comprehensive program, PPSP (Prediction of PK-specific Phosphorylation site), deployed with approach of Bayesian decision theory (BDT). PPSP could predict the potential phosphorylation sites accurately for approximately 70 PK (Protein Kinase) groups. Compared with four existing tools Scansite, NetPhosK, KinasePhos and GPS, PPSP is more accurate and powerful than these tools. Moreover, PPSP also provides the prediction for many novel PKs, say, TRK, mTOR, SyK and MET/RON, etc. The accuracy of these novel PKs are also satisfying. Taken together, we propose that PPSP could be a potentially powerful tool for the experimentalists who are focusing on phosphorylation substrates with their PK-specific sites identification. Moreover, the BDT strategy could also be a ubiquitous approach for PTMs, such as sumoylation and ubiquitination, etc.
Wireless Network Simulation in Aircraft Cabins
NASA Technical Reports Server (NTRS)
Beggs, John H.; Youssef, Mennatoallah; Vahala, Linda
2004-01-01
An electromagnetic propagation prediction tool was used to predict electromagnetic field strength inside airplane cabins. A commercial software package, Wireless Insite, was used to predict power levels inside aircraft cabins and the data was compared with previously collected experimental data. It was concluded that the software could qualitatively predict electromagnetic propagation inside the aircraft cabin environment.
How to test gravitation theories by means of gravitational-wave measurements
NASA Technical Reports Server (NTRS)
Thorne, K. S.
1974-01-01
Gravitational-wave experiments are a potentially powerful tool for testing gravitation theories. Most theories in the literature predict rather different polarization properties for gravitational waves than are predicted by general relativity; and many theories predict anomalies in the propagation speeds of gravitational waves.
NASA Astrophysics Data System (ADS)
Kardhana, Hadi; Arya, Doni Khaira; Hadihardaja, Iwan K.; Widyaningtyas, Riawan, Edi; Lubis, Atika
2017-11-01
Small-Scale Hydropower (SHP) had been important electric energy power source in Indonesia. Indonesia is vast countries, consists of more than 17.000 islands. It has large fresh water resource about 3 m of rainfall and 2 m of runoff. Much of its topography is mountainous, remote but abundant with potential energy. Millions of people do not have sufficient access to electricity, some live in the remote places. Recently, SHP development was encouraged for energy supply of the places. Development of global hydrology data provides opportunity to predict distribution of hydropower potential. In this paper, we demonstrate run-of-river type SHP spot prediction tool using SWAT and a river diversion algorithm. The use of Soil and Water Assessment Tool (SWAT) with input of CFSR (Climate Forecast System Re-analysis) of 10 years period had been implemented to predict spatially distributed flow cumulative distribution function (CDF). A simple algorithm to maximize potential head of a location by a river diversion expressing head race and penstock had been applied. Firm flow and power of the SHP were estimated from the CDF and the algorithm. The tool applied to Upper Citarum River Basin and three out of four existing hydropower locations had been well predicted. The result implies that this tool is able to support acceleration of SHP development at earlier phase.
Self-learning computers for surgical planning and prediction of postoperative alignment.
Lafage, Renaud; Pesenti, Sébastien; Lafage, Virginie; Schwab, Frank J
2018-02-01
In past decades, the role of sagittal alignment has been widely demonstrated in the setting of spinal conditions. As several parameters can be affected, identifying the driver of the deformity is the cornerstone of a successful treatment approach. Despite the importance of restoring sagittal alignment for optimizing outcome, this task remains challenging. Self-learning computers and optimized algorithms are of great interest in spine surgery as in that they facilitate better planning and prediction of postoperative alignment. Nowadays, computer-assisted tools are part of surgeons' daily practice; however, the use of such tools remains to be time-consuming. NARRATIVE REVIEW AND RESULTS: Computer-assisted methods for the prediction of postoperative alignment consist of a three step analysis: identification of anatomical landmark, definition of alignment objectives, and simulation of surgery. Recently, complex rules for the prediction of alignment have been proposed. Even though this kind of work leads to more personalized objectives, the number of parameters involved renders it difficult for clinical use, stressing the importance of developing computer-assisted tools. The evolution of our current technology, including machine learning and other types of advanced algorithms, will provide powerful tools that could be useful in improving surgical outcomes and alignment prediction. These tools can combine different types of advanced technologies, such as image recognition and shape modeling, and using this technique, computer-assisted methods are able to predict spinal shape. The development of powerful computer-assisted methods involves the integration of several sources of information such as radiographic parameters (X-rays, MRI, CT scan, etc.), demographic information, and unusual non-osseous parameters (muscle quality, proprioception, gait analysis data). In using a larger set of data, these methods will aim to mimic what is actually done by spine surgeons, leading to real tailor-made solutions. Integrating newer technology can change the current way of planning/simulating surgery. The use of powerful computer-assisted tools that are able to integrate several parameters and learn from experience can change the traditional way of selecting treatment pathways and counseling patients. However, there is still much work to be done to reach a desired level as noted in other orthopedic fields, such as hip surgery. Many of these tools already exist in non-medical fields and their adaptation to spine surgery is of considerable interest.
The DSM-5 Self-Rated Level 1 Cross-Cutting Symptom Measure as a Screening Tool.
Bastiaens, Leo; Galus, James
2018-03-01
The DSM-5 Self-Rated Level 1 Cross-Cutting Symptom Measure was developed to aid clinicians with a dimensional assessment of psychopathology; however, this measure resembles a screening tool for several symptomatic domains. The objective of the current study was to examine the basic parameters of sensitivity, specificity, positive and negative predictive power of the measure as a screening tool. One hundred and fifty patients in a correctional community center filled out the measure prior to a psychiatric evaluation, including the Mini International Neuropsychiatric Interview screen. The above parameters were calculated for the domains of depression, mania, anxiety, and psychosis. The results showed that the sensitivity and positive predictive power of the studied domains was poor because of a high rate of false positive answers on the measure. However, when the lowest threshold on the Cross-Cutting Symptom Measure was used, the sensitivity of the anxiety and psychosis domains and the negative predictive values for mania, anxiety and psychosis were good. In conclusion, while it is foreseeable that some clinicians may use the DSM-5 Self-Rated Level 1 Cross-Cutting Symptom Measure as a screening tool, it should not be relied on to identify positive findings. It functioned well in the negative prediction of mania, anxiety and psychosis symptoms.
ERIC Educational Resources Information Center
Chavez-Gibson, Sarah
2013-01-01
The purpose of this study is to exam in-depth, the Comprehensive, Powerful, Academic Database (CPAD), a data decision-making tool that determines and identifies students at-risk of dropping out of school, and how the CPAD assists administrators and teachers at an elementary campus to monitor progress, curriculum, and performance to improve student…
In-silico wear prediction for knee replacements--methodology and corroboration.
Strickland, M A; Taylor, M
2009-07-22
The capability to predict in-vivo wear of knee replacements is a valuable pre-clinical analysis tool for implant designers. Traditionally, time-consuming experimental tests provided the principal means of investigating wear. Today, computational models offer an alternative. However, the validity of these models has not been demonstrated across a range of designs and test conditions, and several different formulas are in contention for estimating wear rates, limiting confidence in the predictive power of these in-silico models. This study collates and retrospectively simulates a wide range of experimental wear tests using fast rigid-body computational models with extant wear prediction algorithms, to assess the performance of current in-silico wear prediction tools. The number of tests corroborated gives a broader, more general assessment of the performance of these wear-prediction tools, and provides better estimates of the wear 'constants' used in computational models. High-speed rigid-body modelling allows a range of alternative algorithms to be evaluated. Whilst most cross-shear (CS)-based models perform comparably, the 'A/A+B' wear model appears to offer the best predictive power amongst existing wear algorithms. However, the range and variability of experimental data leaves considerable uncertainty in the results. More experimental data with reduced variability and more detailed reporting of studies will be necessary to corroborate these models with greater confidence. With simulation times reduced to only a few minutes, these models are ideally suited to large-volume 'design of experiment' or probabilistic studies (which are essential if pre-clinical assessment tools are to begin addressing the degree of variation observed clinically and in explanted components).
GAPIT version 2: an enhanced integrated tool for genomic association and prediction
USDA-ARS?s Scientific Manuscript database
Most human diseases and agriculturally important traits are complex. Dissecting their genetic architecture requires continued development of innovative and powerful statistical methods. Corresponding advances in computing tools are critical to efficiently use these statistical innovations and to enh...
An Integrated Software Package to Enable Predictive Simulation Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang
The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less
Comparison of ISS Power System Telemetry with Analytically Derived Data for Shadowed Cases
NASA Technical Reports Server (NTRS)
Fincannon, H. James
2002-01-01
Accurate International Space Station (ISS) power prediction requires the quantification of solar array shadowing. Prior papers have discussed the NASA Glenn Research Center (GRC) ISS power system tool SPACE (System Power Analysis for Capability Evaluation) and its integrated shadowing algorithms. On-orbit telemetry has become available that permits the correlation of theoretical shadowing predictions with actual data. This paper documents the comparison of a shadowing metric (total solar array current) as derived from SPACE predictions and on-orbit flight telemetry data for representative significant shadowing cases. Images from flight video recordings and the SPACE computer program graphical output are used to illustrate the comparison. The accuracy of the SPACE shadowing capability is demonstrated for the cases examined.
NASA Astrophysics Data System (ADS)
Xie, Yan; Li, Mu; Zhou, Jin; Zheng, Chang-zheng
2009-07-01
Agricultural machinery total power is an important index to reflex and evaluate the level of agricultural mechanization. It is the power source of agricultural production, and is the main factors to enhance the comprehensive agricultural production capacity expand production scale and increase the income of the farmers. Its demand is affected by natural, economic, technological and social and other "grey" factors. Therefore, grey system theory can be used to analyze the development of agricultural machinery total power. A method based on genetic algorithm optimizing grey modeling process is introduced in this paper. This method makes full use of the advantages of the grey prediction model and characteristics of genetic algorithm to find global optimization. So the prediction model is more accurate. According to data from a province, the GM (1, 1) model for predicting agricultural machinery total power was given based on the grey system theories and genetic algorithm. The result indicates that the model can be used as agricultural machinery total power an effective tool for prediction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris
RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variablemore » resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less
Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diakov, Victor; Cole, Wesley; Sullivan, Patrick
2015-11-01
Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validitymore » of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.« less
GAPIT: genome association and prediction integrated tool.
Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu
2012-09-15
Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.
A novel method for predicting the power outputs of wave energy converters
NASA Astrophysics Data System (ADS)
Wang, Yingguang
2018-03-01
This paper focuses on realistically predicting the power outputs of wave energy converters operating in shallow water nonlinear waves. A heaving two-body point absorber is utilized as a specific calculation example, and the generated power of the point absorber has been predicted by using a novel method (a nonlinear simulation method) that incorporates a second order random wave model into a nonlinear dynamic filter. It is demonstrated that the second order random wave model in this article can be utilized to generate irregular waves with realistic crest-trough asymmetries, and consequently, more accurate generated power can be predicted by subsequently solving the nonlinear dynamic filter equation with the nonlinearly simulated second order waves as inputs. The research findings demonstrate that the novel nonlinear simulation method in this article can be utilized as a robust tool for ocean engineers in their design, analysis and optimization of wave energy converters.
Family-Based Benchmarking of Copy Number Variation Detection Software.
Nutsua, Marcel Elie; Fischer, Annegret; Nebel, Almut; Hofmann, Sylvia; Schreiber, Stefan; Krawczak, Michael; Nothnagel, Michael
2015-01-01
The analysis of structural variants, in particular of copy-number variations (CNVs), has proven valuable in unraveling the genetic basis of human diseases. Hence, a large number of algorithms have been developed for the detection of CNVs in SNP array signal intensity data. Using the European and African HapMap trio data, we undertook a comparative evaluation of six commonly used CNV detection software tools, namely Affymetrix Power Tools (APT), QuantiSNP, PennCNV, GLAD, R-gada and VEGA, and assessed their level of pair-wise prediction concordance. The tool-specific CNV prediction accuracy was assessed in silico by way of intra-familial validation. Software tools differed greatly in terms of the number and length of the CNVs predicted as well as the number of markers included in a CNV. All software tools predicted substantially more deletions than duplications. Intra-familial validation revealed consistently low levels of prediction accuracy as measured by the proportion of validated CNVs (34-60%). Moreover, up to 20% of apparent family-based validations were found to be due to chance alone. Software using Hidden Markov models (HMM) showed a trend to predict fewer CNVs than segmentation-based algorithms albeit with greater validity. PennCNV yielded the highest prediction accuracy (60.9%). Finally, the pairwise concordance of CNV prediction was found to vary widely with the software tools involved. We recommend HMM-based software, in particular PennCNV, rather than segmentation-based algorithms when validity is the primary concern of CNV detection. QuantiSNP may be used as an additional tool to detect sets of CNVs not detectable by the other tools. Our study also reemphasizes the need for laboratory-based validation, such as qPCR, of CNVs predicted in silico.
Jensen's Inequality Predicts Effects of Environmental Variation
Jonathan J. Ruel; Matthew P. Ayres
1999-01-01
Many biologists now recognize that environmental variance can exert important effects on patterns and processes in nature that are independent of average conditions. Jenson's inequality is a mathematical proof that is seldom mentioned in the ecological literature but which provides a powerful tool for predicting some direct effects of environmental variance in...
Metabolic pathways for the whole community.
Hanson, Niels W; Konwar, Kishori M; Hawley, Alyse K; Altman, Tomer; Karp, Peter D; Hallam, Steven J
2014-07-22
A convergence of high-throughput sequencing and computational power is transforming biology into information science. Despite these technological advances, converting bits and bytes of sequence information into meaningful insights remains a challenging enterprise. Biological systems operate on multiple hierarchical levels from genomes to biomes. Holistic understanding of biological systems requires agile software tools that permit comparative analyses across multiple information levels (DNA, RNA, protein, and metabolites) to identify emergent properties, diagnose system states, or predict responses to environmental change. Here we adopt the MetaPathways annotation and analysis pipeline and Pathway Tools to construct environmental pathway/genome databases (ePGDBs) that describe microbial community metabolism using MetaCyc, a highly curated database of metabolic pathways and components covering all domains of life. We evaluate Pathway Tools' performance on three datasets with different complexity and coding potential, including simulated metagenomes, a symbiotic system, and the Hawaii Ocean Time-series. We define accuracy and sensitivity relationships between read length, coverage and pathway recovery and evaluate the impact of taxonomic pruning on ePGDB construction and interpretation. Resulting ePGDBs provide interactive metabolic maps, predict emergent metabolic pathways associated with biosynthesis and energy production and differentiate between genomic potential and phenotypic expression across defined environmental gradients. This multi-tiered analysis provides the user community with specific operating guidelines, performance metrics and prediction hazards for more reliable ePGDB construction and interpretation. Moreover, it demonstrates the power of Pathway Tools in predicting metabolic interactions in natural and engineered ecosystems.
A Modular Approach for Teaching Partial Discharge Phenomenon through Experiment
ERIC Educational Resources Information Center
Chatterjee, B.; Dey, D.; Chakravorti, S.
2011-01-01
Partial discharge (PD) monitoring is an effective predictive maintenance tool for electrical power equipment. As a result, an understanding of the theory related to PD and the associated measurement techniques is now necessary knowledge for power engineers in their professional life. This paper presents a modular course on PD phenomenon in which…
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.
Lee, Leng-Feng; Umberger, Brian R
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB
Lee, Leng-Feng
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184
Wide-Area Traffic Management for Cloud Services
2012-04-01
performance prediction tools [11], which are usually load oblivious. Therefore, without information about link loads and capacities, a CDN may direct...powerful tool . DONAR allows its customers to dictate a replica’s (i) split weight, wi, the desired proportion of requests that a particular replica i...Diagnostic Tool (NDT) [100], which is used for the Federal Communication Commission’s Consumer Broadband Test, and NPAD [101]—are more closely integrated with
Flight Awareness Collaboration Tool Development
NASA Technical Reports Server (NTRS)
Mogford, Richard
2016-01-01
This is a PowerPoint presentation covering airline operations center (AOC) research. It reviews a dispatcher decision support tool called the Flight Awareness Collaboration Tool (FACT). FACT gathers information about winter weather onto one screen and includes predictive abilities. FACT should prove to be useful for airline dispatchers and airport personnel when they manage winter storms and their effect on air traffic. This material is very similar to other previously approved presentations.
Piazza, Matthew; Sharma, Nikhil; Osiemo, Benjamin; McClintock, Scott; Missimer, Emily; Gardiner, Diana; Maloney, Eileen; Callahan, Danielle; Smith, J Lachlan; Welch, William; Schuster, James; Grady, M Sean; Malhotra, Neil R
2018-05-21
Bundled care payments are increasingly being explored for neurosurgical interventions. In this setting, skilled nursing facility (SNF) is less desirable from a cost perspective than discharge to home, underscoring the need for better preoperative prediction of postoperative disposition. To assess the capability of the Risk Assessment and Prediction Tool (RAPT) and other preoperative variables to determine expected disposition prior to surgery in a heterogeneous neurosurgical cohort, through observational study. Patients aged 50 yr or more undergoing elective neurosurgery were enrolled from June 2016 to February 2017 (n = 623). Logistic regression was used to identify preoperative characteristics predictive of discharge disposition. Results from multivariate analysis were used to create novel grading scales for the prediction of discharge disposition that were subsequently compared to the RAPT Score using Receiver Operating Characteristic analysis. Higher RAPT Score significantly predicted home disposition (P < .001). Age 65 and greater, dichotomized RAPT walk score, and spinal surgery below L2 were independent predictors of SNF discharge in multivariate analysis. A grading scale utilizing these variables had superior discriminatory power between SNF and home/rehab discharge when compared with RAPT score alone (P = .004). Our analysis identified age, lower lumbar/lumbosacral surgery, and RAPT walk score as independent predictors of discharge to SNF, and demonstrated superior predictive power compared with the total RAPT Score when combined in a novel grading scale. These tools may identify patients who may benefit from expedited discharge to subacute care facilities and decrease inpatient hospital resource utilization following surgery.
Computing organic stereoselectivity - from concepts to quantitative calculations and predictions.
Peng, Qian; Duarte, Fernanda; Paton, Robert S
2016-11-07
Advances in theory and processing power have established computation as a valuable interpretative and predictive tool in the discovery of new asymmetric catalysts. This tutorial review outlines the theory and practice of modeling stereoselective reactions. Recent examples illustrate how an understanding of the fundamental principles and the application of state-of-the-art computational methods may be used to gain mechanistic insight into organic and organometallic reactions. We highlight the emerging potential of this computational tool-box in providing meaningful predictions for the rational design of asymmetric catalysts. We present an accessible account of the field to encourage future synergy between computation and experiment.
Modelling for Prediction vs. Modelling for Understanding: Commentary on Musso et al. (2013)
ERIC Educational Resources Information Center
Edelsbrunner, Peter; Schneider, Michael
2013-01-01
Musso et al. (2013) predict students' academic achievement with high accuracy one year in advance from cognitive and demographic variables, using artificial neural networks (ANNs). They conclude that ANNs have high potential for theoretical and practical improvements in learning sciences. ANNs are powerful statistical modelling tools but they can…
A Decision Support Prototype Tool for Predicting Student Performance in an ODL Environment
ERIC Educational Resources Information Center
Kotsiantis, S. B.; Pintelas, P. E.
2004-01-01
Machine Learning algorithms fed with data sets which include information such as attendance data, test scores and other student information can provide tutors with powerful tools for decision-making. Until now, much of the research has been limited to the relation between single variables and student performance. Combining multiple variables as…
Online Analysis of Wind and Solar Part II: Transmission Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian
2012-01-31
To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the toolmore » has been developed and implemented in software.« less
NASA Technical Reports Server (NTRS)
Koch, L. Danielle
2012-01-01
A combined quadrupole-dipole model of fan inflow distortion tone noise has been extended to calculate tone sound power levels generated by obstructions arranged in circumferentially asymmetric locations upstream of a rotor. Trends in calculated sound power level agreed well with measurements from tests conducted in 2007 in the NASA Glenn Advanced Noise Control Fan. Calculated values of sound power levels radiated upstream were demonstrated to be sensitive to the accuracy of the modeled wakes from the cylindrical rods that were placed upstream of the fan to distort the inflow. Results indicate a continued need to obtain accurate aerodynamic predictions and measurements at the fan inlet plane as engineers work towards developing fan inflow distortion tone noise prediction tools.
Tools and techniques for estimating high intensity RF effects
NASA Astrophysics Data System (ADS)
Zacharias, Richard L.; Pennock, Steve T.; Poggio, Andrew J.; Ray, Scott L.
1992-01-01
Tools and techniques for estimating and measuring coupling and component disturbance for avionics and electronic controls are described. A finite-difference-time-domain (FD-TD) modeling code, TSAR, used to predict coupling is described. This code can quickly generate a mesh model to represent the test object. Some recent applications as well as the advantages and limitations of using such a code are described. Facilities and techniques for making low-power coupling measurements and for making direct injection test measurements of device disturbance are also described. Some scaling laws for coupling and device effects are presented. A method for extrapolating these low-power test results to high-power full-system effects are presented.
Electrical Systems Analysis at NASA Glenn Research Center: Status and Prospects
NASA Technical Reports Server (NTRS)
Freeh, Joshua E.; Liang, Anita D.; Berton, Jeffrey J.; Wickenheiser, Timothy J.
2003-01-01
An analysis of an electrical power and propulsion system for a 2-place general aviation aircraft is presented to provide a status of such modeling at NASA Glenn Research Center. The thermodynamic/ electrical model and mass prediction tools are described and the resulting system power and mass are shown. Three technology levels are used to predict the effect of advancements in component technology. Methods of fuel storage are compared by mass and volume. Prospects for future model development and validation at NASA as well as possible applications are also summarized.
A New Analysis Tool Assessment for Rotordynamic Modeling of Gas Foil Bearings
NASA Technical Reports Server (NTRS)
Howard, Samuel A.; SanAndres, Luis
2010-01-01
Gas foil bearings offer several advantages over traditional bearing types that make them attractive for use in high-speed turbomachinery. They can operate at very high temperatures, require no lubrication supply (oil pumps, seals, etc.), exhibit very long life with no maintenance, and once operating airborne, have very low power loss. The use of gas foil bearings in high-speed turbomachinery has been accelerating in recent years, although the pace has been slow. One of the contributing factors to the slow growth has been a lack of analysis tools, benchmarked to measurements, to predict gas foil bearing behavior in rotating machinery. To address this shortcoming, NASA Glenn Research Center (GRC) has supported the development of analytical tools to predict gas foil bearing performance. One of the codes has the capability to predict rotordynamic coefficients, power loss, film thickness, structural deformation, and more. The current paper presents an assessment of the predictive capability of the code, named XLGFBTH (Texas A&M University). A test rig at GRC is used as a simulated case study to compare rotordynamic analysis using output from the code to actual rotor response as measured in the test rig. The test rig rotor is supported on two gas foil journal bearings manufactured at GRC, with all pertinent geometry disclosed. The resulting comparison shows that the rotordynamic coefficients calculated using XLGFBTH represent the dynamics of the system reasonably well, especially as they pertain to predicting critical speeds.
Rosetta Structure Prediction as a Tool for Solving Difficult Molecular Replacement Problems.
DiMaio, Frank
2017-01-01
Molecular replacement (MR), a method for solving the crystallographic phase problem using phases derived from a model of the target structure, has proven extremely valuable, accounting for the vast majority of structures solved by X-ray crystallography. However, when the resolution of data is low, or the starting model is very dissimilar to the target protein, solving structures via molecular replacement may be very challenging. In recent years, protein structure prediction methodology has emerged as a powerful tool in model building and model refinement for difficult molecular replacement problems. This chapter describes some of the tools available in Rosetta for model building and model refinement specifically geared toward difficult molecular replacement cases.
Hamaker, Marije E; Mitrovic, M; Stauder, R
2014-06-01
The G8 screening tool was developed to separate fit older cancer patients who were able to receive standard treatment from those that should undergo a geriatric assessment to guide tailoring of therapy. We set out to determine the discriminative power and prognostic value of the G8 in older patients with a haematological malignancy. Between September 2009 and May 2013, a multi-dimensional geriatric assessment was performed in consecutive patients aged ≥67 years diagnosed with blood cancer at the Innsbruck University Hospital. The assessment included (instrumental) activities of daily living, cognition, mood, nutritional status, mobility, polypharmacy and social support. In parallel, the G8 was also administered (cut-off ≤ 14). Using a cut-off of ≥2 impaired domains, 70 % of the 108 included patients were considered as having an impaired geriatric assessment while 61 % had an impaired G8. The G8 lacked discriminative power for impairments on full geriatric assessment: sensitivity 69, specificity 79, positive predictive value 89 and negative predictive value 50 %. However, G8 was an independent predictor of mortality within the first year after inclusion (hazard ratio 3.93; 95 % confidence interval 1.67-9.22, p < 0.001). Remarkably, patients with impaired G8 fared poorly, irrespective of treatment choices (p < 0.001). This is the first report on the clinical and prognostic relevance of G8 in elderly patients with haematological malignancies. Although the G8 lacked discriminative power for outcome of multi-dimensional geriatric assessment, this score appears to be a powerful prognosticator and could potentially represent a useful tool in treatment decisions. This novel finding certainly deserves further exploration.
Muhlestein, Whitney E; Akagi, Dallin S; Kallos, Justiss A; Morone, Peter J; Weaver, Kyle D; Thompson, Reid C; Chambless, Lola B
2018-04-01
Objective Machine learning (ML) algorithms are powerful tools for predicting patient outcomes. This study pilots a novel approach to algorithm selection and model creation using prediction of discharge disposition following meningioma resection as a proof of concept. Materials and Methods A diversity of ML algorithms were trained on a single-institution database of meningioma patients to predict discharge disposition. Algorithms were ranked by predictive power and top performers were combined to create an ensemble model. The final ensemble was internally validated on never-before-seen data to demonstrate generalizability. The predictive power of the ensemble was compared with a logistic regression. Further analyses were performed to identify how important variables impact the ensemble. Results Our ensemble model predicted disposition significantly better than a logistic regression (area under the curve of 0.78 and 0.71, respectively, p = 0.01). Tumor size, presentation at the emergency department, body mass index, convexity location, and preoperative motor deficit most strongly influence the model, though the independent impact of individual variables is nuanced. Conclusion Using a novel ML technique, we built a guided ML ensemble model that predicts discharge destination following meningioma resection with greater predictive power than a logistic regression, and that provides greater clinical insight than a univariate analysis. These techniques can be extended to predict many other patient outcomes of interest.
Using Data Mining for Predicting Relationships between Online Question Theme and Final Grade
ERIC Educational Resources Information Center
Abdous, M'hammed; He, Wu; Yen, Cherng-Jyh
2012-01-01
As higher education diversifies its delivery modes, our ability to use the predictive and analytical power of educational data mining (EDM) to understand students' learning experiences is a critical step forward. The adoption of EDM by higher education as an analytical and decision making tool is offering new opportunities to exploit the untapped…
Towards a National Space Weather Predictive Capability
NASA Astrophysics Data System (ADS)
Fox, N. J.; Lindstrom, K. L.; Ryschkewitsch, M. G.; Anderson, B. J.; Gjerloev, J. W.; Merkin, V. G.; Kelly, M. A.; Miller, E. S.; Sitnov, M. I.; Ukhorskiy, A. Y.; Erlandson, R. E.; Barnes, R. J.; Paxton, L. J.; Sotirelis, T.; Stephens, G.; Comberiate, J.
2014-12-01
National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review datasets, tools and models that have resulted from research by scientists at JHU/APL, and examine how they could be applied to support space weather applications in coordination with other community assets and capabilities.
The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct
ERIC Educational Resources Information Center
Knezek, Gerald; Christensen, Rhonda
2015-01-01
An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…
ERIC Educational Resources Information Center
Knezek, Gerald; Christensen, Rhonda
2016-01-01
An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power of the model for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be…
GPS-ARM: Computational Analysis of the APC/C Recognition Motif by Predicting D-Boxes and KEN-Boxes
Ren, Jian; Cao, Jun; Zhou, Yanhong; Yang, Qing; Xue, Yu
2012-01-01
Anaphase-promoting complex/cyclosome (APC/C), an E3 ubiquitin ligase incorporated with Cdh1 and/or Cdc20 recognizes and interacts with specific substrates, and faithfully orchestrates the proper cell cycle events by targeting proteins for proteasomal degradation. Experimental identification of APC/C substrates is largely dependent on the discovery of APC/C recognition motifs, e.g., the D-box and KEN-box. Although a number of either stringent or loosely defined motifs proposed, these motif patterns are only of limited use due to their insufficient powers of prediction. We report the development of a novel GPS-ARM software package which is useful for the prediction of D-boxes and KEN-boxes in proteins. Using experimentally identified D-boxes and KEN-boxes as the training data sets, a previously developed GPS (Group-based Prediction System) algorithm was adopted. By extensive evaluation and comparison, the GPS-ARM performance was found to be much better than the one using simple motifs. With this powerful tool, we predicted 4,841 potential D-boxes in 3,832 proteins and 1,632 potential KEN-boxes in 1,403 proteins from H. sapiens, while further statistical analysis suggested that both the D-box and KEN-box proteins are involved in a broad spectrum of biological processes beyond the cell cycle. In addition, with the co-localization information, we predicted hundreds of mitosis-specific APC/C substrates with high confidence. As the first computational tool for the prediction of APC/C-mediated degradation, GPS-ARM is a useful tool for information to be used in further experimental investigations. The GPS-ARM is freely accessible for academic researchers at: http://arm.biocuckoo.org. PMID:22479614
A High Resolution Tropical Cyclone Power Outage Forecasting Model for the Continental United States
NASA Astrophysics Data System (ADS)
Pino, J. V.; Quiring, S. M.; Guikema, S.; Shashaani, S.; Linger, S.; Backhaus, S.
2017-12-01
Tropical cyclones cause extensive damage to the power infrastructure system throughout the United States. This damage can leave millions without power for extended periods of time, as most recently seen with Hurricane Matthew (2016). Accurate and timely prediction of power outages are essential for utility companies, emergency management agencies, and governmental organizations. Here we present a high-resolution (250 m x 250 m) hurricane power outage model for the United States. The model uses only publicly-available data to make predictions. It uses forecasts of storm variables such as maximum 3-second wind gust, duration of strong winds > 20 m s-2, soil moisture, and precipitation. It also incorporates static environmental variables such as elevation characteristics, land cover type, population density, tree species data, and root zone depth. A web tool was established for use by the Department of Energy (DOE) so that the model can be used for real-time outage forecasting or for synthetic tropical cyclones as an exercise in emergency management. This web tool provides DOE decision-makers with high impact analytic results and products that can be disseminated to federal, local, and state agencies. The results then aid utility companies in their pre- and post-storm activities, thus decreasing restoration times and lowering costs.
Department of Defense Space Science and Technology Strategy 2015
2015-01-01
solar cells at 34% efficiency enabling higher power spacecraft capability. These solar cells developed by the Air Force Research Laboratory (AFRL...Reduce size, weight, power , cost, and improve thermal management for SATCOM terminals Support intelligence surveillance and reconnaissance (ISR...Improve understanding and awareness of the Earth-to-Sun environment Improve space environment forecast capabilities and tools to predict operational
Anwar, Mohammad Y; Lewnard, Joseph A; Parikh, Sunil; Pitzer, Virginia E
2016-11-22
Malaria remains endemic in Afghanistan. National control and prevention strategies would be greatly enhanced through a better ability to forecast future trends in disease incidence. It is, therefore, of interest to develop a predictive tool for malaria patterns based on the current passive and affordable surveillance system in this resource-limited region. This study employs data from Ministry of Public Health monthly reports from January 2005 to September 2015. Malaria incidence in Afghanistan was forecasted using autoregressive integrated moving average (ARIMA) models in order to build a predictive tool for malaria surveillance. Environmental and climate data were incorporated to assess whether they improve predictive power of models. Two models were identified, each appropriate for different time horizons. For near-term forecasts, malaria incidence can be predicted based on the number of cases in the four previous months and 12 months prior (Model 1); for longer-term prediction, malaria incidence can be predicted using the rates 1 and 12 months prior (Model 2). Next, climate and environmental variables were incorporated to assess whether the predictive power of proposed models could be improved. Enhanced vegetation index was found to have increased the predictive accuracy of longer-term forecasts. Results indicate ARIMA models can be applied to forecast malaria patterns in Afghanistan, complementing current surveillance systems. The models provide a means to better understand malaria dynamics in a resource-limited context with minimal data input, yielding forecasts that can be used for public health planning at the national level.
Novel fiber-MOPA-based high power blue laser
NASA Astrophysics Data System (ADS)
Engin, Doruk; Fouron, Jean-Luc; Chen, Youming; Huffman, Andromeda; Fitzpatrick, Fran; Burnham, Ralph; Gupta, Shantanu
2012-06-01
5W peak power at 911 nm is demonstrated with a pulsed Neodymium (Nd) doped fiber master oscillator power amplifier (MOPA). This result is the first reported high gain (16dB) fiber amplifier operation at 911nm. Pulse repetition frequency (PRF) and duty-cycle dependence of the all fiber system is characterized. Negligible performance degreadation is observed down to 1% duty cycle and 10 kHz PRF, where 2.5μJ of pulse energy is achieved. Continuous wave (CW) MOPA experiments achieved 55mW average power and 9dB gain with 15% optical to optical (o-o) efficiency. Excellent agreement is established between dynammic fiber MOPA simulation tool and experimental results in predicting output amplified spontaneous emission (ase) and signal pulse shapes. Using the simulation tool robust Stimulated Brillion Scattering (SBS) free operation is predicted out of a two stage all fiber system that generates over 10W's of peak power with 500 MHz line-width. An all fiber 911 nm pulsed laser source with >10W of peak power is expected to increase reliability and reduce complexity of high energy 455 nm laser system based on optical parametric amplification for udnerwater applications. The views expressed are thos of the author and do not reflect the official policy or position of the Department of Defense or the U.S. Government.
A signature inferred from Drosophila mitotic genes predicts survival of breast cancer patients.
Damasco, Christian; Lembo, Antonio; Somma, Maria Patrizia; Gatti, Maurizio; Di Cunto, Ferdinando; Provero, Paolo
2011-02-28
The classification of breast cancer patients into risk groups provides a powerful tool for the identification of patients who will benefit from aggressive systemic therapy. The analysis of microarray data has generated several gene expression signatures that improve diagnosis and allow risk assessment. There is also evidence that cell proliferation-related genes have a high predictive power within these signatures. We thus constructed a gene expression signature (the DM signature) using the human orthologues of 108 Drosophila melanogaster genes required for either the maintenance of chromosome integrity (36 genes) or mitotic division (72 genes). The DM signature has minimal overlap with the extant signatures and is highly predictive of survival in 5 large breast cancer datasets. In addition, we show that the DM signature outperforms many widely used breast cancer signatures in predictive power, and performs comparably to other proliferation-based signatures. For most genes of the DM signature, an increased expression is negatively correlated with patient survival. The genes that provide the highest contribution to the predictive power of the DM signature are those involved in cytokinesis. This finding highlights cytokinesis as an important marker in breast cancer prognosis and as a possible target for antimitotic therapies.
GWFASTA: server for FASTA search in eukaryotic and microbial genomes.
Issac, Biju; Raghava, G P S
2002-09-01
Similarity searches are a powerful method for solving important biological problems such as database scanning, evolutionary studies, gene prediction, and protein structure prediction. FASTA is a widely used sequence comparison tool for rapid database scanning. Here we describe the GWFASTA server that was developed to assist the FASTA user in similarity searches against partially and/or completely sequenced genomes. GWFASTA consists of more than 60 microbial genomes, eight eukaryote genomes, and proteomes of annotatedgenomes. Infact, it provides the maximum number of databases for similarity searching from a single platform. GWFASTA allows the submission of more than one sequence as a single query for a FASTA search. It also provides integrated post-processing of FASTA output, including compositional analysis of proteins, multiple sequences alignment, and phylogenetic analysis. Furthermore, it summarizes the search results organism-wise for prokaryotes and chromosome-wise for eukaryotes. Thus, the integration of different tools for sequence analyses makes GWFASTA a powerful toolfor biologists.
Modeling forest biomass and growth: Coupling long-term inventory and LiDAR data
Chad Babcock; Andrew O. Finley; Bruce D. Cook; Aaron Weiskittel; Christopher W. Woodall
2016-01-01
Combining spatially-explicit long-term forest inventory and remotely sensed information from Light Detection and Ranging (LiDAR) datasets through statistical models can be a powerful tool for predicting and mapping above-ground biomass (AGB) at a range of geographic scales. We present and examine a novel modeling approach to improve prediction of AGB and estimate AGB...
Lim, Chun Shen; Brown, Chris M
2017-01-01
Structured RNA elements may control virus replication, transcription and translation, and their distinct features are being exploited by novel antiviral strategies. Viral RNA elements continue to be discovered using combinations of experimental and computational analyses. However, the wealth of sequence data, notably from deep viral RNA sequencing, viromes, and metagenomes, necessitates computational approaches being used as an essential discovery tool. In this review, we describe practical approaches being used to discover functional RNA elements in viral genomes. In addition to success stories in new and emerging viruses, these approaches have revealed some surprising new features of well-studied viruses e.g., human immunodeficiency virus, hepatitis C virus, influenza, and dengue viruses. Some notable discoveries were facilitated by new comparative analyses of diverse viral genome alignments. Importantly, comparative approaches for finding RNA elements embedded in coding and non-coding regions differ. With the exponential growth of computer power we have progressed from stem-loop prediction on single sequences to cutting edge 3D prediction, and from command line to user friendly web interfaces. Despite these advances, many powerful, user friendly prediction tools and resources are underutilized by the virology community.
Lim, Chun Shen; Brown, Chris M.
2018-01-01
Structured RNA elements may control virus replication, transcription and translation, and their distinct features are being exploited by novel antiviral strategies. Viral RNA elements continue to be discovered using combinations of experimental and computational analyses. However, the wealth of sequence data, notably from deep viral RNA sequencing, viromes, and metagenomes, necessitates computational approaches being used as an essential discovery tool. In this review, we describe practical approaches being used to discover functional RNA elements in viral genomes. In addition to success stories in new and emerging viruses, these approaches have revealed some surprising new features of well-studied viruses e.g., human immunodeficiency virus, hepatitis C virus, influenza, and dengue viruses. Some notable discoveries were facilitated by new comparative analyses of diverse viral genome alignments. Importantly, comparative approaches for finding RNA elements embedded in coding and non-coding regions differ. With the exponential growth of computer power we have progressed from stem-loop prediction on single sequences to cutting edge 3D prediction, and from command line to user friendly web interfaces. Despite these advances, many powerful, user friendly prediction tools and resources are underutilized by the virology community. PMID:29354101
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gebraad, Pieter; Thomas, Jared J.; Ning, Andrew
This paper presents a wind plant modeling and optimization tool that enables the maximization of wind plant annual energy production (AEP) using yaw-based wake steering control and layout changes. The tool is an extension of a wake engineering model describing the steady-state effects of yaw on wake velocity profiles and power productions of wind turbines in a wind plant. To make predictions of a wind plant's AEP, necessary extensions of the original wake model include coupling it with a detailed rotor model and a control policy for turbine blade pitch and rotor speed. This enables the prediction of power productionmore » with wake effects throughout a range of wind speeds. We use the tool to perform an example optimization study on a wind plant based on the Princess Amalia Wind Park. In this case study, combined optimization of layout and wake steering control increases AEP by 5%. The power gains from wake steering control are highest for region 1.5 inflow wind speeds, and they continue to be present to some extent for the above-rated inflow wind speeds. The results show that layout optimization and wake steering are complementary because significant AEP improvements can be achieved with wake steering in a wind plant layout that is already optimized to reduce wake losses.« less
NASA Research to Support the Airlines
NASA Technical Reports Server (NTRS)
Mogford, Richard
2016-01-01
This is a PowerPoint presentation that was a review of NASA projects that support airline operations. It covered NASA tasks that have provided new tools to the airline operations center and flight deck including the Flight Awareness Collaboration Tool, Dynamic Weather Routes, Traffic Aware Strategic Aircrew Requests, and Airplane State Awareness and Prediction Technologies. This material is very similar to other previously approved presentations with the same title.
The PREM score: a graphical tool for predicting survival in very preterm births.
Cole, T J; Hey, E; Richmond, S
2010-01-01
To develop a tool for predicting survival to term in babies born more than 8 weeks early using only information available at or before birth. 1456 non-malformed very preterm babies of 22-31 weeks' gestation born in 2000-3 in the north of England and 3382 births of 23-31 weeks born in 2000-4 in Trent. Survival to term, predicted from information available at birth, and at the onset of labour or delivery. Development of a logistic regression model (the prematurity risk evaluation measure or PREM score) based on gestation, birth weight for gestation and base deficit from umbilical cord blood. Gestation was by far the most powerful predictor of survival to term, and as few as 5 extra days can double the chance of survival. Weight for gestation also had a powerful but non-linear effect on survival, with weight between the median and 85th centile predicting the highest survival. Using this information survival can be predicted almost as accurately before birth as after, although base deficit further improves the prediction. A simple graph is described that shows how the two main variables gestation and weight for gestation interact to predict the chance of survival. The PREM score can be used to predict the chance of survival at or before birth almost as accurately as existing measures influenced by post-delivery condition, to balance risk at entry into a controlled trial and to adjust for differences in "case mix" when assessing the quality of perinatal care.
NASA Astrophysics Data System (ADS)
Goudarzi, Nasser
2016-04-01
In this work, two new and powerful chemometrics methods are applied for the modeling and prediction of the 19F chemical shift values of some fluorinated organic compounds. The radial basis function-partial least square (RBF-PLS) and random forest (RF) are employed to construct the models to predict the 19F chemical shifts. In this study, we didn't used from any variable selection method and RF method can be used as variable selection and modeling technique. Effects of the important parameters affecting the ability of the RF prediction power such as the number of trees (nt) and the number of randomly selected variables to split each node (m) were investigated. The root-mean-square errors of prediction (RMSEP) for the training set and the prediction set for the RBF-PLS and RF models were 44.70, 23.86, 29.77, and 23.69, respectively. Also, the correlation coefficients of the prediction set for the RBF-PLS and RF models were 0.8684 and 0.9313, respectively. The results obtained reveal that the RF model can be used as a powerful chemometrics tool for the quantitative structure-property relationship (QSPR) studies.
NASA Technical Reports Server (NTRS)
Barber, Peter W.; Demerdash, Nabeel A. O.; Wang, R.; Hurysz, B.; Luo, Z.
1991-01-01
The goal is to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom.The approach consists of four steps: (1) develop analytical tools (models and computer programs); (2) conduct parameterization studies; (3) predict the global space station EMI environment; and (4) provide a basis for modification of EMI standards.
Wang, Duolin; Zeng, Shuai; Xu, Chunhui; Qiu, Wangren; Liang, Yanchun; Joshi, Trupti; Xu, Dong
2017-12-15
Computational methods for phosphorylation site prediction play important roles in protein function studies and experimental design. Most existing methods are based on feature extraction, which may result in incomplete or biased features. Deep learning as the cutting-edge machine learning method has the ability to automatically discover complex representations of phosphorylation patterns from the raw sequences, and hence it provides a powerful tool for improvement of phosphorylation site prediction. We present MusiteDeep, the first deep-learning framework for predicting general and kinase-specific phosphorylation sites. MusiteDeep takes raw sequence data as input and uses convolutional neural networks with a novel two-dimensional attention mechanism. It achieves over a 50% relative improvement in the area under the precision-recall curve in general phosphorylation site prediction and obtains competitive results in kinase-specific prediction compared to other well-known tools on the benchmark data. MusiteDeep is provided as an open-source tool available at https://github.com/duolinwang/MusiteDeep. xudong@missouri.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Cohen-Stavi, Chandra; Leventer-Roberts, Maya; Balicer, Ran D
2017-01-01
Objective To directly compare the performance and externally validate the three most studied prediction tools for osteoporotic fractures—QFracture, FRAX, and Garvan—using data from electronic health records. Design Retrospective cohort study. Setting Payer provider healthcare organisation in Israel. Participants 1 054 815 members aged 50 to 90 years for comparison between tools and cohorts of different age ranges, corresponding to those in each tools’ development study, for tool specific external validation. Main outcome measure First diagnosis of a major osteoporotic fracture (for QFracture and FRAX tools) and hip fractures (for all three tools) recorded in electronic health records from 2010 to 2014. Observed fracture rates were compared to probabilities predicted retrospectively as of 2010. Results The observed five year hip fracture rate was 2.7% and the rate for major osteoporotic fractures was 7.7%. The areas under the receiver operating curve (AUC) for hip fracture prediction were 82.7% for QFracture, 81.5% for FRAX, and 77.8% for Garvan. For major osteoporotic fractures, AUCs were 71.2% for QFracture and 71.4% for FRAX. All the tools underestimated the fracture risk, but the average observed to predicted ratios and the calibration slopes of FRAX were closest to 1. Tool specific validation analyses yielded hip fracture prediction AUCs of 88.0% for QFracture (among those aged 30-100 years), 81.5% for FRAX (50-90 years), and 71.2% for Garvan (60-95 years). Conclusions Both QFracture and FRAX had high discriminatory power for hip fracture prediction, with QFracture performing slightly better. This performance gap was more pronounced in previous studies, likely because of broader age inclusion criteria for QFracture validations. The simpler FRAX performed almost as well as QFracture for hip fracture prediction, and may have advantages if some of the input data required for QFracture are not available. However, both tools require calibration before implementation. PMID:28104610
Conser, Christiana; Seebacher, Lizbeth; Fujino, David W; Reichard, Sarah; DiTomaso, Joseph M
2015-01-01
Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when "needs further evaluation" classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When "needs further evaluation" classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program.
Conser, Christiana; Seebacher, Lizbeth; Fujino, David W.; Reichard, Sarah; DiTomaso, Joseph M.
2015-01-01
Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when “needs further evaluation” classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When “needs further evaluation” classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program. PMID:25803830
NASA Technical Reports Server (NTRS)
2010-01-01
Topics covered include: Burnishing Techniques Strengthen Hip Implants; Signal Processing Methods Monitor Cranial Pressure; Ultraviolet-Blocking Lenses Protect, Enhance Vision; Hyperspectral Systems Increase Imaging Capabilities; Programs Model the Future of Air Traffic Management; Tail Rotor Airfoils Stabilize Helicopters, Reduce Noise; Personal Aircraft Point to the Future of Transportation; Ducted Fan Designs Lead to Potential New Vehicles; Winglets Save Billions of Dollars in Fuel Costs; Sensor Systems Collect Critical Aerodynamics Data; Coatings Extend Life of Engines and Infrastructure; Radiometers Optimize Local Weather Prediction; Energy-Efficient Systems Eliminate Icing Danger for UAVs; Rocket-Powered Parachutes Rescue Entire Planes; Technologies Advance UAVs for Science, Military; Inflatable Antennas Support Emergency Communication; Smart Sensors Assess Structural Health; Hand-Held Devices Detect Explosives and Chemical Agents; Terahertz Tools Advance Imaging for Security, Industry; LED Systems Target Plant Growth; Aerogels Insulate Against Extreme Temperatures; Image Sensors Enhance Camera Technologies; Lightweight Material Patches Allow for Quick Repairs; Nanomaterials Transform Hairstyling Tools; Do-It-Yourself Additives Recharge Auto Air Conditioning; Systems Analyze Water Quality in Real Time; Compact Radiometers Expand Climate Knowledge; Energy Servers Deliver Clean, Affordable Power; Solutions Remediate Contaminated Groundwater; Bacteria Provide Cleanup of Oil Spills, Wastewater; Reflective Coatings Protect People and Animals; Innovative Techniques Simplify Vibration Analysis; Modeling Tools Predict Flow in Fluid Dynamics; Verification Tools Secure Online Shopping, Banking; Toolsets Maintain Health of Complex Systems; Framework Resources Multiply Computing Power; Tools Automate Spacecraft Testing, Operation; GPS Software Packages Deliver Positioning Solutions; Solid-State Recorders Enhance Scientific Data Collection; Computer Models Simulate Fine Particle Dispersion; Composite Sandwich Technologies Lighten Components; Cameras Reveal Elements in the Short Wave Infrared; Deformable Mirrors Correct Optical Distortions; Stitching Techniques Advance Optics Manufacturing; Compact, Robust Chips Integrate Optical Functions; Fuel Cell Stations Automate Processes, Catalyst Testing; Onboard Systems Record Unique Videos of Space Missions; Space Research Results Purify Semiconductor Materials; and Toolkits Control Motion of Complex Robotics.
2007-03-01
time. This is a very powerful tool in determining fine spatial resolution , as boundary conditions are not only updated at every timestep, but the ...HIGH RESOLUTION MESOSCALE WEATHER DATA IMPROVEMENT TO SPATIAL EFFECTS FOR DOSE-RATE CONTOUR PLOT PREDICTIONS THESIS Christopher P...11 1 HIGH RESOLUTION MESOSCALE WEATHER DATA IMPROVEMENT TO SPATIAL EFFECTS FOR DOSE-RATE CONTOUR PLOT
DR2DI: a powerful computational tool for predicting novel drug-disease associations
NASA Astrophysics Data System (ADS)
Lu, Lu; Yu, Hua
2018-05-01
Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.
DR2DI: a powerful computational tool for predicting novel drug-disease associations
NASA Astrophysics Data System (ADS)
Lu, Lu; Yu, Hua
2018-04-01
Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.
A probabilistic neural network based approach for predicting the output power of wind turbines
NASA Astrophysics Data System (ADS)
Tabatabaei, Sajad
2017-03-01
Finding the authentic predicting tools of eliminating the uncertainty of wind speed forecasts is highly required while wind power sources are strongly penetrating. Recently, traditional predicting models of generating point forecasts have no longer been trustee. Thus, the present paper aims at utilising the concept of prediction intervals (PIs) to assess the uncertainty of wind power generation in power systems. Besides, this paper uses a newly introduced non-parametric approach called lower upper bound estimation (LUBE) to build the PIs since the forecasting errors are unable to be modelled properly by applying distribution probability functions. In the present proposed LUBE method, a PI combination-based fuzzy framework is used to overcome the performance instability of neutral networks (NNs) used in LUBE. In comparison to other methods, this formulation more suitably has satisfied the PI coverage and PI normalised average width (PINAW). Since this non-linear problem has a high complexity, a new heuristic-based optimisation algorithm comprising a novel modification is introduced to solve the aforesaid problems. Based on data sets taken from a wind farm in Australia, the feasibility and satisfying performance of the suggested method have been investigated.
Dose-response patterns for vibration-induced white finger
Griffin, M; Bovenzi, M; Nelson, C
2003-01-01
Aims: To investigate alternative relations between cumulative exposures to hand-transmitted vibration (taking account of vibration magnitude, lifetime exposure duration, and frequency of vibration) and the development of white finger (Raynaud's phenomenon). Methods: Three previous studies have been combined to provide a group of 1557 users of powered vibratory tools in seven occupational subgroups: stone grinders, stone carvers, quarry drillers, dockyard caulkers, dockyard boilermakers, dockyard painters, and forest workers. The estimated total operating duration in hours was thus obtained for each subject, for each tool, and for all tools combined. From the vibration magnitudes and exposure durations, seven alternative measurements of cumulative exposure were calculated for each subject, using expressions of the form: dose = ∑amiti, where ai is the acceleration magnitude on tool i, ti is the lifetime exposure duration for tool i, and m = 0, 1, 2, or 4. Results: For all seven alternative dose measures, an increase in dose was associated with a significant increase in the occurrence of vibration-induced white finger, after adjustment for age and smoking. However, dose measures with high powers of acceleration (m > 1) faired less well than measures in which the weighted or unweighted acceleration, and lifetime exposure duration, were given equal weight (m = 1). Dose determined solely by the lifetime exposure duration (without consideration of the vibration magnitude) gave better predictions than measures with m greater than unity. All measures of dose calculated from the unweighted acceleration gave better predictions than the equivalent dose measures using acceleration frequency-weighted according to current standards. Conclusions: Since the total duration of exposure does not discriminate between exposures accumulated over the day and those accumulated over years, a linear relation between vibration magnitude and exposure duration seems appropriate for predicting the occurrence of vibration-induced white finger. Poorer predictions were obtained when the currently recommended frequency weighting was employed than when accelerations at all frequencies were given equal weight. Findings suggest that improvements are possible to both the frequency weighting and the time dependency used to predict the development of vibration-induced white finger in current standards. PMID:12499452
The Applications of NASA Mission Technologies to the Greening of Human Impact
NASA Technical Reports Server (NTRS)
Sims, Michael H.
2009-01-01
I will give an overview talk about flight software systems, robotics technologies and modeling for energy minimization as applied to vehicles and buildings infrastructures. A dominant issue in both design and operations of robotic spacecraft is the minimization of energy use. In the design and building of spacecraft increased power is acquired only at the cost of additional mass and volumes and ultimately cost. Consequently, interplanetary spacecrafts are designed to have the minimum essential power and those designs often incorporate careful timing of all power use. Operationally, the availability of power is the most influential constraint for the use of planetary surface robots, such as the Mars Exploration Rovers. The amount of driving done, the amount of science accomplished and indeed the survivability of the spacecraft itself is determined by the power available for use. For the Mars Exploration Rovers there are four tools which are used: (1) models of the rover and it s thermal and power use (2) predictive environmental models of power input and thermal environment (3) fine grained manipulation of power use (4) optimization modeling and planning tools. In this talk I will discuss possible applications of this methodology to minimizing power use on Earth, especially in buildings.
Accurate exposure classification tools are required to link exposure with health effects in epidemiological studies. Although long-term integrated exposure measurements are a critical component of exposure assessment, the ability to include these measurements into epidemiologic...
An Alternative Procedure for Estimating Unit Learning Curves,
1985-09-01
the model accurately describes the real-life situation, i.e., when the model is properly applied to the data, it can be a powerful tool for...predicting unit production costs. There are, however, some unique estimation problems inherent in the model . The usual method of generating predicted unit...production costs attempts to extend properties of least squares estimators to non- linear functions of these estimators. The result is biased estimates of
Dental History Predictors of Caries Related Dental Emergencies.
1981-11-01
10+) 50% of those with U- lesions would be selected and only 4% of those without disease would be selected. The accuracy of such a system as well as...sufficient sensitivity, specificity, and diagnostic power to be useful as predictive tools. Dental health classification systems are typically only...predicted with some reliability given the intimacy of the relationship and the relatively long duration of the pre-emergency state. The incidence of
Sugimoto, Masahiro; Takada, Masahiro; Toi, Masakazu
2014-12-09
Nomograms are a standard computational tool to predict the likelihood of an outcome using multiple available patient features. We have developed a more powerful data mining methodology, to predict axillary lymph node (AxLN) metastasis and response to neoadjuvant chemotherapy (NAC) in primary breast cancer patients. We developed websites to use these tools. The tools calculate the probability of AxLN metastasis (AxLN model) and pathological complete response to NAC (NAC model). As a calculation algorithm, we employed a decision tree-based prediction model known as the alternative decision tree (ADTree), which is an analog development of if-then type decision trees. An ensemble technique was used to combine multiple ADTree predictions, resulting in higher generalization abilities and robustness against missing values. The AxLN model was developed with training datasets (n=148) and test datasets (n=143), and validated using an independent cohort (n=174), yielding an area under the receiver operating characteristic curve (AUC) of 0.768. The NAC model was developed and validated with n=150 and n=173 datasets from a randomized controlled trial, yielding an AUC of 0.787. AxLN and NAC models require users to input up to 17 and 16 variables, respectively. These include pathological features, including human epidermal growth factor receptor 2 (HER2) status and imaging findings. Each input variable has an option of "unknown," to facilitate prediction for cases with missing values. The websites developed facilitate the use of these tools, and serve as a database for accumulating new datasets.
Analysis of LH Launcher Arrays (Like the ITER One) Using the TOPLHA Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maggiora, R.; Milanesio, D.; Vecchi, G.
2009-11-26
TOPLHA (Torino Polytechnic Lower Hybrid Antenna) code is an innovative tool for the 3D/1D simulation of Lower Hybrid (LH) antennas, i.e. accounting for realistic 3D waveguides geometry and for accurate 1D plasma models, and without restrictions on waveguide shape, including curvature. This tool provides a detailed performances prediction of any LH launcher, by computing the antenna scattering parameters, the current distribution, electric field maps and power spectra for any user-specified waveguide excitation. In addition, a fully parallelized and multi-cavity version of TOPLHA permits the analysis of large and complex waveguide arrays in a reasonable simulation time. A detailed analysis ofmore » the performances of the proposed ITER LH antenna geometry has been carried out, underlining the strong dependence of the antenna input parameters with respect to plasma conditions. A preliminary optimization of the antenna dimensions has also been accomplished. Electric current distribution on conductors, electric field distribution at the interface with plasma, and power spectra have been calculated as well. The analysis shows the strong capabilities of the TOPLHA code as a predictive tool and its usefulness to LH launcher arrays detailed design.« less
Althouse, Linda A; McGuinness, Gail A
2008-09-01
This study investigates the predictive validity of the In-Training Examination (ITE). Although studies have confirmed the predictive validity of ITEs in other medical specialties, no study has been done for general pediatrics. Each year, residents in accredited pediatric training programs take the ITE as a self-assessment instrument. The ITE is similar to the American Board of Pediatrics General Pediatrics Certifying Examination. First-time takers of the certifying examination over a 5-year period who took at least 1 ITE examination were included in the sample. Regression models analyzed the predictive value of the ITE. The predictive power of the ITE in the first training year is minimal. However, the predictive power of the ITE increases each year, providing the greatest power in the third year of training. Even though ITE scores provide information regarding the likelihood of passing the certification examination, the data should be used with caution, particularly in the first training year. Other factors also must be considered when predicting performance on the certification examination. This study continues to support the ITE as an assessment tool for program directors, as well as a means of providing residents with feedback regarding their acquisition of pediatric knowledge.
Price dynamics in political prediction markets
Majumder, Saikat Ray; Diermeier, Daniel; Rietz, Thomas A.; Amaral, Luís A. Nunes
2009-01-01
Prediction markets, in which contract prices are used to forecast future events, are increasingly applied to various domains ranging from political contests to scientific breakthroughs. However, the dynamics of such markets are not well understood. Here, we study the return dynamics of the oldest, most data-rich prediction markets, the Iowa Electronic Presidential Election “winner-takes-all” markets. As with other financial markets, we find uncorrelated returns, power-law decaying volatility correlations, and, usually, power-law decaying distributions of returns. However, unlike other financial markets, we find conditional diverging volatilities as the contract settlement date approaches. We propose a dynamic binary option model that captures all features of the empirical data and can potentially provide a tool with which one may extract true information events from a price time series. PMID:19155442
GIS and crop simulation modelling applications in climate change research
USDA-ARS?s Scientific Manuscript database
The challenges that climate change presents humanity require an unprecedented ability to predict the responses of crops to environment and management. Geographic information systems (GIS) and crop simulation models are two powerful and highly complementary tools that are increasingly used for such p...
Application of TREECS (trademark) to Strontium 90 for Borschi Watershed near Chernobyl, Ukraine
2012-08-01
near Chernobyl , Ukraine by Mark S. Dortch PURPOSE: The Training Range Environmental Evaluation and Characterization System (TREECS™) (http... Chernobyl Nuclear Power Plant, Ukraine. At this site, TREECS™ was used as a modeling tool to predict the fate of radionuclides. This application also...Web site noted above. Borschi watershed is located 3 km south of the Chernobyl Nuclear Power Plant (Figure 1). Radio- strontium-90, 90Sr, which is a
NASA Technical Reports Server (NTRS)
Barber, Peter W.; Demerdash, Nabeel A. O.; Hurysz, B.; Luo, Z.; Denny, Hugh W.; Millard, David P.; Herkert, R.; Wang, R.
1992-01-01
The goal of this research project was to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom. The approach consists of four steps: (1) developing analytical tools (models and computer programs); (2) conducting parameterization (what if?) studies; (3) predicting the global space station EMI environment; and (4) providing a basis for modification of EMI standards.
Towards a National Space Weather Predictive Capability
NASA Astrophysics Data System (ADS)
Fox, N. J.; Ryschkewitsch, M. G.; Merkin, V. G.; Stephens, G. K.; Gjerloev, J. W.; Barnes, R. J.; Anderson, B. J.; Paxton, L. J.; Ukhorskiy, A. Y.; Kelly, M. A.; Berger, T. E.; Bonadonna, L. C. M. F.; Hesse, M.; Sharma, S.
2015-12-01
National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review the space weather system developed for the Van Allen Probes mission, together with other datasets, tools and models that have resulted from research by scientists at JHU/APL. We will look at how these, and results from future missions such as Solar Probe Plus, could be applied to support space weather applications in coordination with other community assets and capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adigun, Babatunde John; Fensin, Michael Lorne; Galloway, Jack D.
Our burnup study examined the effect of a predicted critical control rod position on the nuclide predictability of several axial and radial locations within a 4×4 graphite moderated gas cooled reactor fuel cluster geometry. To achieve this, a control rod position estimator (CRPE) tool was developed within the framework of the linkage code Monteburns between the transport code MCNP and depletion code CINDER90, and four methodologies were proposed within the tool for maintaining criticality. Two of the proposed methods used an inverse multiplication approach - where the amount of fissile material in a set configuration is slowly altered until criticalitymore » is attained - in estimating the critical control rod position. Another method carried out several MCNP criticality calculations at different control rod positions, then used a linear fit to estimate the critical rod position. The final method used a second-order polynomial fit of several MCNP criticality calculations at different control rod positions to guess the critical rod position. The results showed that consistency in prediction of power densities as well as uranium and plutonium isotopics was mutual among methods within the CRPE tool that predicted critical position consistently well. Finall, while the CRPE tool is currently limited to manipulating a single control rod, future work could be geared toward implementing additional criticality search methodologies along with additional features.« less
On the universality of power laws for tokamak plasma predictions
NASA Astrophysics Data System (ADS)
Garcia, J.; Cambon, D.; Contributors, JET
2018-02-01
Significant deviations from well established power laws for the thermal energy confinement time, obtained from extensive databases analysis as the IPB98(y,2), have been recently reported in dedicated power scans. In order to illuminate the adequacy, validity and universality of power laws as tools for predicting plasma performance, a simplified analysis has been carried out in the framework of a minimal modeling for heat transport which is, however, able to account for the interplay between turbulence and collinear effects with the input power known to play a role in experiments with significant deviations from such power laws. Whereas at low powers, the usual scaling laws are recovered with little influence of other plasma parameters, resulting in a robust power low exponent, at high power it is shown how the exponents obtained are extremely sensitive to the heating deposition, the q-profile or even the sampling or the number of points considered due to highly non-linear behavior of the heat transport. In particular circumstances, even a minimum of the thermal energy confinement time with the input power can be obtained, which means that the approach of the energy confinement time as a power law might be intrinsically invalid. Therefore plasma predictions with a power law approximation with a constant exponent obtained from a regression of a broad range of powers and other plasma parameters which can non-linearly affect and suppress heat transport, can lead to misleading results suggesting that this approach should be taken cautiously and its results continuously compared with modeling which can properly capture the underline physics, as gyrokinetic simulations.
Two dimensional finite element thermal model of laser surface glazing for H13 tool steel
NASA Astrophysics Data System (ADS)
Kabir, I. R.; Yin, D.; Naher, S.
2016-10-01
A two dimensional (2D) transient thermal model with line-heat-source was developed by Finite Element Method (FEM) for laser surface glazing of H13 tool steel using commercial software-ANSYS 15. The geometry of the model was taken as a transverse circular cross-section of cylindrical specimen. Two different power levels (300W, 200W) were used with 0.2mm width of laser beam and 0.15ms exposure time. Temperature distribution, heating and cooling rates, and the dimensions of modified surface were analysed. The maximum temperatures achieved were 2532K (2259°C) and 1592K (1319°C) for laser power 300W and 200W respectively. The maximum cooling rates were 4.2×107 K/s for 300W and 2×107 K/s for 200W. Depths of modified zone increased with increasing laser power. From this analysis, it can be predicted that for 0.2mm beam width and 0.15ms time exposer melting temperature of H13 tool steel is achieved within 200-300W power range of laser beam in laser surface glazing.
NASA Technical Reports Server (NTRS)
Thresher, R. W. (Editor)
1981-01-01
Recent progress in the analysis and prediction of the dynamic behavior of wind turbine generators is discussed. The following areas were addressed: (1) the adequacy of state of the art analysis tools for designing the next generation of wind power systems; (2) the use of state of the art analysis tools designers; and (3) verifications of theory which might be lacking or inadequate. Summaries of these informative discussions as well as the questions and answers which followed each paper are documented in the proceedings.
Advanced Neutronics Tools for BWR Design Calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santamarina, A.; Hfaiedh, N.; Letellier, R.
2006-07-01
This paper summarizes the developments implemented in the new APOLLO2.8 neutronics tool to meet the required target accuracy in LWR applications, particularly void effects and pin-by-pin power map in BWRs. The Method Of Characteristics was developed to allow efficient LWR assembly calculations in 2D-exact heterogeneous geometry; resonant reaction calculation was improved by the optimized SHEM-281 group mesh, which avoids resonance self-shielding approximation below 23 eV, and the new space-dependent method for resonant mixture that accounts for resonance overlapping. Furthermore, a new library CEA2005, processed from JEFF3.1 evaluations involving feedback from Critical Experiments and LWR P.I.E, is used. The specific '2005-2007more » BWR Plan' settled to demonstrate the validation/qualification of this neutronics tool is described. Some results from the validation process are presented: the comparison of APOLLO2.8 results to reference Monte Carlo TRIPOLI4 results on specific BWR benchmarks emphasizes the ability of the deterministic tool to calculate BWR assembly multiplication factor within 200 pcm accuracy for void fraction varying from 0 to 100%. The qualification process against the BASALA mock-up experiment stresses APOLLO2.8/CEA2005 performances: pin-by-pin power is always predicted within 2% accuracy, reactivity worth of B4C or Hf cruciform control blade, as well as Gd pins, is predicted within 1.2% accuracy. (authors)« less
In vitro models for the prediction of in vivo performance of oral dosage forms.
Kostewicz, Edmund S; Abrahamsson, Bertil; Brewster, Marcus; Brouwers, Joachim; Butler, James; Carlert, Sara; Dickinson, Paul A; Dressman, Jennifer; Holm, René; Klein, Sandra; Mann, James; McAllister, Mark; Minekus, Mans; Muenster, Uwe; Müllertz, Anette; Verwei, Miriam; Vertzoni, Maria; Weitschies, Werner; Augustijns, Patrick
2014-06-16
Accurate prediction of the in vivo biopharmaceutical performance of oral drug formulations is critical to efficient drug development. Traditionally, in vitro evaluation of oral drug formulations has focused on disintegration and dissolution testing for quality control (QC) purposes. The connection with in vivo biopharmaceutical performance has often been ignored. More recently, the switch to assessing drug products in a more biorelevant and mechanistic manner has advanced the understanding of drug formulation behavior. Notwithstanding this evolution, predicting the in vivo biopharmaceutical performance of formulations that rely on complex intraluminal processes (e.g. solubilization, supersaturation, precipitation…) remains extremely challenging. Concomitantly, the increasing demand for complex formulations to overcome low drug solubility or to control drug release rates urges the development of new in vitro tools. Development and optimizing innovative, predictive Oral Biopharmaceutical Tools is the main target of the OrBiTo project within the Innovative Medicines Initiative (IMI) framework. A combination of physico-chemical measurements, in vitro tests, in vivo methods, and physiology-based pharmacokinetic modeling is expected to create a unique knowledge platform, enabling the bottlenecks in drug development to be removed and the whole process of drug development to become more efficient. As part of the basis for the OrBiTo project, this review summarizes the current status of predictive in vitro assessment tools for formulation behavior. Both pharmacopoeia-listed apparatus and more advanced tools are discussed. Special attention is paid to major issues limiting the predictive power of traditional tools, including the simulation of dynamic changes in gastrointestinal conditions, the adequate reproduction of gastrointestinal motility, the simulation of supersaturation and precipitation, and the implementation of the solubility-permeability interplay. It is anticipated that the innovative in vitro biopharmaceutical tools arising from the OrBiTo project will lead to improved predictions for in vivo behavior of drug formulations in the GI tract. Copyright © 2013 Elsevier B.V. All rights reserved.
Parameter Estimation for a Turbulent Buoyant Jet Using Approximate Bayesian Computation
NASA Astrophysics Data System (ADS)
Christopher, Jason D.; Wimer, Nicholas T.; Hayden, Torrey R. S.; Lapointe, Caelan; Grooms, Ian; Rieker, Gregory B.; Hamlington, Peter E.
2016-11-01
Approximate Bayesian Computation (ABC) is a powerful tool that allows sparse experimental or other "truth" data to be used for the prediction of unknown model parameters in numerical simulations of real-world engineering systems. In this presentation, we introduce the ABC approach and then use ABC to predict unknown inflow conditions in simulations of a two-dimensional (2D) turbulent, high-temperature buoyant jet. For this test case, truth data are obtained from a simulation with known boundary conditions and problem parameters. Using spatially-sparse temperature statistics from the 2D buoyant jet truth simulation, we show that the ABC method provides accurate predictions of the true jet inflow temperature. The success of the ABC approach in the present test suggests that ABC is a useful and versatile tool for engineering fluid dynamics research.
Laser Powered Launch Vehicle Performance Analyses
NASA Technical Reports Server (NTRS)
Chen, Yen-Sen; Liu, Jiwen; Wang, Ten-See (Technical Monitor)
2001-01-01
The purpose of this study is to establish the technical ground for modeling the physics of laser powered pulse detonation phenomenon. Laser powered propulsion systems involve complex fluid dynamics, thermodynamics and radiative transfer processes. Successful predictions of the performance of laser powered launch vehicle concepts depend on the sophisticate models that reflects the underlying flow physics including the laser ray tracing the focusing, inverse Bremsstrahlung (IB) effects, finite-rate air chemistry, thermal non-equilibrium, plasma radiation and detonation wave propagation, etc. The proposed work will extend the base-line numerical model to an efficient design analysis tool. The proposed model is suitable for 3-D analysis using parallel computing methods.
Evaluating the Power Consumption of Wireless Sensor Network Applications Using Models
Dâmaso, Antônio; Freitas, Davi; Rosa, Nelson; Silva, Bruno; Maciel, Paulo
2013-01-01
Power consumption is the main concern in developing Wireless Sensor Network (WSN) applications. Consequently, several strategies have been proposed for investigating the power consumption of this kind of application. These strategies can help to predict the WSN lifetime, provide recommendations to application developers and may optimize the energy consumed by the WSN applications. While measurement is a known and precise strategy for power consumption evaluation, it is very costly, tedious and may be unfeasible considering the (usual) large number of WSN nodes. Furthermore, due to the inherent dynamism of WSNs, the instrumentation required by measurement techniques makes difficult their use in several different scenarios. In this context, this paper presents an approach for evaluating the power consumption of WSN applications by using simulation models along with a set of tools to automate the proposed approach. Starting from a programming language code, we automatically generate consumption models used to predict the power consumption of WSN applications. In order to evaluate the proposed approach, we compare the results obtained by using the generated models against ones obtained by measurement. PMID:23486217
Evaluating the power consumption of wireless sensor network applications using models.
Dâmaso, Antônio; Freitas, Davi; Rosa, Nelson; Silva, Bruno; Maciel, Paulo
2013-03-13
Power consumption is the main concern in developing Wireless Sensor Network (WSN) applications. Consequently, several strategies have been proposed for investigating the power consumption of this kind of application. These strategies can help to predict the WSN lifetime, provide recommendations to application developers and may optimize the energy consumed by the WSN applications. While measurement is a known and precise strategy for power consumption evaluation, it is very costly, tedious and may be unfeasible considering the (usual) large number of WSN nodes. Furthermore, due to the inherent dynamism of WSNs, the instrumentation required by measurement techniques makes difficult their use in several different scenarios. In this context, this paper presents an approach for evaluating the power consumption of WSN applications by using simulation models along with a set of tools to automate the proposed approach. Starting from a programming language code, we automatically generate consumption models used to predict the power consumption of WSN applications. In order to evaluate the proposed approach, we compare the results obtained by using the generated models against ones obtained by measurement.
Unified Performance and Power Modeling of Scientific Workloads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Shuaiwen; Barker, Kevin J.; Kerbyson, Darren J.
2013-11-17
It is expected that scientific applications executing on future large-scale HPC must be optimized not only in terms of performance, but also in terms of power consumption. As power and energy become increasingly constrained resources, researchers and developers must have access to tools that will allow for accurate prediction of both performance and power consumption. Reasoning about performance and power consumption in concert will be critical for achieving maximum utilization of limited resources on future HPC systems. To this end, we present a unified performance and power model for the Nek-Bone mini-application developed as part of the DOE's CESAR Exascalemore » Co-Design Center. Our models consider the impact of computation, point-to-point communication, and collective communication« less
Determinants of the Pace of Global Innovation in Energy Technologies
2013-10-14
quality (see Figures S1 and S2 in File S1), a comprehensive patent database is a powerful tool for investigating the determinants of innovative...model in order to avoid overfitting the data and to maximize predictive power . We develop a model that explains the observed trends in energy...patents. (A.) World map of cumulative patents in photovoltaics (solar). Japan is the leading nation in terms of patent numbers, followed by the US and China
Predicted and Measured Modal Sound Power Levels for a Fan Ingesting Distorted Inflow
NASA Technical Reports Server (NTRS)
Koch, L. Danielle
2010-01-01
Refinements have been made to a method for estimating the modal sound power levels of a ducted fan ingesting distorted inflow. By assuming that each propagating circumferential mode consists only of a single radial mode (the one with the highest cut-off ratio), circumferential mode sound power levels can be computed for a variety of inflow distortion patterns and operating speeds. Predictions from the refined theory have been compared to data from an experiment conducted in the Advanced Noise Control Fan at NASA Glenn Research Center. The inflow to the fan was distorted by inserting cylindrical rods radially into the inlet duct. The rods were placed at an axial location one rotor chord length upstream of the fan and arranged in both regular and irregular circumferential patterns. The fan was operated at 2000, 1800, and 1400 rpm. Acoustic pressure levels were measured in the fan inlet and exhaust ducts using the Rotating Rake fan mode measurement system. Far field sound pressure levels were also measured. It is shown that predicted trends in circumferential mode sound power levels closely match the experimental data for all operating speeds and distortion configurations tested. Insight gained through this work is being used to develop more advanced tools for predicting fan inflow distortion tone noise levels.
USDA-ARS?s Scientific Manuscript database
To address the multiple challenges to food security posed by global climate change, population growth and rising incomes, plant breeders are developing new crop varieties that can enhance both agricultural productivity and environmental sustainability. Current breeding practices, however, are unable...
Jessica Wright
2014-01-01
Combining data from provenance test studies with our current understanding of predicted climate change can be a powerful tool for informing reforestation efforts. However, the limitations of both sources of data need to be understood to develop an approach to ecological restoration that reduces risk and promotes the highest chance of successful reforestation.
FOUR Score Predicts Early Outcome in Patients After Traumatic Brain Injury.
Nyam, Tee-Tau Eric; Ao, Kam-Hou; Hung, Shu-Yu; Shen, Mei-Li; Yu, Tzu-Chieh; Kuo, Jinn-Rung
2017-04-01
The aim of the study was to determine whether the Full Outline of UnResponsiveness (FOUR) score, which includes eyes opening (E), motor function (M), brainstem reflex (B), and respiratory pattern (R), can be used as an alternate method to the Glasgow Coma Scale (GCS) in predicting intensive care unit (ICU) mortality in traumatic brain injury (TBI) patients. From January 2015 to June 2015, patients with isolated TBI admitted to the ICU were enrolled. Three advanced practice nurses administered the FOUR score, GCS, Acute Physiology and Chronic Health Evaluation II (APACHE II), and Therapeutic Intervention Scoring System (TISS) concurrently from ICU admissions. The endpoint of observation was mortality when the patients left the ICU. Data are presented as frequency with percentages, mean with standard deviation, or median with interquartile range. Each measurement tool used area under the receiver operating characteristic curve to compare the predictive power between these four tools. In addition, the difference between survival and death was estimated using the Wilcoxon rank sum test. From 55 TBI patients, males (72.73 %) were represented more than females, the mean age was 63.1 ± 17.9, and 19 of 55 observations (35 %) had a maximum FOUR score of 16. The overall mortality rate was 14.6 %. The area under the receiver operating characteristic curve was 74.47 % for the FOUR score, 74.73 % for the GCS, 81.78 % for the APACHE II, and 53.32 % for the TISS. The FOUR score has similar predictive power of mortality compared to the GCS and APACHE II. Each of the parameters-E, M, B, and R-of the FOUR score showed a significant difference between mortality and survival group, while the verbal and eye-opening components of the GCS did not. Having similar predictive power of mortality compared to the GCS and APACHE II, the FOUR score can be used as an alternative in the prediction of early mortality in TBI patients in the ICU.
Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikkel, D. J.; McCabe, J.
This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computationalmore » tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.« less
Jiang, Ai-Gui; Chen, Hong-Lin; Lu, Hui-Yu
2015-03-01
Previous studies have shown that Glasgow prognostic score (GPS) and prognostic index (PI) are also powerful prognostic tool for patients with advanced non-small cell lung cancer (NSCLC). The aim of this study was to compare the prognostic value between GPS and PI. We enrolled consecutive patients with advanced NSCLC in this prospective cohort. GPS and PI were calculated before the onset of chemotherapy. The prognosis outcomes included 1-, 3-, and 5-year progression-free survival and overall survival (OS). The performance of two scores in predicting prognosis was analyzed regarding discrimination and calibration. 138 patients were included in the study. The area under the receiver operating characteristic curve for GPS predicting 1-year DFS was 0.62 (95 % confidence interval (CI) 0.56-0.68, P < 0.05), and the area under curve for PI predicting 1-year DFS was 0.57 (95 % CI 0.52-0.63). Delong's test showed that GPS was more accurate than PI in predicting 1-year DFS (P < 0.05). Similar results of discriminatory power were found for predicting 3-year DFS, 1-year OS, and 3-year OS. The predicted 1-year DFS by GPS 0, GPS 1, and GPS 2 were 62.5, 42.1, and 23.1 %, respectively, while actual 1-year DFS by GPS 0, GPS 1, and GPS 2 were 61.1, 43.8, and 27.2 %, respectively. Calibration of the Hosmer and Lemeshow statistic showed good fit of the predicted 1-year DFS to the actual 1-year DFS by GPS (χ(2) = 4.326, P = 0.462), while no fit was found between the predicted 1-year DFS and the actual 1-year DFS by PI (χ(2) = 15.234, P = 0.091). Similar results of calibration power were found for predicting 3-year DFS, 5-year DFS, 1-year OS, 3-year OS, and 5-year OS by GPS and PI. GPS is more accurate than PI in predicting prognosis for patients with advanced NSCLC. GPS can be used as a useful and simple tool for predicting prognosis in patients with NSCLC. However, GPS only can be used for preliminary assessment because of low predicting accuracy.
NASA Airline Operations Research Center
NASA Technical Reports Server (NTRS)
Mogford, Richard H.
2016-01-01
This is a PowerPoint presentation NASA airline operations center (AOC) research. It includes information on using IBM Watson in the AOC. It also reviews a dispatcher decision support tool call the Flight Awareness Collaboration Tool (FACT). FACT gathers information about winter weather onto one screen and includes predictive abilities. It should prove to be useful for airline dispatchers and airport personnel when they manage winter storms and their effect on air traffic. This material is very similar to other previously approved presentations with the same title.
Power fluctuation reduction methodology for the grid-connected renewable power systems
NASA Astrophysics Data System (ADS)
Aula, Fadhil T.; Lee, Samuel C.
2013-04-01
This paper presents a new methodology for eliminating the influence of the power fluctuations of the renewable power systems. The renewable energy, which is to be considered an uncertain and uncontrollable resource, can only provide irregular electrical power to the power grid. This irregularity creates fluctuations of the generated power from the renewable power systems. These fluctuations cause instability to the power system and influence the operation of conventional power plants. Overall, the power system is vulnerable to collapse if necessary actions are not taken to reduce the impact of these fluctuations. This methodology aims at reducing these fluctuations and makes the generated power capability for covering the power consumption. This requires a prediction tool for estimating the generated power in advance to provide the range and the time of occurrence of the fluctuations. Since most of the renewable energies are weather based, as a result a weather forecast technique will be used for predicting the generated power. The reduction of the fluctuation also requires stabilizing facilities to maintain the output power at a desired level. In this study, a wind farm and a photovoltaic array as renewable power systems and a pumped-storage and batteries as stabilizing facilities are used, since they are best suitable for compensating the fluctuations of these types of power suppliers. As an illustrative example, a model of wind and photovoltaic power systems with battery energy and pumped hydro storage facilities for power fluctuation reduction is included, and its power fluctuation reduction is verified through simulation.
NASA Technical Reports Server (NTRS)
Kontos, Karen B.; Kraft, Robert E.; Gliebe, Philip R.
1996-01-01
The Aircraft Noise Predication Program (ANOPP) is an industry-wide tool used to predict turbofan engine flyover noise in system noise optimization studies. Its goal is to provide the best currently available methods for source noise prediction. As part of a program to improve the Heidmann fan noise model, models for fan inlet and fan exhaust noise suppression estimation that are based on simple engine and acoustic geometry inputs have been developed. The models can be used to predict sound power level suppression and sound pressure level suppression at a position specified relative to the engine inlet.
Sulovari, Arvis; Li, Dawei
2014-07-19
Genome-wide association studies (GWAS) have successfully identified genes associated with complex human diseases. Although much of the heritability remains unexplained, combining single nucleotide polymorphism (SNP) genotypes from multiple studies for meta-analysis will increase the statistical power to identify new disease-associated variants. Meta-analysis requires same allele definition (nomenclature) and genome build among individual studies. Similarly, imputation, commonly-used prior to meta-analysis, requires the same consistency. However, the genotypes from various GWAS are generated using different genotyping platforms, arrays or SNP-calling approaches, resulting in use of different genome builds and allele definitions. Incorrect assumptions of identical allele definition among combined GWAS lead to a large portion of discarded genotypes or incorrect association findings. There is no published tool that predicts and converts among all major allele definitions. In this study, we have developed a tool, GACT, which stands for Genome build and Allele definition Conversion Tool, that predicts and inter-converts between any of the common SNP allele definitions and between the major genome builds. In addition, we assessed several factors that may affect imputation quality, and our results indicated that inclusion of singletons in the reference had detrimental effects while ambiguous SNPs had no measurable effect. Unexpectedly, exclusion of genotypes with missing rate > 0.001 (40% of study SNPs) showed no significant decrease of imputation quality (even significantly higher when compared to the imputation with singletons in the reference), especially for rare SNPs. GACT is a new, powerful, and user-friendly tool with both command-line and interactive online versions that can accurately predict, and convert between any of the common allele definitions and between genome builds for genome-wide meta-analysis and imputation of genotypes from SNP-arrays or deep-sequencing, particularly for data from the dbGaP and other public databases. http://www.uvm.edu/genomics/software/gact.
Parameter Estimation for a Pulsating Turbulent Buoyant Jet Using Approximate Bayesian Computation
NASA Astrophysics Data System (ADS)
Christopher, Jason; Wimer, Nicholas; Lapointe, Caelan; Hayden, Torrey; Grooms, Ian; Rieker, Greg; Hamlington, Peter
2017-11-01
Approximate Bayesian Computation (ABC) is a powerful tool that allows sparse experimental or other ``truth'' data to be used for the prediction of unknown parameters, such as flow properties and boundary conditions, in numerical simulations of real-world engineering systems. Here we introduce the ABC approach and then use ABC to predict unknown inflow conditions in simulations of a two-dimensional (2D) turbulent, high-temperature buoyant jet. For this test case, truth data are obtained from a direct numerical simulation (DNS) with known boundary conditions and problem parameters, while the ABC procedure utilizes lower fidelity large eddy simulations. Using spatially-sparse statistics from the 2D buoyant jet DNS, we show that the ABC method provides accurate predictions of true jet inflow parameters. The success of the ABC approach in the present test suggests that ABC is a useful and versatile tool for predicting flow information, such as boundary conditions, that can be difficult to determine experimentally.
NASA Astrophysics Data System (ADS)
Afan, Haitham Abdulmohsin; El-shafie, Ahmed; Mohtar, Wan Hanna Melini Wan; Yaseen, Zaher Mundher
2016-10-01
An accurate model for sediment prediction is a priority for all hydrological researchers. Many conventional methods have shown an inability to achieve an accurate prediction of suspended sediment. These methods are unable to understand the behaviour of sediment transport in rivers due to the complexity, noise, non-stationarity, and dynamism of the sediment pattern. In the past two decades, Artificial Intelligence (AI) and computational approaches have become a remarkable tool for developing an accurate model. These approaches are considered a powerful tool for solving any non-linear model, as they can deal easily with a large number of data and sophisticated models. This paper is a review of all AI approaches that have been applied in sediment modelling. The current research focuses on the development of AI application in sediment transport. In addition, the review identifies major challenges and opportunities for prospective research. Throughout the literature, complementary models superior to classical modelling.
Aggarwal, Gautam; Worthey, E A; McDonagh, Paul D; Myler, Peter J
2003-06-07
Seattle Biomedical Research Institute (SBRI) as part of the Leishmania Genome Network (LGN) is sequencing chromosomes of the trypanosomatid protozoan species Leishmania major. At SBRI, chromosomal sequence is annotated using a combination of trained and untrained non-consensus gene-prediction algorithms with ARTEMIS, an annotation platform with rich and user-friendly interfaces. Here we describe a methodology used to import results from three different protein-coding gene-prediction algorithms (GLIMMER, TESTCODE and GENESCAN) into the ARTEMIS sequence viewer and annotation tool. Comparison of these methods, along with the CODONUSAGE algorithm built into ARTEMIS, shows the importance of combining methods to more accurately annotate the L. major genomic sequence. An improvised and powerful tool for gene prediction has been developed by importing data from widely-used algorithms into an existing annotation platform. This approach is especially fruitful in the Leishmania genome project where there is large proportion of novel genes requiring manual annotation.
A cross-validation package driving Netica with python
Fienen, Michael N.; Plant, Nathaniel G.
2014-01-01
Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).
2015-01-01
Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7–30.6), 0.6 (0.3–0.9), and 4.7 (2.1–7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches. PMID:26504832
Mansfield, Theodore J; MacDonald Gibson, Jacqueline
2015-01-01
Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7-30.6), 0.6 (0.3-0.9), and 4.7 (2.1-7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches.
Adigun, Babatunde John; Fensin, Michael Lorne; Galloway, Jack D.; ...
2016-10-01
Our burnup study examined the effect of a predicted critical control rod position on the nuclide predictability of several axial and radial locations within a 4×4 graphite moderated gas cooled reactor fuel cluster geometry. To achieve this, a control rod position estimator (CRPE) tool was developed within the framework of the linkage code Monteburns between the transport code MCNP and depletion code CINDER90, and four methodologies were proposed within the tool for maintaining criticality. Two of the proposed methods used an inverse multiplication approach - where the amount of fissile material in a set configuration is slowly altered until criticalitymore » is attained - in estimating the critical control rod position. Another method carried out several MCNP criticality calculations at different control rod positions, then used a linear fit to estimate the critical rod position. The final method used a second-order polynomial fit of several MCNP criticality calculations at different control rod positions to guess the critical rod position. The results showed that consistency in prediction of power densities as well as uranium and plutonium isotopics was mutual among methods within the CRPE tool that predicted critical position consistently well. Finall, while the CRPE tool is currently limited to manipulating a single control rod, future work could be geared toward implementing additional criticality search methodologies along with additional features.« less
Convexity of Energy-Like Functions: Theoretical Results and Applications to Power System Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dvijotham, Krishnamurthy; Low, Steven; Chertkov, Michael
2015-01-12
Power systems are undergoing unprecedented transformations with increased adoption of renewables and distributed generation, as well as the adoption of demand response programs. All of these changes, while making the grid more responsive and potentially more efficient, pose significant challenges for power systems operators. Conventional operational paradigms are no longer sufficient as the power system may no longer have big dispatchable generators with sufficient positive and negative reserves. This increases the need for tools and algorithms that can efficiently predict safe regions of operation of the power system. In this paper, we study energy functions as a tool to designmore » algorithms for various operational problems in power systems. These have a long history in power systems and have been primarily applied to transient stability problems. In this paper, we take a new look at power systems, focusing on an aspect that has previously received little attention: Convexity. We characterize the domain of voltage magnitudes and phases within which the energy function is convex in these variables. We show that this corresponds naturally with standard operational constraints imposed in power systems. We show that power of equations can be solved using this approach, as long as the solution lies within the convexity domain. We outline various desirable properties of solutions in the convexity domain and present simple numerical illustrations supporting our results.« less
NASA Astrophysics Data System (ADS)
Boemer, Dominik; Ponthot, Jean-Philippe
2017-01-01
Discrete element method simulations of a 1:5-scale laboratory ball mill are presented in this paper to study the influence of the contact parameters on the charge motion and the power draw. The position density limit is introduced as an efficient mathematical tool to describe and to compare the macroscopic charge motion in different scenarios, i.a. with different values of the contact parameters. While the charge motion and the power draw are relatively insensitive to the stiffness and the damping coefficient of the linear spring-slider-damper contact law, the coefficient of friction has a strong influence since it controls the sliding propensity of the charge. Based on the experimental calibration and validation by charge motion photographs and power draw measurements, the descriptive and predictive capabilities of the position density limit and the discrete element method are demonstrated, i.e. the real position of the charge is precisely delimited by the respective position density limit and the power draw can be predicted with an accuracy of about 5 %.
Methods to Measure, Predict and Relate Friction, Wear and Fuel Economy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gravante, Steve; Fenske, George; Demas, Nicholas
High-fidelity measurements of the coefficient of friction and the parasitic friction power of the power cylinder components have been made for the Isuzu 5.2L 4H on-highway engine. In particular, measurements of the asperity friction coefficient were made with test coupons using Argonne National Lab’s (ANL) reciprocating test rig for the ring-on-liner and skirt-on-liner component pairs. These measurements correlated well with independent measurements made by Electro-Mechanical Associates (EMA). In addition, surface roughness measurements of the Isuzu components were made using white light interferometer (WLI). The asperity friction and surface characterization are key inputs to advanced CAE simulation tools such as RINGPAKmore » and PISDYN which are used to predict the friction power and wear rates of power cylinder components. Finally, motored friction tests were successfully performed to quantify the friction mean effective pressure (FMEP) of the power cylinder components for various oils (High viscosity 15W40, low viscosity 5W20 with friction modifier (FM) and specially blended oil containing consisting of PAO/ZDDP/MoDTC) at 25, 50, and 110°C.« less
NIR monitoring of in-service wood structures
Michela Zanetti; Timothy G. Rials; Douglas Rammer
2005-01-01
Near infrared spectroscopy (NIRS) was used to study a set of Southern Yellow Pine boards exposed to natural weathering for different periods of exposure time. This non-destructive spectroscopic technique is a very powerful tool to predict the weathering of wood when used in combination with multivariate analysis (Principal Component Analysis, PCA, and Projection to...
Mining LMS Data to Develop an "Early Warning System" for Educators: A Proof of Concept
ERIC Educational Resources Information Center
Macfadyen, Leah P.; Dawson, Shane
2010-01-01
Earlier studies have suggested that higher education institutions could harness the predictive power of Learning Management System (LMS) data to develop reporting tools that identify at-risk students and allow for more timely pedagogical interventions. This paper confirms and extends this proposition by providing data from an international…
Ideas for a Teaching Sequence for the Concept of Energy
ERIC Educational Resources Information Center
Duit, Reinders; Neumann, Knut
2014-01-01
The energy concept is one of the most important ideas for students to understand. Looking at phenomena through the lens of energy provides powerful tools to model, analyse and predict phenomena in the scientific disciplines. The cross-disciplinary nature of the energy concept enables students to look at phenomena from different angles, helping…
The application of nirvana to silvicultural studies
Chi-Leung So; Thomas Elder; Leslie Groom; John S. Kush; Jennifer Myszewski; Todd Shupe
2006-01-01
Previous results from this laboratory have shown that near infrared (NIR) spectroscopy, coupled with multivariate analysis, can be a powerful tool for the prediction of wood quality. While wood quality measurements are of utility, their determination can be both time and labor intensive, thus limiting their use where large sample sizes are concerned. This paper will...
High-fidelity modeling and impact footprint prediction for vehicle breakup analysis
NASA Astrophysics Data System (ADS)
Ling, Lisa
For decades, vehicle breakup analysis had been performed for space missions that used nuclear heater or power units in order to assess aerospace nuclear safety for potential launch failures leading to inadvertent atmospheric reentry. Such pre-launch risk analysis is imperative to assess possible environmental impacts, obtain launch approval, and for launch contingency planning. In order to accurately perform a vehicle breakup analysis, the analysis tool should include a trajectory propagation algorithm coupled with thermal and structural analyses and influences. Since such a software tool was not available commercially or in the public domain, a basic analysis tool was developed by Dr. Angus McRonald prior to this study. This legacy software consisted of low-fidelity modeling and had the capability to predict vehicle breakup, but did not predict the surface impact point of the nuclear component. Thus the main thrust of this study was to develop and verify the additional dynamics modeling and capabilities for the analysis tool with the objectives to (1) have the capability to predict impact point and footprint, (2) increase the fidelity in the prediction of vehicle breakup, and (3) reduce the effort and time required to complete an analysis. The new functions developed for predicting the impact point and footprint included 3-degrees-of-freedom trajectory propagation, the generation of non-arbitrary entry conditions, sensitivity analysis, and the calculation of impact footprint. The functions to increase the fidelity in the prediction of vehicle breakup included a panel code to calculate the hypersonic aerodynamic coefficients for an arbitrary-shaped body and the modeling of local winds. The function to reduce the effort and time required to complete an analysis included the calculation of node failure criteria. The derivation and development of these new functions are presented in this dissertation, and examples are given to demonstrate the new capabilities and the improvements made, with comparisons between the results obtained from the upgraded analysis tool and the legacy software wherever applicable.
Postma, E
2006-03-01
The ability to predict individual breeding values in natural populations with known pedigrees has provided a powerful tool to separate phenotypic values into their genetic and environmental components in a nonexperimental setting. This has allowed sophisticated analyses of selection, as well as powerful tests of evolutionary change and differentiation. To date, there has, however, been no evaluation of the reliability or potential limitations of the approach. In this article, I address these gaps. In particular, I emphasize the differences between true and predicted breeding values (PBVs), which as yet have largely been ignored. These differences do, however, have important implications for the interpretation of, firstly, the relationship between PBVs and fitness, and secondly, patterns in PBVs over time. I subsequently present guidelines I believe to be essential in the formulation of the questions addressed in studies using PBVs, and I discuss possibilities for future research.
Spanagel, Rainer
2017-01-01
In recent years, animal models in psychiatric research have been criticized for their limited translational value to the clinical situation. Failures in clinical trials have thus often been attributed to the lack of predictive power of preclinical animal models. Here, I argue that animal models of voluntary drug intake—under nonoperant and operant conditions—and addiction models based on the Diagnostic and Statistical Manual of Mental Disorders are crucial and informative tools for the identification of pathological mechanisms, target identification, and drug development. These models provide excellent face validity, and it is assumed that the neurochemical and neuroanatomical substrates involved in drug-intake behavior are similar in laboratory rodents and humans. Consequently, animal models of drug consumption and addiction provide predictive validity. This predictive power is best illustrated in alcohol research, in which three approved medications—acamprosate, naltrexone, and nalmefene—were developed by means of animal models and then successfully translated into the clinical situation. PMID:29302222
NASA Astrophysics Data System (ADS)
Kariniotakis, G.; Anemos Team
2003-04-01
Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for offshore wind farms taking into account advances in marine meteorology (interaction between wind and waves, coastal effects). The benefits from the use of satellite radar images for modeling local weather patterns are investigated. A next generation forecasting software, ANEMOS, will be developed to integrate the various models. The tool is enhanced by advanced Information Communication Technology (ICT) functionality and can operate both in stand alone, or remote mode, or be interfaced with standard Energy or Distribution Management Systems (EMS/DMS) systems. Contribution: The project provides an advanced technology for wind resource forecasting applicable in a large scale: at a single wind farm, regional or national level and for both interconnected and island systems. A major milestone is the on-line operation of the developed software by the participating utilities for onshore and offshore wind farms and the demonstration of the economic benefits. The outcome of the ANEMOS project will help consistently the increase of wind integration in two levels; in an operational level due to better management of wind farms, but also, it will contribute to increasing the installed capacity of wind farms. This is because accurate prediction of the resource reduces the risk of wind farm developers, who are then more willing to undertake new wind farm installations especially in a liberalized electricity market environment.
Agur, Zvia; Elishmereni, Moran; Kheifetz, Yuri
2014-01-01
Despite its great promise, personalized oncology still faces many hurdles, and it is increasingly clear that targeted drugs and molecular biomarkers alone yield only modest clinical benefit. One reason is the complex relationships between biomarkers and the patient's response to drugs, obscuring the true weight of the biomarkers in the overall patient's response. This complexity can be disentangled by computational models that integrate the effects of personal biomarkers into a simulator of drug-patient dynamic interactions, for predicting the clinical outcomes. Several computational tools have been developed for personalized oncology, notably evidence-based tools for simulating pharmacokinetics, Bayesian-estimated tools for predicting survival, etc. We describe representative statistical and mathematical tools, and discuss their merits, shortcomings and preliminary clinical validation attesting to their potential. Yet, the individualization power of mathematical models alone, or statistical models alone, is limited. More accurate and versatile personalization tools can be constructed by a new application of the statistical/mathematical nonlinear mixed effects modeling (NLMEM) approach, which until recently has been used only in drug development. Using these advanced tools, clinical data from patient populations can be integrated with mechanistic models of disease and physiology, for generating personal mathematical models. Upon a more substantial validation in the clinic, this approach will hopefully be applied in personalized clinical trials, P-trials, hence aiding the establishment of personalized medicine within the main stream of clinical oncology. © 2014 Wiley Periodicals, Inc.
Using Reanalysis Data for the Prediction of Seasonal Wind Turbine Power Losses Due to Icing
NASA Astrophysics Data System (ADS)
Burtch, D.; Mullendore, G. L.; Delene, D. J.; Storm, B.
2013-12-01
The Northern Plains region of the United States is home to a significant amount of potential wind energy. However, in winter months capturing this potential power is severely impacted by the meteorological conditions, in the form of icing. Predicting the expected loss in power production due to icing is a valuable parameter that can be used in wind turbine operations, determination of wind turbine site locations and long-term energy estimates which are used for financing purposes. Currently, losses due to icing must be estimated when developing predictions for turbine feasibility and financing studies, while icing maps, a tool commonly used in Europe, are lacking in the United States. This study uses the Modern-Era Retrospective Analysis for Research and Applications (MERRA) dataset in conjunction with turbine production data to investigate various methods of predicting seasonal losses (October-March) due to icing at two wind turbine sites located 121 km apart in North Dakota. The prediction of icing losses is based on temperature and relative humidity thresholds and is accomplished using three methods. For each of the three methods, the required atmospheric variables are determined in one of two ways: using industry-specific software to correlate anemometer data in conjunction with the MERRA dataset and using only the MERRA dataset for all variables. For each season, a percentage of the total expected generated power lost due to icing is determined and compared to observed losses from the production data. An optimization is performed in order to determine the relative humidity threshold that minimizes the difference between the predicted and observed values. Eight seasons of data are used to determine an optimal relative humidity threshold, and a further three seasons of data are used to test this threshold. Preliminary results have shown that the optimized relative humidity threshold for the northern turbine is higher than the southern turbine for all methods. For the three test seasons, the optimized thresholds tend to under-predict the icing losses. However, the threshold determined using boundary layer similarity theory most closely predicts the power losses due to icing versus the other methods. For the northern turbine, the average predicted power loss over the three seasons is 4.65 % while the observed power loss is 6.22 % (average difference of 1.57 %). For the southern turbine, the average predicted power loss and observed power loss over the same time period are 4.43 % and 6.16 %, respectively (average difference of 1.73 %). The three-year average, however, does not clearly capture the variability that exists season-to-season. On examination of each of the test seasons individually, the optimized relative humidity threshold methodology performs better than fixed power loss estimates commonly used in the wind energy industry.
A Network Selection Algorithm Considering Power Consumption in Hybrid Wireless Networks
NASA Astrophysics Data System (ADS)
Joe, Inwhee; Kim, Won-Tae; Hong, Seokjoon
In this paper, we propose a novel network selection algorithm considering power consumption in hybrid wireless networks for vertical handover. CDMA, WiBro, WLAN networks are candidate networks for this selection algorithm. This algorithm is composed of the power consumption prediction algorithm and the final network selection algorithm. The power consumption prediction algorithm estimates the expected lifetime of the mobile station based on the current battery level, traffic class and power consumption for each network interface card of the mobile station. If the expected lifetime of the mobile station in a certain network is not long enough compared the handover delay, this particular network will be removed from the candidate network list, thereby preventing unnecessary handovers in the preprocessing procedure. On the other hand, the final network selection algorithm consists of AHP (Analytic Hierarchical Process) and GRA (Grey Relational Analysis). The global factors of the network selection structure are QoS, cost and lifetime. If user preference is lifetime, our selection algorithm selects the network that offers longest service duration due to low power consumption. Also, we conduct some simulations using the OPNET simulation tool. The simulation results show that the proposed algorithm provides longer lifetime in the hybrid wireless network environment.
Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation
Biggs, Matthew B.; Papin, Jason A.
2013-01-01
Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool. PMID:24147108
Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.
Biggs, Matthew B; Papin, Jason A
2013-01-01
Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.
Attention in the predictive mind.
Ransom, Madeleine; Fazelpour, Sina; Mole, Christopher
2017-01-01
It has recently become popular to suggest that cognition can be explained as a process of Bayesian prediction error minimization. Some advocates of this view propose that attention should be understood as the optimization of expected precisions in the prediction-error signal (Clark, 2013, 2016; Feldman & Friston, 2010; Hohwy, 2012, 2013). This proposal successfully accounts for several attention-related phenomena. We claim that it cannot account for all of them, since there are certain forms of voluntary attention that it cannot accommodate. We therefore suggest that, although the theory of Bayesian prediction error minimization introduces some powerful tools for the explanation of mental phenomena, its advocates have been wrong to claim that Bayesian prediction error minimization is 'all the brain ever does'. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Eck, M.; Mukunda, M.
1988-01-01
Given here are predictions of fragment velocities and azimuths resulting from the Space Transportation System Solid Rocket Motor range destruct, or random failure occurring at any time during the 120 seconds of Solid Rocket Motor burn. Results obtained using the analytical methods described showed good agreement between predictions and observations for two specific events. It was shown that these methods have good potential for use in predicting the fragmentation process of a number of generically similar casing systems. It was concluded that coupled Eulerian-Lagrangian calculational methods of the type described here provide a powerful tool for predicting Solid Rocket Motor response.
Using geostatistics to evaluate cleanup goals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marcon, M.F.; Hopkins, L.P.
1995-12-01
Geostatistical analysis is a powerful predictive tool typically used to define spatial variability in environmental data. The information from a geostatistical analysis using kriging, a geostatistical. tool, can be taken a step further to optimize sampling location and frequency and help quantify sampling uncertainty in both the remedial investigation and remedial design at a hazardous waste site. Geostatistics were used to quantify sampling uncertainty in attainment of a risk-based cleanup goal and determine the optimal sampling frequency necessary to delineate the horizontal extent of impacted soils at a Gulf Coast waste site.
Brabrand, Mikkel; Henriksen, Daniel Pilsgaard
2018-06-01
The CURB-65 score is widely implemented as a prediction tool for identifying patients with community-acquired pneumonia (cap) at increased risk of 30-day mortality. However, since most ingredients of CURB-65 are used as general prediction tools, it is likely that other prediction tools, e.g. the British National Early Warning Score (NEWS), could be as good as CURB-65 at predicting the fate of CAP patients. To determine whether NEWS is better than CURB-65 at predicting 30-day mortality of CAP patients. This was a single-centre, 6-month observational study using patients' vital signs and demographic information registered upon admission, survival status extracted from the Danish Civil Registration System after discharge and blood test results extracted from a local database. The study was conducted in the medical admission unit (MAU) at the Hospital of South West Jutland, a regional teaching hospital in Denmark. The participants consisted of 570 CAP patients, 291 female and 279 male, median age 74 (20-102) years. The CURB-65 score had a discriminatory power of 0.728 (0.667-0.789) and NEWS 0.710 (0.645-0.775), both with good calibration and no statistical significant difference. CURB-65 was not demonstrated to be significantly statistically better than NEWS at identifying CAP patients at risk of 30-day mortality.
A Noise and Emissions Assessment of the N3-X Transport
NASA Technical Reports Server (NTRS)
Berton, Jeffrey J.; Haller, William J.
2014-01-01
Analytical predictions of certification noise and exhaust emissions for NASA's N3-X - a notional, hybrid wingbody airplane - are presented in this paper. The N3-X is a 300-passenger concept transport propelled by an array of fans distributed spanwise near the trailing edge of the wingbody. These fans are driven by electric motors deriving power from twin generators driven by turboshaft engines. Turboelectric distributed hybrid propulsion has the potential to dramatically increase the propulsive efficiency of aircraft. The noise and exhaust emission estimates presented here are generated using NASA's conceptual design systems analysis tools with several key modifications to accommodate this unconventional architecture. These tools predict certification noise and the emissions of oxides of nitrogen by leveraging data generated from a recent analysis of the N3-X propulsion system.
Machine learning for the meta-analyses of microbial pathogens' volatile signatures.
Palma, Susana I C J; Traguedo, Ana P; Porteira, Ana R; Frias, Maria J; Gamboa, Hugo; Roque, Ana C A
2018-02-20
Non-invasive and fast diagnostic tools based on volatolomics hold great promise in the control of infectious diseases. However, the tools to identify microbial volatile organic compounds (VOCs) discriminating between human pathogens are still missing. Artificial intelligence is increasingly recognised as an essential tool in health sciences. Machine learning algorithms based in support vector machines and features selection tools were here applied to find sets of microbial VOCs with pathogen-discrimination power. Studies reporting VOCs emitted by human microbial pathogens published between 1977 and 2016 were used as source data. A set of 18 VOCs is sufficient to predict the identity of 11 microbial pathogens with high accuracy (77%), and precision (62-100%). There is one set of VOCs associated with each of the 11 pathogens which can predict the presence of that pathogen in a sample with high accuracy and precision (86-90%). The implemented pathogen classification methodology supports future database updates to include new pathogen-VOC data, which will enrich the classifiers. The sets of VOCs identified potentiate the improvement of the selectivity of non-invasive infection diagnostics using artificial olfaction devices.
Avi Bar Massada; Alexandra D. Syphard; Susan I. Stewart; Volker C. Radeloff
2012-01-01
Wildfire ignition distribution models are powerful tools for predicting the probability of ignitions across broad areas, and identifying their drivers. Several approaches have been used for ignition-distribution modelling, yet the performance of different model types has not been compared. This is unfortunate, given that conceptually similar species-distribution models...
Liang Wei; Marshall John; Jianwei Zhang; Hang Zhou; Robert Powers
2014-01-01
Models can be powerful tools for estimating forest productivity and guiding forest management, but their credibility and complexity are often an issue for forest managers. We parameterized a process-based forest growth model, 3-PG (Physiological Principles Predicting Growth), to simulate growth of ponderosa pine (Pinus ponderosa) plantations in...
Delinquency Level Classification Via the HEW Community Program Youth Impact Scales.
ERIC Educational Resources Information Center
Truckenmiller, James L.
The former HEW National Strategy for Youth Development (NSYD) model was created as a community-based planning and procedural tool to promote youth development and prevent delinquency. To assess the predictive power of NSYD Impact Scales in classifying youths into low, medium, and high delinquency levels, male and female students aged 10-19 years…
Patterns of covariance between forest stand and canopy structure in the Pacific Northwest.
Michael A. Lefsky; Andrew T. Hudak; Warren B. Cohen; S.A. Acker
2005-01-01
In the past decade, LIDAR (light detection and ranging) has emerged as a powerful tool for remotely sensing forest canopy and stand structure, including the estimation of aboveground biomass and carbon storage. Numerous papers have documented the use of LIDAR measurements to predict important aspects of forest stand structure, including aboveground biomass. Other...
Hansen, Bjoern Oest; Meyer, Etienne H; Ferrari, Camilla; Vaid, Neha; Movahedi, Sara; Vandepoele, Klaas; Nikoloski, Zoran; Mutwil, Marek
2018-03-01
Recent advances in gene function prediction rely on ensemble approaches that integrate results from multiple inference methods to produce superior predictions. Yet, these developments remain largely unexplored in plants. We have explored and compared two methods to integrate 10 gene co-function networks for Arabidopsis thaliana and demonstrate how the integration of these networks produces more accurate gene function predictions for a larger fraction of genes with unknown function. These predictions were used to identify genes involved in mitochondrial complex I formation, and for five of them, we confirmed the predictions experimentally. The ensemble predictions are provided as a user-friendly online database, EnsembleNet. The methods presented here demonstrate that ensemble gene function prediction is a powerful method to boost prediction performance, whereas the EnsembleNet database provides a cutting-edge community tool to guide experimentalists. © 2017 The Authors. New Phytologist © 2017 New Phytologist Trust.
Damude, S; Wevers, K P; Murali, R; Kruijff, S; Hoekstra, H J; Bastiaannet, E
2017-09-01
Completion lymph node dissection (CLND) in sentinel node (SN)-positive melanoma patients is accompanied with morbidity, while about 80% yield no additional metastases in non-sentinel nodes (NSNs). A prediction tool for NSN involvement could be of assistance in patient selection for CLND. This study investigated which parameters predict NSN-positivity, and whether the biomarker S-100B improves the accuracy of a prediction model. Recorded clinicopathologic factors were tested for their association with NSN-positivity in 110 SN-positive patients who underwent CLND. A prediction model was developed with multivariable logistic regression, incorporating all predictive factors. Five models were compared for their predictive power by calculating the Area Under the Curve (AUC). A weighted risk score, 'S-100B Non-Sentinel Node Risk Score' (SN-SNORS), was derived for the model with the highest AUC. Besides, a nomogram was developed as visual representation. NSN-positivity was present in 24 (21.8%) patients. Sex, ulceration, number of harvested SNs, number of positive SNs, and S-100B value were independently associated with NSN-positivity. The AUC for the model including all these factors was 0.78 (95%CI 0.69-0.88). SN-SNORS was the sum of scores for the five parameters. Scores of ≤9.5, 10-11.5, and ≥12 were associated with low (0%), intermediate (21.0%) and high (43.2%) risk of NSN involvement. A prediction tool based on five parameters, including the biomarker S-100B, showed accurate risk stratification for NSN-involvement in SN-positive melanoma patients. If validated in future studies, this tool could help to identify patients with low risk for NSN-involvement. Copyright © 2017 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.
The prediction of en route noise levels for a DC-9 aircraft
NASA Technical Reports Server (NTRS)
Weir, Donald S.
1988-01-01
En route noise for advanced propfan powered aircraft has become an issue of concern for the Federal Aviation Administration. The NASA Aircraft Noise Prediction Program (ANOPP) is used to demonstrate the source noise and propagation effects for an aircraft in level flight up to 35,000 feet altitude. One-third octave band spectra of the source noise, atmospheric absorption loss, and received noise are presented. The predicted maximum A-weighted sound pressure level is compared to measured data from the Aeronautical Research Institute of Sweden. ANOPP is shown to be an effective tool in evaluating the en route noise characteristics of a DC-9 aircraft.
Use of Transition Modeling to Enable the Computation of Losses for Variable-Speed Power Turbine
NASA Technical Reports Server (NTRS)
Ameri, Ali A.
2012-01-01
To investigate the penalties associated with using a variable speed power turbine (VSPT) in a rotorcraft capable of vertical takeoff and landing, various analysis tools are required. Such analysis tools must be able to model the flow accurately within the operating envelope of VSPT. For power turbines low Reynolds numbers and a wide range of the incidence angles, positive and negative, due to the variation in the shaft speed at relatively fixed corrected flows, characterize this envelope. The flow in the turbine passage is expected to be transitional and separated at high incidence. The turbulence model of Walters and Leylek was implemented in the NASA Glenn-HT code to enable a more accurate analysis of such flows. Two-dimensional heat transfer predictions of flat plate flow and two-dimensional and three-dimensional heat transfer predictions on a turbine blade were performed and reported herein. Heat transfer computations were performed because it is a good marker for transition. The final goal is to be able to compute the aerodynamic losses. Armed with the new transition model, total pressure losses for three-dimensional flow of an Energy Efficient Engine (E3) tip section cascade for a range of incidence angles were computed in anticipation of the experimental data. The results obtained form a loss bucket for the chosen blade.
Analysis and Design of Rotors at Ultra-Low Reynolds Numbers
NASA Technical Reports Server (NTRS)
Kunz, Peter J.; Strawn, Roger C.
2003-01-01
Design tools have been developed for ultra-low Reynolds number rotors, combining enhanced actuator-ring / blade-element theory with airfoil section data based on two-dimensional Navier-Stokes calculations. This performance prediction method is coupled with an optimizer for both design and analysis applications. Performance predictions from these tools have been compared with three-dimensional Navier Stokes analyses and experimental data for a 2.5 cm diameter rotor with chord Reynolds numbers below 10,000. Comparisons among the analyses and experimental data show reasonable agreement both in the global thrust and power required, but the spanwise distributions of these quantities exhibit significant deviations. The study also reveals that three-dimensional and rotational effects significantly change local airfoil section performance. The magnitude of this issue, unique to this operating regime, may limit the applicability of blade-element type methods for detailed rotor design at ultra-low Reynolds numbers, but these methods are still useful for evaluating concept feasibility and rapidly generating initial designs for further analysis and optimization using more advanced tools.
A new tool called DISSECT for analysing large genomic data sets using a Big Data approach
Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert
2015-01-01
Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010
Molecular beacon sequence design algorithm.
Monroe, W Todd; Haselton, Frederick R
2003-01-01
A method based on Web-based tools is presented to design optimally functioning molecular beacons. Molecular beacons, fluorogenic hybridization probes, are a powerful tool for the rapid and specific detection of a particular nucleic acid sequence. However, their synthesis costs can be considerable. Since molecular beacon performance is based on its sequence, it is imperative to rationally design an optimal sequence before synthesis. The algorithm presented here uses simple Microsoft Excel formulas and macros to rank candidate sequences. This analysis is carried out using mfold structural predictions along with other free Web-based tools. For smaller laboratories where molecular beacons are not the focus of research, the public domain algorithm described here may be usefully employed to aid in molecular beacon design.
Hériché, Jean-Karim; Lees, Jon G.; Morilla, Ian; Walter, Thomas; Petrova, Boryana; Roberti, M. Julia; Hossain, M. Julius; Adler, Priit; Fernández, José M.; Krallinger, Martin; Haering, Christian H.; Vilo, Jaak; Valencia, Alfonso; Ranea, Juan A.; Orengo, Christine; Ellenberg, Jan
2014-01-01
The advent of genome-wide RNA interference (RNAi)–based screens puts us in the position to identify genes for all functions human cells carry out. However, for many functions, assay complexity and cost make genome-scale knockdown experiments impossible. Methods to predict genes required for cell functions are therefore needed to focus RNAi screens from the whole genome on the most likely candidates. Although different bioinformatics tools for gene function prediction exist, they lack experimental validation and are therefore rarely used by experimentalists. To address this, we developed an effective computational gene selection strategy that represents public data about genes as graphs and then analyzes these graphs using kernels on graph nodes to predict functional relationships. To demonstrate its performance, we predicted human genes required for a poorly understood cellular function—mitotic chromosome condensation—and experimentally validated the top 100 candidates with a focused RNAi screen by automated microscopy. Quantitative analysis of the images demonstrated that the candidates were indeed strongly enriched in condensation genes, including the discovery of several new factors. By combining bioinformatics prediction with experimental validation, our study shows that kernels on graph nodes are powerful tools to integrate public biological data and predict genes involved in cellular functions of interest. PMID:24943848
Protein Structure Prediction by Protein Threading
NASA Astrophysics Data System (ADS)
Xu, Ying; Liu, Zhijie; Cai, Liming; Xu, Dong
The seminal work of Bowie, Lüthy, and Eisenberg (Bowie et al., 1991) on "the inverse protein folding problem" laid the foundation of protein structure prediction by protein threading. By using simple measures for fitness of different amino acid types to local structural environments defined in terms of solvent accessibility and protein secondary structure, the authors derived a simple and yet profoundly novel approach to assessing if a protein sequence fits well with a given protein structural fold. Their follow-up work (Elofsson et al., 1996; Fischer and Eisenberg, 1996; Fischer et al., 1996a,b) and the work by Jones, Taylor, and Thornton (Jones et al., 1992) on protein fold recognition led to the development of a new brand of powerful tools for protein structure prediction, which we now term "protein threading." These computational tools have played a key role in extending the utility of all the experimentally solved structures by X-ray crystallography and nuclear magnetic resonance (NMR), providing structural models and functional predictions for many of the proteins encoded in the hundreds of genomes that have been sequenced up to now.
CaFE: a tool for binding affinity prediction using end-point free energy methods.
Liu, Hui; Hou, Tingjun
2016-07-15
Accurate prediction of binding free energy is of particular importance to computational biology and structure-based drug design. Among those methods for binding affinity predictions, the end-point approaches, such as MM/PBSA and LIE, have been widely used because they can achieve a good balance between prediction accuracy and computational cost. Here we present an easy-to-use pipeline tool named Calculation of Free Energy (CaFE) to conduct MM/PBSA and LIE calculations. Powered by the VMD and NAMD programs, CaFE is able to handle numerous static coordinate and molecular dynamics trajectory file formats generated by different molecular simulation packages and supports various force field parameters. CaFE source code and documentation are freely available under the GNU General Public License via GitHub at https://github.com/huiliucode/cafe_plugin It is a VMD plugin written in Tcl and the usage is platform-independent. tingjunhou@zju.edu.cn. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Darmon, David
2018-03-01
In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.
Web tools for predictive toxicology model building.
Jeliazkova, Nina
2012-07-01
The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.
Lange, Rael T; Brickell, Tracey A; French, Louis M
2015-01-01
The purpose of this study was to examine the clinical utility of two validity scales designed for use with the Neurobehavioral Symptom Inventory (NSI) and the PTSD Checklist-Civilian Version (PCL-C); the Mild Brain Injury Atypical Symptoms Scale (mBIAS) and Validity-10 scale. Participants were 63 U.S. military service members (age: M = 31.9 years, SD = 12.5; 90.5% male) who sustained a mild traumatic brain injury (MTBI) and were prospectively enrolled from Walter Reed National Military Medical Center. Participants were divided into two groups based on the validity scales of the Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2-RF): (a) symptom validity test (SVT)-Fail (n = 24) and (b) SVT-Pass (n = 39). Participants were evaluated on average 19.4 months postinjury (SD = 27.6). Participants in the SVT-Fail group had significantly higher scores (p < .05) on the mBIAS (d = 0.85), Validity-10 (d = 1.89), NSI (d = 2.23), and PCL-C (d = 2.47), and the vast majority of the MMPI-2-RF scales (d = 0.69 to d = 2.47). Sensitivity, specificity, and predictive power values were calculated across the range of mBIAS and Validity-10 scores to determine the optimal cutoff to detect symptom exaggeration. For the mBIAS, a cutoff score of ≥8 was considered optimal, which resulted in low sensitivity (.17), high specificity (1.0), high positive predictive power (1.0), and moderate negative predictive power (.69). For the Validity-10 scale, a cutoff score of ≥13 was considered optimal, which resulted in moderate-high sensitivity (.63), high specificity (.97), and high positive (.93) and negative predictive power (.83). These findings provide strong support for the use of the Validity-10 as a tool to screen for symptom exaggeration when administering the NSI and PCL-C. The mBIAS, however, was not a reliable tool for this purpose and failed to identify the vast majority of people who exaggerated symptoms.
Hunter, Christopher L; Silvestri, Salvatore; Ralls, George; Stone, Amanda; Walker, Ayanna; Mangalat, Neal; Papa, Linda
2018-05-01
Early identification of sepsis significantly improves outcomes, suggesting a role for prehospital screening. An end-tidal carbon dioxide (ETCO 2 ) value ≤ 25 mmHg predicts mortality and severe sepsis when used as part of a prehospital screening tool. Recently, the Quick Sequential Organ Failure Assessment (qSOFA) score was also derived as a tool for predicting poor outcomes in potentially septic patients. We conducted a retrospective cohort study among patients transported by emergency medical services to compare the use of ETCO 2 ≤ 25 mmHg with qSOFA score of ≥ 2 as a predictor of mortality or diagnosis of severe sepsis in prehospital patients with suspected sepsis. By comparison of receiver operator characteristic curves, ETCO 2 had a higher discriminatory power to predict mortality, sepsis, and severe sepsis than qSOFA. Both non-invasive measures were easily obtainable by prehospital personnel, with ETCO 2 performing slightly better as an outcome predictor.
Chun, Ting Sie; Malek, M A; Ismail, Amelia Ritahani
2015-01-01
The development of effluent removal prediction is crucial in providing a planning tool necessary for the future development and the construction of a septic sludge treatment plant (SSTP), especially in the developing countries. In order to investigate the expected functionality of the required standard, the prediction of the effluent quality, namely biological oxygen demand, chemical oxygen demand and total suspended solid of an SSTP was modelled using an artificial intelligence approach. In this paper, we adopt the clonal selection algorithm (CSA) to set up a prediction model, with a well-established method - namely the least-square support vector machine (LS-SVM) as a baseline model. The test results of the case study showed that the prediction of the CSA-based SSTP model worked well and provided model performance as satisfactory as the LS-SVM model. The CSA approach shows that fewer control and training parameters are required for model simulation as compared with the LS-SVM approach. The ability of a CSA approach in resolving limited data samples, non-linear sample function and multidimensional pattern recognition makes it a powerful tool in modelling the prediction of effluent removals in an SSTP.
International Space Station Electric Power System Performance Code-SPACE
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey; McKissock, David; Fincannon, James; Green, Robert; Kerslake, Thomas; Delleur, Ann; Follo, Jeffrey; Trudell, Jeffrey; Hoffman, David J.; Jannette, Anthony;
2005-01-01
The System Power Analysis for Capability Evaluation (SPACE) software analyzes and predicts the minute-by-minute state of the International Space Station (ISS) electrical power system (EPS) for upcoming missions as well as EPS power generation capacity as a function of ISS configuration and orbital conditions. In order to complete the Certification of Flight Readiness (CoFR) process in which the mission is certified for flight each ISS System must thoroughly assess every proposed mission to verify that the system will support the planned mission operations; SPACE is the sole tool used to conduct these assessments for the power system capability. SPACE is an integrated power system model that incorporates a variety of modules tied together with integration routines and graphical output. The modules include orbit mechanics, solar array pointing/shadowing/thermal and electrical, battery performance, and power management and distribution performance. These modules are tightly integrated within a flexible architecture featuring data-file-driven configurations, source- or load-driven operation, and event scripting. SPACE also predicts the amount of power available for a given system configuration, spacecraft orientation, solar-array-pointing conditions, orbit, and the like. In the source-driven mode, the model must assure that energy balance is achieved, meaning that energy removed from the batteries must be restored (or balanced) each and every orbit. This entails an optimization scheme to ensure that energy balance is maintained without violating any other constraints.
ERIC Educational Resources Information Center
Koc, Mustafa
2012-01-01
This study explored (a) pre-service teachers' perceptions of using concept mapping (CM) in one of their pedagogical courses, (b) the predictive power of such implementation in course achievement, and (c) the role of prior experience with CM, type of mapping, and gender on their perceptions and performances in CM and achievement. The subjects were…
An experimental investigation of evolutionary dynamics in the Rock-Paper-Scissors game.
Hoffman, Moshe; Suetens, Sigrid; Gneezy, Uri; Nowak, Martin A
2015-03-06
Game theory describes social behaviors in humans and other biological organisms. By far, the most powerful tool available to game theorists is the concept of a Nash Equilibrium (NE), which is motivated by perfect rationality. NE specifies a strategy for everyone, such that no one would benefit by deviating unilaterally from his/her strategy. Another powerful tool available to game theorists are evolutionary dynamics (ED). Motivated by evolutionary and learning processes, ED specify changes in strategies over time in a population, such that more successful strategies typically become more frequent. A simple game that illustrates interesting ED is the generalized Rock-Paper-Scissors (RPS) game. The RPS game extends the children's game to situations where winning or losing can matter more or less relative to tying. Here we investigate experimentally three RPS games, where the NE is always to randomize with equal probability, but the evolutionary stability of this strategy changes. Consistent with the prediction of ED we find that aggregate behavior is far away from NE when it is evolutionarily unstable. Our findings add to the growing literature that demonstrates the predictive validity of ED in large-scale incentivized laboratory experiments with human subjects.
NASA Astrophysics Data System (ADS)
Ángel Prósper Fernández, Miguel; Casal, Carlos Otero; Canoura Fernández, Felipe; Miguez-Macho, Gonzalo
2017-04-01
Regional meteorological models are becoming a generalized tool for forecasting wind resource, due to their capacity to simulate local flow dynamics impacting wind farm production. This study focuses on the production forecast and validation of a real onshore wind farm using high horizontal and vertical resolution WRF (Weather Research and Forecasting) model simulations. The wind farm is located in Galicia, in the northwest of Spain, in a complex terrain region with high wind resource. Utilizing the Fitch scheme, specific for wind farms, a period of one year is simulated with a daily operational forecasting set-up. Power and wind predictions are obtained and compared with real data provided by the management company. Results show that WRF is able to yield good wind power operational predictions for this kind of wind farms, due to a good representation of the planetary boundary layer behaviour of the region and the good performance of the Fitch scheme under these conditions.
Challenges Facing Design and Analysis Tools
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)
2001-01-01
The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.
NASA Technical Reports Server (NTRS)
Posey, Joe W.; Dunn, M. H.; Farassat, F.
2004-01-01
This paper addresses two aspects of duct propagation and radiation which can contribute to more efficient fan noise predictions. First, we assess the effectiveness of Rayleigh's formula as a ducted fan noise prediction tool. This classical result which predicts the sound produced by a piston in a flanged duct is expanded to include the uniform axial inflow case. Radiation patterns using Rayleigh's formula with single radial mode input are compared to those obtained from the more precise ducted fan noise prediction code TBIEM3D. Agreement between the two methods is excellent in the peak noise regions both forward and aft. Next, we use TBIEM3D to calculate generalized radiation impedances and power transmission coefficients. These quantities are computed for a wide range of operating parameters. Results were obtained for higher Mach numbers, frequencies, and circumferential mode orders than have been previously published. Viewed as functions of frequency, calculated trends in lower order inlet impedances and power transmission coefficients are in agreement with known results. The relationships are more oscillatory for higher order modes and higher Mach numbers.
Dong, Jian-Jun; Li, Qing-Liang; Yin, Hua; Zhong, Cheng; Hao, Jun-Guang; Yang, Pan-Fei; Tian, Yu-Hong; Jia, Shi-Ru
2014-10-15
Sensory evaluation is regarded as a necessary procedure to ensure a reproducible quality of beer. Meanwhile, high-throughput analytical methods provide a powerful tool to analyse various flavour compounds, such as higher alcohol and ester. In this study, the relationship between flavour compounds and sensory evaluation was established by non-linear models such as partial least squares (PLS), genetic algorithm back-propagation neural network (GA-BP), support vector machine (SVM). It was shown that SVM with a Radial Basis Function (RBF) had a better performance of prediction accuracy for both calibration set (94.3%) and validation set (96.2%) than other models. Relatively lower prediction abilities were observed for GA-BP (52.1%) and PLS (31.7%). In addition, the kernel function of SVM played an essential role of model training when the prediction accuracy of SVM with polynomial kernel function was 32.9%. As a powerful multivariate statistics method, SVM holds great potential to assess beer quality. Copyright © 2014 Elsevier Ltd. All rights reserved.
Soukarieh, Omar; Gaildrat, Pascaline; Hamieh, Mohamad; Drouet, Aurélie; Baert-Desurmont, Stéphanie; Frébourg, Thierry; Tosi, Mario; Martins, Alexandra
2016-01-01
The identification of a causal mutation is essential for molecular diagnosis and clinical management of many genetic disorders. However, even if next-generation exome sequencing has greatly improved the detection of nucleotide changes, the biological interpretation of most exonic variants remains challenging. Moreover, particular attention is typically given to protein-coding changes often neglecting the potential impact of exonic variants on RNA splicing. Here, we used the exon 10 of MLH1, a gene implicated in hereditary cancer, as a model system to assess the prevalence of RNA splicing mutations among all single-nucleotide variants identified in a given exon. We performed comprehensive minigene assays and analyzed patient’s RNA when available. Our study revealed a staggering number of splicing mutations in MLH1 exon 10 (77% of the 22 analyzed variants), including mutations directly affecting splice sites and, particularly, mutations altering potential splicing regulatory elements (ESRs). We then used this thoroughly characterized dataset, together with experimental data derived from previous studies on BRCA1, BRCA2, CFTR and NF1, to evaluate the predictive power of 3 in silico approaches recently described as promising tools for pinpointing ESR-mutations. Our results indicate that ΔtESRseq and ΔHZEI-based approaches not only discriminate which variants affect splicing, but also predict the direction and severity of the induced splicing defects. In contrast, the ΔΨ-based approach did not show a compelling predictive power. Our data indicates that exonic splicing mutations are more prevalent than currently appreciated and that they can now be predicted by using bioinformatics methods. These findings have implications for all genetically-caused diseases. PMID:26761715
Kengkla, K; Charoensuk, N; Chaichana, M; Puangjan, S; Rattanapornsompong, T; Choorassamee, J; Wilairat, P; Saokaew, S
2016-05-01
Extended spectrum β-lactamase-producing Escherichia coli (ESBL-EC) has important implications for infection control and empiric antibiotic prescribing. This study aims to develop a risk scoring system for predicting ESBL-EC infection based on local epidemiology. The study retrospectively collected eligible patients with a positive culture for E. coli during 2011 to 2014. The risk scoring system was developed using variables independently associated with ESBL-EC infection through logistic regression-based prediction. Area under the receiver-operator characteristic curve (AuROC) was determined to confirm the prediction power of the model. Predictors for ESBL-EC infection were male gender [odds ratio (OR): 1.53], age ≥55 years (OR: 1.50), healthcare-associated infection (OR: 3.21), hospital-acquired infection (OR: 2.28), sepsis (OR: 1.79), prolonged hospitalization (OR: 1.88), history of ESBL infection within one year (OR: 7.88), prior use of broad-spectrum cephalosporins within three months (OR: 12.92), and prior use of other antibiotics within three months (OR: 2.14). Points scored ranged from 0 to 47, and were divided into three groups based on diagnostic performance parameters: low risk (score: 0-8; 44.57%), moderate risk (score: 9-11; 21.85%) and high risk (score: ≥12; 33.58%). The model displayed moderate power of prediction (AuROC: 0.773; 95% confidence interval: 0.742-0.805) and good calibration (Hosmer-Lemeshow χ(2) = 13.29; P = 0.065). This tool may optimize the prescribing of empirical antibiotic therapy, minimize time to identify patients, and prevent spreading of ESBL-EC. Prior to adoption into routine clinical practice, further validation study of the tool is needed. Copyright © 2016 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Disease Staging and Prognosis in Smokers Using Deep Learning in Chest Computed Tomography.
González, Germán; Ash, Samuel Y; Vegas-Sánchez-Ferrero, Gonzalo; Onieva Onieva, Jorge; Rahaghi, Farbod N; Ross, James C; Díaz, Alejandro; San José Estépar, Raúl; Washko, George R
2018-01-15
Deep learning is a powerful tool that may allow for improved outcome prediction. To determine if deep learning, specifically convolutional neural network (CNN) analysis, could detect and stage chronic obstructive pulmonary disease (COPD) and predict acute respiratory disease (ARD) events and mortality in smokers. A CNN was trained using computed tomography scans from 7,983 COPDGene participants and evaluated using 1,000 nonoverlapping COPDGene participants and 1,672 ECLIPSE participants. Logistic regression (C statistic and the Hosmer-Lemeshow test) was used to assess COPD diagnosis and ARD prediction. Cox regression (C index and the Greenwood-Nam-D'Agnostino test) was used to assess mortality. In COPDGene, the C statistic for the detection of COPD was 0.856. A total of 51.1% of participants in COPDGene were accurately staged and 74.95% were within one stage. In ECLIPSE, 29.4% were accurately staged and 74.6% were within one stage. In COPDGene and ECLIPSE, the C statistics for ARD events were 0.64 and 0.55, respectively, and the Hosmer-Lemeshow P values were 0.502 and 0.380, respectively, suggesting no evidence of poor calibration. In COPDGene and ECLIPSE, CNN predicted mortality with fair discrimination (C indices, 0.72 and 0.60, respectively), and without evidence of poor calibration (Greenwood-Nam-D'Agnostino P values, 0.307 and 0.331, respectively). A deep-learning approach that uses only computed tomography imaging data can identify those smokers who have COPD and predict who are most likely to have ARD events and those with the highest mortality. At a population level CNN analysis may be a powerful tool for risk assessment.
Surjadjaja, Claudia; Mayhew, Susannah H
2011-01-01
The relevance and importance of research for understanding policy processes and influencing policies has been much debated, but studies on the effectiveness of policy theories for predicting and informing opportunities for policy change (i.e. prospective policy analysis) are rare. The case study presented in this paper is drawn from a policy analysis of a contemporary process of policy debate on legalization of abortion in Indonesia, which was in flux at the time of the research and provided a unique opportunity for prospective analysis. Applying a combination of policy analysis theories, this case study provides an analysis of processes, power and relationships between actors involved in the amendment of the Health Law in Indonesia. It uses a series of practical stakeholder mapping tools to identify power relations between key actors and what strategic approaches should be employed to manage these to enhance the possibility of policy change. The findings show how the moves to legalize abortion have been supported or constrained according to the balance of political and religious powers operating in a macro-political context defined increasingly by a polarized Islamic-authoritarian—Western-liberal agenda. The issue of reproductive health constituted a battlefield where these two ideologies met and the debate on the current health law amendment became a contest, which still continues, for the larger future of Indonesia. The findings confirm the utility of policy analysis theories and stakeholder mapping tools for predicting the likelihood of policy change and informing the strategic approaches for achieving such change. They also highlight opportunities and dilemmas in prospective policy analysis and raise questions about whether research on policy processes and actors can or should be used to inform, or even influence, policies in ‘real-time’. PMID:21183461
Surjadjaja, Claudia; Mayhew, Susannah H
2011-09-01
The relevance and importance of research for understanding policy processes and influencing policies has been much debated, but studies on the effectiveness of policy theories for predicting and informing opportunities for policy change (i.e. prospective policy analysis) are rare. The case study presented in this paper is drawn from a policy analysis of a contemporary process of policy debate on legalization of abortion in Indonesia, which was in flux at the time of the research and provided a unique opportunity for prospective analysis. Applying a combination of policy analysis theories, this case study provides an analysis of processes, power and relationships between actors involved in the amendment of the Health Law in Indonesia. It uses a series of practical stakeholder mapping tools to identify power relations between key actors and what strategic approaches should be employed to manage these to enhance the possibility of policy change. The findings show how the moves to legalize abortion have been supported or constrained according to the balance of political and religious powers operating in a macro-political context defined increasingly by a polarized Islamic-authoritarian-Western-liberal agenda. The issue of reproductive health constituted a battlefield where these two ideologies met and the debate on the current health law amendment became a contest, which still continues, for the larger future of Indonesia. The findings confirm the utility of policy analysis theories and stakeholder mapping tools for predicting the likelihood of policy change and informing the strategic approaches for achieving such change. They also highlight opportunities and dilemmas in prospective policy analysis and raise questions about whether research on policy processes and actors can or should be used to inform, or even influence, policies in 'real-time'.
NASA Astrophysics Data System (ADS)
Pugh, Ray; Huff, Roy
1999-03-01
The importance of infrared (IR) technology and analysis in today's world of predictive maintenance and reliability- centered maintenance cannot be understated. The use of infrared is especially important in facilities that are required to maintain a high degree of equipment reliability because of plant or public safety concerns. As with all maintenance tools, particularly those used in predictive maintenance approaches, training plays a key role in their effectiveness and the benefit gained from their use. This paper details an effort to transfer IR technology to Soviet- designed nuclear power plants in Russia, Ukraine, and Lithuania. Delivery of this technology and post-delivery training activities have been completed recently at the Chornobyl nuclear power plant in Ukraine. Many interesting challenges were encountered during this effort. Hardware procurement and delivery of IR technology to a sensitive country were complicated by United States regulations. Freight and shipping infrastructure and host-country customs policies complicated hardware transport. Training activities were complicated by special hardware, software and training material translation needs, limited communication opportunities, and site logistical concerns. These challenges and others encountered while supplying the Chornobyl plant with state-of-the-art IR technology are described in this paper.
Burridge-Knopoff Model as an Educational and Demonstrational Tool in Seismicity Prediction
NASA Astrophysics Data System (ADS)
Kato, M.
2007-12-01
While our effort is ongoing, the fact that predicting destructive earthquakes is not a straightforward business is hard to sell to the general public. Japan is prone to two types of destructive earthquakes; interplate events along Japan Trench and Nankai Trough, and intraplate events that often occur beneath megacities. Periodicity of interplate earthquakes is usually explained by the elastic rebound theory, but we are aware that the historical seismicity along Nankai Trough is not simply periodic. Inland intraplate events have geologically postulated recurrence intervals that are far longer than human lifetime, and we do not have ample knowledge to model their behavior that includes interaction among intraplate and interplate events. To demonstrate that accumulation and release of elastic energy is complex even in a simple system, we propose to utilize the Burridge-Knopoff (BK) model as a demonstrational tool. This original one-dimensional model is easy to construct and handle so that this is also an effective educational tool for classroom use. Our simulator is a simple realization of the original one dimensional BK, which consists of small blocks, springs and a motor. Accumulation and release of strain is visibly observable, and by guessing when the next large events occur we are able to intuitively learn that observation of strain accumulation is only one element in predicting large events. Quantitative analysis of the system is also possible by measuring the movement of blocks. While the long term average of strain energy is controlled by the loading rate, observed seismicity is neither time-predictable nor slip-predictable. Time between successive events is never a constant. Distribution of released energy obeys the power law, similar to Ishimoto- Iida and Gutenberg-Richter Law. This tool is also useful in demonstration of nonlinear behavior of complex system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borges, Ronaldo C.; D'Auria, Francesco; Alvim, Antonio Carlos M.
2002-07-01
The Code with - the capability of - Internal Assessment of Uncertainty (CIAU) is a tool proposed by the 'Dipartimento di Ingegneria Meccanica, Nucleare e della Produzione (DIMNP)' of the University of Pisa. Other Institutions including the nuclear regulatory body from Brazil, 'Comissao Nacional de Energia Nuclear', contributed to the development of the tool. The CIAU aims at providing the currently available Relap5/Mod3.2 system code with the integrated capability of performing not only relevant transient calculations but also the related estimates of uncertainty bands. The Uncertainty Methodology based on Accuracy Extrapolation (UMAE) is used to characterize the uncertainty in themore » prediction of system code calculations for light water reactors and is internally coupled with the above system code. Following an overview of the CIAU development, the present paper deals with the independent qualification of the tool. The qualification test is performed by estimating the uncertainty bands that should envelope the prediction of the Angra 1 NPP transient RES-11. 99 originated by an inadvertent complete load rejection that caused the reactor scram when the unit was operating at 99% of nominal power. The current limitation of the 'error' database, implemented into the CIAU prevented a final demonstration of the qualification. However, all the steps for the qualification process are demonstrated. (authors)« less
Rincon, Sergio A; Paoletti, Anne
2016-01-01
Unveiling the function of a novel protein is a challenging task that requires careful experimental design. Yeast cytokinesis is a conserved process that involves modular structural and regulatory proteins. For such proteins, an important step is to identify their domains and structural organization. Here we briefly discuss a collection of methods commonly used for sequence alignment and prediction of protein structure that represent powerful tools for the identification homologous domains and design of structure-function approaches to test experimentally the function of multi-domain proteins such as those implicated in yeast cytokinesis.
A tool for modeling concurrent real-time computation
NASA Technical Reports Server (NTRS)
Sharma, D. D.; Huang, Shie-Rei; Bhatt, Rahul; Sridharan, N. S.
1990-01-01
Real-time computation is a significant area of research in general, and in AI in particular. The complexity of practical real-time problems demands use of knowledge-based problem solving techniques while satisfying real-time performance constraints. Since the demands of a complex real-time problem cannot be predicted (owing to the dynamic nature of the environment) powerful dynamic resource control techniques are needed to monitor and control the performance. A real-time computation model for a real-time tool, an implementation of the QP-Net simulator on a Symbolics machine, and an implementation on a Butterfly multiprocessor machine are briefly described.
Some results regarding stability of photovoltaic maximum-power-point tracking dc-dc converters
NASA Astrophysics Data System (ADS)
Schaefer, John F.
An analytical investigation of a class of photovoltaic (PV) maximum-power-point tracking dc-dc converters has yielded basic results relative to the stability of such devices. Necessary and sufficient conditions for stable operation are derived, and design tools are given. Specific results have been obtained for arbitrary PV arrays driving converters powering resistive loads and batteries. The analytical techniques are applicable to inverters, also. Portions of the theoretical results have been verified in operational devices: a 1500 watt unit has driven a 1-horsepower, 90-volt dc motor powering a water pump jack for over one year. Prior to modification shortly after initial installation, the unit exhibited instability at low levels of irradiance, as predicted by the theory. Two examples are provided.
Lee, Ciaran M; Cradick, Thomas J; Fine, Eli J; Bao, Gang
2016-01-01
The rapid advancement in targeted genome editing using engineered nucleases such as ZFNs, TALENs, and CRISPR/Cas9 systems has resulted in a suite of powerful methods that allows researchers to target any genomic locus of interest. A complementary set of design tools has been developed to aid researchers with nuclease design, target site selection, and experimental validation. Here, we review the various tools available for target selection in designing engineered nucleases, and for quantifying nuclease activity and specificity, including web-based search tools and experimental methods. We also elucidate challenges in target selection, especially in predicting off-target effects, and discuss future directions in precision genome editing and its applications. PMID:26750397
Metabolic network flux analysis for engineering plant systems.
Shachar-Hill, Yair
2013-04-01
Metabolic network flux analysis (NFA) tools have proven themselves to be powerful aids to metabolic engineering of microbes by providing quantitative insights into the flows of material and energy through cellular systems. The development and application of NFA tools to plant systems has advanced in recent years and are yielding significant insights and testable predictions. Plants present substantial opportunities for the practical application of NFA but they also pose serious challenges related to the complexity of plant metabolic networks and to deficiencies in our knowledge of their structure and regulation. By considering the tools available and selected examples, this article attempts to assess where and how NFA is most likely to have a real impact on plant biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.
Srinivasan, M; Shetty, N; Gadekari, S; Thunga, G; Rao, K; Kunhikatta, V
2017-07-01
Severity or mortality prediction of nosocomial pneumonia could aid in the effective triage of patients and assisting physicians. To compare various severity assessment scoring systems for predicting intensive care unit (ICU) mortality in nosocomial pneumonia patients. A prospective cohort study was conducted in a tertiary care university-affiliated hospital in Manipal, India. One hundred patients with nosocomial pneumonia, admitted in the ICUs who developed pneumonia after >48h of admission, were included. The Nosocomial Pneumonia Mortality Prediction (NPMP) model, developed in our hospital, was compared with Acute Physiology and Chronic Health Evaluation II (APACHE II), Mortality Probability Model II (MPM 72 II), Simplified Acute Physiology Score II (SAPS II), Multiple Organ Dysfunction Score (MODS), Sequential Organ Failure Assessment (SOFA), Clinical Pulmonary Infection Score (CPIS), Ventilator-Associated Pneumonia Predisposition, Insult, Response, Organ dysfunction (VAP-PIRO). Data and clinical variables were collected on the day of pneumonia diagnosis. The outcome for the study was ICU mortality. The sensitivity and specificity of the various scoring systems was analysed by plotting receiver operating characteristic (ROC) curves and computing the area under the curve for each of the mortality predicting tools. NPMP, APACHE II, SAPS II, MPM 72 II, SOFA, and VAP-PIRO were found to have similar and acceptable discrimination power as assessed by the area under the ROC curve. The AUC values for the above scores ranged from 0.735 to 0.762. CPIS and MODS showed least discrimination. NPMP is a specific tool to predict mortality in nosocomial pneumonia and is comparable to other standard scores. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Power-Production Diagnostic Tools for Low-Density Wind Farms with Applications to Wake Steering
NASA Astrophysics Data System (ADS)
Takle, E. S.; Herzmann, D.; Rajewski, D. A.; Lundquist, J. K.; Rhodes, M. E.
2016-12-01
Hansen (2011) provided guidelines for wind farm wake analysis with applications to "high density" wind farms (where average distance between turbines is less than ten times rotor diameter). For "low-density" (average distance greater than fifteen times rotor diameter) wind farms, or sections of wind farms we demonstrate simpler sorting and visualization tools that reveal wake interactions and opportunities for wind farm power prediction and wake steering. SCADA data from a segment of a large mid-continent wind farm, together with surface flux measurements and lidar data are subjected to analysis and visualization of wake interactions. A time-history animated visualization of a plan view of power level of individual turbines provides a quick analysis of wake interaction dynamics. Yaw-based sectoral histograms of enhancement/decline of wind speed and power from wind farm reference levels reveals angular width of wake interactions and identifies the turbine(s) responsible for the power reduction. Concurrent surface flux measurements within the wind farm allowed us to evaluate stability influence on wake loss. A one-season climatology is used to identify high-priority candidates for wake steering based on estimated power recovery. Typical clearing prices on the day-ahead market are used to estimate the added value of wake steering. Current research is exploring options for identifying candidate locations for wind farm "build-in" in existing low-density wind farms.
NASA Astrophysics Data System (ADS)
Kurihara, Osamu; Kim, Eunjoo; Kunishima, Naoaki; Tani, Kotaro; Ishikawa, Tetsuo; Furuyama, Kazuo; Hashimoto, Shozo; Akashi, Makoto
2017-09-01
A tool was developed to facilitate the calculation of the early internal doses to residents involved in the Fukushima Nuclear Disaster based on atmospheric transport and dispersion model (ATDM) simulations performed using Worldwide version of System for Prediction of Environmental Emergency Information 2nd version (WSPEEDI-II) together with personal behavior data containing the history of the whereabouts of individul's after the accident. The tool generates hourly-averaged air concentration data for the simulation grids nearest to an individual's whereabouts using WSPEEDI-II datasets for the subsequent calculation of internal doses due to inhalation. This paper presents an overview of the developed tool and provides tentative comparisons between direct measurement-based and ATDM-based results regarding the internal doses received by 421 persons from whom personal behavior data available.
Angioi, Manuela; Metsios, George S; Twitchett, Emily; Koutedakis, Yiannis; Wyon, Matthew
2009-01-01
The physical demands imposed on contemporary dancers by choreographers and performance schedules make their physical fitness just as important to them as skill development. Nevertheless, it remains to be confirmed which physical fitness components are associated with aesthetic competence. The aim of this study was to: 1. replicate and test a novel aesthetic competence tool for reliability, and 2. investigate the association between selected physical fitness components and aesthetic competence by using this new tool. Seventeen volunteers underwent a series of physical fitness tests (body composition, flexibility, muscular power and endurance, and aerobic capacity) and aesthetic competence assessments (seven individual criteria commonly used by selected dance companies). Inter-rater reliability of the aesthetic competence tool was very high (r = 0.96). There were significant correlations between the aesthetic competence score and jump ability and push-ups (r = 0.55 and r = 0.55, respectively). Stepwise backward multiple regression analysis revealed that the best predictor of aesthetic competence was push-ups (R(2) = 0.30, p = 0.03). Univariate analyses also revealed that the interaction of push-ups and jump ability improved the prediction power of aesthetic competence (R(2) = 0.44, p = 0.004). It is concluded that upper body muscular endurance and jump ability best predict aesthetic competence of the present sample of contemporary dancers. Further research is required to investigate the contribution of other components of aesthetic competence, including upper body strength, lower body muscular endurance, general coordination, and static and dynamic balance.
Implementing Machine Learning in the PCWG Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clifton, Andrew; Ding, Yu; Stuart, Peter
The Power Curve Working Group (www.pcwg.org) is an ad-hoc industry-led group to investigate the performance of wind turbines in real-world conditions. As part of ongoing experience-sharing exercises, machine learning has been proposed as a possible way to predict turbine performance. This presentation provides some background information about machine learning and how it might be implemented in the PCWG exercises.
A Comparison of Synoptic Classification Methods for Application to Wind Power Prediction
NASA Astrophysics Data System (ADS)
Fowler, P.; Basu, S.
2008-12-01
Wind energy is a highly variable resource. To make it competitive with other sources of energy for integration on the power grid, at the very least, a day-ahead forecast of power output must be available. In many grid operations worldwide, next-day power output is scheduled in 30 minute intervals and grid management routinely occurs at real time. Maintenance and repairs require costly time to complete and must be scheduled along with normal operations. Revenue is dependent on the reliability of the entire system. In other words, there is financial and managerial benefit to short-term prediction of wind power. One approach to short-term forecasting is to combine a data centric method such as an artificial neural network with a physically based approach like numerical weather prediction (NWP). The key is in associating high-dimensional NWP model output with the most appropriately trained neural network. Because neural networks perform the best in the situations they are designed for, one can hypothesize that if one can identify similar recurring states in historical weather data, this data can be used to train multiple custom designed neural networks to be used when called upon by numerical prediction. Identifying similar recurring states may offer insight to how a neural network forecast can be improved, but amassing the knowledge and utilizing it efficiently in the time required for power prediction would be difficult for a human to master, thus showing the advantage of classification. Classification methods are important tools for short-term forecasting because they can be unsupervised, objective, and computationally quick. They primarily involve categorizing data sets in to dominant weather classes, but there are numerous ways to define a class and a great variety in interpretation of the results. In the present study a collection of classification methods are used on a sampling of atmospheric variables from the North American Regional Reanalysis data set. The results will be discussed in relation to their use for short-term wind power forecasting by neural networks.
Using Predictive Analytics to Predict Power Outages from Severe Weather
NASA Astrophysics Data System (ADS)
Wanik, D. W.; Anagnostou, E. N.; Hartman, B.; Frediani, M. E.; Astitha, M.
2015-12-01
The distribution of reliable power is essential to businesses, public services, and our daily lives. With the growing abundance of data being collected and created by industry (i.e. outage data), government agencies (i.e. land cover), and academia (i.e. weather forecasts), we can begin to tackle problems that previously seemed too complex to solve. In this session, we will present newly developed tools to aid decision-support challenges at electric distribution utilities that must mitigate, prepare for, respond to and recover from severe weather. We will show a performance evaluation of outage predictive models built for Eversource Energy (formerly Connecticut Light & Power) for storms of all types (i.e. blizzards, thunderstorms and hurricanes) and magnitudes (from 20 to >15,000 outages). High resolution weather simulations (simulated with the Weather and Research Forecast Model) were joined with utility outage data to calibrate four types of models: a decision tree (DT), random forest (RF), boosted gradient tree (BT) and an ensemble (ENS) decision tree regression that combined predictions from DT, RF and BT. The study shows that the ENS model forced with weather, infrastructure and land cover data was superior to the other models we evaluated, especially in terms of predicting the spatial distribution of outages. This research has the potential to be used for other critical infrastructure systems (such as telecommunications, drinking water and gas distribution networks), and can be readily expanded to the entire New England region to facilitate better planning and coordination among decision-makers when severe weather strikes.
NASA Technical Reports Server (NTRS)
Sandlin, Doral R.; Howard, Kipp E.
1991-01-01
A user friendly FORTRAN code that can be used for preliminary design of V/STOL aircraft is described. The program estimates lift increments, due to power induced effects, encountered by aircraft in V/STOL flight. These lift increments are calculated using empirical relations developed from wind tunnel tests and are due to suckdown, fountain, ground vortex, jet wake, and the reaction control system. The code can be used as a preliminary design tool along with NASA Ames' Aircraft Synthesis design code or as a stand-alone program for V/STOL aircraft designers. The Power Induced Effects (PIE) module was validated using experimental data and data computed from lift increment routines. Results are presented for many flat plate models along with the McDonnell Aircraft Company's MFVT (mixed flow vectored thrust) V/STOL preliminary design and a 15 percent scale model of the YAV-8B Harrier V/STOL aircraft. Trends and magnitudes of lift increments versus aircraft height above the ground were predicted well by the PIE module. The code also provided good predictions of the magnitudes of lift increments versus aircraft forward velocity. More experimental results are needed to determine how well the code predicts lift increments as they vary with jet deflection angle and angle of attack. The FORTRAN code is provided in the appendix.
Solar Field Optical Characterization at Stillwater Geothermal/Solar Hybrid Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Guangdong; Turchi, Craig
Concentrating solar power (CSP) can provide additional thermal energy to boost geothermal plant power generation. For a newly constructed solar field at a geothermal power plant site, it is critical to properly characterize its performance so that the prediction of thermal power generation can be derived to develop an optimum operating strategy for a hybrid system. In the past, laboratory characterization of a solar collector has often extended into the solar field performance model and has been used to predict the actual solar field performance, disregarding realistic impacting factors. In this work, an extensive measurement on mirror slope error andmore » receiver position error has been performed in the field by using the optical characterization tool called Distant Observer (DO). Combining a solar reflectance sampling procedure, a newly developed solar characterization program called FirstOPTIC and public software for annual performance modeling called System Advisor Model (SAM), a comprehensive solar field optical characterization has been conducted, thus allowing for an informed prediction of solar field annual performance. The paper illustrates this detailed solar field optical characterization procedure and demonstrates how the results help to quantify an appropriate tracking-correction strategy to improve solar field performance. In particular, it is found that an appropriate tracking-offset algorithm can improve the solar field performance by about 15%. The work here provides a valuable reference for the growing CSP industry.« less
Solar Field Optical Characterization at Stillwater Geothermal/Solar Hybrid Plant
Zhu, Guangdong; Turchi, Craig
2017-01-27
Concentrating solar power (CSP) can provide additional thermal energy to boost geothermal plant power generation. For a newly constructed solar field at a geothermal power plant site, it is critical to properly characterize its performance so that the prediction of thermal power generation can be derived to develop an optimum operating strategy for a hybrid system. In the past, laboratory characterization of a solar collector has often extended into the solar field performance model and has been used to predict the actual solar field performance, disregarding realistic impacting factors. In this work, an extensive measurement on mirror slope error andmore » receiver position error has been performed in the field by using the optical characterization tool called Distant Observer (DO). Combining a solar reflectance sampling procedure, a newly developed solar characterization program called FirstOPTIC and public software for annual performance modeling called System Advisor Model (SAM), a comprehensive solar field optical characterization has been conducted, thus allowing for an informed prediction of solar field annual performance. The paper illustrates this detailed solar field optical characterization procedure and demonstrates how the results help to quantify an appropriate tracking-correction strategy to improve solar field performance. In particular, it is found that an appropriate tracking-offset algorithm can improve the solar field performance by about 15%. The work here provides a valuable reference for the growing CSP industry.« less
Nutritional Risk in Emergency-2017: A New Simplified Proposal for a Nutrition Screening Tool.
Marcadenti, Aline; Mendes, Larissa Loures; Rabito, Estela Iraci; Fink, Jaqueline da Silva; Silva, Flávia Moraes
2018-03-13
There are many nutrition screening tools currently being applied in hospitals to identify risk of malnutrition. However, multivariate statistical models are not usually employed to take into account the importance of each variable included in the instrument's development. To develop and evaluate the concurrent and predictive validities of a new screening tool of nutrition risk. A prospective cohort study was developed, in which 4 nutrition screening tools were applied to all patients. Length of stay in hospital and mortality were considered to test the predictive validity, and the concurrent validity was tested by comparing the Nuritional Risk in Emergency (NRE)-2017 to the other tools. A total of 748 patients were included. The final NRE-2017 score was composed of 6 questions (advanced age, metabolic stress of the disease, decreased appetite, changing of food consistency, unintentional weight loss, and muscle mass loss) with answers yes or no. The prevalence of nutrition risk was 50.7% and 38.8% considering the cutoff points 1.0 and 1.5, respectively. The NRE-2017 showed a satisfactory power to indentify risk of malnutrition (area under the curve >0.790 for all analyses). According to the NRE-2017, patients at risk of malnutrition have twice as high relative risk of a very long hospital stay. The hazard ratio for mortality was 2.78 (1.03-7.49) when the cutoff adopted by the NRE-2017 was 1.5 points. NRE-2017 is a new, easy-to-apply nutrition screening tool which uses 6 bi-categoric features to detect the risk of malnutrition, and it presented a good concurrent and predictive validity. © 2018 American Society for Parenteral and Enteral Nutrition.
The Zeldovich & Adhesion approximations and applications to the local universe
NASA Astrophysics Data System (ADS)
Hidding, Johan; van de Weygaert, Rien; Shandarin, Sergei
2016-10-01
The Zeldovich approximation (ZA) predicts the formation of a web of singularities. While these singularities may only exist in the most formal interpretation of the ZA, they provide a powerful tool for the analysis of initial conditions. We present a novel method to find the skeleton of the resulting cosmic web based on singularities in the primordial deformation tensor and its higher order derivatives. We show that the A 3 lines predict the formation of filaments in a two-dimensional model. We continue with applications of the adhesion model to visualise structures in the local (z < 0.03) universe.
Recent experience with the CQE{trademark}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, C.D.; Kehoe, D.B.; O`Connor, D.C.
1997-12-31
CQE (the Coal Quality Expert) is a software tool that brings a new level of sophistication to fuel decisions by seamlessly integrating the system-wide effects of fuel purchase decisions on power plant performance, emissions, and power generation costs. The CQE technology, which addresses fuel quality from the coal mine to the busbar and the stack, is an integration and improvement of predecessor software tools including: EPRI`s Coal Quality Information System, EPRI`s Coal Cleaning Cost Model, EPRI`s Coal Quality Impact Model, and EPRI and DOE models to predict slagging and fouling. CQE can be used as a stand-alone workstation or asmore » a network application for utilities, coal producers, and equipment manufacturers to perform detailed analyses of the impacts of coal quality, capital improvements, operational changes, and/or environmental compliance alternatives on power plant emissions, performance and production costs. It can be used as a comprehensive, precise and organized methodology for systematically evaluating all such impacts or it may be used in pieces with some default data to perform more strategic or comparative studies.« less
NASA Technical Reports Server (NTRS)
Eck, Marshall; Mukunda, Meera
1988-01-01
A calculational method is described which provides a powerful tool for predicting solid rocket motor (SRM) casing and liquid rocket tankage fragmentation response. The approach properly partitions the available impulse to each major system-mass component. It uses the Pisces code developed by Physics International to couple the forces generated by an Eulerian-modeled gas flow field to a Lagrangian-modeled fuel and casing system. The details of the predictive analytical modeling process and the development of normalized relations for momentum partition as a function of SRM burn time and initial geometry are discussed. Methods for applying similar modeling techniques to liquid-tankage-overpressure failures are also discussed. Good agreement between predictions and observations are obtained for five specific events.
An intelligent load shedding scheme using neural networks and neuro-fuzzy.
Haidar, Ahmed M A; Mohamed, Azah; Al-Dabbagh, Majid; Hussain, Aini; Masoum, Mohammad
2009-12-01
Load shedding is some of the essential requirement for maintaining security of modern power systems, particularly in competitive energy markets. This paper proposes an intelligent scheme for fast and accurate load shedding using neural networks for predicting the possible loss of load at the early stage and neuro-fuzzy for determining the amount of load shed in order to avoid a cascading outage. A large scale electrical power system has been considered to validate the performance of the proposed technique in determining the amount of load shed. The proposed techniques can provide tools for improving the reliability and continuity of power supply. This was confirmed by the results obtained in this research of which sample results are given in this paper.
Chatzigianni, Athina; Halazonetis, Demetrios J
2009-10-01
Cervical vertebrae shape has been proposed as a diagnostic factor for assessing skeletal maturation in orthodontic patients. However, evaluation of vertebral shape is mainly based on qualitative criteria. Comprehensive quantitative measurements of shape and assessments of its predictive power have not been reported. Our aims were to measure vertebral shape by using the tools of geometric morphometrics and to evaluate the correlation and predictive power of vertebral shape on skeletal maturation. Pretreatment lateral cephalograms and corresponding hand-wrist radiographs of 98 patients (40 boys, 58 girls; ages, 8.1-17.7 years) were used. Skeletal age was estimated from the hand-wrist radiographs. The first 4 vertebrae were traced, and 187 landmarks (34 fixed and 153 sliding semilandmarks) were used. Sliding semilandmarks were adjusted to minimize bending energy against the average of the sample. Principal components analysis in shape and form spaces was used for evaluating shape patterns. Shape measures, alone and combined with centroid size and age, were assessed as predictors of skeletal maturation. Shape alone could not predict skeletal maturation better than chronologic age. The best prediction was achieved with the combination of form space principal components and age, giving 90% prediction intervals of approximately 200 maturation units in the girls and 300 units in the boys. Similar predictive power could be obtained by using centroid size and age. Vertebrae C2, C3, and C4 gave similar results when examined individually or combined. C1 showed lower correlations, signifying lower integration with hand-wrist maturation. Vertebral shape is strongly correlated to skeletal age but does not offer better predictive value than chronologic age.
Probabilistic Weather Information Tailored to the Needs of Transmission System Operators
NASA Astrophysics Data System (ADS)
Alberts, I.; Stauch, V.; Lee, D.; Hagedorn, R.
2014-12-01
Reliable and accurate forecasts for wind and photovoltaic (PV) power production are essential for stable transmission systems. A high potential for improving the wind and PV power forecasts lies in optimizing the weather forecasts, since these energy sources are highly weather dependent. For this reason the main objective of the German research project EWeLiNE is to improve the quality the underlying numerical weather predictions towards energy operations. In this project, the German Meteorological Service (DWD), the Fraunhofer Institute for Wind Energy and Energy System Technology, and three of the German transmission system operators (TSOs) are working together to improve the weather and power forecasts. Probabilistic predictions are of particular interest, as the quantification of uncertainties provides an important tool for risk management. Theoretical considerations suggest that it can be advantageous to use probabilistic information to represent and respond to the remaining uncertainties in the forecasts. However, it remains a challenge to integrate this information into the decision making processes related to market participation and power systems operations. The project is planned and carried out in close cooperation with the involved TSOs in order to ensure the usability of the products developed. It will conclude with a demonstration phase, in which the improved models and newly developed products are combined into a process chain and used to provide information to TSOs in a real-time decision support tool. The use of a web-based development platform enables short development cycles and agile adaptation to evolving user needs. This contribution will present the EWeLiNE project and discuss ideas on how to incorporate probabilistic information into the users' current decision making processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vanderhoff, J. F.; Rao, G. V.; Stein, A.
2012-07-01
The issue of Flow Accelerated Erosion-Corrosion (FAC) in power plant piping is a known phenomenon that has resulted in material replacements and plant accidents in operating power plants. Therefore, it is important for FAC resistance to be considered in the design of new nuclear power plants. This paper describes the design considerations related to FAC that were used to develop a safe and robust AP1000{sup R} plant secondary side piping design. The primary FAC influencing factors include: - Fluid Temperature - Pipe Geometry/layout - Fluid Chemistry - Fluid Velocity - Pipe Material Composition - Moisture Content (in steam lines) Duemore » to the unknowns related to the relative impact of the influencing factors and the complexities of the interactions between these factors, it is difficult to accurately predict the expected wear rate in a given piping segment in a new plant. This paper provides: - a description of FAC and the factors that influence the FAC degradation rate, - an assessment of the level of FAC resistance of AP1000{sup R} secondary side system piping, - an explanation of options to increase FAC resistance and associated benefits/cost, - discussion of development of a tool for predicting FAC degradation rate in new nuclear power plants. (authors)« less
Classical Mathematical Models for Description and Prediction of Experimental Tumor Growth
Benzekry, Sébastien; Lamont, Clare; Beheshti, Afshin; Tracz, Amanda; Ebos, John M. L.; Hlatky, Lynn; Hahnfeldt, Philip
2014-01-01
Despite internal complexity, tumor growth kinetics follow relatively simple laws that can be expressed as mathematical models. To explore this further, quantitative analysis of the most classical of these were performed. The models were assessed against data from two in vivo experimental systems: an ectopic syngeneic tumor (Lewis lung carcinoma) and an orthotopically xenografted human breast carcinoma. The goals were threefold: 1) to determine a statistical model for description of the measurement error, 2) to establish the descriptive power of each model, using several goodness-of-fit metrics and a study of parametric identifiability, and 3) to assess the models' ability to forecast future tumor growth. The models included in the study comprised the exponential, exponential-linear, power law, Gompertz, logistic, generalized logistic, von Bertalanffy and a model with dynamic carrying capacity. For the breast data, the dynamics were best captured by the Gompertz and exponential-linear models. The latter also exhibited the highest predictive power, with excellent prediction scores (≥80%) extending out as far as 12 days in the future. For the lung data, the Gompertz and power law models provided the most parsimonious and parametrically identifiable description. However, not one of the models was able to achieve a substantial prediction rate (≥70%) beyond the next day data point. In this context, adjunction of a priori information on the parameter distribution led to considerable improvement. For instance, forecast success rates went from 14.9% to 62.7% when using the power law model to predict the full future tumor growth curves, using just three data points. These results not only have important implications for biological theories of tumor growth and the use of mathematical modeling in preclinical anti-cancer drug investigations, but also may assist in defining how mathematical models could serve as potential prognostic tools in the clinic. PMID:25167199
Classical mathematical models for description and prediction of experimental tumor growth.
Benzekry, Sébastien; Lamont, Clare; Beheshti, Afshin; Tracz, Amanda; Ebos, John M L; Hlatky, Lynn; Hahnfeldt, Philip
2014-08-01
Despite internal complexity, tumor growth kinetics follow relatively simple laws that can be expressed as mathematical models. To explore this further, quantitative analysis of the most classical of these were performed. The models were assessed against data from two in vivo experimental systems: an ectopic syngeneic tumor (Lewis lung carcinoma) and an orthotopically xenografted human breast carcinoma. The goals were threefold: 1) to determine a statistical model for description of the measurement error, 2) to establish the descriptive power of each model, using several goodness-of-fit metrics and a study of parametric identifiability, and 3) to assess the models' ability to forecast future tumor growth. The models included in the study comprised the exponential, exponential-linear, power law, Gompertz, logistic, generalized logistic, von Bertalanffy and a model with dynamic carrying capacity. For the breast data, the dynamics were best captured by the Gompertz and exponential-linear models. The latter also exhibited the highest predictive power, with excellent prediction scores (≥80%) extending out as far as 12 days in the future. For the lung data, the Gompertz and power law models provided the most parsimonious and parametrically identifiable description. However, not one of the models was able to achieve a substantial prediction rate (≥70%) beyond the next day data point. In this context, adjunction of a priori information on the parameter distribution led to considerable improvement. For instance, forecast success rates went from 14.9% to 62.7% when using the power law model to predict the full future tumor growth curves, using just three data points. These results not only have important implications for biological theories of tumor growth and the use of mathematical modeling in preclinical anti-cancer drug investigations, but also may assist in defining how mathematical models could serve as potential prognostic tools in the clinic.
White, David B.
1991-01-01
An electrical safety device for use in power tools that is designed to automatically discontinue operation of the power tool upon physical contact of the tool with a concealed conductive material. A step down transformer is used to supply the operating power for a disconnect relay and a reset relay. When physical contact is made between the power tool and the conductive material, an electrical circuit through the disconnect relay is completed and the operation of the power tool is automatically interrupted. Once the contact between the tool and conductive material is broken, the power tool can be quickly and easily reactivated by a reset push button activating the reset relay. A remote reset is provided for convenience and efficiency of operation.
Modeling NIF experimental designs with adaptive mesh refinement and Lagrangian hydrodynamics
NASA Astrophysics Data System (ADS)
Koniges, A. E.; Anderson, R. W.; Wang, P.; Gunney, B. T. N.; Becker, R.; Eder, D. C.; MacGowan, B. J.; Schneider, M. B.
2006-06-01
Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs.
Artificial neural networks in gynaecological diseases: current and potential future applications.
Siristatidis, Charalampos S; Chrelias, Charalampos; Pouliakis, Abraham; Katsimanis, Evangelos; Kassanos, Dimitrios
2010-10-01
Current (and probably future) practice of medicine is mostly associated with prediction and accurate diagnosis. Especially in clinical practice, there is an increasing interest in constructing and using valid models of diagnosis and prediction. Artificial neural networks (ANNs) are mathematical systems being used as a prospective tool for reliable, flexible and quick assessment. They demonstrate high power in evaluating multifactorial data, assimilating information from multiple sources and detecting subtle and complex patterns. Their capability and difference from other statistical techniques lies in performing nonlinear statistical modelling. They represent a new alternative to logistic regression, which is the most commonly used method for developing predictive models for outcomes resulting from partitioning in medicine. In combination with the other non-algorithmic artificial intelligence techniques, they provide useful software engineering tools for the development of systems in quantitative medicine. Our paper first presents a brief introduction to ANNs, then, using what we consider the best available evidence through paradigms, we evaluate the ability of these networks to serve as first-line detection and prediction techniques in some of the most crucial fields in gynaecology. Finally, through the analysis of their current application, we explore their dynamics for future use.
Lee, Bum Ju; Kim, Jong Yeol
2015-09-01
Serum high-density lipoprotein (HDL) and low-density lipoprotein (LDL) cholesterol levels are associated with risk factors for various diseases and are related to anthropometric measures. However, controversy remains regarding the best anthropometric indicators of the HDL and LDL cholesterol levels. The objectives of this study were to identify the best predictors of HDL and LDL cholesterol using statistical analyses and two machine learning algorithms and to compare the predictive power of combined anthropometric measures in Korean adults. A total of 13,014 subjects participated in this study. The anthropometric measures were assessed with binary logistic regression (LR) to evaluate statistically significant differences between the subjects with normal and high LDL cholesterol levels and between the subjects with normal and low HDL cholesterol levels. LR and the naive Bayes algorithm (NB), which provides more reasonable and reliable results, were used in the analyses of the predictive power of individual and combined measures. The best predictor of HDL was the rib to hip ratio (p =< 0.0001; odds ratio (OR) = 1.895; area under curve (AUC) = 0.681) in women and the waist to hip ratio (WHR) (p =< 0.0001; OR = 1.624; AUC = 0.633) in men. In women, the strongest indicator of LDL was age (p =< 0.0001; OR = 1.662; AUC by NB = 0.653 ; AUC by LR = 0.636). Among the anthropometric measures, the body mass index (BMI), WHR, forehead to waist ratio, forehead to rib ratio, and forehead to chest ratio were the strongest predictors of LDL; these measures had similar predictive powers. The strongest predictor in men was BMI (p =< 0.0001; OR = 1.369; AUC by NB = 0.594; AUC by LR = 0.595 ). The predictive power of almost all individual anthropometric measures was higher for HDL than for LDL, and the predictive power for both HDL and LDL in women was higher than for men. A combination of anthropometric measures slightly improved the predictive power for both HDL and LDL cholesterol. The best indicator for HDL and LDL might differ according to the type of cholesterol and the gender. In women, but not men, age was the variable that strongly predicted HDL and LDL cholesterol levels. Our findings provide new information for the development of better initial screening tools for HDL and LDL cholesterol.
Analysis of Facial Injuries Caused by Power Tools.
Kim, Jiye; Choi, Jin-Hee; Hyun Kim, Oh; Won Kim, Sug
2016-06-01
The number of injuries caused by power tools is steadily increasing as more domestic woodwork is undertaken and more power tools are used recreationally. The injuries caused by the different power tools as a consequence of accidents are an issue, because they can lead to substantial costs for patients and the national insurance system. The increase in hand surgery as a consequence of the use of power tools and its economic impact, and the characteristics of the hand injuries caused by power saws have been described. In recent years, the authors have noticed that, in addition to hand injuries, facial injuries caused by power tools commonly present to the emergency room. This study aimed to review the data in relation to facial injuries caused by power saws that were gathered from patients who visited the trauma center at our hospital over the last 4 years, and to analyze the incidence and epidemiology of the facial injuries caused by power saws. The authors found that facial injuries caused by power tools have risen continually. Facial injuries caused by power tools are accidental, and they cause permanent facial disfigurements and functional disabilities. Accidents are almost inevitable in particular workplaces; however, most facial injuries could be avoided by providing sufficient operator training and by tool operators wearing suitable protective devices. The evaluation of the epidemiology and patterns of facial injuries caused by power tools in this study should provide the information required to reduce the number of accidental injuries.
Lepoivre, Cyrille; Bergon, Aurélie; Lopez, Fabrice; Perumal, Narayanan B; Nguyen, Catherine; Imbert, Jean; Puthier, Denis
2012-01-31
Deciphering gene regulatory networks by in silico approaches is a crucial step in the study of the molecular perturbations that occur in diseases. The development of regulatory maps is a tedious process requiring the comprehensive integration of various evidences scattered over biological databases. Thus, the research community would greatly benefit from having a unified database storing known and predicted molecular interactions. Furthermore, given the intrinsic complexity of the data, the development of new tools offering integrated and meaningful visualizations of molecular interactions is necessary to help users drawing new hypotheses without being overwhelmed by the density of the subsequent graph. We extend the previously developed TranscriptomeBrowser database with a set of tables containing 1,594,978 human and mouse molecular interactions. The database includes: (i) predicted regulatory interactions (computed by scanning vertebrate alignments with a set of 1,213 position weight matrices), (ii) potential regulatory interactions inferred from systematic analysis of ChIP-seq experiments, (iii) regulatory interactions curated from the literature, (iv) predicted post-transcriptional regulation by micro-RNA, (v) protein kinase-substrate interactions and (vi) physical protein-protein interactions. In order to easily retrieve and efficiently analyze these interactions, we developed In-teractomeBrowser, a graph-based knowledge browser that comes as a plug-in for Transcriptome-Browser. The first objective of InteractomeBrowser is to provide a user-friendly tool to get new insight into any gene list by providing a context-specific display of putative regulatory and physical interactions. To achieve this, InteractomeBrowser relies on a "cell compartments-based layout" that makes use of a subset of the Gene Ontology to map gene products onto relevant cell compartments. This layout is particularly powerful for visual integration of heterogeneous biological information and is a productive avenue in generating new hypotheses. The second objective of InteractomeBrowser is to fill the gap between interaction databases and dynamic modeling. It is thus compatible with the network analysis software Cytoscape and with the Gene Interaction Network simulation software (GINsim). We provide examples underlying the benefits of this visualization tool for large gene set analysis related to thymocyte differentiation. The InteractomeBrowser plugin is a powerful tool to get quick access to a knowledge database that includes both predicted and validated molecular interactions. InteractomeBrowser is available through the TranscriptomeBrowser framework and can be found at: http://tagc.univ-mrs.fr/tbrowser/. Our database is updated on a regular basis.
Adaptation and Re-Use of Spacecraft Power System Models for the Constellation Program
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey S.; Kerslake, Thomas W.; Ayres, Mark; Han, Augustina H.; Adamson, Adrian M.
2008-01-01
NASA's Constellation Program is embarking on a new era of space exploration, returning to the Moon and beyond. The Constellation architecture will consist of a number of new spacecraft elements, including the Orion crew exploration vehicle, the Altair lunar lander, and the Ares family of launch vehicles. Each of these new spacecraft elements will need an electric power system, and those power systems will need to be designed to fulfill unique mission objectives and to survive the unique environments encountered on a lunar exploration mission. As with any new spacecraft power system development, preliminary design work will rely heavily on analysis to select the proper power technologies, size the power system components, and predict the system performance throughout the required mission profile. Constellation projects have the advantage of leveraging power system modeling developments from other recent programs such as the International Space Station (ISS) and the Mars Exploration Program. These programs have developed mature power system modeling tools, which can be quickly modified to meet the unique needs of Constellation, and thus provide a rapid capability for detailed power system modeling that otherwise would not exist.
An integrated workflow for analysis of ChIP-chip data.
Weigelt, Karin; Moehle, Christoph; Stempfl, Thomas; Weber, Bernhard; Langmann, Thomas
2008-08-01
Although ChIP-chip is a powerful tool for genome-wide discovery of transcription factor target genes, the steps involving raw data analysis, identification of promoters, and correlation with binding sites are still laborious processes. Therefore, we report an integrated workflow for the analysis of promoter tiling arrays with the Genomatix ChipInspector system. We compare this tool with open-source software packages to identify PU.1 regulated genes in mouse macrophages. Our results suggest that ChipInspector data analysis, comparative genomics for binding site prediction, and pathway/network modeling significantly facilitate and enhance whole-genome promoter profiling to reveal in vivo sites of transcription factor-DNA interactions.
Computational Methods for Stability and Control (COMSAC): The Time Has Come
NASA Technical Reports Server (NTRS)
Hall, Robert M.; Biedron, Robert T.; Ball, Douglas N.; Bogue, David R.; Chung, James; Green, Bradford E.; Grismer, Matthew J.; Brooks, Gregory P.; Chambers, Joseph R.
2005-01-01
Powerful computational fluid dynamics (CFD) tools have emerged that appear to offer significant benefits as an adjunct to the experimental methods used by the stability and control community to predict aerodynamic parameters. The decreasing costs for and increasing availability of computing hours are making these applications increasingly viable as time goes on and the cost of computing continues to drop. This paper summarizes the efforts of four organizations to utilize high-end computational fluid dynamics (CFD) tools to address the challenges of the stability and control arena. General motivation and the backdrop for these efforts will be summarized as well as examples of current applications.
GPS-MBA: Computational Analysis of MHC Class II Epitopes in Type 1 Diabetes
Ren, Jian; Ma, Chuang; Gao, Tianshun; Zhou, Yanhong; Yang, Qing; Xue, Yu
2012-01-01
As a severe chronic metabolic disease and autoimmune disorder, type 1 diabetes (T1D) affects millions of people world-wide. Recent advances in antigen-based immunotherapy have provided a great opportunity for further treating T1D with a high degree of selectivity. It is reported that MHC class II I-Ag7 in the non-obese diabetic (NOD) mouse and human HLA-DQ8 are strongly linked to susceptibility to T1D. Thus, the identification of new I-Ag7 and HLA-DQ8 epitopes would be of great help to further experimental and biomedical manipulation efforts. In this study, a novel GPS-MBA (MHC Binding Analyzer) software package was developed for the prediction of I-Ag7 and HLA-DQ8 epitopes. Using experimentally identified epitopes as the training data sets, a previously developed GPS (Group-based Prediction System) algorithm was adopted and improved. By extensive evaluation and comparison, the GPS-MBA performance was found to be much better than other tools of this type. With this powerful tool, we predicted a number of potentially new I-Ag7 and HLA-DQ8 epitopes. Furthermore, we designed a T1D epitope database (TEDB) for all of the experimentally identified and predicted T1D-associated epitopes. Taken together, this computational prediction result and analysis provides a starting point for further experimental considerations, and GPS-MBA is demonstrated to be a useful tool for generating starting information for experimentalists. The GPS-MBA is freely accessible for academic researchers at: http://mba.biocuckoo.org. PMID:22479466
Investigating market efficiency through a forecasting model based on differential equations
NASA Astrophysics Data System (ADS)
de Resende, Charlene C.; Pereira, Adriano C. M.; Cardoso, Rodrigo T. N.; de Magalhães, A. R. Bosco
2017-05-01
A new differential equation based model for stock price trend forecast is proposed as a tool to investigate efficiency in an emerging market. Its predictive power showed statistically to be higher than the one of a completely random model, signaling towards the presence of arbitrage opportunities. Conditions for accuracy to be enhanced are investigated, and application of the model as part of a trading strategy is discussed.
Abdelkader, E H; Feintuch, A; Yao, X; Adams, L A; Aurelio, L; Graham, B; Goldfarb, D; Otting, G
2015-11-14
Quantitative cysteine-independent ligation of a Gd(3+) tag to genetically encoded p-azido-L-phenylalanine via Cu(I)-catalyzed click chemistry is shown to deliver an exceptionally powerful tool for Gd(3+)-Gd(3+) distance measurements by double electron-electron resonance (DEER) experiments, as the position of the Gd(3+) ion relative to the protein can be predicted with high accuracy.
Improved Rainfall Estimates and Predictions for 21st Century Drought Early Warning
NASA Technical Reports Server (NTRS)
Funk, Chris; Peterson, Pete; Shukla, Shraddhanand; Husak, Gregory; Landsfeld, Marty; Hoell, Andrew; Pedreros, Diego; Roberts, J. B.; Robertson, F. R.; Tadesse, Tsegae;
2015-01-01
As temperatures increase, the onset and severity of droughts is likely to become more intense. Improved tools for understanding, monitoring and predicting droughts will be a key component of 21st century climate adaption. The best drought monitoring systems will bring together accurate precipitation estimates with skillful climate and weather forecasts. Such systems combine the predictive power inherent in the current land surface state with the predictive power inherent in low frequency ocean-atmosphere dynamics. To this end, researchers at the Climate Hazards Group (CHG), in collaboration with partners at the USGS and NASA, have developed i) a long (1981-present) quasi-global (50degS-50degN, 180degW-180degE) high resolution (0.05deg) homogenous precipitation data set designed specifically for drought monitoring, ii) tools for understanding and predicting East African boreal spring droughts, and iii) an integrated land surface modeling (LSM) system that combines rainfall observations and predictions to provide effective drought early warning. This talk briefly describes these three components. Component 1: CHIRPS The Climate Hazards group InfraRed Precipitation with Stations (CHIRPS), blends station data with geostationary satellite observations to provide global near real time daily, pentadal and monthly precipitation estimates. We describe the CHIRPS algorithm and compare CHIRPS and other estimates to validation data. The CHIRPS is shown to have high correlation, low systematic errors (bias) and low mean absolute errors. Component 2: Hybrid statistical-dynamic forecast strategies East African droughts have increased in frequency, but become more predictable as Indo- Pacific SST gradients and Walker circulation disruptions intensify. We describe hybrid statistical-dynamic forecast strategies that are far superior to the raw output of coupled forecast models. These forecasts can be translated into probabilities that can be used to generate bootstrapped ensembles describing future climate conditions. Component 3: Assimilation using LSMs CHIRPS rainfall observations (component 1) and bootstrapped forecast ensembles (component 2) can be combined using LSMs to predict soil moisture deficits. We evaluate the skill such a system in East Africa, and demonstrate results for 2013.
Predicting rates of isotopic turnover across the animal kingdom: a synthesis of existing data.
Thomas, Stephen M; Crowther, Thomas W
2015-05-01
The stable isotopes of carbon ((12)C, (13)C) and nitrogen ((14)N, (15)N) represent powerful tools in food web ecology, providing a wide range of dietary information in animal consumers. However, identifying the temporal window over which a consumer's isotopic signature reflects its diet requires an understanding of elemental incorporation, a process that varies from days to years across species and tissue types. Though theory predicts body size and temperature are likely to control incorporation rates, this has not been tested empirically across a morphologically and phylogenetically diverse range of taxa. Readily available estimates of this relationship would, however, aid in the design of stable isotope food web investigations and improve the interpretation of isotopic data collected from natural systems. Using literature-derived turnover estimates from animal species ranging in size from 1 mg to 2000 kg, we develop a predictive tool for stable isotope ecologists, allowing for estimation of incorporation rates in the structural tissues of entirely novel taxa. In keeping with metabolic scaling theory, we show that isotopic turnover rates of carbon and nitrogen in whole organisms and muscle tissue scale allometrically with body mass raised approximately to the power -0.19, an effect modulated by body temperature. This relationship did not, however, apply to incorporation rates in splanchnic tissues, which were instead dependent on the thermoregulation tactic employed by an organism, being considerably faster in endotherms than ectotherms. We believe the predictive turnover equations we provide can improve the design of experiments and interpretation of results obtained in future stable isotopic food web studies. © 2014 The Authors. Journal of Animal Ecology © 2014 British Ecological Society.
Comparison of Non-Parabolic Hydrodynamic Simulations for Semiconductor Devices
NASA Technical Reports Server (NTRS)
Smith, A. W.; Brennan, K. F.
1996-01-01
Parabolic drift-diffusion simulators are common engineering level design tools for semiconductor devices. Hydrodynamic simulators, based on the parabolic band approximation, are becoming more prevalent as device dimensions shrink and energy transport effects begin to dominate device characteristic. However, band structure effects present in state-of-the-art devices necessitate relaxing the parabolic band approximation. This paper presents simulations of ballistic diodes, a benchmark device, of Si and GaAs using two different non-parabolic hydrodynamic formulations. The first formulation uses the Kane dispersion relationship in the derivation of the conservation equations. The second model uses a power law dispersion relation {(hk)(exp 2)/2m = xW(exp Y)}. Current-voltage relations show that for the ballistic diodes considered. the non-parabolic formulations predict less current than the parabolic case. Explanations of this will be provided by examination of velocity and energy profiles. At low bias, the simulations based on the Kane formulation predict greater current flow than the power law formulation. As the bias is increased this trend changes and the power law predicts greater current than the Kane formulation. It will be shown that the non-parabolicity and energy range of the hydrodynamic model based on the Kane dispersion relation are limited due to the binomial approximation which was utilized in the derivation.
NASA Astrophysics Data System (ADS)
Salmaso, Veronica; Sturlese, Mattia; Cuzzolin, Alberto; Moro, Stefano
2018-01-01
Molecular docking is a powerful tool in the field of computer-aided molecular design. In particular, it is the technique of choice for the prediction of a ligand pose within its target binding site. A multitude of docking methods is available nowadays, whose performance may vary depending on the data set. Therefore, some non-trivial choices should be made before starting a docking simulation. In the same framework, the selection of the target structure to use could be challenging, since the number of available experimental structures is increasing. Both issues have been explored within this work. The pose prediction of a pool of 36 compounds provided by D3R Grand Challenge 2 organizers was preceded by a pipeline to choose the best protein/docking-method couple for each blind ligand. An integrated benchmark approach including ligand shape comparison and cross-docking evaluations was implemented inside our DockBench software. The results are encouraging and show that bringing attention to the choice of the docking simulation fundamental components improves the results of the binding mode predictions.
Gómez-Banoy, Nicolás; Cuevas, Virginia; Soler, Fernando; Pineda, Maria Fernanda; Mockus, Ismena
2017-01-01
This cross sectional study intended to evaluate two bedside tests (Neuropad and VibraTip) as screening tools for distal symmetrical polyneuropathy (DSPN) in Latin American patients with type 2 diabetes mellitus (T2D). Ninety-three Colombian patients diagnosed with T2D were recruited. Anthropometric variables, glycemic control parameters, lipid profile and renal function were assessed for each patient. DSPN was defined by a Michigan Neuropathy Screening Instrument (MNSI) clinical score greater than 2. Both Neuropad and Vibratip tests were applied to each patient. Contingency analyses were performed to evaluate the diagnostic power of both tools. The prevalence of DSPN determined clinically by MNSI was 25.8%. DSPN in these patients was associated with age, worsening renal function, and insulin treatment. The sensitivity and specificity of the Neuropad test for DSPN was 66.6% and 63% respectively. Its negative predictive value (NPV) was 84.6%. The VibraTip test exhibited a sensitivity of 54.1% and specificity of 91.3%, with a NPV of 85.1%. Neuropad and VibraTip are reliable screening tools for DSPN in Latin American population. VibraTip presents a considerable diagnostic power for DSPN in this population. Further studies regarding the cost-effectiveness of these tools in clinical practice are needed.
U.S. Geological Survey science for the Wyoming Landscape Conservation Initiative—2014 annual report
Bowen, Zachary H.; Aldridge, Cameron L.; Anderson, Patrick J.; Assal, Timothy J.; Bartos, Timothy T.; Biewick, Laura R; Boughton, Gregory K.; Chalfoun, Anna D.; Chong, Geneva W.; Dematatis, Marie K.; Eddy-Miller, Cheryl A.; Garman, Steven L.; Germaine, Stephen S.; Homer, Collin G.; Huber, Christopher; Kauffman, Matthew J.; Latysh, Natalie; Manier, Daniel; Melcher, Cynthia P.; Miller, Alexander; Miller, Kirk A.; Olexa, Edward M.; Schell, Spencer; Walters, Annika W.; Wilson, Anna B.; Wyckoff, Teal B.
2015-01-01
Finally, capabilities of the WLCI Web site and the USGS ScienceBase infrastructure were maintained and upgraded to help ensure access to and efficient use of all the WLCI data, products, assessment tools, and outreach materials that have been developed. Of particular note is the completion of three Web applications developed for mapping (1) the 1900−2008 progression of oil and gas development;(2) the predicted distributions of Wyoming’s Species of Greatest Conservation Need; and (3) the locations of coal and wind energy production, sage-grouse distribution and core management areas, and alternative routes for transmission lines within the WLCI region. Collectively, these applications tools provide WLCI planners and managers with powerful tools for better understanding the distributions of wildlife species and potential alternatives for energy development.
Liu, Yun; Scirica, Benjamin M; Stultz, Collin M; Guttag, John V
2016-10-06
Frequency domain measures of heart rate variability (HRV) are associated with adverse events after a myocardial infarction. However, patterns in the traditional frequency domain (measured in Hz, or cycles per second) may capture different cardiac phenomena at different heart rates. An alternative is to consider frequency with respect to heartbeats, or beatquency. We compared the use of frequency and beatquency domains to predict patient risk after an acute coronary syndrome. We then determined whether machine learning could further improve the predictive performance. We first evaluated the use of pre-defined frequency and beatquency bands in a clinical trial dataset (N = 2302) for the HRV risk measure LF/HF (the ratio of low frequency to high frequency power). Relative to frequency, beatquency improved the ability of LF/HF to predict cardiovascular death within one year (Area Under the Curve, or AUC, of 0.730 vs. 0.704, p < 0.001). Next, we used machine learning to learn frequency and beatquency bands with optimal predictive power, which further improved the AUC for beatquency to 0.753 (p < 0.001), but not for frequency. Results in additional validation datasets (N = 2255 and N = 765) were similar. Our results suggest that beatquency and machine learning provide valuable tools in physiological studies of HRV.
A simulation of cross-country skiing on varying terrain by using a mathematical power balance model
Moxnes, John F; Sandbakk, Øyvind; Hausken, Kjell
2013-01-01
The current study simulated cross-country skiing on varying terrain by using a power balance model. By applying the hypothetical inductive deductive method, we compared the simulated position along the track with actual skiing on snow, and calculated the theoretical effect of friction and air drag on skiing performance. As input values in the model, air drag and friction were estimated from the literature, whereas the model included relationships between heart rate, metabolic rate, and work rate based on the treadmill roller-ski testing of an elite cross-country skier. We verified this procedure by testing four models of metabolic rate against experimental data on the treadmill. The experimental data corresponded well with the simulations, with the best fit when work rate was increased on uphill and decreased on downhill terrain. The simulations predicted that skiing time increases by 3%–4% when either friction or air drag increases by 10%. In conclusion, the power balance model was found to be a useful tool for predicting how various factors influence racing performance in cross-country skiing. PMID:24379718
A simulation of cross-country skiing on varying terrain by using a mathematical power balance model.
Moxnes, John F; Sandbakk, Oyvind; Hausken, Kjell
2013-01-01
The current study simulated cross-country skiing on varying terrain by using a power balance model. By applying the hypothetical inductive deductive method, we compared the simulated position along the track with actual skiing on snow, and calculated the theoretical effect of friction and air drag on skiing performance. As input values in the model, air drag and friction were estimated from the literature, whereas the model included relationships between heart rate, metabolic rate, and work rate based on the treadmill roller-ski testing of an elite cross-country skier. We verified this procedure by testing four models of metabolic rate against experimental data on the treadmill. The experimental data corresponded well with the simulations, with the best fit when work rate was increased on uphill and decreased on downhill terrain. The simulations predicted that skiing time increases by 3%-4% when either friction or air drag increases by 10%. In conclusion, the power balance model was found to be a useful tool for predicting how various factors influence racing performance in cross-country skiing.
Jet Mixing Noise Scaling Laws SHJAR Data Vs. Predictions
NASA Technical Reports Server (NTRS)
Khavaran, Abbas; Bridges, James
2008-01-01
High quality jet noise spectral data measured at the anechoic dome at the NASA Glenn Research Center is used to examine a number of jet noise scaling laws. Configurations considered in the present study consist of convergent as well as convergent-divergent axisymmetric nozzles. The spectral measurements are shown in narrow band and cover 8193 equally spaced points in a typical Strouhal number range of (0.01 10.0). Measurements are reported as lossless (i.e. atmospheric attenuation is added to as-measured data), and at 24 equally spaced angles (50deg to 165deg) on a 100-diameter arc. Following the work of Viswanathan [Ref. 1], velocity power laws are derived using a least square fit on spectral power density as a function of jet temperature and observer angle. The goodness of the fit is studied at each angle, and alternative relationships are proposed to improve the spectral collapse when certain conditions are met. On the application side, power laws are extremely useful in identifying components from various noise generation mechanisms. From this analysis, jet noise prediction tools can be developed with physics derived from the different spectral components.
Performance of Reclassification Statistics in Comparing Risk Prediction Models
Paynter, Nina P.
2012-01-01
Concerns have been raised about the use of traditional measures of model fit in evaluating risk prediction models for clinical use, and reclassification tables have been suggested as an alternative means of assessing the clinical utility of a model. Several measures based on the table have been proposed, including the reclassification calibration (RC) statistic, the net reclassification improvement (NRI), and the integrated discrimination improvement (IDI), but the performance of these in practical settings has not been fully examined. We used simulations to estimate the type I error and power for these statistics in a number of scenarios, as well as the impact of the number and type of categories, when adding a new marker to an established or reference model. The type I error was found to be reasonable in most settings, and power was highest for the IDI, which was similar to the test of association. The relative power of the RC statistic, a test of calibration, and the NRI, a test of discrimination, varied depending on the model assumptions. These tools provide unique but complementary information. PMID:21294152
NASA Astrophysics Data System (ADS)
Haack, Lukas; Peniche, Ricardo; Sommer, Lutz; Kather, Alfons
2017-06-01
At early project stages, the main CSP plant design parameters such as turbine capacity, solar field size, and thermal storage capacity are varied during the techno-economic optimization to determine most suitable plant configurations. In general, a typical meteorological year with at least hourly time resolution is used to analyze each plant configuration. Different software tools are available to simulate the annual energy yield. Software tools offering a thermodynamic modeling approach of the power block and the CSP thermal cycle, such as EBSILONProfessional®, allow a flexible definition of plant topologies. In EBSILON, the thermodynamic equilibrium for each time step is calculated iteratively (quasi steady state), which requires approximately 45 minutes to process one year with hourly time resolution. For better presentation of gradients, 10 min time resolution is recommended, which increases processing time by a factor of 5. Therefore, analyzing a large number of plant sensitivities, as required during the techno-economic optimization procedure, the detailed thermodynamic simulation approach becomes impracticable. Suntrace has developed an in-house CSP-Simulation tool (CSPsim), based on EBSILON and applying predictive models, to approximate the CSP plant performance for central receiver and parabolic trough technology. CSPsim significantly increases the speed of energy yield calculations by factor ≥ 35 and has automated the simulation run of all predefined design configurations in sequential order during the optimization procedure. To develop the predictive models, multiple linear regression techniques and Design of Experiment methods are applied. The annual energy yield and derived LCOE calculated by the predictive model deviates less than ±1.5 % from the thermodynamic simulation in EBSILON and effectively identifies the optimal range of main design parameters for further, more specific analysis.
Hoshi, Masayuki; Hozawa, Atsushi; Kuriyama, Shinichi; Nakaya, Naoki; Ohmori-Matsuda, Kaori; Sone, Toshimasa; Kakizaki, Masako; Niu, Kaijun; Fujita, Kazuki; Ueki, Shouzoh; Haga, Hiroshi; Nagatomi, Ryoichi; Tsuji, Ichiro
2012-08-01
To compare the predictive power of physical function assessed by questionnaire and physical performance measures for subsequent disability in community-dwelling elderly persons. Prospective cohort study. Participants were 813 aged 70 years and older, elderly Japanese residing in the community, included in the Tsurugaya Project, who were not disabled at the baseline in 2003. Physical function was assessed by the questionnaire of "Motor Fitness Scale". Physical performance measures consisted of maximum walking velocity, timed up and go test (TUG), leg extension power, and functional reach test. The area under the curve (AUC) of the receiver operating characteristic curve for disability was used to compare screening accuracy between Motor Fitness Scale and physical performance measures. Incident disability, defined as certification for long-term care insurance, was used as the endpoint. We observed 135 cases of incident disability during follow-up. The third or fourth quartile for each measure was associated with a significantly increased risk of disability in comparison with the highest quartile. The AUC was 0.70, 0.72, 0.70, 0.68, 0.69 and 0.74, for Motor Fitness Scale, maxi- mum walking velocity, TUG, leg extension power, functional reach test, and total performance score, respectively. The predictive power of physical function assessed by the Motor Fitness Scale was equivalent to that assessed by physical performance measures. Since Motor Fitness Scale can evaluate physical function safely and simply in comparison with physical performance tests, it would be a practical tool for screening persons at high risk of disability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pennock, Kenneth; Makarov, Yuri V.; Rajagopal, Sankaran
The need for proactive closed-loop integration of uncertainty information into system operations and probability-based controls is widely recognized, but rarely implemented in system operations. Proactive integration for this project means that the information concerning expected uncertainty ranges for net load and balancing requirements, including required balancing capacity, ramping and ramp duration characteristics, will be fed back into the generation commitment and dispatch algorithms to modify their performance so that potential shortages of these characteristics can be prevented. This basic, yet important, premise is the motivating factor for this project. The achieved project goal is to demonstrate the benefit of suchmore » a system. The project quantifies future uncertainties, predicts additional system balancing needs including the prediction intervals for capacity and ramping requirements of future dispatch intervals, evaluates the impacts of uncertainties on transmission including the risk of overloads and voltage problems, and explores opportunities for intra-hour generation adjustments helping to provide more flexibility for system operators. The resulting benefits culminate in more reliable grid operation in the face of increased system uncertainty and variability caused by solar power. The project identifies that solar power does not require special separate penetration level restrictions or penalization for its intermittency. Ultimately, the collective consideration of all sources of intermittency distributed over a wide area unified with the comprehensive evaluation of various elements of balancing process, i.e. capacity, ramping, and energy requirements, help system operators more robustly and effectively balance generation against load and interchange. This project showed that doing so can facilitate more solar and other renewable resources on the grid without compromising reliability and control performance. Efforts during the project included developing and integrating advanced probabilistic solar forecasts, including distributed PV forecasts, into closed –loop decision making processes. Additionally, new uncertainty quantifications methods and tools for the direct integration of uncertainty and variability information into grid operations at the transmission and distribution levels were developed and tested. During Phase 1, project work focused heavily on the design, development and demonstration of a set of processes and tools that could reliably and efficiently incorporate solar power into California’s grid operations. In Phase 2, connectivity between the ramping analysis tools and market applications software were completed, multiple dispatch scenarios demonstrated a successful reduction of overall uncertainty and an analysis to quantify increases in system operator reliability, and the transmission and distribution system uncertainty prediction tool was introduced to system operation engineers in a live webinar. The project met its goals, the experiments prove the advancements to methods and tools, when working together, are beneficial to not only the California Independent System Operator, but the benefits are transferable to other system operators in the United States.« less
Integrated CFD modeling of gas turbine combustors
NASA Technical Reports Server (NTRS)
Fuller, E. J.; Smith, C. E.
1993-01-01
3D, curvilinear, multi-domain CFD analysis is becoming a valuable tool in gas turbine combustor design. Used as a supplement to experimental testing. CFD analysis can provide improved understanding of combustor aerodynamics and used to qualitatively assess new combustor designs. This paper discusses recent advancements in CFD combustor methodology, including the timely integration of the design (i.e. CAD) and analysis (i.e. CFD) processes. Allied Signal's F124 combustor was analyzed at maximum power conditions. The assumption of turbulence levels at the nozzle/swirler inlet was shown to be very important in the prediction of combustor exit temperatures. Predicted exit temperatures were compared to experimental rake data, and good overall agreement was seen. Exit radial temperature profiles were well predicted, while the predicted pattern factor was 25 percent higher than the harmonic-averaged experimental pattern factor.
NASA Technical Reports Server (NTRS)
Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.
1980-01-01
The rigid lid model was developed to predict three dimensional temperature and velocity distributions in lakes. This model was verified at various sites (Lake Belews, Biscayne Bay, etc.) and th verification at Lake Keowee was the last of these series of verification runs. The verification at Lake Keowee included the following: (1) selecting the domain of interest, grid systems, and comparing the preliminary results with archival data; (2) obtaining actual ground truth and infrared scanner data both for summer and winter; and (3) using the model to predict the measured data for the above periods and comparing the predicted results with the actual data. The model results compared well with measured data. Thus, the model can be used as an effective predictive tool for future sites.
NASA Technical Reports Server (NTRS)
Potteiger, Timothy R.; Eure, Kenneth W.; Levenstein, David
2017-01-01
Prediction methods concerning remaining charge in lithium-ion batteries that power unmanned aerial vehicles are of critical concern for the safe fulfillment of mission objectives. In recent years, lithium-ion batteries have been the power source for both fixed wing and vertical lift electric vehicles. The purpose of this document is to describe in detail the implementation of a battery health monitor for estimating the state of charge of a lithium-ion battery and a lithium-ion polymer battery that is used to power a vertical lift aircraft test-bed. It will be demonstrated that an electro-chemistry based state of charge estimator effectively tracks battery discharge characteristics and may be employed as a useful tool in monitoring battery health.
Aircraft Particle Emissions eXperiment (APEX)
NASA Technical Reports Server (NTRS)
Wey, C. C.; Anderson, B. E.; Hudgins, C.; Wey, C.; Li-Jones, X.; Winstead, E.; Thornhill, L. K.; Lobo, P.; Hagen, D.; Whitefield, P.
2006-01-01
APEX systematically investigated the gas-phase and particle emissions from a CFM56-2C1 engine on NASA's DC-8 aircraft as functions of engine power, fuel composition, and exhaust plumage. Emissions parameters were measured at 11 engine power, settings, ranging from idle to maximum thrust, in samples collected at 1, 10, and 30 m downstream of the exhaust plane as the aircraft burned three fuels to stress relevant chemistry. Gas-phase emission indices measured at 1 m were in good agreement with the ICAO data and predictions provided by GEAE empirical modeling tools. Soot particles emitted by the engine exhibited a log-normal size distribution peaked between 15 and 40 nm, depending on engine power. Samples collected 30 m downstream of the engine exhaust plane exhibited a prominent nucleation mode.
Protein function prediction--the power of multiplicity.
Rentzsch, Robert; Orengo, Christine A
2009-04-01
Advances in experimental and computational methods have quietly ushered in a new era in protein function annotation. This 'age of multiplicity' is marked by the notion that only the use of multiple tools, multiple evidence and considering the multiple aspects of function can give us the broad picture that 21st century biology will need to link and alter micro- and macroscopic phenotypes. It might also help us to undo past mistakes by removing errors from our databases and prevent us from producing more. On the downside, multiplicity is often confusing. We therefore systematically review methods and resources for automated protein function prediction, looking at individual (biochemical) and contextual (network) functions, respectively.
An approach to adjustment of relativistic mean field model parameters
NASA Astrophysics Data System (ADS)
Bayram, Tuncay; Akkoyun, Serkan
2017-09-01
The Relativistic Mean Field (RMF) model with a small number of adjusted parameters is powerful tool for correct predictions of various ground-state nuclear properties of nuclei. Its success for describing nuclear properties of nuclei is directly related with adjustment of its parameters by using experimental data. In the present study, the Artificial Neural Network (ANN) method which mimics brain functionality has been employed for improvement of the RMF model parameters. In particular, the understanding capability of the ANN method for relations between the RMF model parameters and their predictions for binding energies (BEs) of 58Ni and 208Pb have been found in agreement with the literature values.
Breast magnetic resonance elastography: a review of clinical work and future perspectives.
Bohte, A E; Nelissen, J L; Runge, J H; Holub, O; Lambert, S A; de Graaf, L; Kolkman, S; van der Meij, S; Stoker, J; Strijkers, G J; Nederveen, A J; Sinkus, R
2018-05-30
This review on magnetic resonance elastography (MRE) of the breast provides an overview of available literature and describes current developments in the field of breast MRE, including new transducer technology for data acquisition and multi-frequency-derived power-law behaviour of tissue. Moreover, we discuss the future potential of breast MRE, which goes beyond its original application as an additional tool in differentiating benign from malignant breast lesions. These areas of ongoing and future research include MRE for pre-operative tumour delineation, staging, monitoring and predicting response to treatment, as well as prediction of the metastatic potential of primary tumours. Copyright © 2018 John Wiley & Sons, Ltd.
The Realization of Drilling Fault Diagnosis Based on Hybrid Programming with Matlab and VB
NASA Astrophysics Data System (ADS)
Wang, Jiangping; Hu, Yingcai
This paper presents a method using hybrid programming with Matlab and VB based on ActiveX to design the system of drilling accident prediction and diagnosis. So that the powerful calculating function and graphical display function of Matlab and visual development interface of VB are combined fully. The main interface of the diagnosis system is compiled in VB,and the analysis and fault diagnosis are implemented by neural network tool boxes in Matlab.The system has favorable interactive interface,and the fault example validation shows that the diagnosis result is feasible and can meet the demands of drilling accident prediction and diagnosis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Xiao; Blazek, Jonathan A.; McEwen, Joseph E.
Cosmological perturbation theory is a powerful tool to predict the statistics of large-scale structure in the weakly non-linear regime, but even at 1-loop order it results in computationally expensive mode-coupling integrals. Here we present a fast algorithm for computing 1-loop power spectra of quantities that depend on the observer's orientation, thereby generalizing the FAST-PT framework (McEwen et al., 2016) that was originally developed for scalars such as the matter density. This algorithm works for an arbitrary input power spectrum and substantially reduces the time required for numerical evaluation. We apply the algorithm to four examples: intrinsic alignments of galaxies inmore » the tidal torque model; the Ostriker-Vishniac effect; the secondary CMB polarization due to baryon flows; and the 1-loop matter power spectrum in redshift space. Code implementing this algorithm and these applications is publicly available at https://github.com/JoeMcEwen/FAST-PT.« less
Operations of the External Conjugate-T Matching System for the A2 ICRH Antennas at JET
NASA Astrophysics Data System (ADS)
Monakhov, I.; Graham, M.; Blackman, T.; Mayoral, M.-L.; Nightingale, M.; Sheikh, H.; Whitehurst, A.
2009-11-01
The External Conjugate-T (ECT) matching system was successfully commissioned on two A2 ICRH antennas at JET in 2009. The system allows trip-free injection of RF power into ELMy H-mode plasmas in the 32-52 MHz band without antenna phasing restrictions. The ECT demonstrates robust and predictable performance and high load-tolerance during routine operations, injecting up to 4 MW average power into H-mode plasma with Type-I ELMs. The total power coupled to ELMy plasma by all the A2 antennas using the ECT and 3dB systems has been increased to 7 MW. Antenna arcing during ELMs has been identified as a new challenge to high-power ICRH operations in H-mode plasma. The implemented Advanced Wave Amplitude Comparison System (AWACS) has proven to be an efficient protection tool for the ECT scheme.
Operations of the External Conjugate-T Matching System for the A2 ICRH Antennas at JET
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monakhov, I.; Graham, M.; Blackman, T.
2009-11-26
The External Conjugate-T (ECT) matching system was successfully commissioned on two A2 ICRH antennas at JET in 2009. The system allows trip-free injection of RF power into ELMy H-mode plasmas in the 32-52 MHz band without antenna phasing restrictions. The ECT demonstrates robust and predictable performance and high load-tolerance during routine operations, injecting up to 4 MW average power into H-mode plasma with Type-I ELMs. The total power coupled to ELMy plasma by all the A2 antennas using the ECT and 3dB systems has been increased to 7 MW. Antenna arcing during ELMs has been identified as a new challengemore » to high-power ICRH operations in H-mode plasma. The implemented Advanced Wave Amplitude Comparison System (AWACS) has proven to be an efficient protection tool for the ECT scheme.« less
Lomnitz, Jason G.; Savageau, Michael A.
2016-01-01
Mathematical models of biochemical systems provide a means to elucidate the link between the genotype, environment, and phenotype. A subclass of mathematical models, known as mechanistic models, quantitatively describe the complex non-linear mechanisms that capture the intricate interactions between biochemical components. However, the study of mechanistic models is challenging because most are analytically intractable and involve large numbers of system parameters. Conventional methods to analyze them rely on local analyses about a nominal parameter set and they do not reveal the vast majority of potential phenotypes possible for a given system design. We have recently developed a new modeling approach that does not require estimated values for the parameters initially and inverts the typical steps of the conventional modeling strategy. Instead, this approach relies on architectural features of the model to identify the phenotypic repertoire and then predict values for the parameters that yield specific instances of the system that realize desired phenotypic characteristics. Here, we present a collection of software tools, the Design Space Toolbox V2 based on the System Design Space method, that automates (1) enumeration of the repertoire of model phenotypes, (2) prediction of values for the parameters for any model phenotype, and (3) analysis of model phenotypes through analytical and numerical methods. The result is an enabling technology that facilitates this radically new, phenotype-centric, modeling approach. We illustrate the power of these new tools by applying them to a synthetic gene circuit that can exhibit multi-stability. We then predict values for the system parameters such that the design exhibits 2, 3, and 4 stable steady states. In one example, inspection of the basins of attraction reveals that the circuit can count between three stable states by transient stimulation through one of two input channels: a positive channel that increases the count, and a negative channel that decreases the count. This example shows the power of these new automated methods to rapidly identify behaviors of interest and efficiently predict parameter values for their realization. These tools may be applied to understand complex natural circuitry and to aid in the rational design of synthetic circuits. PMID:27462346
29 CFR 1910.242 - Hand and portable powered tools and equipment, general.
Code of Federal Regulations, 2011 CFR
2011-07-01
... to less than 30 p.s.i. and then only with effective chip guarding and personal protective equipment. ... 29 Labor 5 2011-07-01 2011-07-01 false Hand and portable powered tools and equipment, general... Powered Tools and Other Hand-Held Equipment § 1910.242 Hand and portable powered tools and equipment...
29 CFR 1910.242 - Hand and portable powered tools and equipment, general.
Code of Federal Regulations, 2010 CFR
2010-07-01
... to less than 30 p.s.i. and then only with effective chip guarding and personal protective equipment. ... 29 Labor 5 2010-07-01 2010-07-01 false Hand and portable powered tools and equipment, general... Powered Tools and Other Hand-Held Equipment § 1910.242 Hand and portable powered tools and equipment...
Sreerangaiah, Dee; Grayer, Michael; Fisher, Benjamin A; Ho, Meilien; Abraham, Sonya; Taylor, Peter C
2016-01-01
To assess the value of quantitative vascular imaging by power Doppler US (PDUS) as a tool that can be used to stratify patient risk of joint damage in early seropositive RA while still biologic naive but on synthetic DMARD treatment. Eighty-five patients with seropositive RA of <3 years duration had clinical, laboratory and imaging assessments at 0 and 12 months. Imaging assessments consisted of radiographs of the hands and feet, two-dimensional (2D) high-frequency and PDUS imaging of 10 MCP joints that were scored for erosions and vascularity and three-dimensional (3D) PDUS of MCP joints and wrists that were scored for vascularity. Severe deterioration on radiographs and ultrasonography was seen in 45 and 28% of patients, respectively. The 3D power Doppler volume and 2D vascularity scores were the most useful US predictors of deterioration. These variables were modelled in two equations that estimate structural damage over 12 months. The equations had a sensitivity of 63.2% and specificity of 80.9% for predicting radiographic structural damage and a sensitivity of 54.2% and specificity of 96.7% for predicting structural damage on ultrasonography. In seropositive early RA, quantitative vascular imaging by PDUS has clinical utility in predicting which patients will derive benefit from early use of biologic therapy. © The Author 2015. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A summary of wind power prediction methods
NASA Astrophysics Data System (ADS)
Wang, Yuqi
2018-06-01
The deterministic prediction of wind power, the probability prediction and the prediction of wind power ramp events are introduced in this paper. Deterministic prediction includes the prediction of statistical learning based on histor ical data and the prediction of physical models based on NWP data. Due to the great impact of wind power ramp events on the power system, this paper also introduces the prediction of wind power ramp events. At last, the evaluation indicators of all kinds of prediction are given. The prediction of wind power can be a good solution to the adverse effects of wind power on the power system due to the abrupt, intermittent and undulation of wind power.
Hur, Manhoi; Campbell, Alexis Ann; Almeida-de-Macedo, Marcia; Li, Ling; Ransom, Nick; Jose, Adarsh; Crispin, Matt; Nikolau, Basil J; Wurtele, Eve Syrkin
2013-04-01
Discovering molecular components and their functionality is key to the development of hypotheses concerning the organization and regulation of metabolic networks. The iterative experimental testing of such hypotheses is the trajectory that can ultimately enable accurate computational modelling and prediction of metabolic outcomes. This information can be particularly important for understanding the biology of natural products, whose metabolism itself is often only poorly defined. Here, we describe factors that must be in place to optimize the use of metabolomics in predictive biology. A key to achieving this vision is a collection of accurate time-resolved and spatially defined metabolite abundance data and associated metadata. One formidable challenge associated with metabolite profiling is the complexity and analytical limits associated with comprehensively determining the metabolome of an organism. Further, for metabolomics data to be efficiently used by the research community, it must be curated in publicly available metabolomics databases. Such databases require clear, consistent formats, easy access to data and metadata, data download, and accessible computational tools to integrate genome system-scale datasets. Although transcriptomics and proteomics integrate the linear predictive power of the genome, the metabolome represents the nonlinear, final biochemical products of the genome, which results from the intricate system(s) that regulate genome expression. For example, the relationship of metabolomics data to the metabolic network is confounded by redundant connections between metabolites and gene-products. However, connections among metabolites are predictable through the rules of chemistry. Therefore, enhancing the ability to integrate the metabolome with anchor-points in the transcriptome and proteome will enhance the predictive power of genomics data. We detail a public database repository for metabolomics, tools and approaches for statistical analysis of metabolomics data, and methods for integrating these datasets with transcriptomic data to create hypotheses concerning specialized metabolisms that generate the diversity in natural product chemistry. We discuss the importance of close collaborations among biologists, chemists, computer scientists and statisticians throughout the development of such integrated metabolism-centric databases and software.
Hur, Manhoi; Campbell, Alexis Ann; Almeida-de-Macedo, Marcia; Li, Ling; Ransom, Nick; Jose, Adarsh; Crispin, Matt; Nikolau, Basil J.
2013-01-01
Discovering molecular components and their functionality is key to the development of hypotheses concerning the organization and regulation of metabolic networks. The iterative experimental testing of such hypotheses is the trajectory that can ultimately enable accurate computational modelling and prediction of metabolic outcomes. This information can be particularly important for understanding the biology of natural products, whose metabolism itself is often only poorly defined. Here, we describe factors that must be in place to optimize the use of metabolomics in predictive biology. A key to achieving this vision is a collection of accurate time-resolved and spatially defined metabolite abundance data and associated metadata. One formidable challenge associated with metabolite profiling is the complexity and analytical limits associated with comprehensively determining the metabolome of an organism. Further, for metabolomics data to be efficiently used by the research community, it must be curated in publically available metabolomics databases. Such databases require clear, consistent formats, easy access to data and metadata, data download, and accessible computational tools to integrate genome system-scale datasets. Although transcriptomics and proteomics integrate the linear predictive power of the genome, the metabolome represents the nonlinear, final biochemical products of the genome, which results from the intricate system(s) that regulate genome expression. For example, the relationship of metabolomics data to the metabolic network is confounded by redundant connections between metabolites and gene-products. However, connections among metabolites are predictable through the rules of chemistry. Therefore, enhancing the ability to integrate the metabolome with anchor-points in the transcriptome and proteome will enhance the predictive power of genomics data. We detail a public database repository for metabolomics, tools and approaches for statistical analysis of metabolomics data, and methods for integrating these dataset with transcriptomic data to create hypotheses concerning specialized metabolism that generates the diversity in natural product chemistry. We discuss the importance of close collaborations among biologists, chemists, computer scientists and statisticians throughout the development of such integrated metabolism-centric databases and software. PMID:23447050
The ITER ICRF Antenna Design with TOPICA
NASA Astrophysics Data System (ADS)
Milanesio, Daniele; Maggiora, Riccardo; Meneghini, Orso; Vecchi, Giuseppe
2007-11-01
TOPICA (Torino Polytechnic Ion Cyclotron Antenna) code is an innovative tool for the 3D/1D simulation of Ion Cyclotron Radio Frequency (ICRF), i.e. accounting for antennas in a realistic 3D geometry and with an accurate 1D plasma model [1]. The TOPICA code has been deeply parallelized and has been already proved to be a reliable tool for antennas design and performance prediction. A detailed analysis of the 24 straps ITER ICRF antenna geometry has been carried out, underlining the strong dependence and asymmetries of the antenna input parameters due to the ITER plasma response. We optimized the antenna array geometry dimensions to maximize loading, lower mutual couplings and mitigate sheath effects. The calculated antenna input impedance matrices are TOPICA results of a paramount importance for the tuning and matching system design. Electric field distributions have been also calculated and they are used as the main input for the power flux estimation tool. The designed optimized antenna is capable of coupling 20 MW of power to plasma in the 40 -- 55 MHz frequency range with a maximum voltage of 45 kV in the feeding coaxial cables. [1] V. Lancellotti et al., Nuclear Fusion, 46 (2006) S476-S499
NASA Astrophysics Data System (ADS)
Zhang, M.; Nunes, V. D.; Burbey, T. J.; Borggaard, J.
2012-12-01
More than 1.5 m of subsidence has been observed in Las Vegas Valley since 1935 as a result of groundwater pumping that commenced in 1905 (Bell, 2002). The compaction of the aquifer system has led to several large subsidence bowls and deleterious earth fissures. The highly heterogeneous aquifer system with its variably thick interbeds makes predicting the magnitude and location of subsidence extremely difficult. Several numerical groundwater flow models of the Las Vegas basin have been previously developed; however none of them have been able to accurately simulate the observed subsidence patterns or magnitudes because of inadequate parameterization. To better manage groundwater resources and predict future subsidence we have updated and developed a more accurate groundwater management model for Las Vegas Valley by developing a new adjoint parameter estimation package (APE) that is used in conjunction with UCODE along with MODFLOW and the SUB (subsidence) and HFB (horizontal flow barrier) packages. The APE package is used with UCODE to automatically identify suitable parameter zonations and inversely calculate parameter values from hydraulic head and subsidence measurements, which are highly sensitive to both elastic (Ske) and inelastic (Skv) storage coefficients. With the advent of InSAR (Interferometric synthetic aperture radar), distributed spatial and temporal subsidence measurements can be obtained, which greatly enhance the accuracy of parameter estimation. This automation process can remove user bias and provide a far more accurate and robust parameter zonation distribution. The outcome of this work yields a more accurate and powerful tool for managing groundwater resources in Las Vegas Valley to date.
NASA Astrophysics Data System (ADS)
Iungo, Giacomo Valerio; Camarri, Simone; Ciri, Umberto; El-Asha, Said; Leonardi, Stefano; Rotea, Mario A.; Santhanagopalan, Vignesh; Viola, Francesco; Zhan, Lu
2016-11-01
Site conditions, such as topography and local climate, as well as wind farm layout strongly affect performance of a wind power plant. Therefore, predictions of wake interactions and their effects on power production still remain a great challenge in wind energy. For this study, an onshore wind turbine array was monitored through lidar measurements, SCADA and met-tower data. Power losses due to wake interactions were estimated to be approximately 4% and 2% of the total power production under stable and convective conditions, respectively. This dataset was then leveraged for the calibration of a data driven RANS (DDRANS) solver, which is a compelling tool for prediction of wind turbine wakes and power production. DDRANS is characterized by a computational cost as low as that for engineering wake models, and adequate accuracy achieved through data-driven tuning of the turbulence closure model. DDRANS is based on a parabolic formulation, axisymmetry and boundary layer approximations, which allow achieving low computational costs. The turbulence closure model consists in a mixing length model, which is optimally calibrated with the experimental dataset. Assessment of DDRANS is then performed through lidar and SCADA data for different atmospheric conditions. This material is based upon work supported by the National Science Foundation under the I/UCRC WindSTAR, NSF Award IIP 1362033.
NASA Astrophysics Data System (ADS)
Zhang, G. Q.; To, S.
2014-08-01
Cutting force and its power spectrum analysis was thought to be an effective method monitoring tool wear in many cutting processes and a significant body of research has been conducted on this research area. However, relative little similar research was found in ultra-precision fly cutting. In this paper, a group of experiments were carried out to investigate the cutting forces and its power spectrum characteristics under different tool wear stages. Result reveals that the cutting force increases with the progress of tool wear. The cutting force signals under different tool wear stages were analyzed using power spectrum analysis. The analysis indicates that a characteristic frequency does exist in the power spectrum of the cutting force, whose power spectral density increases with the increasing of tool wear level, this characteristic frequency could be adopted to monitor diamond tool wear in ultra-precision fly cutting.
Radiation Mitigation and Power Optimization Design Tools for Reconfigurable Hardware in Orbit
NASA Technical Reports Server (NTRS)
French, Matthew; Graham, Paul; Wirthlin, Michael; Wang, Li; Larchev, Gregory
2005-01-01
The Reconfigurable Hardware in Orbit (RHinO)project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. In the second year of the project, design tools that leverage an established FPGA design environment have been created to visualize and analyze an FPGA circuit for radiation weaknesses and power inefficiencies. For radiation, a single event Upset (SEU) emulator, persistence analysis tool, and a half-latch removal tool for Xilinx/Virtex-II devices have been created. Research is underway on a persistence mitigation tool and multiple bit upsets (MBU) studies. For power, synthesis level dynamic power visualization and analysis tools have been completed. Power optimization tools are under development and preliminary test results are positive.
A simulation model for risk assessment of turbine wheels
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Hage, Richard T.
1991-01-01
A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.
A simulation model for risk assessment of turbine wheels
NASA Astrophysics Data System (ADS)
Safie, Fayssal M.; Hage, Richard T.
A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.
Predictive Mining of Time Series Data
NASA Astrophysics Data System (ADS)
Java, A.; Perlman, E. S.
2002-05-01
All-sky monitors are a relatively new development in astronomy, and their data represent a largely untapped resource. Proper utilization of this resource could lead to important discoveries not only in the physics of variable objects, but in how one observes such objects. We discuss the development of a Java toolbox for astronomical time series data. Rather than using methods conventional in astronomy (e.g., power spectrum and cross-correlation analysis) we employ rule discovery techniques commonly used in analyzing stock-market data. By clustering patterns found within the data, rule discovery allows one to build predictive models, allowing one to forecast when a given event might occur or whether the occurrence of one event will trigger a second. We have tested the toolbox and accompanying display tool on datasets (representing several classes of objects) from the RXTE All Sky Monitor. We use these datasets to illustrate the methods and functionality of the toolbox. We have found predictive patterns in several ASM datasets. We also discuss problems faced in the development process, particularly the difficulties of dealing with discretized and irregularly sampled data. A possible application would be in scheduling target of opportunity observations where the astronomer wants to observe an object when a certain event or series of events occurs. By combining such a toolbox with an automatic, Java query tool which regularly gathers data on objects of interest, the astronomer or telescope operator could use the real-time datastream to efficiently predict the occurrence of (for example) a flare or other event. By combining the toolbox with dynamic time warping data-mining tools, one could predict events which may happen on variable time scales.
Predictors of self-rated health in patients with chronic nonmalignant pain.
Siedlecki, Sandra L
2006-09-01
Self-rated health (SRH) is an important outcome measure that has been found to accurately predict mortality, morbidity, function, and psychologic well-being. Chronic nonmalignant pain presents with a pattern that includes low levels of power and high levels of pain, depression, and disability. Differences in SRH may be related to variations within this pattern. The purpose of this analysis was to identify determinants of SRH and test their ability to predict SRH in patients with chronic nonmalignant pain. SRH was measured by response to a single three-option age-comparative question. The Power as Knowing Participation in Change Tool, McGill Pain Questionnaire Short Form, Center for Epidemiological Studies Depression Scale, and Pain Disability Index were used to measure independent variables. Multivariate analysis of variance revealed significant differences (p = .001) between SRH categories on the combined dependent variable. Analysis of variance conducted as a follow-up identified significant differences for power (p < .001) and depression (p = .003), but not for pain or pain-related disability; and discriminant analysis found that power and depression correctly classified patients with 75% accuracy. Findings suggest pain interventions designed to improve mood and provide opportunities for knowing participation may have a greater impact on overall health than those that target only pain and disability.
Utilizing Dental Electronic Health Records Data to Predict Risk for Periodontal Disease.
Thyvalikakath, Thankam P; Padman, Rema; Vyawahare, Karnali; Darade, Pratiksha; Paranjape, Rhucha
2015-01-01
Periodontal disease is a major cause for tooth loss and adversely affects individuals' oral health and quality of life. Research shows its potential association with systemic diseases like diabetes and cardiovascular disease, and social habits such as smoking. This study explores mining potential risk factors from dental electronic health records to predict and display patients' contextualized risk for periodontal disease. We retrieved relevant risk factors from structured and unstructured data on 2,370 patients who underwent comprehensive oral examinations at the Indiana University School of Dentistry, Indianapolis, IN, USA. Predicting overall risk and displaying relationships between risk factors and their influence on the patient's oral and general health can be a powerful educational and disease management tool for patients and clinicians at the point of care.
Integrated Computational Solution for Predicting Skin Sensitization Potential of Molecules
Desai, Aarti; Singh, Vivek K.; Jere, Abhay
2016-01-01
Introduction Skin sensitization forms a major toxicological endpoint for dermatology and cosmetic products. Recent ban on animal testing for cosmetics demands for alternative methods. We developed an integrated computational solution (SkinSense) that offers a robust solution and addresses the limitations of existing computational tools i.e. high false positive rate and/or limited coverage. Results The key components of our solution include: QSAR models selected from a combinatorial set, similarity information and literature-derived sub-structure patterns of known skin protein reactive groups. Its prediction performance on a challenge set of molecules showed accuracy = 75.32%, CCR = 74.36%, sensitivity = 70.00% and specificity = 78.72%, which is better than several existing tools including VEGA (accuracy = 45.00% and CCR = 54.17% with ‘High’ reliability scoring), DEREK (accuracy = 72.73% and CCR = 71.44%) and TOPKAT (accuracy = 60.00% and CCR = 61.67%). Although, TIMES-SS showed higher predictive power (accuracy = 90.00% and CCR = 92.86%), the coverage was very low (only 10 out of 77 molecules were predicted reliably). Conclusions Owing to improved prediction performance and coverage, our solution can serve as a useful expert system towards Integrated Approaches to Testing and Assessment for skin sensitization. It would be invaluable to cosmetic/ dermatology industry for pre-screening their molecules, and reducing time, cost and animal testing. PMID:27271321
A Predictive Model for Readmissions Among Medicare Patients in a California Hospital.
Duncan, Ian; Huynh, Nhan
2017-11-17
Predictive models for hospital readmission rates are in high demand because of the Centers for Medicare & Medicaid Services (CMS) Hospital Readmission Reduction Program (HRRP). The LACE index is one of the most popular predictive tools among hospitals in the United States. The LACE index is a simple tool with 4 parameters: Length of stay, Acuity of admission, Comorbidity, and Emergency visits in the previous 6 months. The authors applied logistic regression to develop a predictive model for a medium-sized not-for-profit community hospital in California using patient-level data with more specific patient information (including 13 explanatory variables). Specifically, the logistic regression is applied to 2 populations: a general population including all patients and the specific group of patients targeted by the CMS penalty (characterized as ages 65 or older with select conditions). The 2 resulting logistic regression models have a higher sensitivity rate compared to the sensitivity of the LACE index. The C statistic values of the model applied to both populations demonstrate moderate levels of predictive power. The authors also build an economic model to demonstrate the potential financial impact of the use of the model for targeting high-risk patients in a sample hospital and demonstrate that, on balance, whether the hospital gains or loses from reducing readmissions depends on its margin and the extent of its readmission penalties.
JNSViewer—A JavaScript-based Nucleotide Sequence Viewer for DNA/RNA secondary structures
Dong, Min; Graham, Mitchell; Yadav, Nehul
2017-01-01
Many tools are available for visualizing RNA or DNA secondary structures, but there is scarce implementation in JavaScript that provides seamless integration with the increasingly popular web computational platforms. We have developed JNSViewer, a highly interactive web service, which is bundled with several popular tools for DNA/RNA secondary structure prediction and can provide precise and interactive correspondence among nucleotides, dot-bracket data, secondary structure graphs, and genic annotations. In JNSViewer, users can perform RNA secondary structure predictions with different programs and settings, add customized genic annotations in GFF format to structure graphs, search for specific linear motifs, and extract relevant structure graphs of sub-sequences. JNSViewer also allows users to choose a transcript or specific segment of Arabidopsis thaliana genome sequences and predict the corresponding secondary structure. Popular genome browsers (i.e., JBrowse and BrowserGenome) were integrated into JNSViewer to provide powerful visualizations of chromosomal locations, genic annotations, and secondary structures. In addition, we used StructureFold with default settings to predict some RNA structures for Arabidopsis by incorporating in vivo high-throughput RNA structure profiling data and stored the results in our web server, which might be a useful resource for RNA secondary structure studies in plants. JNSViewer is available at http://bioinfolab.miamioh.edu/jnsviewer/index.html. PMID:28582416
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tratnyek, Paul G.; Bylaska, Eric J.; Weber, Eric J.
2017-01-01
Quantitative structure–activity relationships (QSARs) have long been used in the environmental sciences. More recently, molecular modeling and chemoinformatic methods have become widespread. These methods have the potential to expand and accelerate advances in environmental chemistry because they complement observational and experimental data with “in silico” results and analysis. The opportunities and challenges that arise at the intersection between statistical and theoretical in silico methods are most apparent in the context of properties that determine the environmental fate and effects of chemical contaminants (degradation rate constants, partition coefficients, toxicities, etc.). The main example of this is the calibration of QSARs usingmore » descriptor variable data calculated from molecular modeling, which can make QSARs more useful for predicting property data that are unavailable, but also can make them more powerful tools for diagnosis of fate determining pathways and mechanisms. Emerging opportunities for “in silico environmental chemical science” are to move beyond the calculation of specific chemical properties using statistical models and toward more fully in silico models, prediction of transformation pathways and products, incorporation of environmental factors into model predictions, integration of databases and predictive models into more comprehensive and efficient tools for exposure assessment, and extending the applicability of all the above from chemicals to biologicals and materials.« less
Predictive power of Koplik's spots for the diagnosis of measles.
Zenner, Dominik; Nacul, Luis
2012-03-12
Measles is a major cause of mortality globally. In many countries, management of measles is based on clinical suspicion, but the predictive value of clinical diagnosis depends on knowledge and population prevalence of measles. In the pre-vaccine era with high measles incidence, Koplik's spots (KS) were said to be "pathognomonic". This study prospectively evaluated test properties and diagnostic odds ratios (OR) of KS. Data including KS status were prospectively collected for a six-month period on all suspected measles cases reported to the North-West London Health Protection Unit. Saliva test kits were sent to all cases and KS test properties were analysed against measles confirmation by PCR or IgM testing (gold standard). The positive predictive value (PPV) of clinically suspecting measles was 50%. Using KS as diagnostic tool improved the PPV to 80% and the presence of KS was associated with confirmed measles in the multi-variable analysis (OR 7.2, 95% Confidence Interval 2.1-24.9, p=0.001). We found that Koplik's spots were highly predictive of confirmed measles and could be a good clinical tool to enable prompt measles management and control measures, as action often needs to be taken in the absence of laboratory confirmation. We suggest that current clinical case definitions might benefit from the inclusion of KS.
29 CFR 1926.304 - Woodworking tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) SAFETY AND HEALTH REGULATIONS FOR CONSTRUCTION Tools-Hand and Power § 1926.304 Woodworking tools. (a) Disconnect switches. All fixed power driven woodworking tools shall be provided with a disconnect..., power-driven circular saws shall be equipped with guards above and below the base plate or shoe. The...
Evaluating O, C, and N isotopes in human hair as a forensic tool to reconstruct travel
NASA Astrophysics Data System (ADS)
Ehleringer, Jim; Chesson, Lesley; Cerling, Thure; Valenzuela, Luciano
2014-05-01
Oxygen isotope ratios in the proteins of human scalp hair have been proposed and modeled as a tool for reconstructing the movements of humans and evaluating the likelihood that an individual is a resident or non-resident of a particular geographic region. Carbon and nitrogen isotope ratios reflect dietary input and complement oxygen isotope data interpretation when it is necessary to distinguish potential location overlap among continents. The combination of a time sequence analysis in hair segments and spatial models that describe predicted geographic variation in hair isotope values represents a potentially powerful tool for forensic investigations. The applications of this technique have thus far been to provide assistance to law enforcement with information on the predicted geographical travel histories of unidentified murder victims. Here we review multiple homicide cases from the USA where stable isotope analysis of hair has been applied and for which we now know the travel histories of the murder victims. Here we provide information on the robustness of the original data sets used to test these models by evaluating the travel histories of randomly collected hair discarded in Utah barbershops.
Nilsson, Lisbeth; Durkin, Josephine
2017-10-01
To explore the knowledge necessary for adoption and implementation of the Assessment of Learning Powered mobility use (ALP) tool in different practice settings for both adults and children. To consult with a diverse population of professionals working with adults and children, in different countries and various settings; who were learning about or using the ALP tool, as part of exploring and implementing research findings. Classical grounded theory with a rigorous comparative analysis of data from informants together with reflections on our own rich experiences of powered mobility practice and comparisons with the literature. A core category learning tool use and a new theory of cognizing tool use, with its interdependent properties: motivation, confidence, permissiveness, attentiveness and co-construction has emerged which explains in greater depth what enables the application of the ALP tool. The scientific knowledge base on tool use learning and the new theory conveys the information necessary for practitioner's cognizing how to apply the learning approach of the ALP tool in order to enable tool use learning through powered mobility practice as a therapeutic intervention in its own right. This opens up the possibility for more children and adults to have access to learning through powered mobility practice. Implications for rehabilitation Tool use learning through powered mobility practice is a therapeutic intervention in its own right. Powered mobility practice can be used as a rehabilitation tool with individuals who may not need to become powered wheelchair users. Motivation, confidence, permissiveness, attentiveness and co-construction are key properties for enabling the application of the learning approach of the ALP tool. Labelling and the use of language, together with honing observational skills through viewing video footage, are key to developing successful learning partnerships.
Biberger, Thomas; Ewert, Stephan D
2017-08-01
The generalized power spectrum model [GPSM; Biberger and Ewert (2016). J. Acoust. Soc. Am. 140, 1023-1038], combining the "classical" concept of the power-spectrum model (PSM) and the envelope power spectrum-model (EPSM), was demonstrated to account for several psychoacoustic and speech intelligibility (SI) experiments. The PSM path of the model uses long-time power signal-to-noise ratios (SNRs), while the EPSM path uses short-time envelope power SNRs. A systematic comparison of existing SI models for several spectro-temporal manipulations of speech maskers and gender combinations of target and masker speakers [Schubotz et al. (2016). J. Acoust. Soc. Am. 140, 524-540] showed the importance of short-time power features. Conversely, Jørgensen et al. [(2013). J. Acoust. Soc. Am. 134, 436-446] demonstrated a higher predictive power of short-time envelope power SNRs than power SNRs using reverberation and spectral subtraction. Here the GPSM was extended to utilize short-time power SNRs and was shown to account for all psychoacoustic and SI data of the three mentioned studies. The best processing strategy was to exclusively use either power or envelope-power SNRs, depending on the experimental task. By analyzing both domains, the suggested model might provide a useful tool for clarifying the contribution of amplitude modulation masking and energetic masking.
Accuracy of three-dimensional multislice view Doppler in diagnosis of morbid adherent placenta
Abdel Moniem, Alaa M.; Ibrahim, Ahmed; Akl, Sherif A.; Aboul-Enen, Loay; Abdelazim, Ibrahim A.
2015-01-01
Objective To detect the accuracy of the three-dimensional multislice view (3D MSV) Doppler in the diagnosis of morbid adherent placenta (MAP). Material and Methods Fifty pregnant women at ≥28 weeks gestation with suspected MAP were included in this prospective study. Two dimensional (2D) trans-abdominal gray-scale ultrasound scan was performed for the subjects to confirm the gestational age, placental location, and findings suggestive of MAP, followed by the 3D power Doppler and then the 3D MSV Doppler to confirm the diagnosis of MAP. Intraoperative findings and histopathology results of removed uteri in cases managed by emergency hysterectomy were compared with preoperative sonographic findings to detect the accuracy of the 3D MSV Doppler in the diagnosis of MAP. Results The 3D MSV Doppler increased the accuracy and predictive values of the diagnostic criteria of MAP compared with the 3D power Doppler. The sensitivity and negative predictive value (NPV) (79.6% and 82.2%, respectively) of crowded vessels over the peripheral sub-placental zone to detect difficult placental separation and considerable intraoperative blood loss in cases of MAP using the 3D power Doppler was increased to 82.6% and 84%, respectively, using the 3D MSV Doppler. In addition, the sensitivity, specificity, and positive predictive value (PPV) (90.9%, 68.8%, and 47%, respectively) of the disruption of the uterine serosa-bladder interface for the detection of emergency hysterectomy in cases of MAP using the 3D power Doppler was increased to 100%, 71.8%, and 50%, respectively, using the 3D MSV Doppler. Conclusion The 3D MSV Doppler is a useful adjunctive tool to the 3D power Doppler or color Doppler to refine the diagnosis of MAP. PMID:26401104
CFD Validation with Experiment and Verification with Physics of a Propellant Damping Device
NASA Technical Reports Server (NTRS)
Yang, H. Q.; Peugeot, John
2011-01-01
This paper will document our effort in validating a coupled fluid-structure interaction CFD tool in predicting a damping device performance in the laboratory condition. Consistently good comparisons of "blind" CFD predictions against experimental data under various operation conditions, design parameters, and cryogenic environment will be presented. The power of the coupled CFD-structures interaction code in explaining some unexpected phenomena of the device observed during the technology development will be illustrated. The evolution of the damper device design inside the LOX tank will be used to demonstrate the contribution of the tool in understanding, optimization and implementation of LOX damper in Ares I vehicle. It is due to the present validation effort, the LOX damper technology has matured to TRL 5. The present effort has also contributed to the transition of the technology from an early conceptual observation to the baseline design of thrust oscillation mitigation for the Ares I within a 10 month period.
Finite Element Modeling, Simulation, Tools, and Capabilities at Superform
NASA Astrophysics Data System (ADS)
Raman, Hari; Barnes, A. J.
2010-06-01
Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.
Artificial Intelligence Tools for Scaling Up of High Shear Wet Granulation Process.
Landin, Mariana
2017-01-01
The results presented in this article demonstrate the potential of artificial intelligence tools for predicting the endpoint of the granulation process in high-speed mixer granulators of different scales from 25L to 600L. The combination of neurofuzzy logic and gene expression programing technologies allowed the modeling of the impeller power as a function of operation conditions and wet granule properties, establishing the critical variables that affect the response and obtaining a unique experimental polynomial equation (transparent model) of high predictability (R 2 > 86.78%) for all size equipment. Gene expression programing allowed the modeling of the granulation process for granulators of similar and dissimilar geometries and can be improved by implementing additional characteristics of the process, as composition variables or operation parameters (e.g., batch size, chopper speed). The principles and the methodology proposed here can be applied to understand and control manufacturing process, using any other granulation equipment, including continuous granulation processes. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time‐to‐Event Analysis
Gong, Xiajing; Hu, Meng
2018-01-01
Abstract Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time‐to‐event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high‐dimensional data featured by a large number of predictor variables. Our results showed that ML‐based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high‐dimensional data. The prediction performances of ML‐based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML‐based methods provide a powerful tool for time‐to‐event analysis, with a built‐in capacity for high‐dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. PMID:29536640
Suppa, Per; Hampel, Harald; Kepp, Timo; Lange, Catharina; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph
2016-01-01
MRI-based hippocampus volume, a core feasible biomarker of Alzheimer's disease (AD), is not yet widely used in clinical patient care, partly due to lack of validation of software tools for hippocampal volumetry that are compatible with routine workflow. Here, we evaluate fully-automated and computationally efficient hippocampal volumetry with FSL-FIRST for prediction of AD dementia (ADD) in subjects with amnestic mild cognitive impairment (aMCI) from phase 1 of the Alzheimer's Disease Neuroimaging Initiative. Receiver operating characteristic analysis of FSL-FIRST hippocampal volume (corrected for head size and age) revealed an area under the curve of 0.79, 0.70, and 0.70 for prediction of aMCI-to-ADD conversion within 12, 24, or 36 months, respectively. Thus, FSL-FIRST provides about the same power for prediction of progression to ADD in aMCI as other volumetry methods.
Photovoltaic performance models: an evaluation with actual field data
NASA Astrophysics Data System (ADS)
TamizhMani, Govindasamy; Ishioye, John-Paul; Voropayev, Arseniy; Kang, Yi
2008-08-01
Prediction of energy production is crucial to the design and installation of the building integrated photovoltaic systems. This prediction should be attainable based on the commonly available parameters such as system size, orientation and tilt angle. Several commercially available as well as free downloadable software tools exist to predict energy production. Six software models have been evaluated in this study and they are: PV Watts, PVsyst, MAUI, Clean Power Estimator, Solar Advisor Model (SAM) and RETScreen. This evaluation has been done by comparing the monthly, seasonaly and annually predicted data with the actual, field data obtained over a year period on a large number of residential PV systems ranging between 2 and 3 kWdc. All the systems are located in Arizona, within the Phoenix metropolitan area which lies at latitude 33° North, and longitude 112 West, and are all connected to the electrical grid.
2017-12-01
people eagerly anticipated failed to materialize. Instead, the country fractured into a collection of well- organized Islamic militias armed with...is continuously updated in near real time. When applied to certain models, ICEWS can be a powerful predictive tool to “forecast select events of...project would be incomplete without you. It is possible that through your work, and the work of dedicated people like you, Libya will see better days
Lee, Tai-Sung; Hu, Yuan; Sherborne, Brad; Guo, Zhuyan; York, Darrin M
2017-07-11
We report the implementation of the thermodynamic integration method on the pmemd module of the AMBER 16 package on GPUs (pmemdGTI). The pmemdGTI code typically delivers over 2 orders of magnitude of speed-up relative to a single CPU core for the calculation of ligand-protein binding affinities with no statistically significant numerical differences and thus provides a powerful new tool for drug discovery applications.
Spacecraft Electrical Power System (EPS) generic analysis tools and techniques
NASA Technical Reports Server (NTRS)
Morris, Gladys M.; Sheppard, Mark A.
1992-01-01
An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.
30 CFR 56.14116 - Hand-held power tools.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Hand-held power tools. 56.14116 Section 56... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 56.14116 Hand-held power tools. (a) Power drills...
30 CFR 56.14116 - Hand-held power tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Hand-held power tools. 56.14116 Section 56... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 56.14116 Hand-held power tools. (a) Power drills...
30 CFR 56.14116 - Hand-held power tools.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Hand-held power tools. 56.14116 Section 56... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 56.14116 Hand-held power tools. (a) Power drills...
30 CFR 57.14116 - Hand-held power tools.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Hand-held power tools. 57.14116 Section 57... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 57.14116 Hand-held power tools. (a) Power drills...
30 CFR 56.14116 - Hand-held power tools.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Hand-held power tools. 56.14116 Section 56... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 56.14116 Hand-held power tools. (a) Power drills...
30 CFR 57.14116 - Hand-held power tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Hand-held power tools. 57.14116 Section 57... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 57.14116 Hand-held power tools. (a) Power drills...
30 CFR 57.14116 - Hand-held power tools.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Hand-held power tools. 57.14116 Section 57... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 57.14116 Hand-held power tools. (a) Power drills...
30 CFR 56.14116 - Hand-held power tools.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Hand-held power tools. 56.14116 Section 56... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-SURFACE METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 56.14116 Hand-held power tools. (a) Power drills...
30 CFR 57.14116 - Hand-held power tools.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Hand-held power tools. 57.14116 Section 57... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 57.14116 Hand-held power tools. (a) Power drills...
30 CFR 57.14116 - Hand-held power tools.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Hand-held power tools. 57.14116 Section 57... MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES Machinery and Equipment Safety Devices and Maintenance Requirements § 57.14116 Hand-held power tools. (a) Power drills...
Power Plant Model Validation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less
A new solar power output prediction based on hybrid forecast engine and decomposition model.
Zhang, Weijiang; Dang, Hongshe; Simoes, Rolando
2018-06-12
Regarding to the growing trend of photovoltaic (PV) energy as a clean energy source in electrical networks and its uncertain nature, PV energy prediction has been proposed by researchers in recent decades. This problem is directly effects on operation in power network while, due to high volatility of this signal, an accurate prediction model is demanded. A new prediction model based on Hilbert Huang transform (HHT) and integration of improved empirical mode decomposition (IEMD) with feature selection and forecast engine is presented in this paper. The proposed approach is divided into three main sections. In the first section, the signal is decomposed by the proposed IEMD as an accurate decomposition tool. To increase the accuracy of the proposed method, a new interpolation method has been used instead of cubic spline curve (CSC) fitting in EMD. Then the obtained output is entered into the new feature selection procedure to choose the best candidate inputs. Finally, the signal is predicted by a hybrid forecast engine composed of support vector regression (SVR) based on an intelligent algorithm. The effectiveness of the proposed approach has been verified over a number of real-world engineering test cases in comparison with other well-known models. The obtained results prove the validity of the proposed method. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Internal exposure dynamics drive the Adverse Outcome Pathways of synthetic glucocorticoids in fish
NASA Astrophysics Data System (ADS)
Margiotta-Casaluci, Luigi; Owen, Stewart F.; Huerta, Belinda; Rodríguez-Mozaz, Sara; Kugathas, Subramanian; Barceló, Damià; Rand-Weaver, Mariann; Sumpter, John P.
2016-02-01
The Adverse Outcome Pathway (AOP) framework represents a valuable conceptual tool to systematically integrate existing toxicological knowledge from a mechanistic perspective to facilitate predictions of chemical-induced effects across species. However, its application for decision-making requires the transition from qualitative to quantitative AOP (qAOP). Here we used a fish model and the synthetic glucocorticoid beclomethasone dipropionate (BDP) to investigate the role of chemical-specific properties, pharmacokinetics, and internal exposure dynamics in the development of qAOPs. We generated a qAOP network based on drug plasma concentrations and focused on immunodepression, skin androgenisation, disruption of gluconeogenesis and reproductive performance. We showed that internal exposure dynamics and chemical-specific properties influence the development of qAOPs and their predictive power. Comparing the effects of two different glucocorticoids, we highlight how relatively similar in vitro hazard-based indicators can lead to different in vivo risk. This discrepancy can be predicted by their different uptake potential, pharmacokinetic (PK) and pharmacodynamic (PD) profiles. We recommend that the development phase of qAOPs should include the application of species-species uptake and physiologically-based PK/PD models. This integration will significantly enhance the predictive power, enabling a more accurate assessment of the risk and the reliable transferability of qAOPs across chemicals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2015-06-01
WEC-Sim is a DOE-funded software tool being jointly developed by NREL and SNL. WEC-Sim computationally models wave energy converters (WEC), devices that generate electricity using movement of water systems such as oceans, rivers, etc. There is great potential for WECs to generate electricity, but as of yet, the industry has yet to establish a commercially viable concept. Modeling, design, and simulations tools are essential to the successful development of WECs. Commercial WEC modeling software tools can't be modified by the user. In contrast, WEC-Sim is a free, open-source, and flexible enough to be modified to meet the rapidly evolving needsmore » of the WEC industry. By modeling the power generation performance and dynamic loads of WEC designs, WEC-Sim can help support the development of new WEC devices by optimizing designs for cost of energy and competitiveness. By being easily accessible, WEC-Sim promises to help level the playing field in the WEC industry. Importantly, WEC-Sim is also excellent at its job! In 2014, WEC-Sim was used in conjunction with NREL’s FAST modeling software to win a hydrodynamic modeling competition. WEC-Sim and FAST performed very well at predicting the motion of a test device in comparison to other modeling tools. The most recent version of WEC-Sim (v1.1) was released in April 2015.« less
NASA Astrophysics Data System (ADS)
Franz, S.
2004-10-01
Since the discovery of the renormalization group theory in statistical physics, the realm of applications of the concepts of scale invariance and criticality has pervaded several fields of natural and social sciences. This is the leitmotiv of Didier Sornette's book, who in Critical Phenomena in Natural Sciences reviews three decades of developments and applications of the concepts of criticality, scale invariance and power law behaviour from statistical physics, to earthquake prediction, ruptures, plate tectonics, modelling biological and economic systems and so on. This strongly interdisciplinary book addresses students and researchers in disciplines where concepts of criticality and scale invariance are appropriate: mainly geology from which most of the examples are taken, but also engineering, biology, medicine, economics, etc. A good preparation in quantitative science is assumed but the presentation of statistical physics principles, tools and models is self-contained, so that little background in this field is needed. The book is written in a simple informal style encouraging intuitive comprehension rather than stressing formal derivations. Together with the discussion of the main conceptual results of the discipline, great effort is devoted to providing applied scientists with the tools of data analysis and modelling necessary to analyse, understand, make predictions and simulate systems undergoing complex collective behaviour. The book starts from a purely descriptive approach, explaining basic probabilistic and geometrical tools to characterize power law behaviour and scale invariant sets. Probability theory is introduced by a detailed discussion of interpretative issues warning the reader on the use and misuse of probabilistic concepts when the emphasis is on prediction of low probability rare---and often catastrophic---events. Then, concepts that have proved useful in risk evaluation, extreme value statistics, large limit theorems for sums of independent variables with power law distribution, random walks, fractals and multifractal formalisms, etc, are discussed in an immediate and direct way so as to provide ready-to-use tools for analysing and representing power law behaviour in natural phenomena. The exposition then continues discussing the main developments, allowing the reader to understand theoretically and model strongly correlated behaviour. After a concise, but useful, introduction to the fundamentals of statistical physics a discussion of equilibrium critical phenomena and the renormalization group is proposed to the reader. With the centrality of the problem of non-equilibrium behaviour in mind, a discussion is devoted to tentative applications of the concept of temperature in the off-equilibrium context. Particular emphasis is given to the development of long range correlation and of precursors of phase transitions, and their role in the prediction of catastrophic events. Then, basic models such as percolation and rupture models are described. A central position in the book is occupied by a chapter on mechanisms for power laws and a subsequent one on self-organized criticality as a general paradigm for critical behaviour as proposed by P Bak and collaborators. The book concludes with a chapter on the prediction of fields generated by a random distribution of sources. The book maintains the promise of the title of providing concepts and tools to tackle criticality and self-organization. The second edition, while retaining the structure of the first edition, considerably extends the scope with new examples and applications of a research field which is constantly growing. Any scientific book has to solve the dichotomy between the depth of discussion, the pedagogical character of exposition and the quantity of material discussed. In general the book, which evolved from a graduate student course, favours these last two aspects at the expense of the first one. This makes the book very readable and means that, while complicated concepts are always explained by means of simple examples, important results are often mentioned but not derived or discussed in depth. Most of the time this style of exposition manages to successfully convey the essential information, other times unfortunately, e.g. in the case of the chapter on disordered systems, the presentation appears rather superficial. This is the price we pay for a book covering an impressively vast subject area and the huge bibliography (more than 1000 references) furnishes a necessary guide for acquiring the working knowledge of the subject covered. I would recommend it to teachers planning introductory courses on the field of complex systems and to researchers wanting to learn about an area of great contemporary interest.
Lo, Yuan-Chieh; Hu, Yuh-Chung; Chang, Pei-Zen
2018-01-01
Thermal characteristic analysis is essential for machine tool spindles because sudden failures may occur due to unexpected thermal issue. This article presents a lumped-parameter Thermal Network Model (TNM) and its parameter estimation scheme, including hardware and software, in order to characterize both the steady-state and transient thermal behavior of machine tool spindles. For the hardware, the authors develop a Bluetooth Temperature Sensor Module (BTSM) which accompanying with three types of temperature-sensing probes (magnetic, screw, and probe). Its specification, through experimental test, achieves to the precision ±(0.1 + 0.0029|t|) °C, resolution 0.00489 °C, power consumption 7 mW, and size Ø40 mm × 27 mm. For the software, the heat transfer characteristics of the machine tool spindle correlative to rotating speed are derived based on the theory of heat transfer and empirical formula. The predictive TNM of spindles was developed by grey-box estimation and experimental results. Even under such complicated operating conditions as various speeds and different initial conditions, the experiments validate that the present modeling methodology provides a robust and reliable tool for the temperature prediction with normalized mean square error of 99.5% agreement, and the present approach is transferable to the other spindles with a similar structure. For realizing the edge computing in smart manufacturing, a reduced-order TNM is constructed by Model Order Reduction (MOR) technique and implemented into the real-time embedded system. PMID:29473877
Lo, Yuan-Chieh; Hu, Yuh-Chung; Chang, Pei-Zen
2018-02-23
Thermal characteristic analysis is essential for machine tool spindles because sudden failures may occur due to unexpected thermal issue. This article presents a lumped-parameter Thermal Network Model (TNM) and its parameter estimation scheme, including hardware and software, in order to characterize both the steady-state and transient thermal behavior of machine tool spindles. For the hardware, the authors develop a Bluetooth Temperature Sensor Module (BTSM) which accompanying with three types of temperature-sensing probes (magnetic, screw, and probe). Its specification, through experimental test, achieves to the precision ±(0.1 + 0.0029|t|) °C, resolution 0.00489 °C, power consumption 7 mW, and size Ø40 mm × 27 mm. For the software, the heat transfer characteristics of the machine tool spindle correlative to rotating speed are derived based on the theory of heat transfer and empirical formula. The predictive TNM of spindles was developed by grey-box estimation and experimental results. Even under such complicated operating conditions as various speeds and different initial conditions, the experiments validate that the present modeling methodology provides a robust and reliable tool for the temperature prediction with normalized mean square error of 99.5% agreement, and the present approach is transferable to the other spindles with a similar structure. For realizing the edge computing in smart manufacturing, a reduced-order TNM is constructed by Model Order Reduction (MOR) technique and implemented into the real-time embedded system.
A computational model that predicts behavioral sensitivity to intracortical microstimulation
Kim, Sungshin; Callier, Thierri; Bensmaia, Sliman J.
2016-01-01
Objective Intracortical microstimulation (ICMS) is a powerful tool to investigate the neural mechanisms of perception and can be used to restore sensation for patients who have lost it. While sensitivity to ICMS has previously been characterized, no systematic framework has been developed to summarize the detectability of individual ICMS pulse trains or the discriminability of pairs of pulse trains. Approach We develop a simple simulation that describes the responses of a population of neurons to a train of electrical pulses delivered through a microelectrode. We then perform an ideal observer analysis on the simulated population responses to predict the behavioral performance of non-human primates in ICMS detection and discrimination tasks. Main results Our computational model can predict behavioral performance across a wide range of stimulation conditions with high accuracy (R2 = 0.97) and generalizes to novel ICMS pulse trains that were not used to fit its parameters. Furthermore, the model provides a theoretical basis for the finding that amplitude discrimination based on ICMS violates Weber's law. Significance The model can be used to characterize the sensitivity to ICMS across the range of perceptible and safe stimulation regimes. As such, it will be a useful tool for both neuroscience and neuroprosthetics. PMID:27977419
A computational model that predicts behavioral sensitivity to intracortical microstimulation.
Kim, Sungshin; Callier, Thierri; Bensmaia, Sliman J
2017-02-01
Intracortical microstimulation (ICMS) is a powerful tool to investigate the neural mechanisms of perception and can be used to restore sensation for patients who have lost it. While sensitivity to ICMS has previously been characterized, no systematic framework has been developed to summarize the detectability of individual ICMS pulse trains or the discriminability of pairs of pulse trains. We develop a simple simulation that describes the responses of a population of neurons to a train of electrical pulses delivered through a microelectrode. We then perform an ideal observer analysis on the simulated population responses to predict the behavioral performance of non-human primates in ICMS detection and discrimination tasks. Our computational model can predict behavioral performance across a wide range of stimulation conditions with high accuracy (R 2 = 0.97) and generalizes to novel ICMS pulse trains that were not used to fit its parameters. Furthermore, the model provides a theoretical basis for the finding that amplitude discrimination based on ICMS violates Weber's law. The model can be used to characterize the sensitivity to ICMS across the range of perceptible and safe stimulation regimes. As such, it will be a useful tool for both neuroscience and neuroprosthetics.
A computational model that predicts behavioral sensitivity to intracortical microstimulation
NASA Astrophysics Data System (ADS)
Kim, Sungshin; Callier, Thierri; Bensmaia, Sliman J.
2017-02-01
Objective. Intracortical microstimulation (ICMS) is a powerful tool to investigate the neural mechanisms of perception and can be used to restore sensation for patients who have lost it. While sensitivity to ICMS has previously been characterized, no systematic framework has been developed to summarize the detectability of individual ICMS pulse trains or the discriminability of pairs of pulse trains. Approach. We develop a simple simulation that describes the responses of a population of neurons to a train of electrical pulses delivered through a microelectrode. We then perform an ideal observer analysis on the simulated population responses to predict the behavioral performance of non-human primates in ICMS detection and discrimination tasks. Main results. Our computational model can predict behavioral performance across a wide range of stimulation conditions with high accuracy (R 2 = 0.97) and generalizes to novel ICMS pulse trains that were not used to fit its parameters. Furthermore, the model provides a theoretical basis for the finding that amplitude discrimination based on ICMS violates Weber’s law. Significance. The model can be used to characterize the sensitivity to ICMS across the range of perceptible and safe stimulation regimes. As such, it will be a useful tool for both neuroscience and neuroprosthetics.
Design of a high altitude long endurance flying-wing solar-powered unmanned air vehicle
NASA Astrophysics Data System (ADS)
Alsahlani, A. A.; Johnston, L. J.; Atcliffe, P. A.
2017-06-01
The low-Reynolds number environment of high-altitude §ight places severe demands on the aerodynamic design and stability and control of a high altitude, long endurance (HALE) unmanned air vehicle (UAV). The aerodynamic efficiency of a §ying-wing configuration makes it an attractive design option for such an application and is investigated in the present work. The proposed configuration has a high-aspect ratio, swept-wing planform, the wing sweep being necessary to provide an adequate moment arm for outboard longitudinal and lateral control surfaces. A design optimization framework is developed under a MATLAB environment, combining aerodynamic, structural, and stability analysis. Low-order analysis tools are employed to facilitate efficient computations, which is important when there are multiple optimization loops for the various engineering analyses. In particular, a vortex-lattice method is used to compute the wing planform aerodynamics, coupled to a twodimensional (2D) panel method to derive aerofoil sectional characteristics. Integral boundary-layer methods are coupled to the panel method in order to predict §ow separation boundaries during the design iterations. A quasi-analytical method is adapted for application to flyingwing con¦gurations to predict the wing weight and a linear finite-beam element approach is used for structural analysis of the wing-box. Stability is a particular concern in the low-density environment of high-altitude flight for flying-wing aircraft and so provision of adequate directional stability and control power forms part of the optimization process. At present, a modified Genetic Algorithm is used in all of the optimization loops. Each of the low-order engineering analysis tools is validated using higher-order methods to provide con¦dence in the use of these computationally-efficient tools in the present design-optimization framework. This paper includes the results of employing the present optimization tools in the design of a HALE, flying-wing UAV to indicate that this is a viable design configuration option.
Analysis of log-periodic power law singularity patterns in time series related to credit risk
NASA Astrophysics Data System (ADS)
Wosnitza, Jan Henrik; Sornette, Didier
2015-04-01
The log-periodic (super-exponential) power law singularity (LPPLS) has become a promising tool for predicting extreme behavior of self-organizing systems in natural sciences and finance. Some researchers have recently proposed to employ the LPPLS on credit risk markets. The review article at hand summarizes four papers in this field and shows how they are linked. After structuring the research questions, we collect the corresponding answers from the four articles. This eventually gives us an overall picture of the application of the LPPLS to credit risk data. Our literature review begins with grounding the view that credit default swap (CDS) spreads are hotbeds for LPPLS patterns and it ends up with drawing attention to the recently proposed alarm index for the prediction of institutional bank runs. By presenting a new field of application for the LPPLS, the reviewed strand of literature further substantiates the LPPLS hypothesis. Moreover, the results suggest that CDS spread trajectories belong to a different universality class than, for instance, stock prices.
NASA Technical Reports Server (NTRS)
Sree, Dave
2015-01-01
Near-field acoustic power level analysis of F31A31 open rotor model has been performed to determine its noise characteristics at simulated cruise flight conditions. The non-proprietary parts of the test data obtained from experiments in the 8x6 supersonic wind tunnel were provided by NASA-Glenn Research Center. The tone and broadband components of total noise have been separated from raw test data by using a new data analysis tool. Results in terms of sound pressure levels, acoustic power levels, and their variations with rotor speed, freestream Mach number, and input shaft power, with different blade-pitch setting angles at simulated cruise flight conditions, are presented and discussed. Empirical equations relating models acoustic power level and input shaft power have been developed. The near-field acoustic efficiency of the model at simulated cruise conditions is also determined. It is hoped that the results presented in this work will serve as a database for comparison and improvement of other open rotor blade designs and also for validating open rotor noise prediction codes.
EMU battery/SMM power tool characterization study
NASA Technical Reports Server (NTRS)
Palandati, C.
1982-01-01
The power tool which will be used to replace the attitude control system in the SMM spacecraft was modified to operate from a self contained battery. The extravehicular mobility unit (EMU) battery was tested for the power tool application. The results are that the EMU battery is capable of operating the power tool within the pulse current range of 2.0 to 15.0 amperes and battery temperature range of -10 to 40 degrees Celsius.
NASA Astrophysics Data System (ADS)
Declair, Stefan; Saint-Drenan, Yves-Marie; Potthast, Roland
2017-04-01
Determining the amount of weather dependent renewable energy is a demanding task for transmission system operators (TSOs) and wind and photovoltaic (PV) prediction errors require the use of reserve power, which generate costs and can - in extreme cases - endanger the security of supply. In the project EWeLiNE funded by the German government, the German Weather Service and the Fraunhofer Institute on Wind Energy and Energy System Technology develop innovative weather- and power forecasting models and tools for grid integration of weather dependent renewable energy. The key part in energy prediction process chains is the numerical weather prediction (NWP) system. Irradiation forecasts from NWP systems are however subject to several sources of error. For PV power prediction, weaknesses of the NWP model to correctly forecast i.e. low stratus, absorption of condensed water or aerosol optical depths are the main sources of errors. Inaccurate radiation schemes (i.e. the two-stream parametrization) are also known as a deficit of NWP systems with regard to irradiation forecast. To mitigate errors like these, latest observations can be used in a pre-processing technique called data assimilation (DA). In DA, not only the initial fields are provided, but the model is also synchronized with reality - the observations - and hence forecast errors are reduced. Besides conventional observation networks like radiosondes, synoptic observations or air reports of wind, pressure and humidity, the number of observations measuring meteorological information indirectly by means of remote sensing such as satellite radiances, radar reflectivities or GPS slant delays strongly increases. Numerous PV plants installed in Germany potentially represent a dense meteorological network assessing irradiation through their power measurements. Forecast accuracy may thus be enhanced by extending the observations in the assimilation by this new source of information. PV power plants can provide information on clouds, aerosol optical depth or low stratus in terms of remote sensing: the power output is strongly dependent on perturbations along the slant between sun position and PV panel. Since these data are not limited to the vertical column above or below the detector, it may thus complement satellite data and compensate weaknesses in the radiation scheme. In this contribution, the used DA technique (Local Ensemble Transform Kalman Filter, LETKF) is shortly sketched. Furthermore, the computation of the model power equivalents is described and first results are presented and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casarini, L.; Bonometto, S.A.; Tessarotto, E.
2016-08-01
We discuss an extension of the Coyote emulator to predict non-linear matter power spectra of dark energy (DE) models with a scale factor dependent equation of state of the form w = w {sub 0}+(1- a ) w {sub a} . The extension is based on the mapping rule between non-linear spectra of DE models with constant equation of state and those with time varying one originally introduced in ref. [40]. Using a series of N-body simulations we show that the spectral equivalence is accurate to sub-percent level across the same range of modes and redshift covered by the Coyotemore » suite. Thus, the extended emulator provides a very efficient and accurate tool to predict non-linear power spectra for DE models with w {sub 0}- w {sub a} parametrization. According to the same criteria we have developed a numerical code that we have implemented in a dedicated module for the CAMB code, that can be used in combination with the Coyote Emulator in likelihood analyses of non-linear matter power spectrum measurements. All codes can be found at https://github.com/luciano-casarini/pkequal.« less
The wind power prediction research based on mind evolutionary algorithm
NASA Astrophysics Data System (ADS)
Zhuang, Ling; Zhao, Xinjian; Ji, Tianming; Miao, Jingwen; Cui, Haina
2018-04-01
When the wind power is connected to the power grid, its characteristics of fluctuation, intermittent and randomness will affect the stability of the power system. The wind power prediction can guarantee the power quality and reduce the operating cost of power system. There were some limitations in several traditional wind power prediction methods. On the basis, the wind power prediction method based on Mind Evolutionary Algorithm (MEA) is put forward and a prediction model is provided. The experimental results demonstrate that MEA performs efficiently in term of the wind power prediction. The MEA method has broad prospect of engineering application.
Multiscale modeling of mucosal immune responses
2015-01-01
Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM. Background Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Implementation Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. Conclusion We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation. PMID:26329787
Esmaeilzadeh, Sina; Cesme, Fatih; Oral, Aydan; Yaliman, Ayse; Sindel, Dilsad
2016-08-01
Dual-energy X-ray absorptiometry (DXA) is considered the "gold standard" in predicting osteoporotic fractures. Calcaneal quantitative ultrasound (QUS) variables are also known to predict fractures. Fracture risk assessment tools may also guide us for the detection of individuals at high risk for fractures. The aim of this case-control study was to evaluate the utility of DXA bone mineral density (BMD), calcaneal QUS parameters, FRAX® (Fracture Risk Assessment Tool), and Osteoporosis Risk Assessment Instrument (ORAI) for the discrimination of women with distal forearm or hip fractures. This case-control study included 20 women with a distal forearm fracture and 18 women with a hip fracture as cases and 76 age-matched women served as controls. BMD at the spine, proximal femur, and radius was measured using DXA and acoustic parameters of bone were obtained using a calcaneal QUS device. FRAX® 10-year probability of fracture and ORAI scores were also calculated in all participants. Receiver operating characteristic (ROC) analysis was used to assess fracture discriminatory power of all the tools. While all DXA BMD, and QUS variables and FRAX® fracture probabilities demonstrated significant areas under the ROC curves for the discrimination of hip-fractured women and those without, only 33% radius BMD, broadband ultrasound attenuation (BUA), and FRAX® major osteoporotic fracture probability calculated without BMD showed significant discriminatory power for distal forearm fractures. It can be concluded that QUS variables, particularly BUA, and FRAX® major osteoporotic fracture probability without BMD are good candidates for the identification of both hip and distal forearm fractures.
Multiscale modeling of mucosal immune responses.
Mei, Yongguo; Abedi, Vida; Carbo, Adria; Zhang, Xiaoying; Lu, Pinyi; Philipson, Casandra; Hontecillas, Raquel; Hoops, Stefan; Liles, Nathan; Bassaganya-Riera, Josep
2015-01-01
Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation.Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM.
Decoding genes with coexpression networks and metabolomics - 'majority report by precogs'.
Saito, Kazuki; Hirai, Masami Y; Yonekura-Sakakibara, Keiko
2008-01-01
Following the sequencing of whole genomes of model plants, high-throughput decoding of gene function is a major challenge in modern plant biology. In view of remarkable technical advances in transcriptomics and metabolomics, integrated analysis of these 'omics' by data-mining informatics is an excellent tool for prediction and identification of gene function, particularly for genes involved in complicated metabolic pathways. The availability of Arabidopsis public transcriptome datasets containing data of >1000 microarrays reinforces the potential for prediction of gene function by transcriptome coexpression analysis. Here, we review the strategy of combining transcriptome and metabolome as a powerful technology for studying the functional genomics of model plants and also crop and medicinal plants.
NASA Technical Reports Server (NTRS)
2013-01-01
Topics covered include: Water Treatment Technologies Inspire Healthy Beverages; Dietary Formulas Fortify Antioxidant Supplements; Rovers Pave the Way for Hospital Robots; Dry Electrodes Facilitate Remote Health Monitoring; Telescope Innovations Improve Speed, Accuracy of Eye Surgery; Superconductors Enable Lower Cost MRI Systems; Anti-Icing Formulas Prevent Train Delays; Shuttle Repair Tools Automate Vehicle Maintenance; Pressure-Sensitive Paints Advance Rotorcraft Design Testing; Speech Recognition Interfaces Improve Flight Safety; Polymers Advance Heat Management Materials for Vehicles; Wireless Sensors Pinpoint Rotorcraft Troubles; Ultrasonic Detectors Safely Identify Dangerous, Costly Leaks; Detectors Ensure Function, Safety of Aircraft Wiring; Emergency Systems Save Tens of Thousands of Lives; Oxygen Assessments Ensure Safer Medical Devices; Collaborative Platforms Aid Emergency Decision Making; Space-Inspired Trailers Encourage Exploration on Earth; Ultra-Thin Coatings Beautify Art; Spacesuit Materials Add Comfort to Undergarments; Gigapixel Images Connect Sports Teams with Fans; Satellite Maps Deliver More Realistic Gaming; Elemental Scanning Devices Authenticate Works of Art; Microradiometers Reveal Ocean Health, Climate Change; Sensors Enable Plants to Text Message Farmers; Efficient Cells Cut the Cost of Solar Power; Shuttle Topography Data Inform Solar Power Analysis; Photocatalytic Solutions Create Self-Cleaning Surfaces; Concentrators Enhance Solar Power Systems; Innovative Coatings Potentially Lower Facility Maintenance Costs; Simulation Packages Expand Aircraft Design Options; Web Solutions Inspire Cloud Computing Software; Behavior Prediction Tools Strengthen Nanoelectronics; Power Converters Secure Electronics in Harsh Environments; Diagnostics Tools Identify Faults Prior to Failure; Archiving Innovations Preserve Essential Historical Records; Meter Designs Reduce Operation Costs for Industry; Commercial Platforms Allow Affordable Space Research; Fiber Optics Deliver Real-Time Structural Monitoring; Camera Systems Rapidly Scan Large Structures; Terahertz Lasers Reveal Information for 3D Images; Thin Films Protect Electronics from Heat and Radiation; Interferometers Sharpen Measurements for Better Telescopes; and Vision Systems Illuminate Industrial Processes.
Analysis of high vacuum systems using SINDA'85
NASA Technical Reports Server (NTRS)
Spivey, R. A.; Clanton, S. E.; Moore, J. D.
1993-01-01
The theory, algorithms, and test data correlation analysis of a math model developed to predict performance of the Space Station Freedom Vacuum Exhaust System are presented. The theory used to predict the flow characteristics of viscous, transition, and molecular flow is presented in detail. Development of user subroutines which predict the flow characteristics in conjunction with the SINDA'85/FLUINT analysis software are discussed. The resistance-capacitance network approach with application to vacuum system analysis is demonstrated and results from the model are correlated with test data. The model was developed to predict the performance of the Space Station Freedom Vacuum Exhaust System. However, the unique use of the user subroutines developed in this model and written into the SINDA'85/FLUINT thermal analysis model provides a powerful tool that can be used to predict the transient performance of vacuum systems and gas flow in tubes of virtually any geometry. This can be accomplished using a resistance-capacitance (R-C) method very similar to the methods used to perform thermal analyses.
Prediction of sweetness and amino acid content in soybean crops from hyperspectral imagery
NASA Astrophysics Data System (ADS)
Monteiro, Sildomar Takahashi; Minekawa, Yohei; Kosugi, Yukio; Akazawa, Tsuneya; Oda, Kunio
Hyperspectral image data provides a powerful tool for non-destructive crop analysis. This paper investigates a hyperspectral image data-processing method to predict the sweetness and amino acid content of soybean crops. Regression models based on artificial neural networks were developed in order to calculate the level of sucrose, glucose, fructose, and nitrogen concentrations, which can be related to the sweetness and amino acid content of vegetables. A performance analysis was conducted comparing regression models obtained using different preprocessing methods, namely, raw reflectance, second derivative, and principal components analysis. This method is demonstrated using high-resolution hyperspectral data of wavelengths ranging from the visible to the near infrared acquired from an experimental field of green vegetable soybeans. The best predictions were achieved using a nonlinear regression model of the second derivative transformed dataset. Glucose could be predicted with greater accuracy, followed by sucrose, fructose and nitrogen. The proposed method provides the possibility to provide relatively accurate maps predicting the chemical content of soybean crop fields.
On the predictive ability of mechanistic models for the Haitian cholera epidemic.
Mari, Lorenzo; Bertuzzo, Enrico; Finger, Flavio; Casagrandi, Renato; Gatto, Marino; Rinaldo, Andrea
2015-03-06
Predictive models of epidemic cholera need to resolve at suitable aggregation levels spatial data pertaining to local communities, epidemiological records, hydrologic drivers, waterways, patterns of human mobility and proxies of exposure rates. We address the above issue in a formal model comparison framework and provide a quantitative assessment of the explanatory and predictive abilities of various model settings with different spatial aggregation levels and coupling mechanisms. Reference is made to records of the recent Haiti cholera epidemics. Our intensive computations and objective model comparisons show that spatially explicit models accounting for spatial connections have better explanatory power than spatially disconnected ones for short-to-intermediate calibration windows, while parsimonious, spatially disconnected models perform better with long training sets. On average, spatially connected models show better predictive ability than disconnected ones. We suggest limits and validity of the various approaches and discuss the pathway towards the development of case-specific predictive tools in the context of emergency management. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Laine, Elodie; Carbone, Alessandra
2015-01-01
Protein-protein interactions (PPIs) are essential to all biological processes and they represent increasingly important therapeutic targets. Here, we present a new method for accurately predicting protein-protein interfaces, understanding their properties, origins and binding to multiple partners. Contrary to machine learning approaches, our method combines in a rational and very straightforward way three sequence- and structure-based descriptors of protein residues: evolutionary conservation, physico-chemical properties and local geometry. The implemented strategy yields very precise predictions for a wide range of protein-protein interfaces and discriminates them from small-molecule binding sites. Beyond its predictive power, the approach permits to dissect interaction surfaces and unravel their complexity. We show how the analysis of the predicted patches can foster new strategies for PPIs modulation and interaction surface redesign. The approach is implemented in JET2, an automated tool based on the Joint Evolutionary Trees (JET) method for sequence-based protein interface prediction. JET2 is freely available at www.lcqb.upmc.fr/JET2. PMID:26690684
The Influence of Landslides on Channel Flood Response: A Case Study from the Colorado Front Range
NASA Astrophysics Data System (ADS)
Bennett, G. L.; Ryan, S. E.; Sholtes, J.; Rathburn, S. L.
2016-12-01
Studies have identified the role of thresholds and gradients in stream power in inducing geomorphic change during floods. At much longer time scales, empirical and modeling studies suggest the role of landslides in modifying channel response to external forcing (e.g. tectonic uplift); landslide-delivered sediment may behave as a tool, enhancing channel incision, or as cover, reducing channel incision. However, the influence of landslides on channel response to an individual flood event remains to be elucidated. Here we explore the influence of landslides on channel response to a 200-yr flood in Colorado, USA. From 9 - 15th September 2013 up to 450 mm of rain fell across a 100 km-wide swath of the Colorado Front Range, triggering >1000 landslides and inducing major flooding in several catchments. The flood caused extensive channel erosion, deposition and planform change, resulting in significant damage to property and infrastructure and even loss of life. We use a combination of pre and post flood LiDAR and field mapping to quantify geomorphic change in several catchments spanning the flooded region. We make a reach-by-reach analysis of channel geomorphic change metrics (e.g. volume of erosion) in relation to landslide sediment input and total stream power as calculated from radar-based rainfall measurements. Preliminary results suggest that landslide-sediment input may complicate the predictive relationship between channel erosion and stream power. Low volumes of landslide sediment input appear to enhance channel erosion (a tools effect), whilst very large volumes appear to reduce channel erosion (a cover effect). These results have implications for predicting channel response to floods and for flood planning and mitigation.
NASA Astrophysics Data System (ADS)
Leclercq, Sylvain; Lidbury, David; Van Dyck, Steven; Moinereau, Dominique; Alamo, Ana; Mazouzi, Abdou Al
2010-11-01
In nuclear power plants, materials may undergo degradation due to severe irradiation conditions that may limit their operational life. Utilities that operate these reactors need to quantify the ageing and the potential degradations of some essential structures of the power plant to ensure safe and reliable plant operation. So far, the material databases needed to take account of these degradations in the design and safe operation of installations mainly rely on long-term irradiation programs in test reactors as well as on mechanical or corrosion testing in specialized hot cells. Continuous progress in the physical understanding of the phenomena involved in irradiation damage and continuous progress in computer sciences have now made possible the development of multi-scale numerical tools able to simulate the effects of irradiation on materials microstructure. A first step towards this goal has been successfully reached through the development of the RPV-2 and Toughness Module numerical tools by the scientific community created around the FP6 PERFECT project. These tools allow to simulate irradiation effects on the constitutive behaviour of the reactor pressure vessel low alloy steel, and also on its failure properties. Relying on the existing PERFECT Roadmap, the 4 years Collaborative Project PERFORM 60 has mainly for objective to develop multi-scale tools aimed at predicting the combined effects of irradiation and corrosion on internals (austenitic stainless steels) and also to improve existing ones on RPV (bainitic steels). PERFORM 60 is based on two technical sub-projects: (i) RPV and (ii) internals. In addition to these technical sub-projects, the Users' Group and Training sub-project shall allow representatives of constructors, utilities, research organizations… from Europe, USA and Japan to receive the information and training to get their own appraisal on limits and potentialities of the developed tools. An important effort will also be made to teach young researchers in the field of materials' degradation. PERFORM 60 has officially started on March 1st, 2009 with 20 European organizations and Universities involved in the nuclear field.
Comparison of RNA-seq and microarray-based models for clinical endpoint prediction.
Zhang, Wenqian; Yu, Ying; Hertwig, Falk; Thierry-Mieg, Jean; Zhang, Wenwei; Thierry-Mieg, Danielle; Wang, Jian; Furlanello, Cesare; Devanarayan, Viswanath; Cheng, Jie; Deng, Youping; Hero, Barbara; Hong, Huixiao; Jia, Meiwen; Li, Li; Lin, Simon M; Nikolsky, Yuri; Oberthuer, André; Qing, Tao; Su, Zhenqiang; Volland, Ruth; Wang, Charles; Wang, May D; Ai, Junmei; Albanese, Davide; Asgharzadeh, Shahab; Avigad, Smadar; Bao, Wenjun; Bessarabova, Marina; Brilliant, Murray H; Brors, Benedikt; Chierici, Marco; Chu, Tzu-Ming; Zhang, Jibin; Grundy, Richard G; He, Min Max; Hebbring, Scott; Kaufman, Howard L; Lababidi, Samir; Lancashire, Lee J; Li, Yan; Lu, Xin X; Luo, Heng; Ma, Xiwen; Ning, Baitang; Noguera, Rosa; Peifer, Martin; Phan, John H; Roels, Frederik; Rosswog, Carolina; Shao, Susan; Shen, Jie; Theissen, Jessica; Tonini, Gian Paolo; Vandesompele, Jo; Wu, Po-Yen; Xiao, Wenzhong; Xu, Joshua; Xu, Weihong; Xuan, Jiekun; Yang, Yong; Ye, Zhan; Dong, Zirui; Zhang, Ke K; Yin, Ye; Zhao, Chen; Zheng, Yuanting; Wolfinger, Russell D; Shi, Tieliu; Malkas, Linda H; Berthold, Frank; Wang, Jun; Tong, Weida; Shi, Leming; Peng, Zhiyu; Fischer, Matthias
2015-06-25
Gene expression profiling is being widely applied in cancer research to identify biomarkers for clinical endpoint prediction. Since RNA-seq provides a powerful tool for transcriptome-based applications beyond the limitations of microarrays, we sought to systematically evaluate the performance of RNA-seq-based and microarray-based classifiers in this MAQC-III/SEQC study for clinical endpoint prediction using neuroblastoma as a model. We generate gene expression profiles from 498 primary neuroblastomas using both RNA-seq and 44 k microarrays. Characterization of the neuroblastoma transcriptome by RNA-seq reveals that more than 48,000 genes and 200,000 transcripts are being expressed in this malignancy. We also find that RNA-seq provides much more detailed information on specific transcript expression patterns in clinico-genetic neuroblastoma subgroups than microarrays. To systematically compare the power of RNA-seq and microarray-based models in predicting clinical endpoints, we divide the cohort randomly into training and validation sets and develop 360 predictive models on six clinical endpoints of varying predictability. Evaluation of factors potentially affecting model performances reveals that prediction accuracies are most strongly influenced by the nature of the clinical endpoint, whereas technological platforms (RNA-seq vs. microarrays), RNA-seq data analysis pipelines, and feature levels (gene vs. transcript vs. exon-junction level) do not significantly affect performances of the models. We demonstrate that RNA-seq outperforms microarrays in determining the transcriptomic characteristics of cancer, while RNA-seq and microarray-based models perform similarly in clinical endpoint prediction. Our findings may be valuable to guide future studies on the development of gene expression-based predictive models and their implementation in clinical practice.
NASA Astrophysics Data System (ADS)
Pedretti, Daniele; Bianchi, Marco
2018-03-01
Breakthrough curves (BTCs) observed during tracer tests in highly heterogeneous aquifers display strong tailing. Power laws are popular models for both the empirical fitting of these curves, and the prediction of transport using upscaling models based on best-fitted estimated parameters (e.g. the power law slope or exponent). The predictive capacity of power law based upscaling models can be however questioned due to the difficulties to link model parameters with the aquifers' physical properties. This work analyzes two aspects that can limit the use of power laws as effective predictive tools: (a) the implication of statistical subsampling, which often renders power laws undistinguishable from other heavily tailed distributions, such as the logarithmic (LOG); (b) the difficulties to reconcile fitting parameters obtained from models with different formulations, such as the presence of a late-time cutoff in the power law model. Two rigorous and systematic stochastic analyses, one based on benchmark distributions and the other on BTCs obtained from transport simulations, are considered. It is found that a power law model without cutoff (PL) results in best-fitted exponents (αPL) falling in the range of typical experimental values reported in the literature (1.5 < αPL < 4). The PL exponent tends to lower values as the tailing becomes heavier. Strong fluctuations occur when the number of samples is limited, due to the effects of subsampling. On the other hand, when the power law model embeds a cutoff (PLCO), the best-fitted exponent (αCO) is insensitive to the degree of tailing and to the effects of subsampling and tends to a constant αCO ≈ 1. In the PLCO model, the cutoff rate (λ) is the parameter that fully reproduces the persistence of the tailing and is shown to be inversely correlated to the LOG scale parameter (i.e. with the skewness of the distribution). The theoretical results are consistent with the fitting analysis of a tracer test performed during the MADE-5 experiment. It is shown that a simple mechanistic upscaling model based on the PLCO formulation is able to predict the ensemble of BTCs from the stochastic transport simulations without the need of any fitted parameters. The model embeds the constant αCO = 1 and relies on a stratified description of the transport mechanisms to estimate λ. The PL fails to reproduce the ensemble of BTCs at late time, while the LOG model provides consistent results as the PLCO model, however without a clear mechanistic link between physical properties and model parameters. It is concluded that, while all parametric models may work equally well (or equally wrong) for the empirical fitting of the experimental BTCs tails due to the effects of subsampling, for predictive purposes this is not true. A careful selection of the proper heavily tailed models and corresponding parameters is required to ensure physically-based transport predictions.
Development of Asset Management Decision Support Tools for Power Equipment
NASA Astrophysics Data System (ADS)
Okamoto, Tatsuki; Takahashi, Tsuguhiro
Development of asset management decision support tools become very intensive in order to reduce maintenance cost of power equipment due to the liberalization of power business. This article reviews some aspects of present status of asset management decision support tools development for power equipment based on the papers published in international conferences, domestic conventions, and several journals.
modPDZpep: a web resource for structure based analysis of human PDZ-mediated interaction networks.
Sain, Neetu; Mohanty, Debasisa
2016-09-21
PDZ domains recognize short sequence stretches usually present in C-terminal of their interaction partners. Because of the involvement of PDZ domains in many important biological processes, several attempts have been made for developing bioinformatics tools for genome-wide identification of PDZ interaction networks. Currently available tools for prediction of interaction partners of PDZ domains utilize machine learning approach. Since, they have been trained using experimental substrate specificity data for specific PDZ families, their applicability is limited to PDZ families closely related to the training set. These tools also do not allow analysis of PDZ-peptide interaction interfaces. We have used a structure based approach to develop modPDZpep, a program to predict the interaction partners of human PDZ domains and analyze structural details of PDZ interaction interfaces. modPDZpep predicts interaction partners by using structural models of PDZ-peptide complexes and evaluating binding energy scores using residue based statistical pair potentials. Since, it does not require training using experimental data on peptide binding affinity, it can predict substrates for diverse PDZ families. Because of the use of simple scoring function for binding energy, it is also fast enough for genome scale structure based analysis of PDZ interaction networks. Benchmarking using artificial as well as real negative datasets indicates good predictive power with ROC-AUC values in the range of 0.7 to 0.9 for a large number of human PDZ domains. Another novel feature of modPDZpep is its ability to map novel PDZ mediated interactions in human protein-protein interaction networks, either by utilizing available experimental phage display data or by structure based predictions. In summary, we have developed modPDZpep, a web-server for structure based analysis of human PDZ domains. It is freely available at http://www.nii.ac.in/modPDZpep.html or http://202.54.226.235/modPDZpep.html . This article was reviewed by Michael Gromiha and Zoltán Gáspári.
TU-AB-202-03: Prediction of PET Transfer Uncertainty by DIR Error Estimating Software, AUTODIRECT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, H; Chen, J; Phillips, J
2016-06-15
Purpose: Deformable image registration (DIR) is a powerful tool, but DIR errors can adversely affect its clinical applications. To estimate voxel-specific DIR uncertainty, a software tool, called AUTODIRECT (automated DIR evaluation of confidence tool), has been developed and validated. This work tests the ability of this software to predict uncertainty for the transfer of standard uptake values (SUV) from positron-emission tomography (PET) with DIR. Methods: Virtual phantoms are used for this study. Each phantom has a planning computed tomography (CT) image and a diagnostic PET-CT image set. A deformation was digitally applied to the diagnostic CT to create the planningmore » CT image and establish a known deformation between the images. One lung and three rectum patient datasets were employed to create the virtual phantoms. Both of these sites have difficult deformation scenarios associated with them, which can affect DIR accuracy (lung tissue sliding and changes in rectal filling). The virtual phantoms were created to simulate these scenarios by introducing discontinuities in the deformation field at the lung rectum border. The DIR algorithm from Plastimatch software was applied to these phantoms. The SUV mapping errors from the DIR were then compared to that predicted by AUTODIRECT. Results: The SUV error distributions closely followed the AUTODIRECT predicted error distribution for the 4 test cases. The minimum and maximum PET SUVs were produced from AUTODIRECT at 95% confidence interval before applying gradient-based SUV segmentation for each of these volumes. Notably, 93.5% of the target volume warped by the true deformation was included within the AUTODIRECT-predicted maximum SUV volume after the segmentation, while 78.9% of the target volume was within the target volume warped by Plastimatch. Conclusion: The AUTODIRECT framework is able to predict PET transfer uncertainty caused by DIR, which enables an understanding of the associated target volume uncertainty.« less
Zhang, Xinyan; Li, Bingzong; Han, Huiying; Song, Sha; Xu, Hongxia; Hong, Yating; Yi, Nengjun; Zhuang, Wenzhuo
2018-05-10
Multiple myeloma (MM), like other cancers, is caused by the accumulation of genetic abnormalities. Heterogeneity exists in the patients' response to treatments, for example, bortezomib. This urges efforts to identify biomarkers from numerous molecular features and build predictive models for identifying patients that can benefit from a certain treatment scheme. However, previous studies treated the multi-level ordinal drug response as a binary response where only responsive and non-responsive groups are considered. It is desirable to directly analyze the multi-level drug response, rather than combining the response to two groups. In this study, we present a novel method to identify significantly associated biomarkers and then develop ordinal genomic classifier using the hierarchical ordinal logistic model. The proposed hierarchical ordinal logistic model employs the heavy-tailed Cauchy prior on the coefficients and is fitted by an efficient quasi-Newton algorithm. We apply our hierarchical ordinal regression approach to analyze two publicly available datasets for MM with five-level drug response and numerous gene expression measures. Our results show that our method is able to identify genes associated with the multi-level drug response and to generate powerful predictive models for predicting the multi-level response. The proposed method allows us to jointly fit numerous correlated predictors and thus build efficient models for predicting the multi-level drug response. The predictive model for the multi-level drug response can be more informative than the previous approaches. Thus, the proposed approach provides a powerful tool for predicting multi-level drug response and has important impact on cancer studies.
Myofiber metabolic type determination by mass spectrometry imaging.
Centeno, Delphine; Vénien, Annie; Pujos-Guillot, Estelle; Astruc, Thierry; Chambon, Christophe; Théron, Laëtitia
2017-08-01
Matrix assisted laser desorption/ionization (MALDI) mass spectrometry imaging is a powerful tool that opens new research opportunities in the field of biology. In this work, predictive model was developed to discriminate metabolic myofiber types using the MALDI spectral data. Rat skeletal muscles are constituted of type I and type IIA fiber, which have an oxidative metabolism for glycogen degradation, and type IIX and type IIB fiber which have a glycolytic metabolism, present in different proportions according to the muscle function and physiological state. So far, myofiber type is determined by histological methods that are time consuming. Thanks to the predictive model, we were able to predict not only the metabolic fiber type but also their location, on the same muscle section that was used for MALDI imaging. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Evidence for complex contagion models of social contagion from observational data
Sprague, Daniel A.
2017-01-01
Social influence can lead to behavioural ‘fads’ that are briefly popular and quickly die out. Various models have been proposed for these phenomena, but empirical evidence of their accuracy as real-world predictive tools has so far been absent. Here we find that a ‘complex contagion’ model accurately describes the spread of behaviours driven by online sharing. We found that standard, ‘simple’, contagion often fails to capture both the rapid spread and the long tails of popularity seen in real fads, where our complex contagion model succeeds. Complex contagion also has predictive power: it successfully predicted the peak time and duration of the ALS Icebucket Challenge. The fast spread and longer duration of fads driven by complex contagion has important implications for activities such as publicity campaigns and charity drives. PMID:28686719
NASA Technical Reports Server (NTRS)
Ali, Ashraf; Lovell, Michael
1995-01-01
This presentation summarizes the capabilities in the ANSYS program that relate to the computational modeling of tires. The power and the difficulties associated with modeling nearly incompressible rubber-like materials using hyperelastic constitutive relationships are highlighted from a developer's point of view. The topics covered include a hyperelastic material constitutive model for rubber-like materials, a general overview of contact-friction capabilities, and the acoustic fluid-structure interaction problem for noise prediction. Brief theoretical development and example problems are presented for each topic.
Resch, K J; Walther, P; Zeilinger, A
2005-02-25
We have performed the first experimental tomographic reconstruction of a three-photon polarization state. Quantum state tomography is a powerful tool for fully describing the density matrix of a quantum system. We measured 64 three-photon polarization correlations and used a "maximum-likelihood" reconstruction method to reconstruct the Greenberger-Horne-Zeilinger state. The entanglement class has been characterized using an entanglement witness operator and the maximum predicted values for the Mermin inequality were extracted.
Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Yannan; Hou, Zhangshuan; Meng, Da
2016-07-17
In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.
Chiral phosphoric acid catalysis: from numbers to insights.
Maji, Rajat; Mallojjala, Sharath Chandra; Wheeler, Steven E
2018-02-19
Chiral phosphoric acids (CPAs) have emerged as powerful organocatalysts for asymmetric reactions, and applications of computational quantum chemistry have revealed important insights into the activity and selectivity of these catalysts. In this tutorial review, we provide an overview of computational tools at the disposal of computational organic chemists and demonstrate their application to a wide array of CPA catalysed reactions. Predictive models of the stereochemical outcome of these reactions are discussed along with specific examples of representative reactions and an outlook on remaining challenges in this area.
NASA Astrophysics Data System (ADS)
Homuth, S.; Götz, A. E.; Sass, I.
2015-06-01
The Upper Jurassic carbonates of the southern German Molasse Basin are the target of numerous geothermal combined heat and power production projects since the year 2000. A production-orientated reservoir characterization is therefore of high economic interest. Outcrop analogue studies enable reservoir property prediction by determination and correlation of lithofacies-related thermo- and petrophysical parameters. A thermofacies classification of the carbonate formations serves to identify heterogeneities and production zones. The hydraulic conductivity is mainly controlled by tectonic structures and karstification, whilst the type and grade of karstification is facies related. The rock permeability has only a minor effect on the reservoir's sustainability. Physical parameters determined on oven-dried samples have to be corrected, applying reservoir transfer models to water-saturated reservoir conditions. To validate these calculated parameters, a Thermo-Triaxial-Cell simulating the temperature and pressure conditions of the reservoir is used and calorimetric and thermal conductivity measurements under elevated temperature conditions are performed. Additionally, core and cutting material from a 1600 m deep research drilling and a 4850 m (total vertical depth, measured depth: 6020 m) deep well is used to validate the reservoir property predictions. Under reservoir conditions a decrease in permeability of 2-3 magnitudes is observed due to the thermal expansion of the rock matrix. For tight carbonates the matrix permeability is temperature-controlled; the thermophysical matrix parameters are density-controlled. Density increases typically with depth and especially with higher dolomite content. Therefore, thermal conductivity increases; however the dominant factor temperature also decreases the thermal conductivity. Specific heat capacity typically increases with increasing depth and temperature. The lithofacies-related characterization and prediction of reservoir properties based on outcrop and drilling data demonstrates that this approach is a powerful tool for exploration and operation of geothermal reservoirs.
Alphs, Larry; Morlock, Robert; Coon, Cheryl; Cazorla, Pilar; Szegedi, Armin; Panagides, John
2011-06-01
The 16-item Negative Symptom Assessment (NSA-16) scale is a validated tool for evaluating negative symptoms of schizophrenia. The psychometric properties and predictive power of a four-item version (NSA-4) were compared with the NSA-16. Baseline data from 561 patients with predominant negative symptoms of schizophrenia who participated in two identically designed clinical trials were evaluated. Ordered logistic regression analysis of ratings using NSA-4 and NSA-16 were compared with ratings using several other standard tools to determine predictive validity and construct validity. Internal consistency and test--retest reliability were also analyzed. NSA-16 and NSA-4 scores were both predictive of scores on the NSA global rating (odds ratio = 0.83-0.86) and the Clinical Global Impressions--Severity scale (odds ratio = 0.91-0.93). NSA-16 and NSA-4 showed high correlation with each other (Pearson r = 0.85), similar high correlation with other measures of negative symptoms (demonstrating convergent validity), and lesser correlations with measures of other forms of psychopathology (demonstrating divergent validity). NSA-16 and NSA-4 both showed acceptable internal consistency (Cronbach α, 0.85 and 0.64, respectively) and test--retest reliability (intraclass correlation coefficient, 0.87 and 0.82). This study demonstrates that NSA-4 offers accuracy comparable to the NSA-16 in rating negative symptoms in patients with schizophrenia. Copyright © 2011 John Wiley & Sons, Ltd.
PC Software graphics tool for conceptual design of space/planetary electrical power systems
NASA Technical Reports Server (NTRS)
Truong, Long V.
1995-01-01
This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.
NASA Astrophysics Data System (ADS)
Kumbur, E. C.; Sharp, K. V.; Mench, M. M.
Developing a robust, intelligent design tool for multivariate optimization of multi-phase transport in fuel cell diffusion media (DM) is of utmost importance to develop advanced DM materials. This study explores the development of a DM design algorithm based on artificial neural network (ANN) that can be used as a powerful tool for predicting the capillary transport characteristics of fuel cell DM. Direct measurements of drainage capillary pressure-saturation curves of the differently engineered DMs (5, 10 and 20 wt.% PTFE) were performed at room temperature under three compressions (0, 0.6 and 1.4 MPa) [E.C. Kumbur, K.V. Sharp, M.M. Mench, J. Electrochem. Soc. 154(12) (2007) B1295-B1304; E.C. Kumbur, K.V. Sharp, M.M. Mench, J. Electrochem. Soc. 154(12) (2007) B1305-B1314; E.C. Kumbur, K.V. Sharp, M.M. Mench, J. Electrochem. Soc. 154(12) (2007) B1315-B1324]. The generated benchmark data were utilized to systematically train a three-layered ANN framework that processes the feed-forward error back propagation methodology. The designed ANN successfully predicts the measured capillary pressures within an average uncertainty of ±5.1% of the measured data, confirming that the present ANN model can be used as a design tool within the range of tested parameters. The ANN simulations reveal that tailoring the DM with high PTFE loading and applying high compression pressure lead to a higher capillary pressure, therefore promoting the liquid water transport within the pores of the DM. Any increase in hydrophobicity of the DM is found to amplify the compression effect, thus yielding a higher capillary pressure for the same saturation level and compression.
The environment power system analysis tool development program
NASA Technical Reports Server (NTRS)
Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.
1990-01-01
The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.
NASA Astrophysics Data System (ADS)
McAteer, R. T. James
2015-08-01
My soul is spiraling in frozen fractals all around, And one thought crystallizes like an icy blast, I'm never going back, the past is in the past.Elsa, from Disney’s Frozen, characterizes two fundamental aspects of scale-free processes in Nature: fractals are everywhere in space; fractals can be used to probe changes in time. Self-Organized Criticality provides a powerful set of tools to study scale-free processes. It connects spatial fractals (more generically, multifractals) to temporal evolution. The drawback is that this usually results in scale-free, unit-less, indices, which can be difficult to connect to everyday physics. Here, I show a novel method that connects one of the most powerful SOC tools - the wavelet transform modulus maxima approach to calculating multifractality - to one of the most powerful equations in all of physics - Ampere’s law. In doing so I show how the multifractal spectra can be expressed in terms of current density, and how current density can then be used for the prediction of future energy release from such a system.Our physical understanding of the solar magnetic field structure, and hence our ability to predict solar activity, is limited by the type of data currently available. I show that the multifractal spectrum provides a powerful physical connection between the details of photospheric magnetic gradients of current data and the coronal magnetic structure. By decomposing Ampere’s law and comparing it to the wavelet transform modulus maximum method, I show how the scale-free Holder exponent provides a direct measure of current density across all relevant sizes. The prevalence of this current density across various scales is connected to its stability in time, and hence to the ability of the magnetic structure to store and then release energy. Hence (spatial) multifractals inform us of (future) solar activity.Finally I discuss how such an approach can be used in any study of scale-free processes, and highlight the necessary key steps in identifying the nature of the mother wavelet to ensuring the viability of this powerful connection.
Functional specifications for AI software tools for electric power applications. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faught, W.S.
1985-08-01
The principle barrier to the introduction of artificial intelligence (AI) technology to the electric power industry has not been a lack of interest or appropriate problems, for the industry abounds in both. Like most others, however, the electric power industry lacks the personnel - knowledge engineers - with the special combination of training and skills AI programming demands. Conversely, very few AI specialists are conversant with electric power industry problems and applications. The recent availability of sophisticated AI programming environments is doing much to alleviate this shortage. These products provide a set of powerful and usable software tools that enablemore » even non-AI scientists to rapidly develop AI applications. The purpose of this project was to develop functional specifications for programming tools that, when integrated with existing general-purpose knowledge engineering tools, would expedite the production of AI applications for the electric power industry. Twelve potential applications, representative of major problem domains within the nuclear power industry, were analyzed in order to identify those tools that would be of greatest value in application development. Eight tools were specified, including facilities for power plant modeling, data base inquiry, simulation and machine-machine interface.« less
EPA Green Power Partners can access tools and resources to help promote their green power commitments. Partners use these tools to communicate the benefits of their green power use to their customers, stakeholders, and the general public.
Yi, Hai-Cheng; You, Zhu-Hong; Huang, De-Shuang; Li, Xiao; Jiang, Tong-Hai; Li, Li-Ping
2018-06-01
The interactions between non-coding RNAs (ncRNAs) and proteins play an important role in many biological processes, and their biological functions are primarily achieved by binding with a variety of proteins. High-throughput biological techniques are used to identify protein molecules bound with specific ncRNA, but they are usually expensive and time consuming. Deep learning provides a powerful solution to computationally predict RNA-protein interactions. In this work, we propose the RPI-SAN model by using the deep-learning stacked auto-encoder network to mine the hidden high-level features from RNA and protein sequences and feed them into a random forest (RF) model to predict ncRNA binding proteins. Stacked assembling is further used to improve the accuracy of the proposed method. Four benchmark datasets, including RPI2241, RPI488, RPI1807, and NPInter v2.0, were employed for the unbiased evaluation of five established prediction tools: RPI-Pred, IPMiner, RPISeq-RF, lncPro, and RPI-SAN. The experimental results show that our RPI-SAN model achieves much better performance than other methods, with accuracies of 90.77%, 89.7%, 96.1%, and 99.33%, respectively. It is anticipated that RPI-SAN can be used as an effective computational tool for future biomedical researches and can accurately predict the potential ncRNA-protein interacted pairs, which provides reliable guidance for biological research. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.
Elsaadany, Mostafa; Yan, Karen Chang; Yildirim-Ayan, Eda
2017-06-01
Successful tissue engineering and regenerative therapy necessitate having extensive knowledge about mechanical milieu in engineered tissues and the resident cells. In this study, we have merged two powerful analysis tools, namely finite element analysis and stochastic analysis, to understand the mechanical strain within the tissue scaffold and residing cells and to predict the cell viability upon applying mechanical strains. A continuum-based multi-length scale finite element model (FEM) was created to simulate the physiologically relevant equiaxial strain exposure on cell-embedded tissue scaffold and to calculate strain transferred to the tissue scaffold (macro-scale) and residing cells (micro-scale) upon various equiaxial strains. The data from FEM were used to predict cell viability under various equiaxial strain magnitudes using stochastic damage criterion analysis. The model validation was conducted through mechanically straining the cardiomyocyte-encapsulated collagen constructs using a custom-built mechanical loading platform (EQUicycler). FEM quantified the strain gradients over the radial and longitudinal direction of the scaffolds and the cells residing in different areas of interest. With the use of the experimental viability data, stochastic damage criterion, and the average cellular strains obtained from multi-length scale models, cellular viability was predicted and successfully validated. This methodology can provide a great tool to characterize the mechanical stimulation of bioreactors used in tissue engineering applications in providing quantification of mechanical strain and predicting cellular viability variations due to applied mechanical strain.
Prognostic and Prediction Tools in Bladder Cancer: A Comprehensive Review of the Literature.
Kluth, Luis A; Black, Peter C; Bochner, Bernard H; Catto, James; Lerner, Seth P; Stenzl, Arnulf; Sylvester, Richard; Vickers, Andrew J; Xylinas, Evanguelos; Shariat, Shahrokh F
2015-08-01
This review focuses on risk assessment and prediction tools for bladder cancer (BCa). To review the current knowledge on risk assessment and prediction tools to enhance clinical decision making and counseling of patients with BCa. A literature search in English was performed using PubMed in July 2013. Relevant risk assessment and prediction tools for BCa were selected. More than 1600 publications were retrieved. Special attention was given to studies that investigated the clinical benefit of a prediction tool. Most prediction tools for BCa focus on the prediction of disease recurrence and progression in non-muscle-invasive bladder cancer or disease recurrence and survival after radical cystectomy. Although these tools are helpful, recent prediction tools aim to address a specific clinical problem, such as the prediction of organ-confined disease and lymph node metastasis to help identify patients who might benefit from neoadjuvant chemotherapy. Although a large number of prediction tools have been reported in recent years, many of them lack external validation. Few studies have investigated the clinical utility of any given model as measured by its ability to improve clinical decision making. There is a need for novel biomarkers to improve the accuracy and utility of prediction tools for BCa. Decision tools hold the promise of facilitating the shared decision process, potentially improving clinical outcomes for BCa patients. Prediction models need external validation and assessment of clinical utility before they can be incorporated into routine clinical care. We looked at models that aim to predict outcomes for patients with bladder cancer (BCa). We found a large number of prediction models that hold the promise of facilitating treatment decisions for patients with BCa. However, many models are missing confirmation in a different patient cohort, and only a few studies have tested the clinical utility of any given model as measured by its ability to improve clinical decision making. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Attia, Khalid A. M.; Nassar, Mohammed W. I.; El-Zeiny, Mohamed B.; Serag, Ahmed
2017-01-01
For the first time, a new variable selection method based on swarm intelligence namely firefly algorithm is coupled with three different multivariate calibration models namely, concentration residual augmented classical least squares, artificial neural network and support vector regression in UV spectral data. A comparative study between the firefly algorithm and the well-known genetic algorithm was developed. The discussion revealed the superiority of using this new powerful algorithm over the well-known genetic algorithm. Moreover, different statistical tests were performed and no significant differences were found between all the models regarding their predictabilities. This ensures that simpler and faster models were obtained without any deterioration of the quality of the calibration.
Electromagnetic Cavity Effects from Transmitters Inside a Launch Vehicle Fairing
NASA Technical Reports Server (NTRS)
Trout, Dawn; Stanley, James; Wahid, Parveen
2009-01-01
This paper provides insight into the difficult analytical issue for launch vehicles and spacecraft that has applicability outside of the launch industry. Radiation from spacecraft or launch vehicle antennas located within enclosures in the launch vehicle generates an electromagnetic environment that is difficult to accurately predict. This paper discusses the test results of power levels produced by a transmitter within a representative scaled vehicle fairing model and provides preliminary modeling results at the low end of the frequency test range using a commercial tool. Initially, the walls of the fairing are aluminum and later, layered with materials to simulate acoustic blanketing structures that are typical in payload fairings. The effects of these blanketing materials on the power levels within the fairing are examined.
Recent developments in measurement and evaluation of FAC damage in power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garud, Y.S.; Besuner, P.; Cohn, M.J.
1999-11-01
This paper describes some recent developments in the measurement and evaluation of flow-accelerated corrosion (FAC) damage in power plants. The evaluation focuses on data checking and smoothing to account for gross errors, noise, and uncertainty in the wall thickness measurements from ultrasonic or pulsed eddy-current data. Also, the evaluation method utilizes advanced regression analysis for spatial and temporal evolution of the wall loss, providing statistically robust predictions of wear rates and associated uncertainty. Results of the application of these new tools are presented for several components in actual service. More importantly, the practical implications of using these advances are discussedmore » in relation to the likely impact on the scope and effectiveness of FAC related inspection programs.« less
Power flow as a complement to statistical energy analysis and finite element analysis
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1987-01-01
Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.
Development of Northeast Asia Nuclear Power Plant Accident Simulator.
Kim, Juyub; Kim, Juyoul; Po, Li-Chi Cliff
2017-06-15
A conclusion from the lessons learned after the March 2011 Fukushima Daiichi accident was that Korea needs a tool to estimate consequences from a major accident that could occur at a nuclear power plant located in a neighboring country. This paper describes a suite of computer-based codes to be used by Korea's nuclear emergency response staff for training and potentially operational support in Korea's national emergency preparedness and response program. The systems of codes, Northeast Asia Nuclear Accident Simulator (NANAS), consist of three modules: source-term estimation, atmospheric dispersion prediction and dose assessment. To quickly assess potential doses to the public in Korea, NANAS includes specific reactor data from the nuclear power plants in China, Japan and Taiwan. The completed simulator is demonstrated using data for a hypothetical release. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Warped linear mixed models for the genetic analysis of transformed phenotypes
Fusi, Nicolo; Lippert, Christoph; Lawrence, Neil D.; Stegle, Oliver
2014-01-01
Linear mixed models (LMMs) are a powerful and established tool for studying genotype–phenotype relationships. A limitation of the LMM is that the model assumes Gaussian distributed residuals, a requirement that rarely holds in practice. Violations of this assumption can lead to false conclusions and loss in power. To mitigate this problem, it is common practice to pre-process the phenotypic values to make them as Gaussian as possible, for instance by applying logarithmic or other nonlinear transformations. Unfortunately, different phenotypes require different transformations, and choosing an appropriate transformation is challenging and subjective. Here we present an extension of the LMM that estimates an optimal transformation from the observed data. In simulations and applications to real data from human, mouse and yeast, we show that using transformations inferred by our model increases power in genome-wide association studies and increases the accuracy of heritability estimation and phenotype prediction. PMID:25234577
Warped linear mixed models for the genetic analysis of transformed phenotypes.
Fusi, Nicolo; Lippert, Christoph; Lawrence, Neil D; Stegle, Oliver
2014-09-19
Linear mixed models (LMMs) are a powerful and established tool for studying genotype-phenotype relationships. A limitation of the LMM is that the model assumes Gaussian distributed residuals, a requirement that rarely holds in practice. Violations of this assumption can lead to false conclusions and loss in power. To mitigate this problem, it is common practice to pre-process the phenotypic values to make them as Gaussian as possible, for instance by applying logarithmic or other nonlinear transformations. Unfortunately, different phenotypes require different transformations, and choosing an appropriate transformation is challenging and subjective. Here we present an extension of the LMM that estimates an optimal transformation from the observed data. In simulations and applications to real data from human, mouse and yeast, we show that using transformations inferred by our model increases power in genome-wide association studies and increases the accuracy of heritability estimation and phenotype prediction.
A critical assessment of topologically associating domain prediction tools
Dali, Rola
2017-01-01
Abstract Topologically associating domains (TADs) have been proposed to be the basic unit of chromosome folding and have been shown to play key roles in genome organization and gene regulation. Several different tools are available for TAD prediction, but their properties have never been thoroughly assessed. In this manuscript, we compare the output of seven different TAD prediction tools on two published Hi-C data sets. TAD predictions varied greatly between tools in number, size distribution and other biological properties. Assessed against a manual annotation of TADs, individual TAD boundary predictions were found to be quite reliable, but their assembly into complete TAD structures was much less so. In addition, many tools were sensitive to sequencing depth and resolution of the interaction frequency matrix. This manuscript provides users and designers of TAD prediction tools with information that will help guide the choice of tools and the interpretation of their predictions. PMID:28334773
Development of a High-Order Space-Time Matrix-Free Adjoint Solver
NASA Technical Reports Server (NTRS)
Ceze, Marco A.; Diosady, Laslo T.; Murman, Scott M.
2016-01-01
The growth in computational power and algorithm development in the past few decades has granted the science and engineering community the ability to simulate flows over complex geometries, thus making Computational Fluid Dynamics (CFD) tools indispensable in analysis and design. Currently, one of the pacing items limiting the utility of CFD for general problems is the prediction of unsteady turbulent ows.1{3 Reynolds-averaged Navier-Stokes (RANS) methods, which predict a time-invariant mean flowfield, struggle to provide consistent predictions when encountering even mild separation, such as the side-of-body separation at a wing-body junction. NASA's Transformative Tools and Technologies project is developing both numerical methods and physical modeling approaches to improve the prediction of separated flows. A major focus of this e ort is efficient methods for resolving the unsteady fluctuations occurring in these flows to provide valuable engineering data of the time-accurate flow field for buffet analysis, vortex shedding, etc. This approach encompasses unsteady RANS (URANS), large-eddy simulations (LES), and hybrid LES-RANS approaches such as Detached Eddy Simulations (DES). These unsteady approaches are inherently more expensive than traditional engineering RANS approaches, hence every e ort to mitigate this cost must be leveraged. Arguably, the most cost-effective approach to improve the efficiency of unsteady methods is the optimal placement of the spatial and temporal degrees of freedom (DOF) using solution-adaptive methods.
Dragon-Kings, Black-Swans and Prediction (Invited)
NASA Astrophysics Data System (ADS)
Sornette, D.
2010-12-01
Extreme fluctuations or events are often associated with power law statistics. Indeed, it is a popular belief that "wild randomness'' is deeply associated with distributions with power law tails characterized by small exponents. In other words, power law tails are often seen as the epitome of extreme events (the "Black Swan'' story). Here, we document in very different systems that there is life beyond power law tails: power laws can be superseded by "dragon-kings'', monster events that occur beyond (or changing) the power law tail. Dragon-kings reveal hidden mechanisms that are only transiently active and that amplify the normal fluctuations (often described by the power laws of the normal regime). The goal of this lecture is to catalyze the interest of the community of geophysicists across all fields of geosciences so that the "invisible gorilla" fallacy may be avoided. Our own research illustrates that new statistics or representation of data are often necessary to identify dragon-kings, with strategies guided by the underlying mechanisms. Paradoxically, the monsters may be ignored or hidden by the use of inappropriate analysis or statistical tools that amount to cut a mamooth in small pieces, so as to lead to the incorrect belief that only mice exist. In order to stimulate further research, we will document and discuss the dragon-king phenomenon on the statistics of financial losses, economic geography, hydrodynamic turbulence, mechanical ruptures, avalanches in complex heterogeneous media, earthquakes, and epileptic seizures. The special status of dragon-kings open a new research program on their predictability, based on the fact that they belong to a different class of their own and express specific mechanisms amplifying the normal dynamics via positive feedbacks. We will present evidence of these claims for the predictions of material rupture, financial crashes and epileptic seizures. As a bonus, a few remarks will be offered at the end on how the dragon-king phenomenon allows us to understand the present World financial crisis as underpinned in two decades of successive financial and economic bubbles, inflating the mother of all bubbles with new monster dragon-kings at the horizon. The consequences in terms of a new "normal" are eye-opening. Ref: D. Sornette, Dragon-Kings, Black Swans and the Prediction of Crises, International Journal of Terraspace Science and Engineering 1(3), 1-17 (2009) (http://arXiv.org/abs/0907.4290) and (http://ssrn.com/abstract=1470006)
Bade, Richard; Bijlsma, Lubertus; Sancho, Juan V; Hernández, Felix
2015-07-01
There has been great interest in environmental analytical chemistry in developing screening methods based on liquid chromatography-high resolution mass spectrometry (LC-HRMS) for emerging contaminants. Using HRMS, compound identification relies on the high mass resolving power and mass accuracy attainable by these analyzers. When dealing with wide-scope screening, retention time prediction can be a complementary tool for the identification of compounds, and can also reduce tedious data processing when several peaks appear in the extracted ion chromatograms. There are many in silico, Quantitative Structure-Retention Relationship methods available for the prediction of retention time for LC. However, most of these methods use commercial software to predict retention time based on various molecular descriptors. This paper explores the applicability and makes a critical discussion on a far simpler and cheaper approach to predict retention times by using LogKow. The predictor was based on a database of 595 compounds, their respective LogKow values and a chromatographic run time of 18min. Approximately 95% of the compounds were found within 4.0min of their actual retention times, and 70% within 2.0min. A predictor based purely on pesticides was also made, enabling 80% of these compounds to be found within 2.0min of their actual retention times. To demonstrate the utility of the predictors, they were successfully used as an additional tool in the identification of 30 commonly found emerging contaminants in water. Furthermore, a comparison was made by using different mass extraction windows to minimize the number of false positives obtained. Copyright © 2015 Elsevier B.V. All rights reserved.
Canary in a coal mine: does the plastic surgery market predict the american economy?
Wong, Wendy W; Davis, Drew G; Son, Andrew K; Camp, Matthew C; Gupta, Subhas C
2010-08-01
Economic tools have been used in the past to predict the trends in plastic surgery procedures. Since 1992, U.S. cosmetic surgery volumes have increased overall, but the exact relationship between economic downturns and procedural volumes remains elusive. If an economic predicting role can be established from plastic surgery indicators, this could prove to be a very powerful tool. A rolling 3-month revenue average of an eight-plastic surgeon practice and various economic indicators were plotted and compared. An investigation of the U.S. procedural volumes was performed from the American Society of Plastic Surgeons statistics between 1996 and 2008. The correlations of different economic variables with plastic surgery volumes were evaluated. Lastly, search term frequencies were examined from 2004 to July of 2009 to study potential patient interest in major plastic surgery procedures. The self-payment revenue of the plastic surgery group consistently proved indicative of the market trends approximately 1 month in advance. The Standard and Poor's 500, Dow Jones Industrial Average, National Association of Securities Dealers Automated Quotations, and Standard and Poor's Retail Index demonstrated a very close relationship with the income of our plastic surgery group. The frequency of Internet search terms showed a constant level of interest in the patient population despite economic downturns. The data demonstrate that examining plastic surgery revenue can be a useful tool to analyze and possibly predict trends, as it is driven by a market and shows a close correlation to many leading economic indicators. The persisting and increasing interest in plastic surgery suggests hope for a recovering and successful market in the near future.
Parallel Implementation of MAFFT on CUDA-Enabled Graphics Hardware.
Zhu, Xiangyuan; Li, Kenli; Salah, Ahmad; Shi, Lin; Li, Keqin
2015-01-01
Multiple sequence alignment (MSA) constitutes an extremely powerful tool for many biological applications including phylogenetic tree estimation, secondary structure prediction, and critical residue identification. However, aligning large biological sequences with popular tools such as MAFFT requires long runtimes on sequential architectures. Due to the ever increasing sizes of sequence databases, there is increasing demand to accelerate this task. In this paper, we demonstrate how graphic processing units (GPUs), powered by the compute unified device architecture (CUDA), can be used as an efficient computational platform to accelerate the MAFFT algorithm. To fully exploit the GPU's capabilities for accelerating MAFFT, we have optimized the sequence data organization to eliminate the bandwidth bottleneck of memory access, designed a memory allocation and reuse strategy to make full use of limited memory of GPUs, proposed a new modified-run-length encoding (MRLE) scheme to reduce memory consumption, and used high-performance shared memory to speed up I/O operations. Our implementation tested in three NVIDIA GPUs achieves speedup up to 11.28 on a Tesla K20m GPU compared to the sequential MAFFT 7.015.
Grid Stability Awareness System (GSAS) Final Scientific/Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feuerborn, Scott; Ma, Jian; Black, Clifton
The project team developed a software suite named Grid Stability Awareness System (GSAS) for power system near real-time stability monitoring and analysis based on synchrophasor measurement. The software suite consists of five analytical tools: an oscillation monitoring tool, a voltage stability monitoring tool, a transient instability monitoring tool, an angle difference monitoring tool, and an event detection tool. These tools have been integrated into one framework to provide power grid operators with both real-time or near real-time stability status of a power grid and historical information about system stability status. These tools are being considered for real-time use in themore » operation environment.« less
The Acoustic Analogy: A Powerful Tool in Aeroacoustics with Emphasis on Jet Noise Prediction
NASA Technical Reports Server (NTRS)
Farassat, F.; Doty, Michael J.; Hunter, Craig A.
2004-01-01
The acoustic analogy introduced by Lighthill to study jet noise is now over 50 years old. In the present paper, Lighthill s Acoustic Analogy is revisited together with a brief evaluation of the state-of-the-art of the subject and an exploration of the possibility of further improvements in jet noise prediction from analytical methods, computational fluid dynamics (CFD) predictions, and measurement techniques. Experimental Particle Image Velocimetry (PIV) data is used both to evaluate turbulent statistics from Reynolds-averaged Navier-Stokes (RANS) CFD and to propose correlation models for the Lighthill stress tensor. The NASA Langley Jet3D code is used to study the effect of these models on jet noise prediction. From the analytical investigation, a retarded time correction is shown that improves, by approximately 8 dB, the over-prediction of aft-arc jet noise by Jet3D. In experimental investigation, the PIV data agree well with the CFD mean flow predictions, with room for improvement in Reynolds stress predictions. Initial modifications, suggested by the PIV data, to the form of the Jet3D correlation model showed no noticeable improvements in jet noise prediction.
Saliba, Christopher M; Clouthier, Allison L; Brandon, Scott C E; Rainbow, Michael J; Deluzio, Kevin J
2018-05-29
Abnormal loading of the knee joint contributes to the pathogenesis of knee osteoarthritis. Gait retraining is a non-invasive intervention that aims to reduce knee loads by providing audible, visual, or haptic feedback of gait parameters. The computational expense of joint contact force prediction has limited real-time feedback to surrogate measures of the contact force, such as the knee adduction moment. We developed a method to predict knee joint contact forces using motion analysis and a statistical regression model that can be implemented in near real-time. Gait waveform variables were deconstructed using principal component analysis and a linear regression was used to predict the principal component scores of the contact force waveforms. Knee joint contact force waveforms were reconstructed using the predicted scores. We tested our method using a heterogenous population of asymptomatic controls and subjects with knee osteoarthritis. The reconstructed contact force waveforms had mean (SD) RMS differences of 0.17 (0.05) bodyweight compared to the contact forces predicted by a musculoskeletal model. Our method successfully predicted subject-specific shape features of contact force waveforms and is a potentially powerful tool in biofeedback and clinical gait analysis.
Hand and power tools: A compilation
NASA Technical Reports Server (NTRS)
1976-01-01
Some hand and power tools were described. Section One describes several tools and shop techniques that may be useful in the home or commercial shop. Section Two contains descriptions of tools that are particularly applicable to industrial work, and in Section Three a number of metal working tools are presented.
Continuum Electrostatics Approaches to Calculating pKas and Ems in Proteins
Gunner, MR; Baker, Nathan A.
2017-01-01
Proteins change their charge state through protonation and redox reactions as well as through binding charged ligands. The free energy of these reactions are dominated by solvation and electrostatic energies and modulated by protein conformational relaxation in response to the ionization state changes. Although computational methods for calculating these interactions can provide very powerful tools for predicting protein charge states, they include several critical approximations of which users should be aware. This chapter discusses the strengths, weaknesses, and approximations of popular computational methods for predicting charge states and understanding their underlying electrostatic interactions. The goal of this chapter is to inform users about applications and potential caveats of these methods as well as outline directions for future theoretical and computational research. PMID:27497160
To Collapse or not to Collapse: The Life of a Primordial Black Hole
NASA Astrophysics Data System (ADS)
Craig, Robert; Bloomfield, Jolyon; Face, Stephen
2016-03-01
Primordial black holes offer insights into topics ranging from cosmological questions about inflationary models to astrophysical questions regarding supermassive black holes. Such insights depend on being able to predict the number density of black holes that form from primordial fluctuations. Traditionally this has been done by means of a ``rule-of-thumb'' developed by Carr in the 1980s, but recent numerical studies have shown that this predictor is a coarse tool at best. We present a two-parameter predictor with much more discrimination power that can be straightforwardly used to compute number densities. We also discuss challenges that face this type of prediction strategy, both analytically and numerically, and possible ways to circumvent them.
A thermal sensation prediction tool for use by the profession
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fountain, M.E.; Huizenga, C.
1997-12-31
As part of a recent ASHRAE research project (781-RP), a thermal sensation prediction tool has been developed. This paper introduces the tool, describes the component thermal sensation models, and presents examples of how the tool can be used in practice. Since the main end product of the HVAC industry is the comfort of occupants indoors, tools for predicting occupant thermal response can be an important asset to designers of indoor climate control systems. The software tool presented in this paper incorporates several existing models for predicting occupant comfort.
Increasing power generation in horizontal axis wind turbines using optimized flow control
NASA Astrophysics Data System (ADS)
Cooney, John A., Jr.
In order to effectively realize future goals for wind energy, the efficiency of wind turbines must increase beyond existing technology. One direct method for achieving increased efficiency is by improving the individual power generation characteristics of horizontal axis wind turbines. The potential for additional improvement by traditional approaches is diminishing rapidly however. As a result, a research program was undertaken to assess the potential of using distributed flow control to increase power generation. The overall objective was the development of validated aerodynamic simulations and flow control approaches to improve wind turbine power generation characteristics. BEM analysis was conducted for a general set of wind turbine models encompassing last, current, and next generation designs. This analysis indicated that rotor lift control applied in Region II of the turbine power curve would produce a notable increase in annual power generated. This was achieved by optimizing induction factors along the rotor blade for maximum power generation. In order to demonstrate this approach and other advanced concepts, the University of Notre Dame established the Laboratory for Enhanced Wind Energy Design (eWiND). This initiative includes a fully instrumented meteorological tower and two pitch-controlled wind turbines. The wind turbines are representative in their design and operation to larger multi-megawatt turbines, but of a scale that allows rotors to be easily instrumented and replaced to explore new design concepts. Baseline data detailing typical site conditions and turbine operation is presented. To realize optimized performance, lift control systems were designed and evaluated in CFD simulations coupled with shape optimization tools. These were integrated into a systematic design methodology involving BEM simulations, CFD simulations and shape optimization, and selected experimental validation. To refine and illustrate the proposed design methodology, a complete design cycle was performed for the turbine model incorporated in the wind energy lab. Enhanced power generation was obtained through passive trailing edge shaping aimed at reaching lift and lift-to-drag goals predicted to optimize performance. These targets were determined by BEM analysis to improve power generation characteristics and annual energy production (AEP) for the wind turbine. A preliminary design was validated in wind tunnel experiments on a 2D rotor section in preparation for testing in the full atmospheric environment of the eWiND Laboratory. These tests were performed for the full-scale geometry and atmospheric conditions. Upon making additional improvements to the shape optimization tools, a series of trailing edge additions were designed to optimize power generation. The trailing edge additions were predicted to increase the AEP by up to 4.2% at the White Field site. The pieces were rapid-prototyped and installed on the wind turbine in March, 2014. Field tests are ongoing.
Comprehensive lipid analysis: a powerful metanomic tool for predictive and diagnostic medicine.
Watkins, S M
2000-09-01
The power and accuracy of predictive diagnostics stand to improve dramatically as a result of lipid metanomics. The high definition of data obtained with this approach allows multiple rather than single metabolites to be used in markers for a group. Since as many as 40 fatty acids are quantified from each lipid class, and up to 15 lipid classes can be quantified easily, more than 600 individual lipid metabolites can be measured routinely for each sample. Because these analyses are comprehensive, only the most appropriate and unique metabolites are selected for their predictive value. Thus, comprehensive lipid analysis promises to greatly improve predictive diagnostics for phenotypes that directly or peripherally involve lipids. A broader and possibly more exciting aspect of this technology is the generation of metabolic profiles that are not simply markers for disease, but metabolic maps that can be used to identify specific genes or activities that cause or influence the disease state. Metanomics is, in essence, functional genomics from metabolite analysis. By defining the metabolic basis for phenotype, researchers and clinicians will have an extraordinary opportunity to understand and treat disease. Much in the same way that gene chips allow researchers to observe the complex expression response to a stimulus, metanomics will enable researchers to observe the complex metabolic interplay responsible for defining phenotype. By extending this approach beyond the observation of individual dysregulations, medicine will begin to profile not single diseases, but health. As health is the proper balance of all vital metabolic pathways, comprehensive or metanomic analysis lends itself very well to identifying the metabolite distributions necessary for optimum health. Comprehensive and quantitative analysis of lipids would provide this degree of diagnostic power to researchers and clinicians interested in mining metabolic profiles for biological meaning.
NASA Astrophysics Data System (ADS)
Cipcigan, Flaviu S.; Sokhan, Vlad P.; Crain, Jason; Martyna, Glenn J.
2016-12-01
One key factor that limits the predictive power of molecular dynamics simulations is the accuracy and transferability of the input force field. Force fields are challenged by heterogeneous environments, where electronic responses give rise to biologically important forces such as many-body polarisation and dispersion. The importance of polarisation in the condensed phase was recognised early on, as described by Cochran in 1959 [Philosophical Magazine 4 (1959) 1082-1086] [32]. Currently in molecular simulation, dispersion forces are treated at the two-body level and in the dipole limit, although the importance of three-body terms in the condensed phase was demonstrated by Barker in the 1980s [Phys. Rev. Lett. 57 (1986) 230-233] [72]. One approach for treating both polarisation and dispersion on an equal basis is to coarse grain the electrons surrounding a molecular moiety to a single quantum harmonic oscillator (cf. Hirschfelder, Curtiss and Bird 1954 [The Molecular Theory of Gases and Liquids (1954)] [37]). The approach, when solved in strong coupling beyond the dipole limit, gives a description of long-range forces that includes two- and many-body terms to all orders. In the last decade, the tools necessary to implement the strong coupling limit have been developed, culminating in a transferable model of water with excellent predictive power across the phase diagram. Transferability arises since the environment automatically identifies the important long range interactions, rather than the modeller through a limited set of expressions. Here, we discuss the role of electronic coarse-graining in predictive multiscale materials modelling and describe the first implementation of the method in a general purpose molecular dynamics software: QDO_MD.
A reliable ground bounce noise reduction technique for nanoscale CMOS circuits
NASA Astrophysics Data System (ADS)
Sharma, Vijay Kumar; Pattanaik, Manisha
2015-11-01
Power gating is the most effective method to reduce the standby leakage power by adding header/footer high-VTH sleep transistors between actual and virtual power/ground rails. When a power gating circuit transitions from sleep mode to active mode, a large instantaneous charge current flows through the sleep transistors. Ground bounce noise (GBN) is the high voltage fluctuation on real ground rail during sleep mode to active mode transitions of power gating circuits. GBN disturbs the logic states of internal nodes of circuits. A novel and reliable power gating structure is proposed in this article to reduce the problem of GBN. The proposed structure contains low-VTH transistors in place of high-VTH footer. The proposed power gating structure not only reduces the GBN but also improves other performance metrics. A large mitigation of leakage power in both modes eliminates the need of high-VTH transistors. A comprehensive and comparative evaluation of proposed technique is presented in this article for a chain of 5-CMOS inverters. The simulation results are compared to other well-known GBN reduction circuit techniques at 22 nm predictive technology model (PTM) bulk CMOS model using HSPICE tool. Robustness against process, voltage and temperature (PVT) variations is estimated through Monte-Carlo simulations.
The use of power tools in the insertion of cortical bone screws.
Elliott, D
1992-01-01
Cortical bone screws are commonly used in fracture surgery, most patterns are non-self-tapping and require a thread to be pre-cut. This is traditionally performed using hand tools rather than their powered counterparts. Reasons given usually imply that power tools are more dangerous and cut a less precise thread, but there is no evidence to support this supposition. A series of experiments has been performed which show that the thread pattern cut with either method is identical and that over-penetration with the powered tap is easy to control. The conclusion reached is that both methods produce consistently reliable results but use of power tools is much faster.
2017-04-01
A COMPARISON OF PREDICTIVE THERMO AND WATER SOLVATION PROPERTY PREDICTION TOOLS AND EXPERIMENTAL DATA FOR...4. TITLE AND SUBTITLE A Comparison of Predictive Thermo and Water Solvation Property Prediction Tools and Experimental Data for Selected...1 2. EXPERIMENTAL PROCEDURE
Measurement of erosion in helicon plasma thrusters using the VASIMR® VX-CR device
NASA Astrophysics Data System (ADS)
Del Valle Gamboa, Juan Ignacio; Castro-Nieto, Jose; Squire, Jared; Carter, Mark; Chang-Diaz, Franklin
2015-09-01
The helicon plasma source is one of the principal stages of the high-power VASIMR® electric propulsion system. The VASIMR® VX-CR experiment focuses solely on this stage, exploring the erosion and long-term operation effects of the VASIMR helicon source. We report on the design and operational parameters of the VX-CR experiment, and the development of modeling tools and characterization techniques allowing the study of erosion phenomena in helicon plasma sources in general, and stand-alone helicon plasma thrusters (HPTs) in particular. A thorough understanding of the erosion phenomena within HPTs will enable better predictions of their behavior as well as more accurate estimations of their expected lifetime. We present a simplified model of the plasma-wall interactions within HPTs based on current models of the plasma density distributions in helicon discharges. Results from this modeling tool are used to predict the erosion within the plasma-facing components of the VX-CR device. Experimental techniques to measure actual erosion, including the use of coordinate-measuring machines and microscopy, will be discussed.
Space station interior noise analysis program
NASA Technical Reports Server (NTRS)
Stusnick, E.; Burn, M.
1987-01-01
Documentation is provided for a microcomputer program which was developed to evaluate the effect of the vibroacoustic environment on speech communication inside a space station. The program, entitled Space Station Interior Noise Analysis Program (SSINAP), combines a Statistical Energy Analysis (SEA) prediction of sound and vibration levels within the space station with a speech intelligibility model based on the Modulation Transfer Function and the Speech Transmission Index (MTF/STI). The SEA model provides an effective analysis tool for predicting the acoustic environment based on proposed space station design. The MTF/STI model provides a method for evaluating speech communication in the relatively reverberant and potentially noisy environments that are likely to occur in space stations. The combinations of these two models provides a powerful analysis tool for optimizing the acoustic design of space stations from the point of view of speech communications. The mathematical algorithms used in SSINAP are presented to implement the SEA and MTF/STI models. An appendix provides an explanation of the operation of the program along with details of the program structure and code.
Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.
Gong, Xiajing; Hu, Meng; Zhao, Liang
2018-05-01
Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Phonon-tunnelling dissipation in mechanical resonators
Cole, Garrett D.; Wilson-Rae, Ignacio; Werbach, Katharina; Vanner, Michael R.; Aspelmeyer, Markus
2011-01-01
Microscale and nanoscale mechanical resonators have recently emerged as ubiquitous devices for use in advanced technological applications, for example, in mobile communications and inertial sensors, and as novel tools for fundamental scientific endeavours. Their performance is in many cases limited by the deleterious effects of mechanical damping. In this study, we report a significant advancement towards understanding and controlling support-induced losses in generic mechanical resonators. We begin by introducing an efficient numerical solver, based on the 'phonon-tunnelling' approach, capable of predicting the design-limited damping of high-quality mechanical resonators. Further, through careful device engineering, we isolate support-induced losses and perform a rigorous experimental test of the strong geometric dependence of this loss mechanism. Our results are in excellent agreement with the theory, demonstrating the predictive power of our approach. In combination with recent progress on complementary dissipation mechanisms, our phonon-tunnelling solver represents a major step towards accurate prediction of the mechanical quality factor. PMID:21407197
An elastic-plastic contact model for line contact structures
NASA Astrophysics Data System (ADS)
Zhu, Haibin; Zhao, Yingtao; He, Zhifeng; Zhang, Ruinan; Ma, Shaopeng
2018-06-01
Although numerical simulation tools are now very powerful, the development of analytical models is very important for the prediction of the mechanical behaviour of line contact structures for deeply understanding contact problems and engineering applications. For the line contact structures widely used in the engineering field, few analytical models are available for predicting the mechanical behaviour when the structures deform plastically, as the classic Hertz's theory would be invalid. Thus, the present study proposed an elastic-plastic model for line contact structures based on the understanding of the yield mechanism. A mathematical expression describing the global relationship between load history and contact width evolution of line contact structures was obtained. The proposed model was verified through an actual line contact test and a corresponding numerical simulation. The results confirmed that this model can be used to accurately predict the elastic-plastic mechanical behaviour of a line contact structure.
Interfacial charge transfer absorption: Application to metal molecule assemblies
NASA Astrophysics Data System (ADS)
Creutz, Carol; Brunschwig, Bruce S.; Sutin, Norman
2006-05-01
Optically induced charge transfer between adsorbed molecules and a metal electrode was predicted by Hush to lead to new electronic absorption features, but has been only rarely observed experimentally. Interfacial charge transfer absorption (IFCTA) provides information concerning the barriers to charge transfer between molecules and the metal/semiconductor and the magnitude of the electronic coupling and could thus provide a powerful tool for understanding interfacial charge-transfer kinetics. Here, we utilize a previously published model [C. Creutz, B.S. Brunschwig, N. Sutin, J. Phys. Chem. B 109 (2005) 10251] to predict IFCTA spectra of metal-molecule assemblies and compare the literature observations to these predictions. We conclude that, in general, the electronic coupling between molecular adsorbates and the metal levels is so small that IFCTA is not detectable. However, few experiments designed to detect IFCTA have been done. We suggest approaches to optimizing the conditions for observing the process.
Design and Development of ChemInfoCloud: An Integrated Cloud Enabled Platform for Virtual Screening.
Karthikeyan, Muthukumarasamy; Pandit, Deepak; Bhavasar, Arvind; Vyas, Renu
2015-01-01
The power of cloud computing and distributed computing has been harnessed to handle vast and heterogeneous data required to be processed in any virtual screening protocol. A cloud computing platorm ChemInfoCloud was built and integrated with several chemoinformatics and bioinformatics tools. The robust engine performs the core chemoinformatics tasks of lead generation, lead optimisation and property prediction in a fast and efficient manner. It has also been provided with some of the bioinformatics functionalities including sequence alignment, active site pose prediction and protein ligand docking. Text mining, NMR chemical shift (1H, 13C) prediction and reaction fingerprint generation modules for efficient lead discovery are also implemented in this platform. We have developed an integrated problem solving cloud environment for virtual screening studies that also provides workflow management, better usability and interaction with end users using container based virtualization, OpenVz.
Ando, Tatsuya; Suguro, Miyuki; Hanai, Taizo; Kobayashi, Takeshi; Seto, Masao
2002-01-01
Diffuse large B‐cell lymphoma (DLBCL) is the largest category of aggressive lymphomas. Less than 50% of patients can be cured by combination chemotherapy. Microarray technologies have recently shown that the response to chemotherapy reflects the molecular heterogeneity in DLBCL. On the basis of published microarray data, we attempted to develop a long‐overdue method for the precise and simple prediction of survival of DLBCL patients. We developed a fuzzy neural network (FNN) model to analyze gene expression profiling data for DLBCL. From data on 5857 genes, this model identified four genes (CD10, AA807551, AA805611 and IRF‐4) that could be used to predict prognosis with 93% accuracy. FNNs are powerful tools for extracting significant biological markers affecting prognosis, and are applicable to various kinds of expression profiling data for any malignancy. PMID:12460461
2010-01-01
Atomistic Molecular Dynamics provides powerful and flexible tools for the prediction and analysis of molecular and macromolecular systems. Specifically, it provides a means by which we can measure theoretically that which cannot be measured experimentally: the dynamic time-evolution of complex systems comprising atoms and molecules. It is particularly suitable for the simulation and analysis of the otherwise inaccessible details of MHC-peptide interaction and, on a larger scale, the simulation of the immune synapse. Progress has been relatively tentative yet the emergence of truly high-performance computing and the development of coarse-grained simulation now offers us the hope of accurately predicting thermodynamic parameters and of simulating not merely a handful of proteins but larger, longer simulations comprising thousands of protein molecules and the cellular scale structures they form. We exemplify this within the context of immunoinformatics. PMID:21067546
Lu, Qiongshi; Hu, Yiming; Sun, Jiehuan; Cheng, Yuwei; Cheung, Kei-Hoi; Zhao, Hongyu
2015-05-27
Identifying functional regions in the human genome is a major goal in human genetics. Great efforts have been made to functionally annotate the human genome either through computational predictions, such as genomic conservation, or high-throughput experiments, such as the ENCODE project. These efforts have resulted in a rich collection of functional annotation data of diverse types that need to be jointly analyzed for integrated interpretation and annotation. Here we present GenoCanyon, a whole-genome annotation method that performs unsupervised statistical learning using 22 computational and experimental annotations thereby inferring the functional potential of each position in the human genome. With GenoCanyon, we are able to predict many of the known functional regions. The ability of predicting functional regions as well as its generalizable statistical framework makes GenoCanyon a unique and powerful tool for whole-genome annotation. The GenoCanyon web server is available at http://genocanyon.med.yale.edu.
Kell, Alexander J E; Yamins, Daniel L K; Shook, Erica N; Norman-Haignere, Sam V; McDermott, Josh H
2018-05-02
A core goal of auditory neuroscience is to build quantitative models that predict cortical responses to natural sounds. Reasoning that a complete model of auditory cortex must solve ecologically relevant tasks, we optimized hierarchical neural networks for speech and music recognition. The best-performing network contained separate music and speech pathways following early shared processing, potentially replicating human cortical organization. The network performed both tasks as well as humans and exhibited human-like errors despite not being optimized to do so, suggesting common constraints on network and human performance. The network predicted fMRI voxel responses substantially better than traditional spectrotemporal filter models throughout auditory cortex. It also provided a quantitative signature of cortical representational hierarchy-primary and non-primary responses were best predicted by intermediate and late network layers, respectively. The results suggest that task optimization provides a powerful set of tools for modeling sensory systems. Copyright © 2018 Elsevier Inc. All rights reserved.
Bevelhimer, Mark S.; DeRolph, Christopher R.; Schramm, Michael P.
2016-06-06
Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multidisciplinary explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, wemore » were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements have been a result of a range of factors, from biological and hydrological to political and cultural. Furthermore, project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation.« less
A global model for steady state and transient S.I. engine heat transfer studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bohac, S.V.; Assanis, D.N.; Baker, D.M.
1996-09-01
A global, systems-level model which characterizes the thermal behavior of internal combustion engines is described in this paper. Based on resistor-capacitor thermal networks, either steady-state or transient thermal simulations can be performed. A two-zone, quasi-dimensional spark-ignition engine simulation is used to determine in-cylinder gas temperature and convection coefficients. Engine heat fluxes and component temperatures can subsequently be predicted from specification of general engine dimensions, materials, and operating conditions. Emphasis has been placed on minimizing the number of model inputs and keeping them as simple as possible to make the model practical and useful as an early design tool. The successmore » of the global model depends on properly scaling the general engine inputs to accurately model engine heat flow paths across families of engine designs. The development and validation of suitable, scalable submodels is described in detail in this paper. Simulation sub-models and overall system predictions are validated with data from two spark ignition engines. Several sensitivity studies are performed to determine the most significant heat transfer paths within the engine and exhaust system. Overall, it has been shown that the model is a powerful tool in predicting steady-state heat rejection and component temperatures, as well as transient component temperatures.« less
DeRolph, Christopher R; Schramm, Michael P; Bevelhimer, Mark S
2016-10-01
Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multi-faceted explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, we were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements are functions of a range of factors, from biophysical to socio-political. Project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bevelhimer, Mark S.; DeRolph, Christopher R.; Schramm, Michael P.
Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multidisciplinary explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, wemore » were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements have been a result of a range of factors, from biological and hydrological to political and cultural. Furthermore, project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation.« less
Kaserer, Teresa; Temml, Veronika; Kutil, Zsofia; Vanek, Tomas; Landa, Premysl; Schuster, Daniela
2015-01-01
Computational methods can be applied in drug development for the identification of novel lead candidates, but also for the prediction of pharmacokinetic properties and potential adverse effects, thereby aiding to prioritize and identify the most promising compounds. In principle, several techniques are available for this purpose, however, which one is the most suitable for a specific research objective still requires further investigation. Within this study, the performance of several programs, representing common virtual screening methods, was compared in a prospective manner. First, we selected top-ranked virtual screening hits from the three methods pharmacophore modeling, shape-based modeling, and docking. For comparison, these hits were then additionally predicted by external pharmacophore- and 2D similarity-based bioactivity profiling tools. Subsequently, the biological activities of the selected hits were assessed in vitro, which allowed for evaluating and comparing the prospective performance of the applied tools. Although all methods performed well, considerable differences were observed concerning hit rates, true positive and true negative hits, and hitlist composition. Our results suggest that a rational selection of the applied method represents a powerful strategy to maximize the success of a research project, tightly linked to its aims. We employed cyclooxygenase as application example, however, the focus of this study lied on highlighting the differences in the virtual screening tool performances and not in the identification of novel COX-inhibitors. Copyright © 2015 The Authors. Published by Elsevier Masson SAS.. All rights reserved.
How Not To Drown in Data: A Guide for Biomaterial Engineers.
Vasilevich, Aliaksei S; Carlier, Aurélie; de Boer, Jan; Singh, Shantanu
2017-08-01
High-throughput assays that produce hundreds of measurements per sample are powerful tools for quantifying cell-material interactions. With advances in automation and miniaturization in material fabrication, hundreds of biomaterial samples can be rapidly produced, which can then be characterized using these assays. However, the resulting deluge of data can be overwhelming. To the rescue are computational methods that are well suited to these problems. Machine learning techniques provide a vast array of tools to make predictions about cell-material interactions and to find patterns in cellular responses. Computational simulations allow researchers to pose and test hypotheses and perform experiments in silico. This review describes approaches from these two domains that can be brought to bear on the problem of analyzing biomaterial screening data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Using prediction markets to estimate the reproducibility of scientific research.
Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus
2015-12-15
Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.
Using prediction markets to estimate the reproducibility of scientific research
Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus
2015-01-01
Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988
Dissolved oxygen content prediction in crab culture using a hybrid intelligent method
Yu, Huihui; Chen, Yingyi; Hassan, ShahbazGul; Li, Daoliang
2016-01-01
A precise predictive model is needed to obtain a clear understanding of the changing dissolved oxygen content in outdoor crab ponds, to assess how to reduce risk and to optimize water quality management. The uncertainties in the data from multiple sensors are a significant factor when building a dissolved oxygen content prediction model. To increase prediction accuracy, a new hybrid dissolved oxygen content forecasting model based on the radial basis function neural networks (RBFNN) data fusion method and a least squares support vector machine (LSSVM) with an optimal improved particle swarm optimization(IPSO) is developed. In the modelling process, the RBFNN data fusion method is used to improve information accuracy and provide more trustworthy training samples for the IPSO-LSSVM prediction model. The LSSVM is a powerful tool for achieving nonlinear dissolved oxygen content forecasting. In addition, an improved particle swarm optimization algorithm is developed to determine the optimal parameters for the LSSVM with high accuracy and generalizability. In this study, the comparison of the prediction results of different traditional models validates the effectiveness and accuracy of the proposed hybrid RBFNN-IPSO-LSSVM model for dissolved oxygen content prediction in outdoor crab ponds. PMID:27270206
Dissolved oxygen content prediction in crab culture using a hybrid intelligent method.
Yu, Huihui; Chen, Yingyi; Hassan, ShahbazGul; Li, Daoliang
2016-06-08
A precise predictive model is needed to obtain a clear understanding of the changing dissolved oxygen content in outdoor crab ponds, to assess how to reduce risk and to optimize water quality management. The uncertainties in the data from multiple sensors are a significant factor when building a dissolved oxygen content prediction model. To increase prediction accuracy, a new hybrid dissolved oxygen content forecasting model based on the radial basis function neural networks (RBFNN) data fusion method and a least squares support vector machine (LSSVM) with an optimal improved particle swarm optimization(IPSO) is developed. In the modelling process, the RBFNN data fusion method is used to improve information accuracy and provide more trustworthy training samples for the IPSO-LSSVM prediction model. The LSSVM is a powerful tool for achieving nonlinear dissolved oxygen content forecasting. In addition, an improved particle swarm optimization algorithm is developed to determine the optimal parameters for the LSSVM with high accuracy and generalizability. In this study, the comparison of the prediction results of different traditional models validates the effectiveness and accuracy of the proposed hybrid RBFNN-IPSO-LSSVM model for dissolved oxygen content prediction in outdoor crab ponds.
Ma, Zhao; Yang, Yong; Lin, JiSheng; Zhang, XiaoDong; Meng, Qian; Wang, BingQiang; Fei, Qi
2016-01-01
Purpose To develop a simple new clinical screening tool to identify primary osteoporosis by dual-energy X-ray absorptiometry (DXA) in postmenopausal women and to compare its validity with the Osteoporosis Self-Assessment Tool for Asians (OSTA) in a Han Chinese population. Methods A cross-sectional study was conducted, enrolling 1,721 community-dwelling postmenopausal Han Chinese women. All the subjects completed a structured questionnaire and had their bone mineral density measured using DXA. Using logistic regression analysis, we assessed the ability of numerous potential risk factors examined in the questionnaire to identify women with osteoporosis. Based on this analysis, we build a new predictive model, the Beijing Friendship Hospital Osteoporosis Self-Assessment Tool (BFH-OST). Receiver operating characteristic curves were generated to compare the validity of the new model and OSTA in identifying postmenopausal women at increased risk of primary osteoporosis as defined according to the World Health Organization criteria. Results At screening, it was found that of the 1,721 subjects with DXA, 22.66% had osteoporosis and a further 47.36% had osteopenia. Of the items screened in the questionnaire, it was found that age, weight, height, body mass index, personal history of fracture after the age of 45 years, history of fragility fracture in either parent, current smoking, and consumption of three of more alcoholic drinks per day were all predictive of osteoporosis. However, age at menarche and menopause, years since menopause, and number of pregnancies and live births were irrelevant in this study. The logistic regression analysis and item reduction yielded a final tool (BFH-OST) based on age, body weight, height, and history of fracture after the age of 45 years. The BFH-OST index (cutoff =9.1), which performed better than OSTA, had a sensitivity of 73.6% and a specificity of 72.7% for identifying osteoporosis, with an area under the receiver operating characteristic curve of 0.797. Conclusion BFH-OST may be a powerful and cost-effective new clinical risk assessment tool for prescreening postmenopausal women at increased risk for osteoporosis by DXA, especially for Han Chinese women. PMID:27536085
VLSI design of lossless frame recompression using multi-orientation prediction
NASA Astrophysics Data System (ADS)
Lee, Yu-Hsuan; You, Yi-Lun; Chen, Yi-Guo
2016-01-01
Pursuing an experience of high-end visual quality drives human to demand a higher display resolution and a higher frame rate. Hence, a lot of powerful coding tools are aggregated together in emerging video coding standards to improve coding efficiency. This also makes video coding standards suffer from two design challenges: heavy computation and tremendous memory bandwidth. The first issue can be properly solved by a careful hardware architecture design with advanced semiconductor processes. Nevertheless, the second one becomes a critical design bottleneck for a modern video coding system. In this article, a lossless frame recompression using multi-orientation prediction technique is proposed to overcome this bottleneck. This work is realised into a silicon chip with the technology of TSMC 0.18 µm CMOS process. Its encoding capability can reach full-HD (1920 × 1080)@48 fps. The chip power consumption is 17.31 mW@100 MHz. Core area and chip area are 0.83 × 0.83 mm2 and 1.20 × 1.20 mm2, respectively. Experiment results demonstrate that this work exhibits an outstanding performance on lossless compression ratio with a competitive hardware performance.
Applicability of internet search index for asthma admission forecast using machine learning.
Luo, Li; Liao, Chengcheng; Zhang, Fengyi; Zhang, Wei; Li, Chunyang; Qiu, Zhixin; Huang, Debin
2018-04-15
This study aimed to determine whether a search index could provide insight into trends in asthma admission in China. An Internet search index is a powerful tool to monitor and predict epidemic outbreaks. However, whether using an internet search index can significantly improve asthma admissions forecasts remains unknown. The long-term goal is to develop a surveillance system to help early detection and interventions for asthma and to avoid asthma health care resource shortages in advance. In this study, we used a search index combined with air pollution data, weather data, and historical admissions data to forecast asthma admissions using machine learning. Results demonstrated that the best area under the curve in the test set that can be achieved is 0.832, using all predictors mentioned earlier. A search index is a powerful predictor in asthma admissions forecast, and a recent search index can reflect current asthma admissions with a lag-effect to a certain extent. The addition of a real-time, easily accessible search index improves forecasting capabilities and demonstrates the predictive potential of search index. Copyright © 2018 John Wiley & Sons, Ltd.
Light Water Reactor Sustainability Program: Survey of Models for Concrete Degradation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin W.; Huang, Hai
Concrete is widely used in the construction of nuclear facilities because of its structural strength and its ability to shield radiation. The use of concrete in nuclear facilities for containment and shielding of radiation and radioactive materials has made its performance crucial for the safe operation of the facility. As such, when life extension is considered for nuclear power plants, it is critical to have predictive tools to address concerns related to aging processes of concrete structures and the capacity of structures subjected to age-related degradation. The goal of this report is to review and document the main aging mechanismsmore » of concern for concrete structures in nuclear power plants (NPPs) and the models used in simulations of concrete aging and structural response of degraded concrete structures. This is in preparation for future work to develop and apply models for aging processes and response of aged NPP concrete structures in the Grizzly code. To that end, this report also provides recommendations for developing more robust predictive models for aging effects of performance of concrete.« less
Liu, Feng; Liu, Wenhui; Tian, Shuge
2014-09-01
A combination of an orthogonal L16(4)4 test design and a three-layer artificial neural network (ANN) model was applied to optimize polysaccharides from Althaea rosea seeds extracted by hot water method. The highest optimal experimental yield of A. rosea seed polysaccharides (ARSPs) of 59.85 mg/g was obtained using three extraction numbers, 113 min extraction time, 60.0% ethanol concentration, and 1:41 solid-liquid ratio. Under these optimized conditions, the ARSP experimental yield was very close to the predicted yield of 60.07 mg/g and was higher than the orthogonal test results (40.86 mg/g). Structural characterizations were conducted using physicochemical property and FTIR analysis. In addition, the study of ARSP antioxidant activity demonstrated that polysaccharides exhibited high superoxide dismutase activity, strong reducing power, and positive scavenging activity on superoxide anion, hydroxyl radical, 2,2-diphenyl-1-picrylhydrazyl, and reducing power. Our results indicated that ANNs were efficient quantitative tools for predicting the total ARSP content. Copyright © 2014 Elsevier B.V. All rights reserved.
Performance simulation of a grid connected photovoltaic power system using TRNSYS 17
NASA Astrophysics Data System (ADS)
Raja Sekhar, Y.; Ganesh, D.; Kumar, A. Suresh; Abraham, Raju; Padmanathan, P.
2017-11-01
Energy plays an important role in a country’s economic growth in the current energy scenario, the major problem is depletion of energy sources (non-renewable) are more than being formed. One of the prominent solutions is minimizing the use of fossil fuels by utilization of renewable energy resources. A photovoltaic system is an efficient option in terms of utilizing the solar energy resource. The electricity output produced by the photovoltaic systems depends upon the incident solar radiation. This paper examines the performance simulation of 200KW photovoltaic power system at VIT University, Vellore. The main objective of this paper is to correlate the results between the predicted simulation data and the experimental data. The simulation tool used here is TRNSYS. Using TRNSYS modelling prediction of electricity produced throughout the year can be calculated with the help of TRNSYS weather station. The deviation of the simulated results with the experimented results varies due to the choice of weather station. Results from the field test and simulation results are to be correlated to attain the maximum performance of the system.
Very-short-term wind power prediction by a hybrid model with single- and multi-step approaches
NASA Astrophysics Data System (ADS)
Mohammed, E.; Wang, S.; Yu, J.
2017-05-01
Very-short-term wind power prediction (VSTWPP) has played an essential role for the operation of electric power systems. This paper aims at improving and applying a hybrid method of VSTWPP based on historical data. The hybrid method is combined by multiple linear regressions and least square (MLR&LS), which is intended for reducing prediction errors. The predicted values are obtained through two sub-processes:1) transform the time-series data of actual wind power into the power ratio, and then predict the power ratio;2) use the predicted power ratio to predict the wind power. Besides, the proposed method can include two prediction approaches: single-step prediction (SSP) and multi-step prediction (MSP). WPP is tested comparatively by auto-regressive moving average (ARMA) model from the predicted values and errors. The validity of the proposed hybrid method is confirmed in terms of error analysis by using probability density function (PDF), mean absolute percent error (MAPE) and means square error (MSE). Meanwhile, comparison of the correlation coefficients between the actual values and the predicted values for different prediction times and window has confirmed that MSP approach by using the hybrid model is the most accurate while comparing to SSP approach and ARMA. The MLR&LS is accurate and promising for solving problems in WPP.
Impact of design features upon perceived tool usability and safety
NASA Astrophysics Data System (ADS)
Wiker, Steven F.; Seol, Mun-Su
2005-11-01
While injuries from powered hand tools are caused by a number of factors, this study looks specifically at the impact of the tools design features on perceived tool usability and safety. The tools used in this study are circular saws, power drills and power nailers. Sixty-nine males and thirty-two females completed an anonymous web-based questionnaire that provided orthogonal view photographs of the various tools. Subjects or raters provided: 1) description of the respondents or raters, 2) description of the responses from the raters, and 3) analysis of the interrelationships among respondent ratings of tool safety and usability, physical metrics of the tool, and rater demographic information. The results of the study found that safety and usability were dependent materially upon rater history of use and experience, but not upon training in safety and usability, or quality of design features of the tools (e.g., grip diameters, trigger design, guards, etc.). Thus, positive and negative transfer of prior experience with use of powered hand tools is far more important than any expectancy that may be driven by prior safety and usability training, or from the visual cues that are provided by the engineering design of the tool.
Rubin, Katrine Hass; Friis-Holmberg, Teresa; Hermann, Anne Pernille; Abrahamsen, Bo; Brixen, Kim
2013-08-01
A huge number of risk assessment tools have been developed. Far from all have been validated in external studies, more of them have absence of methodological and transparent evidence, and few are integrated in national guidelines. Therefore, we performed a systematic review to provide an overview of existing valid and reliable risk assessment tools for prediction of osteoporotic fractures. Additionally, we aimed to determine if the performance of each tool was sufficient for practical use, and last, to examine whether the complexity of the tools influenced their discriminative power. We searched PubMed, Embase, and Cochrane databases for papers and evaluated these with respect to methodological quality using the Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS) checklist. A total of 48 tools were identified; 20 had been externally validated, however, only six tools had been tested more than once in a population-based setting with acceptable methodological quality. None of the tools performed consistently better than the others and simple tools (i.e., the Osteoporosis Self-assessment Tool [OST], Osteoporosis Risk Assessment Instrument [ORAI], and Garvan Fracture Risk Calculator [Garvan]) often did as well or better than more complex tools (i.e., Simple Calculated Risk Estimation Score [SCORE], WHO Fracture Risk Assessment Tool [FRAX], and Qfracture). No studies determined the effectiveness of tools in selecting patients for therapy and thus improving fracture outcomes. High-quality studies in randomized design with population-based cohorts with different case mixes are needed. Copyright © 2013 American Society for Bone and Mineral Research.
2012-03-01
In a study focused on Baltimore, MD, researchers have found that data culled from Google Flu Trends, a free Internet-based influenza surveillance system, shows strong correlation with hikes in ED visits from patients with flu-like symptoms. While the approach has yet to be validated in other cities or regions, experts recommend that ED administrators and providers familiarize themselves with the new surveillance tool and stay abreast of developments regarding similar surveillance mechanisms. Google Flu Trends (www.google.org/flutrends/) is a free Internet-based tool that monitors Internet-based searches for flu information. Users can customize their search by location (city, state, country). Researchers say the advantage of this approach over traditional surveillance methods is that it provides real-time data about flu-related activity in a city or region. Traditional approaches, which rely on case reports from the Centers for Disease Control and Prevention, are delayed. Researchers hope to eventually leverage this tool, and perhaps other surveillance data, into a powerful early-warning mechanism that EDs can use to better plan for patient surges due to influenza.
Zhang, X L; Su, G F; Yuan, H Y; Chen, J G; Huang, Q Y
2014-09-15
Atmospheric dispersion models play an important role in nuclear power plant accident management. A reliable estimation of radioactive material distribution in short range (about 50 km) is in urgent need for population sheltering and evacuation planning. However, the meteorological data and the source term which greatly influence the accuracy of the atmospheric dispersion models are usually poorly known at the early phase of the emergency. In this study, a modified ensemble Kalman filter data assimilation method in conjunction with a Lagrangian puff-model is proposed to simultaneously improve the model prediction and reconstruct the source terms for short range atmospheric dispersion using the off-site environmental monitoring data. Four main uncertainty parameters are considered: source release rate, plume rise height, wind speed and wind direction. Twin experiments show that the method effectively improves the predicted concentration distribution, and the temporal profiles of source release rate and plume rise height are also successfully reconstructed. Moreover, the time lag in the response of ensemble Kalman filter is shortened. The method proposed here can be a useful tool not only in the nuclear power plant accident emergency management but also in other similar situation where hazardous material is released into the atmosphere. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sbarufatti, Claudio; Corbetta, Matteo; Giglio, Marco; Cadini, Francesco
2017-03-01
Lithium-Ion rechargeable batteries are widespread power sources with applications to consumer electronics, electrical vehicles, unmanned aerial and spatial vehicles, etc. The failure to supply the required power levels may lead to severe safety and economical consequences. Thus, in view of the implementation of adequate maintenance strategies, the development of diagnostic and prognostic tools for monitoring the state of health of the batteries and predicting their remaining useful life is becoming a crucial task. Here, we propose a method for predicting the end of discharge of Li-Ion batteries, which stems from the combination of particle filters with radial basis function neural networks. The major innovation lies in the fact that the radial basis function model is adaptively trained on-line, i.e., its parameters are identified in real time by the particle filter as new observations of the battery terminal voltage become available. By doing so, the prognostic algorithm achieves the flexibility needed to provide sound end-of-discharge time predictions as the charge-discharge cycles progress, even in presence of anomalous behaviors due to failures or unforeseen operating conditions. The method is demonstrated with reference to actual Li-Ion battery discharge data contained in the prognostics data repository of the NASA Ames Research Center database.
Analyses of Field Test Data at the Atucha-1 Spent Fuel Pools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sitaraman, S.
A field test was conducted at the Atucha-1 spent nuclear fuel pools to validate a software package for gross defect detection that is used in conjunction with the inspection tool, Spent Fuel Neutron Counter (SFNC). A set of measurements was taken with the SFNC and the software predictions were compared with these data and analyzed. The data spanned a wide range of cooling times and a set of burnup levels leading to count rates from the several hundreds to around twenty per second. The current calibration in the software using linear fitting required the use of multiple calibration factors tomore » cover the entire range of count rates recorded. The solution to this was to use power regression data fitting to normalize the predicted response and derive one calibration factor that can be applied to the entire set of data. The resulting comparisons between the predicted and measured responses were generally good and provided a quantitative method of detecting missing fuel in virtually all situations. Since the current version of the software uses the linear calibration method, it would need to be updated with the new power regression method to make it more user-friendly for real time verification and fieldable for the range of responses that will be encountered.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tournier, J.; El-Genk, M.S.; Huang, L.
1999-01-01
The Institute of Space and Nuclear Power Studies at the University of New Mexico has developed a computer simulation of cylindrical geometry alkali metal thermal-to-electric converter cells using a standard Fortran 77 computer code. The objective and use of this code was to compare the experimental measurements with computer simulations, upgrade the model as appropriate, and conduct investigations of various methods to improve the design and performance of the devices for improved efficiency, durability, and longer operational lifetime. The Institute of Space and Nuclear Power Studies participated in vacuum testing of PX series alkali metal thermal-to-electric converter cells and developedmore » the alkali metal thermal-to-electric converter Performance Evaluation and Analysis Model. This computer model consisted of a sodium pressure loss model, a cell electrochemical and electric model, and a radiation/conduction heat transfer model. The code closely predicted the operation and performance of a wide variety of PX series cells which led to suggestions for improvements to both lifetime and performance. The code provides valuable insight into the operation of the cell, predicts parameters of components within the cell, and is a useful tool for predicting both the transient and steady state performance of systems of cells.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tournier, J.; El-Genk, M.S.; Huang, L.
1999-01-01
The Institute of Space and Nuclear Power Studies at the University of New Mexico has developed a computer simulation of cylindrical geometry alkali metal thermal-to-electric converter cells using a standard Fortran 77 computer code. The objective and use of this code was to compare the experimental measurements with computer simulations, upgrade the model as appropriate, and conduct investigations of various methods to improve the design and performance of the devices for improved efficiency, durability, and longer operational lifetime. The Institute of Space and Nuclear Power Studies participated in vacuum testing of PX series alkali metal thermal-to-electric converter cells and developedmore » the alkali metal thermal-to-electric converter Performance Evaluation and Analysis Model. This computer model consisted of a sodium pressure loss model, a cell electrochemical and electric model, and a radiation/conduction heat transfer model. The code closely predicted the operation and performance of a wide variety of PX series cells which led to suggestions for improvements to both lifetime and performance. The code provides valuable insight into the operation of the cell, predicts parameters of components within the cell, and is a useful tool for predicting both the transient and steady state performance of systems of cells.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udhay Ravishankar; Milos manic
2013-08-01
This paper presents a micro-grid simulator tool useful for implementing and testing multi-agent controllers (SGridSim). As a common engineering practice it is important to have a tool that simplifies the modeling of the salient features of a desired system. In electric micro-grids, these salient features are the voltage and power distributions within the micro-grid. Current simplified electric power grid simulator tools such as PowerWorld, PowerSim, Gridlab, etc, model only the power distribution features of a desired micro-grid. Other power grid simulators such as Simulink, Modelica, etc, use detailed modeling to accommodate the voltage distribution features. This paper presents a SGridSimmore » micro-grid simulator tool that simplifies the modeling of both the voltage and power distribution features in a desired micro-grid. The SGridSim tool accomplishes this simplified modeling by using Effective Node-to-Node Complex Impedance (EN2NCI) models of components that typically make-up a micro-grid. The term EN2NCI models means that the impedance based components of a micro-grid are modeled as single impedances tied between their respective voltage nodes on the micro-grid. Hence the benefit of the presented SGridSim tool are 1) simulation of a micro-grid is performed strictly in the complex-domain; 2) faster simulation of a micro-grid by avoiding the simulation of detailed transients. An example micro-grid model was built using the SGridSim tool and tested to simulate both the voltage and power distribution features with a total absolute relative error of less than 6%.« less
Prediction of invasion from the early stage of an epidemic
Pérez-Reche, Francisco J.; Neri, Franco M.; Taraskin, Sergei N.; Gilligan, Christopher A.
2012-01-01
Predictability of undesired events is a question of great interest in many scientific disciplines including seismology, economy and epidemiology. Here, we focus on the predictability of invasion of a broad class of epidemics caused by diseases that lead to permanent immunity of infected hosts after recovery or death. We approach the problem from the perspective of the science of complexity by proposing and testing several strategies for the estimation of important characteristics of epidemics, such as the probability of invasion. Our results suggest that parsimonious approximate methodologies may lead to the most reliable and robust predictions. The proposed methodologies are first applied to analysis of experimentally observed epidemics: invasion of the fungal plant pathogen Rhizoctonia solani in replicated host microcosms. We then consider numerical experiments of the susceptible–infected–removed model to investigate the performance of the proposed methods in further detail. The suggested framework can be used as a valuable tool for quick assessment of epidemic threat at the stage when epidemics only start developing. Moreover, our work amplifies the significance of the small-scale and finite-time microcosm realizations of epidemics revealing their predictive power. PMID:22513723
Rae, L S; Vankan, D M; Rand, J S; Flickinger, E A; Ward, L C
2016-06-01
Thirty-five healthy, neutered, mixed breed dogs were used to determine the ability of multifrequency bioelectrical impedance analysis (MFBIA) to predict accurately fat-free mass (FFM) in dogs using dual energy X-ray absorptiometry (DXA)-measured FFM as reference. A second aim was to compare MFBIA predictions with morphometric predictions. MFBIA-based predictors provided an accurate measure of FFM, within 1.5% when compared to DXA-derived FFM, in normal weight dogs. FFM estimates were most highly correlated with DXA-measured FFM when the prediction equation included resistance quotient, bodyweight, and body condition score. At the population level, the inclusion of impedance as a predictor variable did not add substantially to the predictive power achieved with morphometric variables alone; in individual dogs, impedance predictors were more valuable than morphometric predictors. These results indicate that, following further validation, MFBIA could provide a useful tool in clinical practice to objectively measure FFM in canine patients and help improve compliance with prevention and treatment programs for obesity in dogs. Copyright © 2016. Published by Elsevier Ltd.
Raharimanantsoa, Mahery; Zingg, Tobias; Thiery, Alicia; Brigand, Cécile; Delhorme, Jean-Baptiste; Romain, Benoît
2017-12-14
Blunt bowel and mesenteric injuries (BBMI) are regularly missed by abdominal computed tomography (CT) scans. The aim of this study was to develop a risk assessment tool for BBMI to help clinicians in decision-making for blunt trauma after road traffic crashes (RTCs). Single-center retrospective study of trauma patients from January 2010 to April 2015. All patients admitted to our hospital after blunt trauma following RTCs and CT scan at admission were assessed. Of the 394 patients included, 78 (19.8%) required surgical exploration and 34 (43.6%) of these had a significant BBMI. A univariate and multivariate analysis were performed comparing patients with BBMI (n = 34) and patients without BBMI (n = 360). A score with a range from 0 to 13 was created. Scores from 8 to 9 were associated with 5-25% BBMI risk. The power of this new score ≥ 8 to predict a surgically significant BBMI had a sensitivity of 96%, specificity of 86.4%, positive predictive value (PPV) of 48% and negative predictive value (NPV) of 99.4%. This score could be a valuable tool for the management of blunt trauma patients after RTA without a clear indication for laparotomy but at risk for BBMI. The outcome of this study suggests selective diagnostic laparoscopy for a score ≥ 8 in obtunded patients and ≥ 10 in all other. To assess the value and accuracy of this new score, a prospective validation of these retrospective findings is due.
Advanced Self-Calibrating, Self-Repairing Data Acquisition System
NASA Technical Reports Server (NTRS)
Medelius, Pedro J. (Inventor); Eckhoff, Anthony J. (Inventor); Angel, Lucena R. (Inventor); Perotti, Jose M. (Inventor)
2002-01-01
An improved self-calibrating and self-repairing Data Acquisition System (DAS) for use in inaccessible areas, such as onboard spacecraft, and capable of autonomously performing required system health checks, failure detection. When required, self-repair is implemented utilizing a "spare parts/tool box" system. The available number of spare components primarily depends upon each component's predicted reliability which may be determined using Mean Time Between Failures (MTBF) analysis. Failing or degrading components are electronically removed and disabled to reduce power consumption, before being electronically replaced with spare components.
Emission color tuning in AlQ3 complexes with extended conjugated chromophores.
Pohl, Radek; Anzenbacher, Pavel
2003-08-07
[reaction: see text] A new method for the synthesis of 5-arylethynyl-8-hydroxyquinoline ligands using Sonogashira-Hagihara coupling was developed. The electronic nature of arylethynyl substituents affects the emission color and quantum yield of the resulting Al(III) complex. Photophysical properties of the metallocomplexes correspond to the electron-withdrawing/-donating character of the arylethynyl substituents. Optical properties of such Al(III) complexes correlate with the Hammett constant values of the respective substituents. This strategy offers a powerful tool for the preparation of electroluminophores with predictable photophysical properties.
Molecular imaging to track Parkinson's disease and atypical parkinsonisms: New imaging frontiers.
Strafella, Antonio P; Bohnen, Nicolaas I; Perlmutter, Joel S; Eidelberg, David; Pavese, Nicola; Van Eimeren, Thilo; Piccini, Paola; Politis, Marios; Thobois, Stephane; Ceravolo, Roberto; Higuchi, Makoto; Kaasinen, Valtteri; Masellis, Mario; Peralta, M Cecilia; Obeso, Ignacio; Pineda-Pardo, Jose Ángel; Cilia, Roberto; Ballanger, Benedicte; Niethammer, Martin; Stoessl, Jon A
2017-02-01
Molecular imaging has proven to be a powerful tool for investigation of parkinsonian disorders. One current challenge is to identify biomarkers of early changes that may predict the clinical trajectory of parkinsonian disorders. Exciting new tracer developments hold the potential for in vivo markers of underlying pathology. Herein, we provide an overview of molecular imaging advances and how these approaches help us to understand PD and atypical parkinsonisms. © 2016 International Parkinson and Movement Disorder Society. © 2016 International Parkinson and Movement Disorder Society.
Theoretical analysis of microwave propagation
NASA Astrophysics Data System (ADS)
Parl, S.; Malaga, A.
1984-04-01
This report documents a comprehensive investigation of microwave propagation. The structure of line-of-sight multipath is determined and the impact on practical diversity is discussed. A new model of diffraction propagation for multiple rounded obstacles is developed. A troposcatter model valid at microwave frequencies is described. New results for the power impulse response, and the delay spread and Doppler spread are developed. A 2-component model separating large and small scale scatter effects is proposed. The prediction techniques for diffraction and troposcatter have been implemented in a computer program intended as a tool to analyze propagation experiments.
Using artificial intelligence to control fluid flow computations
NASA Technical Reports Server (NTRS)
Gelsey, Andrew
1992-01-01
Computational simulation is an essential tool for the prediction of fluid flow. Many powerful simulation programs exist today. However, using these programs to reliably analyze fluid flow and other physical situations requires considerable human effort and expertise to set up a simulation, determine whether the output makes sense, and repeatedly run the simulation with different inputs until a satisfactory result is achieved. Automating this process is not only of considerable practical importance but will also significantly advance basic artificial intelligence (AI) research in reasoning about the physical world.
Attia, Khalid A M; Nassar, Mohammed W I; El-Zeiny, Mohamed B; Serag, Ahmed
2017-01-05
For the first time, a new variable selection method based on swarm intelligence namely firefly algorithm is coupled with three different multivariate calibration models namely, concentration residual augmented classical least squares, artificial neural network and support vector regression in UV spectral data. A comparative study between the firefly algorithm and the well-known genetic algorithm was developed. The discussion revealed the superiority of using this new powerful algorithm over the well-known genetic algorithm. Moreover, different statistical tests were performed and no significant differences were found between all the models regarding their predictabilities. This ensures that simpler and faster models were obtained without any deterioration of the quality of the calibration. Copyright © 2016 Elsevier B.V. All rights reserved.
Thermal modeling and analysis of thin-walled structures in micro milling
NASA Astrophysics Data System (ADS)
Zhang, J. F.; Ma, Y. H.; Feng, C.; Tang, W.; Wang, S.
2017-11-01
The numerical analytical model has been developed to predict the thermal effect with respect to thin walled structures by micro-milling. In order to investigate the temperature distribution around micro-edge of cutter, it is necessary to considering the friction power, the shearing power, the shear area between the tool micro-edge and materials. Due to the micro-cutting area is more difficult to be measured accurately, the minimum chip thickness as one of critical factors is also introduced. Finite element-based simulation was employed by the Advantedge, which was determined from the machining of Ti-6Al-4V over a range of the uncut chip thicknesses. Results from the proposed model have been successfully accounted for the effects of thermal softening for material.
Towards a generalized energy prediction model for machine tools
Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H.; Dornfeld, David A.; Helu, Moneer; Rachuri, Sudarsan
2017-01-01
Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process. PMID:28652687
Towards a generalized energy prediction model for machine tools.
Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H; Dornfeld, David A; Helu, Moneer; Rachuri, Sudarsan
2017-04-01
Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process.
Strategic prospects of the electric power industry of Russia
NASA Astrophysics Data System (ADS)
Makarov, A. A.; Veselov, F. V.; Makarova, A. S.; Novikova, T. V.; Pankrushina, T. G.
2017-11-01
The prospects for the development of the electric power industry of Russia adopted at a regular stage of working out the Energy Strategy and the General Plan of Distribution of the Electric Power Facilities are discussed. The monitoring of the progress in the implementation of the Energy Strategies for the periods until 2020 and 2030 adopted in 2003 and 2009 has, in general, validated the correctness of the estimated volumes of the energy resource production under overestimation of the expected domestic demand owing to an excessively optimistic forecast of the real development of the economy. The priority lines of the national energy policy in electric power and allied industries proposed in the Energy Strategy for the period until 2035 are considered. The tools for implementation of most of the proposals and the effectiveness of their implementation have yet to be defined more concretely. The development of the energy sector and the electric power industry under the conservative and optimistic scenarios of the development of the country's economy has been predicted using the SCANER modeling and information system, viz., the dynamics of the domestic consumption, export, and production of the primary energy and the electric power has been determined and the commissioning and structure of the required generating capacities and the consumption of the basic types of the energy resources by the electric power industry and the centralized heat supply systems has been optimized. Changes in the economic efficiency of the nuclear and thermal power plants under the expected improvements on their cost and performance characteristics and an increase in the domestic fuel prices are presented. The competitiveness of the wind and solar power production under Russian conditions has been evaluated considering the necessity of reservation and partial duplication of their capacities when operated in the power supply systems. When optimizing the electric power industry as a subsystem of the country's energy sector, the required amounts of capital investments in the industry have been assessed. Based on the obtained data and the predicted prices of fuel in the main pricing zones of Russia, the ranges of changes in the prices of the electric power in agreement with the macroeconomic restrictions on their dynamics have been calculated.
Effects of 31 FDA approved small-molecule kinase inhibitors on isolated rat liver mitochondria.
Zhang, Jun; Salminen, Alec; Yang, Xi; Luo, Yong; Wu, Qiangen; White, Matthew; Greenhaw, James; Ren, Lijun; Bryant, Matthew; Salminen, William; Papoian, Thomas; Mattes, William; Shi, Qiang
2017-08-01
The FDA has approved 31 small-molecule kinase inhibitors (KIs) for human use as of November 2016, with six having black box warnings for hepatotoxicity (BBW-H) in product labeling. The precise mechanisms and risk factors for KI-induced hepatotoxicity are poorly understood. Here, the 31 KIs were tested in isolated rat liver mitochondria, an in vitro system recently proposed to be a useful tool to predict drug-induced hepatotoxicity in humans. The KIs were incubated with mitochondria or submitochondrial particles at concentrations ranging from therapeutic maximal blood concentrations (Cmax) levels to 100-fold Cmax levels. Ten endpoints were measured, including oxygen consumption rate, inner membrane potential, cytochrome c release, swelling, reactive oxygen species, and individual respiratory chain complex (I-V) activities. Of the 31 KIs examined only three including sorafenib, regorafenib and pazopanib, all of which are hepatotoxic, caused significant mitochondrial toxicity at concentrations equal to the Cmax, indicating that mitochondrial toxicity likely contributes to the pathogenesis of hepatotoxicity associated with these KIs. At concentrations equal to 100-fold Cmax, 18 KIs were found to be toxic to mitochondria, and among six KIs with BBW-H, mitochondrial injury was induced by regorafenib, lapatinib, idelalisib, and pazopanib, but not ponatinib, or sunitinib. Mitochondrial liability at 100-fold Cmax had a positive predictive power (PPV) of 72% and negative predictive power (NPV) of 33% in predicting human KI hepatotoxicity as defined by product labeling, with the sensitivity and specificity being 62% and 44%, respectively. Similar predictive power was obtained using the criterion of Cmax ≥1.1 µM or daily dose ≥100 mg. Mitochondrial liability at 1-2.5-fold Cmax showed a 100% PPV and specificity, though the NPV and sensitivity were 32% and 14%, respectively. These data provide novel mechanistic insights into KI hepatotoxicity and indicate that mitochondrial toxicity at therapeutic levels can help identify hepatotoxic KIs.
A Tool for Model-Based Generation of Scenario-driven Electric Power Load Profiles
NASA Technical Reports Server (NTRS)
Rozek, Matthew L.; Donahue, Kenneth M.; Ingham, Michel D.; Kaderka, Justin D.
2015-01-01
Power consumption during all phases of spacecraft flight is of great interest to the aerospace community. As a result, significant analysis effort is exerted to understand the rates of electrical energy generation and consumption under many operational scenarios of the system. Previously, no standard tool existed for creating and maintaining a power equipment list (PEL) of spacecraft components that consume power, and no standard tool existed for generating power load profiles based on this PEL information during mission design phases. This paper presents the Scenario Power Load Analysis Tool (SPLAT) as a model-based systems engineering tool aiming to solve those problems. SPLAT is a plugin for MagicDraw (No Magic, Inc.) that aids in creating and maintaining a PEL, and also generates a power and temporal variable constraint set, in Maple language syntax, based on specified operational scenarios. The constraint set can be solved in Maple to show electric load profiles (i.e. power consumption from loads over time). SPLAT creates these load profiles from three modeled inputs: 1) a list of system components and their respective power modes, 2) a decomposition hierarchy of the system into these components, and 3) the specification of at least one scenario, which consists of temporal constraints on component power modes. In order to demonstrate how this information is represented in a system model, a notional example of a spacecraft planetary flyby is introduced. This example is also used to explain the overall functionality of SPLAT, and how this is used to generate electric power load profiles. Lastly, a cursory review of the usage of SPLAT on the Cold Atom Laboratory project is presented to show how the tool was used in an actual space hardware design application.
Das, Koel; Giesbrecht, Barry; Eckstein, Miguel P
2010-07-15
Within the past decade computational approaches adopted from the field of machine learning have provided neuroscientists with powerful new tools for analyzing neural data. For instance, previous studies have applied pattern classification algorithms to electroencephalography data to predict the category of presented visual stimuli, human observer decision choices and task difficulty. Here, we quantitatively compare the ability of pattern classifiers and three ERP metrics (peak amplitude, mean amplitude, and onset latency of the face-selective N170) to predict variations across individuals' behavioral performance in a difficult perceptual task identifying images of faces and cars embedded in noise. We investigate three different pattern classifiers (Classwise Principal Component Analysis, CPCA; Linear Discriminant Analysis, LDA; and Support Vector Machine, SVM), five training methods differing in the selection of training data sets and three analyses procedures for the ERP measures. We show that all three pattern classifier algorithms surpass traditional ERP measurements in their ability to predict individual differences in performance. Although the differences across pattern classifiers were not large, the CPCA method with training data sets restricted to EEG activity for trials in which observers expressed high confidence about their decisions performed the highest at predicting perceptual performance of observers. We also show that the neural activity predicting the performance across individuals was distributed through time starting at 120ms, and unlike the face-selective ERP response, sustained for more than 400ms after stimulus presentation, indicating that both early and late components contain information correlated with observers' behavioral performance. Together, our results further demonstrate the potential of pattern classifiers compared to more traditional ERP techniques as an analysis tool for modeling spatiotemporal dynamics of the human brain and relating neural activity to behavior. Copyright 2010 Elsevier Inc. All rights reserved.
EMU Battery/module Service Tool Characterization Study
NASA Technical Reports Server (NTRS)
Palandati, C. F.
1984-01-01
The power tool which will be used to replace the attitude control system in the SMM spacecraft is being modified to operate from a self contained battery. The extravehicular mobility unit (EMU) battery, a silver zinc battery, was tested for the power tool application. The results obtained during show the EMU battery is capable of operating the power tool within the pulse current range of 2.0 to 15.0 amperes and battery temperature range of -10 to 40 degrees Celsius.
Green Power Community Tools and Resources
GPP supplies GPCs will tools to promote their status. GPCs are a subset of the Green Power Partnership; municipalities or tribal governments where government, businesses, and residents collectively use enough green power to meet GPP requirements.
Aberrant prefrontal beta oscillations predict episodic memory encoding deficits in schizophrenia.
Meconi, Federica; Anderl-Straub, Sarah; Raum, Heidelore; Landgrebe, Michael; Langguth, Berthold; Bäuml, Karl-Heinz T; Hanslmayr, Simon
Verbal episodic memory is one of the core cognitive functions affected in patients with schizophrenia (SZ). Although this verbal memory impairment in SZ is a well-known finding, our understanding about its underlying neurophysiological mechanisms is rather scarce. Here we address this issue by recording brain oscillations during a memory task in a sample of healthy controls and patients with SZ. Brain oscillations represent spectral fingerprints of specific neurocognitive operations and are therefore a promising tool to identify neurocognitive mechanisms that are affected by SZ. Healthy controls showed a prominent suppression of left prefrontal beta oscillatory activity during successful memory formation, which replicates several previous oscillatory memory studies. In contrast, patients failed to exhibit such a left prefrontal beta power suppression. Utilizing a new topographical pattern similarity approach, we further demonstrate that the degree of similarity between a patient's beta power decrease to that of the controls reliably predicted memory performance. This relationship between beta power decreases and memory was such that the patients' memory performance improved as they showed a more similar topographical beta desynchronization pattern compared to that of healthy controls. Together, these findings support left prefrontal beta desynchronization as the spectral fingerprint of verbal episodic memory formation, likely indicating deep semantic processing of verbal material. These findings also demonstrate that left prefrontal beta power suppression (or lack thereof) during memory encoding are a reliable biomarker for the observed encoding impairments in SZ in verbal memory.
Cellular automata and its applications in protein bioinformatics.
Xiao, Xuan; Wang, Pu; Chou, Kuo-Chen
2011-09-01
With the explosion of protein sequences generated in the postgenomic era, it is highly desirable to develop high-throughput tools for rapidly and reliably identifying various attributes of uncharacterized proteins based on their sequence information alone. The knowledge thus obtained can help us timely utilize these newly found protein sequences for both basic research and drug discovery. Many bioinformatics tools have been developed by means of machine learning methods. This review is focused on the applications of a new kind of science (cellular automata) in protein bioinformatics. A cellular automaton (CA) is an open, flexible and discrete dynamic model that holds enormous potentials in modeling complex systems, in spite of the simplicity of the model itself. Researchers, scientists and practitioners from different fields have utilized cellular automata for visualizing protein sequences, investigating their evolution processes, and predicting their various attributes. Owing to its impressive power, intuitiveness and relative simplicity, the CA approach has great potential for use as a tool for bioinformatics.
Genome engineering and plant breeding: impact on trait discovery and development.
Nogué, Fabien; Mara, Kostlend; Collonnier, Cécile; Casacuberta, Josep M
2016-07-01
New tools for the precise modification of crops genes are now available for the engineering of new ideotypes. A future challenge in this emerging field of genome engineering is to develop efficient methods for allele mining. Genome engineering tools are now available in plants, including major crops, to modify in a predictable manner a given gene. These new techniques have a tremendous potential for a spectacular acceleration of the plant breeding process. Here, we discuss how genetic diversity has always been the raw material for breeders and how they have always taken advantage of the best available science to use, and when possible, increase, this genetic diversity. We will present why the advent of these new techniques gives to the breeders extremely powerful tools for crop breeding, but also why this will require the breeders and researchers to characterize the genes underlying this genetic diversity more precisely. Tackling these challenges should permit the engineering of optimized alleles assortments in an unprecedented and controlled way.
Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis
2015-01-01
Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.
NASA Astrophysics Data System (ADS)
Abdul-Aziz, O. I.; Ishtiaq, K. S.
2015-12-01
We present a user-friendly modeling tool on MS Excel to predict the greenhouse gas (GHG) fluxes and estimate potential carbon sequestration from the coastal wetlands. The dominant controls of wetland GHG fluxes and their relative mechanistic linkages with various hydro-climatic, sea level, biogeochemical and ecological drivers were first determined by employing a systematic data-analytics method, including Pearson correlation matrix, principal component and factor analyses, and exploratory partial least squares regressions. The mechanistic knowledge and understanding was then utilized to develop parsimonious non-linear (power-law) models to predict wetland carbon dioxide (CO2) and methane (CH4) fluxes based on a sub-set of climatic, hydrologic and environmental drivers such as the photosynthetically active radiation, soil temperature, water depth, and soil salinity. The models were tested with field data for multiple sites and seasons (2012-13) collected from the Waquoit Bay, MA. The model estimated the annual wetland carbon storage by up-scaling the instantaneous predicted fluxes to an extended growing season (e.g., May-October) and by accounting for the net annual lateral carbon fluxes between the wetlands and estuary. The Excel Spreadsheet model is a simple ecological engineering tool for coastal carbon management and their incorporation into a potential carbon market under a changing climate, sea level and environment. Specifically, the model can help to determine appropriate GHG offset protocols and monitoring plans for projects that focus on tidal wetland restoration and maintenance.
Wang, Yuan; Gao, Ying; Battsend, Munkhzul; Chen, Kexin; Lu, Wenli; Wang, Yaogang
2014-11-01
The optimal approach regarding breast cancer screening for Chinese women is unclear due to the relative low incidence rate. A risk assessment tool may be useful for selection of high-risk subsets of population for mammography screening in low-incidence and resource-limited developing country. The odd ratios for six main risk factors of breast cancer were pooled by review manager after a systematic research of literature. Health risk appraisal (HRA) model was developed to predict an individual's risk of developing breast cancer in the next 5 years from current age. The performance of this HRA model was assessed based on a first-round screening database. Estimated risk of breast cancer increased with age. Increases in the 5-year risk of developing breast cancer were found with the existence of any of included risk factors. When individuals who had risk above median risk (3.3‰) were selected from the validation database, the sensitivity is 60.0% and the specificity is 47.8%. The unweighted area under the curve (AUC) was 0.64 (95% CI = 0.50-0.78). The risk-prediction model reported in this article is based on a combination of risk factors and shows good overall predictive power, but it is still weak at predicting which particular women will develop the disease. It would be very helpful for the improvement of a current model if more population-based prospective follow-up studies were used for the validation.
Optimization of multi-environment trials for genomic selection based on crop models.
Rincent, R; Kuhn, E; Monod, H; Oury, F-X; Rousset, M; Allard, V; Le Gouis, J
2017-08-01
We propose a statistical criterion to optimize multi-environment trials to predict genotype × environment interactions more efficiently, by combining crop growth models and genomic selection models. Genotype × environment interactions (GEI) are common in plant multi-environment trials (METs). In this context, models developed for genomic selection (GS) that refers to the use of genome-wide information for predicting breeding values of selection candidates need to be adapted. One promising way to increase prediction accuracy in various environments is to combine ecophysiological and genetic modelling thanks to crop growth models (CGM) incorporating genetic parameters. The efficiency of this approach relies on the quality of the parameter estimates, which depends on the environments composing this MET used for calibration. The objective of this study was to determine a method to optimize the set of environments composing the MET for estimating genetic parameters in this context. A criterion called OptiMET was defined to this aim, and was evaluated on simulated and real data, with the example of wheat phenology. The MET defined with OptiMET allowed estimating the genetic parameters with lower error, leading to higher QTL detection power and higher prediction accuracies. MET defined with OptiMET was on average more efficient than random MET composed of twice as many environments, in terms of quality of the parameter estimates. OptiMET is thus a valuable tool to determine optimal experimental conditions to best exploit MET and the phenotyping tools that are currently developed.
NASA Astrophysics Data System (ADS)
Aparanji, Santosh; Balaswamy, V.; Arun, S.; Supradeepa, V. R.
2018-02-01
In this work, we report and analyse the surprising observation of a rainbow of visible colors, spanning 390nm to 620nm, in silica-based, Near Infrared, continuous-wave, cascaded Raman fiber lasers. The cascaded Raman laser is pumped at 1117nm at around 200W and at full power we obtain 100 W at 1480nm. With increasing pump power at 1117nm, the fiber constituting the Raman laser glows in various hues along its length. From spectroscopic analysis of the emitted visible light, it was identified to be harmonic and sum-frequency components of various locally propagating wavelength components. In addition to third harmonic components, surprisingly, even 2nd harmonic components were observed. Despite being a continuous-wave laser, we expect the phase-matching occurring between the core-propagating NIR light with the cladding-propagating visible wavelengths and the intensity fluctuations characteristic of Raman lasers to have played a major role in generation of visible light. In addition, this surprising generation of visible light provides us a powerful non-contact method to deduce the spectrum of light propagating in the fiber. Using static images of the fiber captured by a standard visible camera such as a DSLR, we demonstrate novel, image-processing based techniques to deduce the wavelength component propagating in the fiber at any given spatial location. This provides a powerful diagnostic tool for both length and power resolved spectral analysis in Raman fiber lasers. This helps accurate prediction of the optimal length of fiber required for complete and efficient conversion to a given Stokes wavelength.
Common features of microRNA target prediction tools
Peterson, Sarah M.; Thompson, Jeffrey A.; Ufkin, Melanie L.; Sathyanarayana, Pradeep; Liaw, Lucy; Congdon, Clare Bates
2014-01-01
The human genome encodes for over 1800 microRNAs (miRNAs), which are short non-coding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one miRNA to target multiple gene transcripts, miRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of miRNA targets is a critical initial step in identifying miRNA:mRNA target interactions for experimental validation. The available tools for miRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to miRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all miRNA target prediction tools, four main aspects of the miRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MiRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output. PMID:24600468
Common features of microRNA target prediction tools.
Peterson, Sarah M; Thompson, Jeffrey A; Ufkin, Melanie L; Sathyanarayana, Pradeep; Liaw, Lucy; Congdon, Clare Bates
2014-01-01
The human genome encodes for over 1800 microRNAs (miRNAs), which are short non-coding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one miRNA to target multiple gene transcripts, miRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of miRNA targets is a critical initial step in identifying miRNA:mRNA target interactions for experimental validation. The available tools for miRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to miRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all miRNA target prediction tools, four main aspects of the miRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MiRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output.
Comparison of Performance Predictions for New Low-Thrust Trajectory Tools
NASA Technical Reports Server (NTRS)
Polsgrove, Tara; Kos, Larry; Hopkins, Randall; Crane, Tracie
2006-01-01
Several low thrust trajectory optimization tools have been developed over the last 3% years by the Low Thrust Trajectory Tools development team. This toolset includes both low-medium fidelity and high fidelity tools which allow the analyst to quickly research a wide mission trade space and perform advanced mission design. These tools were tested using a set of reference trajectories that exercised each tool s unique capabilities. This paper compares the performance predictions of the various tools against several of the reference trajectories. The intent is to verify agreement between the high fidelity tools and to quantify the performance prediction differences between tools of different fidelity levels.
Prediction of intracellular exposure bridges the gap between target- and cell-based drug discovery
Gordon, Laurie J.; Wayne, Gareth J.; Almqvist, Helena; Axelsson, Hanna; Seashore-Ludlow, Brinton; Treyer, Andrea; Lundbäck, Thomas; West, Andy; Hann, Michael M.; Artursson, Per
2017-01-01
Inadequate target exposure is a major cause of high attrition in drug discovery. Here, we show that a label-free method for quantifying the intracellular bioavailability (Fic) of drug molecules predicts drug access to intracellular targets and hence, pharmacological effect. We determined Fic in multiple cellular assays and cell types representing different targets from a number of therapeutic areas, including cancer, inflammation, and dementia. Both cytosolic targets and targets localized in subcellular compartments were investigated. Fic gives insights on membrane-permeable compounds in terms of cellular potency and intracellular target engagement, compared with biochemical potency measurements alone. Knowledge of the amount of drug that is locally available to bind intracellular targets provides a powerful tool for compound selection in early drug discovery. PMID:28701380
Variability in Humoral Immunity to Measles Vaccine: New Developments
Haralambieva, Iana H.; Kennedy, Richard B.; Ovsyannikova, Inna G.; Whitaker, Jennifer A.; Poland, Gregory A.
2015-01-01
Despite the existence of an effective measles vaccine, resurgence in measles cases in the United States and across Europe has occurred, including in individuals vaccinated with two doses of the vaccine. Host genetic factors result in inter-individual variation in measles vaccine-induced antibodies, and play a role in vaccine failure. Studies have identified HLA and non-HLA genetic influences that individually or jointly contribute to the observed variability in the humoral response to vaccination among healthy individuals. In this exciting era, new high-dimensional approaches and techniques including vaccinomics, systems biology, GWAS, epitope prediction and sophisticated bioinformatics/statistical algorithms, provide powerful tools to investigate immune response mechanisms to the measles vaccine. These might predict, on an individual basis, outcomes of acquired immunity post measles vaccination. PMID:26602762
Continuum Electrostatics Approaches to Calculating pKas and Ems in Proteins.
Gunner, M R; Baker, N A
2016-01-01
Proteins change their charge state through protonation and redox reactions as well as through binding charged ligands. The free energy of these reactions is dominated by solvation and electrostatic energies and modulated by protein conformational relaxation in response to the ionization state changes. Although computational methods for calculating these interactions can provide very powerful tools for predicting protein charge states, they include several critical approximations of which users should be aware. This chapter discusses the strengths, weaknesses, and approximations of popular computational methods for predicting charge states and understanding the underlying electrostatic interactions. The goal of this chapter is to inform users about applications and potential caveats of these methods as well as outline directions for future theoretical and computational research. © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Modesitt, Kenneth L.
1990-01-01
A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsoneva, N., E-mail: Nadia.Tsoneva@theo.physik.uni-giessen.de; Lenske, H.
During the last decade, a theoretical method based on the energy–density functional theory and quasiparticle–phonon model, including up to three-phonon configurations was developed. The main advantages of themethod are that it incorporates a self-consistentmean-field and multi-configuration mixing which are found of crucial importance for systematic investigations of nuclear low-energy excitations, pygmy and giant resonances in an unified way. In particular, the theoretical approach has been proven to be very successful in predictions of new modes of excitations, namely pygmy quadrupole resonance which is also lately experimentally observed. Recently, our microscopically obtained dipole strength functions are implemented in predictions of nucleon-capturemore » reaction rates of astrophysical importance. A comparison to available experimental data is discussed.« less
Bhamidipati, Ravi Kanth; Syed, Muzeeb; Mullangi, Ramesh; Srinivas, Nuggehally
2018-02-01
1. Dalbavancin, a lipoglycopeptide, is approved for treating gram-positive bacterial infections. Area under plasma concentration versus time curve (AUC inf ) of dalbavancin is a key parameter and AUC inf /MIC ratio is a critical pharmacodynamic marker. 2. Using end of intravenous infusion concentration (i.e. C max ) C max versus AUC inf relationship for dalbavancin was established by regression analyses (i.e. linear, log-log, log-linear and power models) using 21 pairs of subject data. 3. The predictions of the AUC inf were performed using published C max data by application of regression equations. The quotient of observed/predicted values rendered fold difference. The mean absolute error (MAE)/root mean square error (RMSE) and correlation coefficient (r) were used in the assessment. 4. MAE and RMSE values for the various models were comparable. The C max versus AUC inf exhibited excellent correlation (r > 0.9488). The internal data evaluation showed narrow confinement (0.84-1.14-fold difference) with a RMSE < 10.3%. The external data evaluation showed that the models predicted AUC inf with a RMSE of 3.02-27.46% with fold difference largely contained within 0.64-1.48. 5. Regardless of the regression models, a single time point strategy of using C max (i.e. end of 30-min infusion) is amenable as a prospective tool for predicting AUC inf of dalbavancin in patients.
Predictive models in cancer management: A guide for clinicians.
Kazem, Mohammed Ali
2017-04-01
Predictive tools in cancer management are used to predict different outcomes including survival probability or risk of recurrence. The uptake of these tools by clinicians involved in cancer management has not been as common as other clinical tools, which may be due to the complexity of some of these tools or a lack of understanding of how they can aid decision-making in particular clinical situations. The aim of this article is to improve clinicians' knowledge and understanding of predictive tools used in cancer management, including how they are built, how they can be applied to medical practice, and what their limitations may be. Literature review was conducted to investigate the role of predictive tools in cancer management. All predictive models share similar characteristics, but depending on the type of the tool its ability to predict an outcome will differ. Each type has its own pros and cons, and its generalisability will depend on the cohort used to build the tool. These factors will affect the clinician's decision whether to apply the model to their cohort or not. Before a model is used in clinical practice, it is important to appreciate how the model is constructed, what its use may add over and above traditional decision-making tools, and what problems or limitations may be associated with it. Understanding all the above is an important step for any clinician who wants to decide whether or not use predictive tools in their practice. Copyright © 2016 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Callaghan, Michael E., E-mail: elspeth.raymond@health.sa.gov.au; Freemasons Foundation Centre for Men's Health, University of Adelaide; Urology Unit, Repatriation General Hospital, SA Health, Flinders Centre for Innovation in Cancer
Purpose: To identify, through a systematic review, all validated tools used for the prediction of patient-reported outcome measures (PROMs) in patients being treated with radiation therapy for prostate cancer, and provide a comparative summary of accuracy and generalizability. Methods and Materials: PubMed and EMBASE were searched from July 2007. Title/abstract screening, full text review, and critical appraisal were undertaken by 2 reviewers, whereas data extraction was performed by a single reviewer. Eligible articles had to provide a summary measure of accuracy and undertake internal or external validation. Tools were recommended for clinical implementation if they had been externally validated and foundmore » to have accuracy ≥70%. Results: The search strategy identified 3839 potential studies, of which 236 progressed to full text review and 22 were included. From these studies, 50 tools predicted gastrointestinal/rectal symptoms, 29 tools predicted genitourinary symptoms, 4 tools predicted erectile dysfunction, and no tools predicted quality of life. For patients treated with external beam radiation therapy, 3 tools could be recommended for the prediction of rectal toxicity, gastrointestinal toxicity, and erectile dysfunction. For patients treated with brachytherapy, 2 tools could be recommended for the prediction of urinary retention and erectile dysfunction. Conclusions: A large number of tools for the prediction of PROMs in prostate cancer patients treated with radiation therapy have been developed. Only a small minority are accurate and have been shown to be generalizable through external validation. This review provides an accessible catalogue of tools that are ready for clinical implementation as well as which should be prioritized for validation.« less
Watson, Robert A
2014-08-01
To test the hypothesis that machine learning algorithms increase the predictive power to classify surgical expertise using surgeons' hand motion patterns. In 2012 at the University of North Carolina at Chapel Hill, 14 surgical attendings and 10 first- and second-year surgical residents each performed two bench model venous anastomoses. During the simulated tasks, the participants wore an inertial measurement unit on the dorsum of their dominant (right) hand to capture their hand motion patterns. The pattern from each bench model task performed was preprocessed into a symbolic time series and labeled as expert (attending) or novice (resident). The labeled hand motion patterns were processed and used to train a Support Vector Machine (SVM) classification algorithm. The trained algorithm was then tested for discriminative/predictive power against unlabeled (blinded) hand motion patterns from tasks not used in the training. The Lempel-Ziv (LZ) complexity metric was also measured from each hand motion pattern, with an optimal threshold calculated to separately classify the patterns. The LZ metric classified unlabeled (blinded) hand motion patterns into expert and novice groups with an accuracy of 70% (sensitivity 64%, specificity 80%). The SVM algorithm had an accuracy of 83% (sensitivity 86%, specificity 80%). The results confirmed the hypothesis. The SVM algorithm increased the predictive power to classify blinded surgical hand motion patterns into expert versus novice groups. With further development, the system used in this study could become a viable tool for low-cost, objective assessment of procedural proficiency in a competency-based curriculum.
Geographically distributed real-time digital simulations using linear prediction
Liu, Ren; Mohanpurkar, Manish; Panwar, Mayank; ...
2016-07-04
Real time simulation is a powerful tool for analyzing, planning, and operating modern power systems. For analyzing the ever evolving power systems and understanding complex dynamic and transient interactions larger real time computation capabilities are essential. These facilities are interspersed all over the globe and to leverage unique facilities geographically-distributed real-time co-simulation in analyzing the power systems is pursued and presented. However, the communication latency between different simulator locations may lead to inaccuracy in geographically distributed real-time co-simulations. In this paper, the effect of communication latency on geographically distributed real-time co-simulation is introduced and discussed. In order to reduce themore » effect of the communication latency, a real-time data predictor, based on linear curve fitting is developed and integrated into the distributed real-time co-simulation. Two digital real time simulators are used to perform dynamic and transient co-simulations with communication latency and predictor. Results demonstrate the effect of the communication latency and the performance of the real-time data predictor to compensate it.« less
Geographically distributed real-time digital simulations using linear prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Ren; Mohanpurkar, Manish; Panwar, Mayank
Real time simulation is a powerful tool for analyzing, planning, and operating modern power systems. For analyzing the ever evolving power systems and understanding complex dynamic and transient interactions larger real time computation capabilities are essential. These facilities are interspersed all over the globe and to leverage unique facilities geographically-distributed real-time co-simulation in analyzing the power systems is pursued and presented. However, the communication latency between different simulator locations may lead to inaccuracy in geographically distributed real-time co-simulations. In this paper, the effect of communication latency on geographically distributed real-time co-simulation is introduced and discussed. In order to reduce themore » effect of the communication latency, a real-time data predictor, based on linear curve fitting is developed and integrated into the distributed real-time co-simulation. Two digital real time simulators are used to perform dynamic and transient co-simulations with communication latency and predictor. Results demonstrate the effect of the communication latency and the performance of the real-time data predictor to compensate it.« less
NASA Technical Reports Server (NTRS)
Seybert, A. F.; Wu, X. F.; Oswald, Fred B.
1992-01-01
Analytical and experimental validation of methods to predict structural vibration and radiated noise are presented. A rectangular box excited by a mechanical shaker was used as a vibrating structure. Combined finite element method (FEM) and boundary element method (BEM) models of the apparatus were used to predict the noise radiated from the box. The FEM was used to predict the vibration, and the surface vibration was used as input to the BEM to predict the sound intensity and sound power. Vibration predicted by the FEM model was validated by experimental modal analysis. Noise predicted by the BEM was validated by sound intensity measurements. Three types of results are presented for the total radiated sound power: (1) sound power predicted by the BEM modeling using vibration data measured on the surface of the box; (2) sound power predicted by the FEM/BEM model; and (3) sound power measured by a sound intensity scan. The sound power predicted from the BEM model using measured vibration data yields an excellent prediction of radiated noise. The sound power predicted by the combined FEM/BEM model also gives a good prediction of radiated noise except for a shift of the natural frequencies that are due to limitations in the FEM model.
NASA Astrophysics Data System (ADS)
Hansson, Linus; Guédez, Rafael; Larchet, Kevin; Laumert, Bjorn
2017-06-01
The dispatchability offered by thermal energy storage (TES) in concentrated solar power (CSP) and solar hybrid plants based on such technology presents the most important difference compared to power generation based only on photovoltaics (PV). This has also been one reason for recent hybridization efforts of the two technologies and the creation of Power Purchase Agreement (PPA) payment schemes based on offering higher payment multiples during daily hours of higher (peak or priority) demand. Recent studies involving plant-level thermal energy storage control strategies are however to a large extent based on pre-determined approaches, thereby not taking into account the actual dynamics of thermal energy storage system operation. In this study, the implementation of a dynamic dispatch strategy in the form of a TRNSYS controller for hybrid PV-CSP plants in the power-plant modelling tool DYESOPT is presented. In doing this it was attempted to gauge the benefits of incorporating a day-ahead approach to dispatch control compared to a fully pre-determined approach determining hourly dispatch only once prior to annual simulation. By implementing a dynamic strategy, it was found possible to enhance technical and economic performance for CSP-only plants designed for peaking operation and featuring low values of the solar multiple. This was achieved by enhancing dispatch control, primarily by taking storage levels at the beginning of every simulation day into account. The sequential prediction of the TES level could therefore be improved, notably for evaluated plants without integrated PV, for which the predicted storage levels deviated less than when PV was present in the design. While also featuring dispatch performance gains, optimal plant configurations for hybrid PV-CSP was found to present a trade-off in economic performance in the form of an increase in break-even electricity price when using the dynamic strategy which was offset to some extent by a reduction in upfront investment cost. An increase in turbine starts for the implemented strategy however highlights that this is where further improvements can be made.
Thermodynamics of reformulated automotive fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zudkevitch, D.; Murthy, A.K.S.; Gmehling, J.
1995-06-01
Two methods for predicting Reid vapor pressure (Rvp) and initial vapor emissions of reformulated gasoline blends that contain one or more oxygenated compounds show excellent agreement with experimental data. In the first method, method A, D-86 distillation data for gasoline blends are used for predicting Rvp from a simulation of the mini dry vapor pressure equivalent (Dvpe) experiment. The other method, method B, relies on analytical information (PIANO analyses) of the base gasoline and uses classical thermodynamics for simulating the same Rvp equivalent (Rvpe) mini experiment. Method B also predicts composition and other properties for the fuel`s initial vapor emission.more » Method B, although complex, is more useful in that is can predict properties of blends without a D-86 distillation. An important aspect of method B is its capability to predict composition of initial vapor emissions from gasoline blends. Thus, it offers a powerful tool to planners of gasoline blending. Method B uses theoretically sound formulas, rigorous thermodynamic routines and uses data and correlations of physical properties that are in the public domain. Results indicate that predictions made with both methods agree very well with experimental values of Dvpe. Computer simulation methods were programmed and tested.« less
Hadjithomas, Michalis; Chen, I-Min Amy; Chu, Ken; Ratner, Anna; Palaniappan, Krishna; Szeto, Ernest; Huang, Jinghua; Reddy, T B K; Cimermančič, Peter; Fischbach, Michael A; Ivanova, Natalia N; Markowitz, Victor M; Kyrpides, Nikos C; Pati, Amrita
2015-07-14
In the discovery of secondary metabolites, analysis of sequence data is a promising exploration path that remains largely underutilized due to the lack of computational platforms that enable such a systematic approach on a large scale. In this work, we present IMG-ABC (https://img.jgi.doe.gov/abc), an atlas of biosynthetic gene clusters within the Integrated Microbial Genomes (IMG) system, which is aimed at harnessing the power of "big" genomic data for discovering small molecules. IMG-ABC relies on IMG's comprehensive integrated structural and functional genomic data for the analysis of biosynthetic gene clusters (BCs) and associated secondary metabolites (SMs). SMs and BCs serve as the two main classes of objects in IMG-ABC, each with a rich collection of attributes. A unique feature of IMG-ABC is the incorporation of both experimentally validated and computationally predicted BCs in genomes as well as metagenomes, thus identifying BCs in uncultured populations and rare taxa. We demonstrate the strength of IMG-ABC's focused integrated analysis tools in enabling the exploration of microbial secondary metabolism on a global scale, through the discovery of phenazine-producing clusters for the first time in Alphaproteobacteria. IMG-ABC strives to fill the long-existent void of resources for computational exploration of the secondary metabolism universe; its underlying scalable framework enables traversal of uncovered phylogenetic and chemical structure space, serving as a doorway to a new era in the discovery of novel molecules. IMG-ABC is the largest publicly available database of predicted and experimental biosynthetic gene clusters and the secondary metabolites they produce. The system also includes powerful search and analysis tools that are integrated with IMG's extensive genomic/metagenomic data and analysis tool kits. As new research on biosynthetic gene clusters and secondary metabolites is published and more genomes are sequenced, IMG-ABC will continue to expand, with the goal of becoming an essential component of any bioinformatic exploration of the secondary metabolism world. Copyright © 2015 Hadjithomas et al.
NASA Technical Reports Server (NTRS)
Smith, Mark S.; Bui, Trong T.; Garcia, Christian A.; Cumming, Stephen B.
2016-01-01
A pair of compliant trailing edge flaps was flown on a modified GIII airplane. Prior to flight test, multiple analysis tools of various levels of complexity were used to predict the aerodynamic effects of the flaps. Vortex lattice, full potential flow, and full Navier-Stokes aerodynamic analysis software programs were used for prediction, in addition to another program that used empirical data. After the flight-test series, lift and pitching moment coefficient increments due to the flaps were estimated from flight data and compared to the results of the predictive tools. The predicted lift increments matched flight data well for all predictive tools for small flap deflections. All tools over-predicted lift increments for large flap deflections. The potential flow and Navier-Stokes programs predicted pitching moment coefficient increments better than the other tools.
Power counting and Wilsonian renormalization in nuclear effective field theory
NASA Astrophysics Data System (ADS)
Valderrama, Manuel Pavón
2016-05-01
Effective field theories are the most general tool for the description of low energy phenomena. They are universal and systematic: they can be formulated for any low energy systems we can think of and offer a clear guide on how to calculate predictions with reliable error estimates, a feature that is called power counting. These properties can be easily understood in Wilsonian renormalization, in which effective field theories are the low energy renormalization group evolution of a more fundamental — perhaps unknown or unsolvable — high energy theory. In nuclear physics they provide the possibility of a theoretically sound derivation of nuclear forces without having to solve quantum chromodynamics explicitly. However there is the problem of how to organize calculations within nuclear effective field theory: the traditional knowledge about power counting is perturbative but nuclear physics is not. Yet power counting can be derived in Wilsonian renormalization and there is already a fairly good understanding of how to apply these ideas to non-perturbative phenomena and in particular to nuclear physics. Here we review a few of these ideas, explain power counting in two-nucleon scattering and reactions with external probes and hint at how to extend the present analysis beyond the two-body problem.
NASA Technical Reports Server (NTRS)
Ferraro, R.; Some, R.
2002-01-01
The growth in data rates of instruments on future NASA spacecraft continues to outstrip the improvement in communications bandwidth and processing capabilities of radiation-hardened computers. Sophisticated autonomous operations strategies will further increase the processing workload. Given the reductions in spacecraft size and available power, standard radiation hardened computing systems alone will not be able to address the requirements of future missions. The REE project was intended to overcome this obstacle by developing a COTS- based supercomputer suitable for use as a science and autonomy data processor in most space environments. This development required a detailed knowledge of system behavior in the presence of Single Event Effect (SEE) induced faults so that mitigation strategies could be designed to recover system level reliability while maintaining the COTS throughput advantage. The REE project has developed a suite of tools and a methodology for predicting SEU induced transient fault rates in a range of natural space environments from ground-based radiation testing of component parts. In this paper we provide an overview of this methodology and tool set with a concentration on the radiation fault model and its use in the REE system development methodology. Using test data reported elsewhere in this and other conferences, we predict upset rates for a particular COTS single board computer configuration in several space environments.
Jiang, Luohua; Yang, Jing; Huang, Haixiao; Johnson, Ann; Dill, Edward J; Beals, Janette; Manson, Spero M; Roubideaux, Yvette
2016-05-01
Participant attrition in clinical trials and community-based interventions is a serious, common, and costly problem. In order to develop a simple predictive scoring system that can quantify the risk of participant attrition in a lifestyle intervention project, we analyzed data from the Special Diabetes Program for Indians Diabetes Prevention Program (SDPI-DP), an evidence-based lifestyle intervention to prevent diabetes in 36 American Indian and Alaska Native communities. SDPI-DP participants were randomly divided into a derivation cohort (n = 1600) and a validation cohort (n = 801). Logistic regressions were used to develop a scoring system from the derivation cohort. The discriminatory power and calibration properties of the system were assessed using the validation cohort. Seven independent factors predicted program attrition: gender, age, household income, comorbidity, chronic pain, site's user population size, and average age of site staff. Six factors predicted long-term attrition: gender, age, marital status, chronic pain, site's user population size, and average age of site staff. Each model exhibited moderate to fair discriminatory power (C statistic in the validation set: 0.70 for program attrition, and 0.66 for long-term attrition) and excellent calibration. The resulting scoring system offers a low-technology approach to identify participants at elevated risk for attrition in future similar behavioral modification intervention projects, which may inform appropriate allocation of retention resources. This approach also serves as a model for other efforts to prevent participant attrition.
Quantitative Reactivity Scales for Dynamic Covalent and Systems Chemistry.
Zhou, Yuntao; Li, Lijie; Ye, Hebo; Zhang, Ling; You, Lei
2016-01-13
Dynamic covalent chemistry (DCC) has become a powerful tool for the creation of molecular assemblies and complex systems in chemistry and materials science. Herein we developed for the first time quantitative reactivity scales capable of correlation and prediction of the equilibrium of dynamic covalent reactions (DCRs). The reference reactions are based upon universal DCRs between imines, one of the most utilized structural motifs in DCC, and a series of O-, N-, and S- mononucleophiles. Aromatic imines derived from pyridine-2-carboxyaldehyde exhibit capability for controlling the equilibrium through distinct substituent effects. Electron-donating groups (EDGs) stabilize the imine through quinoidal resonance, while electron-withdrawing groups (EWGs) stabilize the adduct by enhancing intramolecular hydrogen bonding, resulting in curvature in Hammett analysis. Notably, unique nonlinearity induced by both EDGs and EWGs emerged in Hammett plot when cyclic secondary amines were used. This is the first time such a behavior is observed in a thermodynamically controlled system, to the best of our knowledge. Unified quantitative reactivity scales were proposed for DCC and defined by the correlation log K = S(N) (R(N) + R(E)). Nucleophilicity parameters (R(N) and S(N)) and electrophilicity parameters (R(E)) were then developed from DCRs discovered. Furthermore, the predictive power of those parameters was verified by successful correlation of other DCRs, validating our reactivity scales as a general and useful tool for the evaluation and modeling of DCRs. The reactivity parameters proposed here should be complementary to well-established kinetics based parameters and find applications in many aspects, such as DCR discovery, bioconjugation, and catalysis.
NASA Astrophysics Data System (ADS)
Declair, Stefan; Saint-Drenan, Yves-Marie; Potthast, Roland
2016-04-01
Determining the amount of weather dependent renewable energy is a demanding task for transmission system operators (TSOs) and wind and photovoltaic (PV) prediction errors require the use of reserve power, which generate costs and can - in extreme cases - endanger the security of supply. In the project EWeLiNE funded by the German government, the German Weather Service and the Fraunhofer Institute on Wind Energy and Energy System Technology develop innovative weather- and power forecasting models and tools for grid integration of weather dependent renewable energy. The key part in energy prediction process chains is the numerical weather prediction (NWP) system. Wind speed and irradiation forecast from NWP system are however subject to several sources of error. The quality of the wind power prediction is mainly penalized by forecast error of the NWP model in the planetary boundary layer (PBL), which is characterized by high spatial and temporal fluctuations of the wind speed. For PV power prediction, weaknesses of the NWP model to correctly forecast i.e. low stratus, the absorption of condensed water or aerosol optical depth are the main sources of errors. Inaccurate radiation schemes (i.e. the two-stream parametrization) are also known as a deficit of NWP systems with regard to irradiation forecast. To mitigate errors like these, NWP model data can be corrected by post-processing techniques such as model output statistics and calibration using historical observational data. Additionally, latest observations can be used in a pre-processing technique called data assimilation (DA). In DA, not only the initial fields are provided, but the model is also synchronized with reality - the observations - and hence the model error is reduced in the forecast. Besides conventional observation networks like radiosondes, synoptic observations or air reports of wind, pressure and humidity, the number of observations measuring meteorological information indirectly such as satellite radiances, radar reflectivities or GPS slant delays strongly increases. The numerous wind farm and PV plants installed in Germany potentially represent a dense meteorological network assessing irradiation and wind speed through their power measurements. The accuracy of the NWP data may thus be enhanced by extending the observations in the assimilation by this new source of information. Wind power data can serve as indirect measurements of wind speed at hub height. The impact on the NWP model is potentially interesting since conventional observation network lacks measurements in this part of the PBL. Photovoltaic power plants can provide information on clouds, aerosol optical depth or low stratus in terms of remote sensing: the power output is strongly dependent on perturbations along the slant between sun position and PV panel. Additionally, since the latter kind of data is not limited to the vertical column above or below the detector. It may thus complement satellite data and compensate weaknesses in the radiation scheme. In this contribution, the DA method (Local Ensemble Transform Kalman Filter, LETKF) is shortly sketched. Furthermore, the computation of the model power equivalents is described and first assimilation results are presented and discussed.
Zou, Lingyun; Wang, Zhengzhi; Huang, Jiaomin
2007-12-01
Subcellular location is one of the key biological characteristics of proteins. Position-specific profiles (PSP) have been introduced as important characteristics of proteins in this article. In this study, to obtain position-specific profiles, the Position Specific Iterative-Basic Local Alignment Search Tool (PSI-BLAST) has been used to search for protein sequences in a database. Position-specific scoring matrices are extracted from the profiles as one class of characteristics. Four-part amino acid compositions and 1st-7th order dipeptide compositions have also been calculated as the other two classes of characteristics. Therefore, twelve characteristic vectors are extracted from each of the protein sequences. Next, the characteristic vectors are weighed by a simple weighing function and inputted into a BP neural network predictor named PSP-Weighted Neural Network (PSP-WNN). The Levenberg-Marquardt algorithm is employed to adjust the weight matrices and thresholds during the network training instead of the error back propagation algorithm. With a jackknife test on the RH2427 dataset, PSP-WNN has achieved a higher overall prediction accuracy of 88.4% rather than the prediction results by the general BP neural network, Markov model, and fuzzy k-nearest neighbors algorithm on this dataset. In addition, the prediction performance of PSP-WNN has been evaluated with a five-fold cross validation test on the PK7579 dataset and the prediction results have been consistently better than those of the previous method on the basis of several support vector machines, using compositions of both amino acids and amino acid pairs. These results indicate that PSP-WNN is a powerful tool for subcellular localization prediction. At the end of the article, influences on prediction accuracy using different weighting proportions among three characteristic vector categories have been discussed. An appropriate proportion is considered by increasing the prediction accuracy.
Gao, Xiang-Ming; Yang, Shi-Feng; Pan, San-Bo
2017-01-01
Predicting the output power of photovoltaic system with nonstationarity and randomness, an output power prediction model for grid-connected PV systems is proposed based on empirical mode decomposition (EMD) and support vector machine (SVM) optimized with an artificial bee colony (ABC) algorithm. First, according to the weather forecast data sets on the prediction date, the time series data of output power on a similar day with 15-minute intervals are built. Second, the time series data of the output power are decomposed into a series of components, including some intrinsic mode components IMFn and a trend component Res, at different scales using EMD. The corresponding SVM prediction model is established for each IMF component and trend component, and the SVM model parameters are optimized with the artificial bee colony algorithm. Finally, the prediction results of each model are reconstructed, and the predicted values of the output power of the grid-connected PV system can be obtained. The prediction model is tested with actual data, and the results show that the power prediction model based on the EMD and ABC-SVM has a faster calculation speed and higher prediction accuracy than do the single SVM prediction model and the EMD-SVM prediction model without optimization.
2017-01-01
Predicting the output power of photovoltaic system with nonstationarity and randomness, an output power prediction model for grid-connected PV systems is proposed based on empirical mode decomposition (EMD) and support vector machine (SVM) optimized with an artificial bee colony (ABC) algorithm. First, according to the weather forecast data sets on the prediction date, the time series data of output power on a similar day with 15-minute intervals are built. Second, the time series data of the output power are decomposed into a series of components, including some intrinsic mode components IMFn and a trend component Res, at different scales using EMD. The corresponding SVM prediction model is established for each IMF component and trend component, and the SVM model parameters are optimized with the artificial bee colony algorithm. Finally, the prediction results of each model are reconstructed, and the predicted values of the output power of the grid-connected PV system can be obtained. The prediction model is tested with actual data, and the results show that the power prediction model based on the EMD and ABC-SVM has a faster calculation speed and higher prediction accuracy than do the single SVM prediction model and the EMD-SVM prediction model without optimization. PMID:28912803
Maier-Kiener, Verena; Schuh, Benjamin; George, Easo P.; ...
2016-11-19
The equiatomic high-entropy alloy (HEA), CrMnFeCoNi, has recently been shown to be microstructurally unstable, resulting in a multi-phase microstructure after intermediate-temperature annealing treatments. The decomposition occurs rapidly in the nanocrystalline (NC) state and after longer annealing times in coarse-grained states. To characterize the mechanical properties of differently annealed NC states containing multiple phases, nanoindentation was used in this paper. The results revealed besides drastic changes in hardness, also for the first time significant changes in the Young's modulus and strain rate sensitivity. Finally, nanoindentation of NC HEAs is, therefore, a useful complementary screening tool with high potential as a highmore » throughput approach to detect phase decomposition, which can also be used to qualitatively predict the long-term stability of single-phase HEAs.« less
DNA-binding specificity prediction with FoldX.
Nadra, Alejandro D; Serrano, Luis; Alibés, Andreu
2011-01-01
With the advent of Synthetic Biology, a field between basic science and applied engineering, new computational tools are needed to help scientists reach their goal, their design, optimizing resources. In this chapter, we present a simple and powerful method to either know the DNA specificity of a wild-type protein or design new specificities by using the protein design algorithm FoldX. The only basic requirement is having a good resolution structure of the complex. Protein-DNA interaction design may aid the development of new parts designed to be orthogonal, decoupled, and precise in its target. Further, it could help to fine-tune the systems in terms of specificity, discrimination, and binding constants. In the age of newly developed devices and invented systems, computer-aided engineering promises to be an invaluable tool. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Bavuso, Salvatore J.
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. The Hybrid Automated Reliability Predictor (HARP) tutorial provides insight into HARP modeling techniques and the interactive textual prompting input language via a step-by-step explanation and demonstration of HARP's fault occurrence/repair model and the fault/error handling models. Example applications are worked in their entirety and the HARP tabular output data are presented for each. Simple models are presented at first with each succeeding example demonstrating greater modeling power and complexity. This document is not intended to present the theoretical and mathematical basis for HARP.
A computer controlled power tool for the servicing of the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
Richards, Paul W.; Konkel, Carl; Smith, Chris; Brown, Lee; Wagner, Ken
1996-01-01
The Hubble Space Telescope (HST) Pistol Grip Tool (PGT) is a self-contained, microprocessor controlled, battery-powered, 3/8-inch-drive hand-held tool. The PGT is also a non-powered ratchet wrench. This tool will be used by astronauts during Extravehicular Activity (EVA) to apply torque to the HST and HST Servicing Support Equipment mechanical interfaces and fasteners. Numerous torque, speed, and turn or angle limits are programmed into the PGT for use during various missions. Batteries are replaceable during ground operations, Intravehicular Activities, and EVA's.
A Nuclear Renaissance: The Role of Nuclear Power in Mitigating Climate Change
NASA Astrophysics Data System (ADS)
Winslow, Anne
2011-06-01
The U. N. Framework Convention on Climate Change calls for the stabilization of greenhouse gas (GHG) emissions at double the preindustrial atmospheric carbon dioxide concentration to avoid dangerous anthropogenic interference with the climate system. To achieve this goal, carbon emissions in 2050 must not exceed their current level, despite predictions of a dramatic increase in global electricity demand. The need to reduce GHG emissions and simultaneously provide for additional electricity demand has led to a renewed interest in the expansion of alternatives to fossil fuels—particularly renewable energy and nuclear power. As renewable energy sources are often constrained by the intermittency of natural energy forms, scale-ability concerns, cost and environmental barriers, many governments and even prominent environmentalist turn to nuclear energy as a source of clean, reliable base-load electricity. Described by some as a "nuclear renaissance", this trend of embracing nuclear power as a tool to mitigate climate change will dramatically influence the feasibility of emerging nuclear programs around the world.
A Nuclear Renaissance: The Role of Nuclear Power in Mitigating Climate Change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winslow, Anne
2011-06-28
The U. N. Framework Convention on Climate Change calls for the stabilization of greenhouse gas (GHG) emissions at double the preindustrial atmospheric carbon dioxide concentration to avoid dangerous anthropogenic interference with the climate system. To achieve this goal, carbon emissions in 2050 must not exceed their current level, despite predictions of a dramatic increase in global electricity demand. The need to reduce GHG emissions and simultaneously provide for additional electricity demand has led to a renewed interest in the expansion of alternatives to fossil fuels--particularly renewable energy and nuclear power. As renewable energy sources are often constrained by the intermittencymore » of natural energy forms, scale-ability concerns, cost and environmental barriers, many governments and even prominent environmentalist turn to nuclear energy as a source of clean, reliable base-load electricity. Described by some as a ''nuclear renaissance'', this trend of embracing nuclear power as a tool to mitigate climate change will dramatically influence the feasibility of emerging nuclear programs around the world.« less
Optics assembly for high power laser tools
Fraze, Jason D.; Faircloth, Brian O.; Zediker, Mark S.
2016-06-07
There is provided a high power laser rotational optical assembly for use with, or in high power laser tools for performing high power laser operations. In particular, the optical assembly finds applications in performing high power laser operations on, and in, remote and difficult to access locations. The optical assembly has rotational seals and bearing configurations to avoid contamination of the laser beam path and optics.
PredictSNP: Robust and Accurate Consensus Classifier for Prediction of Disease-Related Mutations
Bendl, Jaroslav; Stourac, Jan; Salanda, Ondrej; Pavelka, Antonin; Wieben, Eric D.; Zendulka, Jaroslav; Brezovsky, Jan; Damborsky, Jiri
2014-01-01
Single nucleotide variants represent a prevalent form of genetic variation. Mutations in the coding regions are frequently associated with the development of various genetic diseases. Computational tools for the prediction of the effects of mutations on protein function are very important for analysis of single nucleotide variants and their prioritization for experimental characterization. Many computational tools are already widely employed for this purpose. Unfortunately, their comparison and further improvement is hindered by large overlaps between the training datasets and benchmark datasets, which lead to biased and overly optimistic reported performances. In this study, we have constructed three independent datasets by removing all duplicities, inconsistencies and mutations previously used in the training of evaluated tools. The benchmark dataset containing over 43,000 mutations was employed for the unbiased evaluation of eight established prediction tools: MAPP, nsSNPAnalyzer, PANTHER, PhD-SNP, PolyPhen-1, PolyPhen-2, SIFT and SNAP. The six best performing tools were combined into a consensus classifier PredictSNP, resulting into significantly improved prediction performance, and at the same time returned results for all mutations, confirming that consensus prediction represents an accurate and robust alternative to the predictions delivered by individual tools. A user-friendly web interface enables easy access to all eight prediction tools, the consensus classifier PredictSNP and annotations from the Protein Mutant Database and the UniProt database. The web server and the datasets are freely available to the academic community at http://loschmidt.chemi.muni.cz/predictsnp. PMID:24453961
1998-01-01
equipped with a constant- pressure switch or control: drills; tappers; fastener drivers; horizontal, vertical, and angle grinders with wheels more than...hand-held power tools must be equipped with either a positive “on-off” control switch, a constant pressure switch , or a “lock-on” control: disc sanders...percussion tools with no means of holding accessories securely, must be equipped with a constant- pressure switch that will shut off the power when the
NASA Astrophysics Data System (ADS)
Kapur, Pawan
The miniaturization paradigm for silicon integrated circuits has resulted in a tremendous cost and performance advantage. Aggressive shrinking of devices provides faster transistors and a greater functionality for circuit design. However, scaling induced smaller wire cross-sections coupled with longer lengths owing to larger chip areas, result in a steady deterioration of interconnects. This degradation in interconnect trends threatens to slow down the rapid growth along Moore's law. This work predicts that the situation is worse than anticipated. It shows that in the light of technology and reliability constraints, scaling induced increase in electron surface scattering, fractional cross section area occupied by the highly resistive barrier, and realistic interconnect operation temperature will lead to a significant rise in effective resistivity of modern copper based interconnects. We start by discussing various technology factors affecting copper resistivity. We, next, develop simulation tools to model these effects. Using these tools, we quantify the increase in realistic copper resistivity as a function of future technology nodes, under various technology assumptions. Subsequently, we evaluate the impact of these technology effects on delay and power dissipation of global signaling interconnects. Modern long on-chip wires use repeaters, which dramatically improves their delay and bandwidth. We quantify the repeated wire delays and power dissipation using realistic resistance trends at future nodes. With the motivation of reducing power, we formalize a methodology, which trades power with delay very efficiently for repeated wires. Using this method, we find that although the repeater power comes down, the total power dissipation due to wires is still found to be very large at future nodes. Finally, we explore optical interconnects as a possible substitute, for specific interconnect applications. We model an optical receiver and waveguides. Using this we assess future optical system performance. Finally, we compare the delay and power of future metal interconnects with that of optical interconnects for global signaling application. We also compare the power dissipation of the two approaches for an upper level clock distribution application. We find that for long on-chip communication links, optical interconnects have lower latencies than future metal interconnects at comparable levels of power dissipation.
Power-Tool Adapter For T-Handle Screws
NASA Technical Reports Server (NTRS)
Deloach, Stephen R.
1992-01-01
Proposed adapter enables use of pneumatic drill, electric drill, electric screwdriver, or similar power tool to tighten or loosen T-handled screws. Notched tube with perpendicular rod welded to it inserted in chuck of tool. Notched end of tube slipped over screw handle.
iDrug: a web-accessible and interactive drug discovery and design platform
2014-01-01
Background The progress in computer-aided drug design (CADD) approaches over the past decades accelerated the early-stage pharmaceutical research. Many powerful standalone tools for CADD have been developed in academia. As programs are developed by various research groups, a consistent user-friendly online graphical working environment, combining computational techniques such as pharmacophore mapping, similarity calculation, scoring, and target identification is needed. Results We presented a versatile, user-friendly, and efficient online tool for computer-aided drug design based on pharmacophore and 3D molecular similarity searching. The web interface enables binding sites detection, virtual screening hits identification, and drug targets prediction in an interactive manner through a seamless interface to all adapted packages (e.g., Cavity, PocketV.2, PharmMapper, SHAFTS). Several commercially available compound databases for hit identification and a well-annotated pharmacophore database for drug targets prediction were integrated in iDrug as well. The web interface provides tools for real-time molecular building/editing, converting, displaying, and analyzing. All the customized configurations of the functional modules can be accessed through featured session files provided, which can be saved to the local disk and uploaded to resume or update the history work. Conclusions iDrug is easy to use, and provides a novel, fast and reliable tool for conducting drug design experiments. By using iDrug, various molecular design processing tasks can be submitted and visualized simply in one browser without installing locally any standalone modeling softwares. iDrug is accessible free of charge at http://lilab.ecust.edu.cn/idrug. PMID:24955134
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammel, T.E.; Srinivas, V.
1978-11-01
This initial definition of the power degradation prediction technique outlines a model for predicting SIG/Galileo mean EOM power using component test data and data from a module power degradation demonstration test program. (LCL)
NASA Technical Reports Server (NTRS)
Sree, Dave
2015-01-01
Far-field acoustic power level and performance analyses of open rotor model F31/A31 have been performed to determine its noise characteristics at simulated scaled takeoff, nominal takeoff, and approach flight conditions. The nonproprietary parts of the data obtained from experiments in 9- by 15-Foot Low-Speed Wind Tunnel (9?15 LSWT) tests were provided by NASA Glenn Research Center to perform the analyses. The tone and broadband noise components have been separated from raw test data by using a new data analysis tool. Results in terms of sound pressure levels, acoustic power levels, and their variations with rotor speed, angle of attack, thrust, and input shaft power have been presented and discussed. The effect of an upstream pylon on the noise levels of the model has been addressed. Empirical equations relating model's acoustic power level, thrust, and input shaft power have been developed. The far-field acoustic efficiency of the model is also determined for various simulated flight conditions. It is intended that the results presented in this work will serve as a database for comparison and improvement of other open rotor blade designs and also for validating open rotor noise prediction codes.
Karaismailoğlu, Eda; Dikmen, Zeliha Günnur; Akbıyık, Filiz; Karaağaoğlu, Ahmet Ergun
2018-04-30
Background/aim: Myoglobin, cardiac troponin T, B-type natriuretic peptide (BNP), and creatine kinase isoenzyme MB (CK-MB) are frequently used biomarkers for evaluating risk of patients admitted to an emergency department with chest pain. Recently, time- dependent receiver operating characteristic (ROC) analysis has been used to evaluate the predictive power of biomarkers where disease status can change over time. We aimed to determine the best set of biomarkers that estimate cardiac death during follow-up time. We also obtained optimal cut-off values of these biomarkers, which differentiates between patients with and without risk of death. A web tool was developed to estimate time intervals in risk. Materials and methods: A total of 410 patients admitted to the emergency department with chest pain and shortness of breath were included. Cox regression analysis was used to determine an optimal set of biomarkers that can be used for estimating cardiac death and to combine the significant biomarkers. Time-dependent ROC analysis was performed for evaluating performances of significant biomarkers and a combined biomarker during 240 h. The bootstrap method was used to compare statistical significance and the Youden index was used to determine optimal cut-off values. Results : Myoglobin and BNP were significant by multivariate Cox regression analysis. Areas under the time-dependent ROC curves of myoglobin and BNP were about 0.80 during 240 h, and that of the combined biomarker (myoglobin + BNP) increased to 0.90 during the first 180 h. Conclusion: Although myoglobin is not clinically specific to a cardiac event, in our study both myoglobin and BNP were found to be statistically significant for estimating cardiac death. Using this combined biomarker may increase the power of prediction. Our web tool can be useful for evaluating the risk status of new patients and helping clinicians in making decisions.
Statistical modeling to support power system planning
NASA Astrophysics Data System (ADS)
Staid, Andrea
This dissertation focuses on data-analytic approaches that improve our understanding of power system applications to promote better decision-making. It tackles issues of risk analysis, uncertainty management, resource estimation, and the impacts of climate change. Tools of data mining and statistical modeling are used to bring new insight to a variety of complex problems facing today's power system. The overarching goal of this research is to improve the understanding of the power system risk environment for improved operation, investment, and planning decisions. The first chapter introduces some challenges faced in planning for a sustainable power system. Chapter 2 analyzes the driving factors behind the disparity in wind energy investments among states with a goal of determining the impact that state-level policies have on incentivizing wind energy. Findings show that policy differences do not explain the disparities; physical and geographical factors are more important. Chapter 3 extends conventional wind forecasting to a risk-based focus of predicting maximum wind speeds, which are dangerous for offshore operations. Statistical models are presented that issue probabilistic predictions for the highest wind speed expected in a three-hour interval. These models achieve a high degree of accuracy and their use can improve safety and reliability in practice. Chapter 4 examines the challenges of wind power estimation for onshore wind farms. Several methods for wind power resource assessment are compared, and the weaknesses of the Jensen model are demonstrated. For two onshore farms, statistical models outperform other methods, even when very little information is known about the wind farm. Lastly, chapter 5 focuses on the power system more broadly in the context of the risks expected from tropical cyclones in a changing climate. Risks to U.S. power system infrastructure are simulated under different scenarios of tropical cyclone behavior that may result from climate change. The scenario-based approach allows me to address the deep uncertainty present by quantifying the range of impacts, identifying the most critical parameters, and assessing the sensitivity of local areas to a changing risk. Overall, this body of work quantifies the uncertainties present in several operational and planning decisions for power system applications.
Hedger, George; Sansom, Mark S. P.
2017-01-01
Lipid molecules are able to selectively interact with specific sites on integral membrane proteins, and modulate their structure and function. Identification and characterisation of these sites is of importance for our understanding of the molecular basis of membrane protein function and stability, and may facilitate the design of lipid-like drug molecules. Molecular dynamics simulations provide a powerful tool for the identification of these sites, complementing advances in membrane protein structural biology and biophysics. We describe recent notable biomolecular simulation studies which have identified lipid interaction sites on a range of different membrane proteins. The sites identified in these simulation studies agree well with those identified by complementary experimental techniques. This demonstrates the power of the molecular dynamics approach in the prediction and characterization of lipid interaction sites on integral membrane proteins. PMID:26946244
Loads produced by a suited subject performing tool tasks without the use of foot restraints
NASA Technical Reports Server (NTRS)
Rajulu, Sudhakar L.; Poliner, Jeffrey; Klute, Glenn K.
1993-01-01
With an increase in the frequency of extravehicular activities (EVA's) aboard the Space Shuttle, NASA is interested in determining the capabilities of suited astronauts while performing manual tasks during an EVA, in particular the situations in which portable foot restraints are not used to stabilize the astronauts. Efforts were made to document the forces that are transmitted to spacecraft while pushing and pulling an object as well as while operating a standard wrench and an automatic power tool. The six subjects studied aboard the KC-135 reduced gravity aircraft were asked to exert a maximum torque and to maintain a constant level of torque with a wrench, to push and pull an EVA handrail, and to operate a Hubble Space Telescope (HST) power tool. The results give an estimate of the forces and moments that an operator will transmit to the handrail as well as to the supporting structure. In general, it was more effective to use the tool inwardly toward the body rather than away from the body. There were no differences in terms of strength capabilities between right and left hands. The power tool was difficult to use. It is suggested that ergonomic redesigning of the power tool may increase the efficiency of power tool use.
How animals move: comparative lessons on animal locomotion.
Schaeffer, Paul J; Lindstedt, Stan L
2013-01-01
Comparative physiology often provides unique insights in animal structure and function. It is specifically through this lens that we discuss the fundamental properties of skeletal muscle and animal locomotion, incorporating variation in body size and evolved difference among species. For example, muscle frequencies in vivo are highly constrained by body size, which apparently tunes muscle use to maximize recovery of elastic recoil potential energy. Secondary to this constraint, there is an expected linking of skeletal muscle structural and functional properties. Muscle is relatively simple structurally, but by changing proportions of the few muscle components, a diverse range of functional outputs is possible. Thus, there is a consistent and predictable relation between muscle function and myocyte composition that illuminates animal locomotion. When animals move, the mechanical properties of muscle diverge from the static textbook force-velocity relations described by A. V. Hill, as recovery of elastic potential energy together with force and power enhancement with activation during stretch combine to modulate performance. These relations are best understood through the tool of work loops. Also, when animals move, locomotion is often conveniently categorized energetically. Burst locomotion is typified by high-power outputs and short durations while sustained, cyclic, locomotion engages a smaller fraction of the muscle tissue, yielding lower force and power. However, closer examination reveals that rather than a dichotomy, energetics of locomotion is a continuum. There is a remarkably predictable relationship between duration of activity and peak sustainable performance.
Predicting the impact of chromium on flow-accelerated corrosion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chexal, B.; Goyette, L.F.; Horowitz, J.S.
1996-12-01
Flow-Accelerated Corrosion (FAC) continues to cause problems in nuclear and fossil power plants. Many experiments have been performed to understand the mechanism of FAC. For approximately twenty years, it has ben widely recognized that the presence of small amounts of chromium will reduce the rate of FAC. This effect was quantified in the eighties by research performed in France, Germany and the Netherlands. The results of this research has been incorporated into the computer-based tools used by utility engineers to deal with this issue. For some time, plant data from Diablo Canyon has suggested that the existing correlations relating themore » concentration of chromium to the rate of FAC are conservative. Laboratory examinations have supported this observation. It appears that the existing correlations fail to capture a change in mechanism from a FAC process with linear kinetics to a general corrosion process with parabolic kinetics. This change in mechanism occurs at a chromium level of approximately 0.1%, within the allowable alloy range of typical carbon steel (ASTM/ASME A106 Grade B) used in power piping in most domestic plants. It has been difficult to obtain plant data that has sufficient chromium to develop a new correlation. Data from Diablo Canyon and the Dukovany Power Plant in the Czech Republic will be used to develop a new chromium correlation for predicting FAC rate.« less
Decision-making tools in prostate cancer: from risk grouping to nomograms.
Fontanella, Paolo; Benecchi, Luigi; Grasso, Angelica; Patel, Vipul; Albala, David; Abbou, Claude; Porpiglia, Francesco; Sandri, Marco; Rocco, Bernardo; Bianchi, Giampaolo
2017-12-01
Prostate cancer (PCa) is the most common solid neoplasm and the second leading cause of cancer death in men. After the Partin tables were developed, a number of predictive and prognostic tools became available for risk stratification. These tools have allowed the urologist to better characterize this disease and lead to more confident treatment decisions for patients. The purpose of this study is to critically review the decision-making tools currently available to the urologist, from the moment when PCa is first diagnosed until patients experience metastatic progression and death. A systematic and critical analysis through Medline, EMBASE, Scopus and Web of Science databases was carried out in February 2016 as per the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. The search was conducted using the following key words: "prostate cancer," "prediction tools," "nomograms." Seventy-two studies were identified in the literature search. We summarized the results into six sections: Tools for prediction of life expectancy (before treatment), Tools for prediction of pathological stage (before treatment), Tools for prediction of survival and cancer-specific mortality (before/after treatment), Tools for prediction of biochemical recurrence (before/after treatment), Tools for prediction of metastatic progression (after treatment) and in the last section biomarkers and genomics. The management of PCa patients requires a tailored approach to deliver a truly personalized treatment. The currently available tools are of great help in helping the urologist in the decision-making process. These tests perform very well in high-grade and low-grade disease, while for intermediate-grade disease further research is needed. Newly discovered markers, genomic tests, and advances in imaging acquisition through mpMRI will help in instilling confidence that the appropriate treatments are being offered to patients with prostate cancer.
Implementation Analysis of Cutting Tool Carbide with Cast Iron Material S45 C on Universal Lathe
NASA Astrophysics Data System (ADS)
Junaidi; hestukoro, Soni; yanie, Ahmad; Jumadi; Eddy
2017-12-01
Cutting tool is the tools lathe. Cutting process tool CARBIDE with Cast Iron Material Universal Lathe which is commonly found at Analysiscutting Process by some aspects numely Cutting force, Cutting Speed, Cutting Power, Cutting Indication Power, Temperature Zone 1 and Temperatur Zone 2. Purpose of this Study was to determine how big the cutting Speed, Cutting Power, electromotor Power,Temperatur Zone 1 and Temperatur Zone 2 that drives the chisel cutting CARBIDE in the Process of tur ning Cast Iron Material. Cutting force obtained from image analysis relationship between the recommended Component Cuting Force with plane of the cut and Cutting Speed obtained from image analysis of relationships between the recommended Cutting Speed Feed rate.
REVEL: An Ensemble Method for Predicting the Pathogenicity of Rare Missense Variants.
Ioannidis, Nilah M; Rothstein, Joseph H; Pejaver, Vikas; Middha, Sumit; McDonnell, Shannon K; Baheti, Saurabh; Musolf, Anthony; Li, Qing; Holzinger, Emily; Karyadi, Danielle; Cannon-Albright, Lisa A; Teerlink, Craig C; Stanford, Janet L; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan M; Schleutker, Johanna; Carpten, John D; Powell, Isaac J; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham G; MacInnis, Robert J; Maier, Christiane; Hsieh, Chih-Lin; Wiklund, Fredrik; Catalona, William J; Foulkes, William D; Mandal, Diptasri; Eeles, Rosalind A; Kote-Jarai, Zsofia; Bustamante, Carlos D; Schaid, Daniel J; Hastie, Trevor; Ostrander, Elaine A; Bailey-Wilson, Joan E; Radivojac, Predrag; Thibodeau, Stephen N; Whittemore, Alice S; Sieh, Weiva
2016-10-06
The vast majority of coding variants are rare, and assessment of the contribution of rare variants to complex traits is hampered by low statistical power and limited functional data. Improved methods for predicting the pathogenicity of rare coding variants are needed to facilitate the discovery of disease variants from exome sequencing studies. We developed REVEL (rare exome variant ensemble learner), an ensemble method for predicting the pathogenicity of missense variants on the basis of individual tools: MutPred, FATHMM, VEST, PolyPhen, SIFT, PROVEAN, MutationAssessor, MutationTaster, LRT, GERP, SiPhy, phyloP, and phastCons. REVEL was trained with recently discovered pathogenic and rare neutral missense variants, excluding those previously used to train its constituent tools. When applied to two independent test sets, REVEL had the best overall performance (p < 10 -12 ) as compared to any individual tool and seven ensemble methods: MetaSVM, MetaLR, KGGSeq, Condel, CADD, DANN, and Eigen. Importantly, REVEL also had the best performance for distinguishing pathogenic from rare neutral variants with allele frequencies <0.5%. The area under the receiver operating characteristic curve (AUC) for REVEL was 0.046-0.182 higher in an independent test set of 935 recent SwissVar disease variants and 123,935 putatively neutral exome sequencing variants and 0.027-0.143 higher in an independent test set of 1,953 pathogenic and 2,406 benign variants recently reported in ClinVar than the AUCs for other ensemble methods. We provide pre-computed REVEL scores for all possible human missense variants to facilitate the identification of pathogenic variants in the sea of rare variants discovered as sequencing studies expand in scale. Copyright © 2016 American Society of Human Genetics. All rights reserved.
Integrating remotely sensed fires for predicting deforestation for REDD.
Armenteras, Dolors; Gibbes, Cerian; Anaya, Jesús A; Dávalos, Liliana M
2017-06-01
Fire is an important tool in tropical forest management, as it alters forest composition, structure, and the carbon budget. The United Nations program on Reducing Emissions from Deforestation and Forest Degradation (REDD+) aims to sustainably manage forests, as well as to conserve and enhance their carbon stocks. Despite the crucial role of fire management, decision-making on REDD+ interventions fails to systematically include fires. Here, we address this critical knowledge gap in two ways. First, we review REDD+ projects and programs to assess the inclusion of fires in monitoring, reporting, and verification (MRV) systems. Second, we model the relationship between fire and forest for a pilot site in Colombia using near-real-time (NRT) fire monitoring data derived from the Moderate Resolution Imaging Spectroradiometer (MODIS). The literature review revealed fire remains to be incorporated as a key component of MRV systems. Spatially explicit modeling of land use change showed the probability of deforestation declined sharply with increasing distance to the nearest fire the preceding year (multi-year model area under the curve [AUC] 0.82). Deforestation predictions based on the model performed better than the official REDD early-warning system. The model AUC for 2013 and 2014 was 0.81, compared to 0.52 for the early-warning system in 2013 and 0.68 in 2014. This demonstrates NRT fire monitoring is a powerful tool to predict sites of forest deforestation. Applying new, publicly available, and open-access NRT fire data should be an essential element of early-warning systems to detect and prevent deforestation. Our results provide tools for improving both the current MRV systems, and the deforestation early-warning system in Colombia. © 2017 by the Ecological Society of America.
NASA Astrophysics Data System (ADS)
Kiliclar, Yalin; Laurischkat, Roman; Vladimirov, Ivaylo N.; Reese, Stefanie
2011-08-01
The presented project deals with a robot based incremental sheet metal forming process, which is called roboforming and has been developed at the Chair of Production Systems. It is characterized by flexible shaping using a freely programmable path-synchronous movement of two industrial robots. The final shape is produced by the incremental infeed of the forming tool in depth direction and its movement along the part contour in lateral direction. However, the resulting geometries formed in roboforming deviate several millimeters from the reference geometry. This results from the compliance of the involved machine structures and the springback effects of the workpiece. The project aims to predict these deviations caused by resiliences and to carry out a compensative path planning based on this prediction. Therefore a planning tool is implemented which compensates the robots's compliance and the springback effects of the sheet metal. The forming process is simulated by means of a finite element analysis using a material model developed at the Institute of Applied Mechanics (IFAM). It is based on the multiplicative split of the deformation gradient in the context of hyperelasticity and combines nonlinear kinematic and isotropic hardening. Low-order finite elements used to simulate thin sheet structures, such as used for the experiments, have the major problem of locking, a nonphysical stiffening effect. For an efficient finite element analysis a special solid-shell finite element formulation based on reduced integration with hourglass stabilization has been developed. To circumvent different locking effects, the enhanced assumed strain (EAS) and the assumed natural strain (ANS) concepts are included in this formulation. Having such powerful tools available we obtain more accurate geometries.
2011-01-01
Background Age-related bone loss is asymptomatic, and the morbidity of osteoporosis is secondary to the fractures that occur. Common sites of fracture include the spine, hip, forearm and proximal humerus. Fractures at the hip incur the greatest morbidity and mortality and give rise to the highest direct costs for health services. Their incidence increases exponentially with age. Independently changes in population demography, the age - and sex- specific incidence of osteoporotic fractures appears to be increasing in developing and developed countries. This could mean more than double the expected burden of osteoporotic fractures in the next 50 years. Methods/Design To assess the predictive power of the WHO FRAX™ tool to identify the subjects with the highest absolute risk of fragility fracture at 10 years in a Spanish population, a predictive validation study of the tool will be carried out. For this purpose, the participants recruited by 1999 will be assessed. These were referred to scan-DXA Department from primary healthcare centres, non hospital and hospital consultations. Study population: Patients attended in the national health services integrated into a FRIDEX cohort with at least one Dual-energy X-ray absorptiometry (DXA) measurement and one extensive questionnaire related to fracture risk factors. Measurements: At baseline bone mineral density measurement using DXA, clinical fracture risk factors questionnaire, dietary calcium intake assessment, history of previous fractures, and related drugs. Follow up by telephone interview to know fragility fractures in the 10 years with verification in electronic medical records and also to know the number of falls in the last year. The absolute risk of fracture will be estimated using the FRAX™ tool from the official web site. Discussion Since more than 10 years ago numerous publications have recognised the importance of other risk factors for new osteoporotic fractures in addition to low BMD. The extension of a method for calculating the risk (probability) of fractures using the FRAX™ tool is foreseeable in Spain and this would justify a study such as this to allow the necessary adjustments in calibration of the parameters included in the logarithmic formula constituted by FRAX™. PMID:21272372
Rocha, R R A; Thomaz, S M; Carvalho, P; Gomes, L C
2009-06-01
The need for prediction is widely recognized in limnology. In this study, data from 25 lakes of the Upper Paraná River floodplain were used to build models to predict chlorophyll-a and dissolved oxygen concentrations. Akaike's information criterion (AIC) was used as a criterion for model selection. Models were validated with independent data obtained in the same lakes in 2001. Predictor variables that significantly explained chlorophyll-a concentration were pH, electrical conductivity, total seston (positive correlation) and nitrate (negative correlation). This model explained 52% of chlorophyll variability. Variables that significantly explained dissolved oxygen concentration were pH, lake area and nitrate (all positive correlations); water temperature and electrical conductivity were negatively correlated with oxygen. This model explained 54% of oxygen variability. Validation with independent data showed that both models had the potential to predict algal biomass and dissolved oxygen concentration in these lakes. These findings suggest that multiple regression models are valuable and practical tools for understanding the dynamics of ecosystems and that predictive limnology may still be considered a powerful approach in aquatic ecology.
MLBCD: a machine learning tool for big clinical data.
Luo, Gang
2015-01-01
Predictive modeling is fundamental for extracting value from large clinical data sets, or "big clinical data," advancing clinical research, and improving healthcare. Machine learning is a powerful approach to predictive modeling. Two factors make machine learning challenging for healthcare researchers. First, before training a machine learning model, the values of one or more model parameters called hyper-parameters must typically be specified. Due to their inexperience with machine learning, it is hard for healthcare researchers to choose an appropriate algorithm and hyper-parameter values. Second, many clinical data are stored in a special format. These data must be iteratively transformed into the relational table format before conducting predictive modeling. This transformation is time-consuming and requires computing expertise. This paper presents our vision for and design of MLBCD (Machine Learning for Big Clinical Data), a new software system aiming to address these challenges and facilitate building machine learning predictive models using big clinical data. The paper describes MLBCD's design in detail. By making machine learning accessible to healthcare researchers, MLBCD will open the use of big clinical data and increase the ability to foster biomedical discovery and improve care.
Bekiaris, Georgios; Bruun, Sander; Peltre, Clément; Houot, Sabine; Jensen, Lars S
2015-05-01
Fourier transform infrared (FT-IR) spectroscopy has been used for several years as a fast, low-cost, reliable technique for characterising a large variety of materials. However, the strong influence of sample particle size and the inability to measure the absorption of very dark and opaque samples have made FTIR unsuitable for many waste materials. FTIR-photoacoustic spectroscopy (FTIR-PAS) can eliminate some of the shortcomings of traditional FTIR caused by scattering effects and reflection issues, and recent advances in PAS technology have made commercial instruments available. In this study, FTIR-PAS was used to characterise a wide range of organic waste products and predict their labile carbon fraction, which is normally determined from time-consuming assays. FTIR-PAS was found to be capable of predicting the labile fraction of carbon as efficiently as near infrared spectroscopy (NIR) and furthermore of identifying the compounds that are correlated with the predicted parameter, thus facilitating a more mechanistic interpretation. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, H; Chen, J; Pouliot, J
2015-06-15
Purpose: Deformable image registration (DIR) is a powerful tool with the potential to deformably map dose from one computed-tomography (CT) image to another. Errors in the DIR, however, will produce errors in the transferred dose distribution. We have proposed a software tool, called AUTODIRECT (automated DIR evaluation of confidence tool), which predicts voxel-specific dose mapping errors on a patient-by-patient basis. This work validates the effectiveness of AUTODIRECT to predict dose mapping errors with virtual and physical phantom datasets. Methods: AUTODIRECT requires 4 inputs: moving and fixed CT images and two noise scans of a water phantom (for noise characterization). Then,more » AUTODIRECT uses algorithms to generate test deformations and applies them to the moving and fixed images (along with processing) to digitally create sets of test images, with known ground-truth deformations that are similar to the actual one. The clinical DIR algorithm is then applied to these test image sets (currently 4) . From these tests, AUTODIRECT generates spatial and dose uncertainty estimates for each image voxel based on a Student’s t distribution. This work compares these uncertainty estimates to the actual errors made by the Velocity Deformable Multi Pass algorithm on 11 virtual and 1 physical phantom datasets. Results: For 11 of the 12 tests, the predicted dose error distributions from AUTODIRECT are well matched to the actual error distributions within 1–6% for 10 virtual phantoms, and 9% for the physical phantom. For one of the cases though, the predictions underestimated the errors in the tail of the distribution. Conclusion: Overall, the AUTODIRECT algorithm performed well on the 12 phantom cases for Velocity and was shown to generate accurate estimates of dose warping uncertainty. AUTODIRECT is able to automatically generate patient-, organ- , and voxel-specific DIR uncertainty estimates. This ability would be useful for patient-specific DIR quality assurance.« less
Computational tool for simulation of power and refrigeration cycles
NASA Astrophysics Data System (ADS)
Córdoba Tuta, E.; Reyes Orozco, M.
2016-07-01
Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, Alexander V.; Rodrigues, George, E-mail: george.rodrigues@lhsc.on.ca; Department of Epidemiology/Biostatistics, University of Western Ontario, London, ON
Purpose: To compare the quality-adjusted life expectancy and overall survival in patients with Stage I non-small-cell lung cancer (NSCLC) treated with either stereotactic body radiation therapy (SBRT) or surgery. Methods and Materials: We constructed a Markov model to describe health states after either SBRT or lobectomy for Stage I NSCLC for a 5-year time frame. We report various treatment strategy survival outcomes stratified by age, sex, and pack-year history of smoking, and compared these with an external outcome prediction tool (Adjuvant{exclamation_point} Online). Results: Overall survival, cancer-specific survival, and other causes of death as predicted by our model correlated closely withmore » those predicted by the external prediction tool. Overall survival at 5 years as predicted by baseline analysis of our model is in favor of surgery, with a benefit ranging from 2.2% to 3.0% for all cohorts. Mean quality-adjusted life expectancy ranged from 3.28 to 3.78 years after surgery and from 3.35 to 3.87 years for SBRT. The utility threshold for preferring SBRT over surgery was 0.90. Outcomes were sensitive to quality of life, the proportion of local and regional recurrences treated with standard vs. palliative treatments, and the surgery- and SBRT-related mortalities. Conclusions: The role of SBRT in the medically operable patient is yet to be defined. Our model indicates that SBRT may offer comparable overall survival and quality-adjusted life expectancy as compared with surgical resection. Well-powered prospective studies comparing surgery vs. SBRT in early-stage lung cancer are warranted to further investigate the relative survival, quality of life, and cost characteristics of both treatment paradigms.« less
Fisher, Jolene H; Al-Hejaili, Faris; Kandel, Sonja; Hirji, Alim; Shapera, Shane; Mura, Marco
2017-04-01
The heterogeneous progression of idiopathic pulmonary fibrosis (IPF) makes prognostication difficult and contributes to high mortality on the waitlist for lung transplantation (LTx). Multi-dimensional scores (Composite Physiologic index [CPI], [Gender-Age-Physiology [GAP]; RIsk Stratification scorE [RISE]) demonstrated enhanced predictive power towards outcome in IPF. The lung allocation score (LAS) is a multi-dimensional tool commonly used to stratify patients assessed for LTx. We sought to investigate whether IPF-specific multi-dimensional scores predict mortality in patients with IPF assessed for LTx. The study included 302 patients with IPF who underwent a LTx assessment (2003-2014). Multi-dimensional scores were calculated. The primary outcome was 12-month mortality after assessment. LTx was considered as competing event in all analyses. At the end of the observation period, there were 134 transplants, 63 deaths, and 105 patients were alive without LTx. Multi-dimensional scores predicted mortality with accuracy similar to LAS, and superior to that of individual variables: area under the curve (AUC) for LAS was 0.78 (sensitivity 71%, specificity 86%); CPI 0.75 (sensitivity 67%, specificity 82%); GAP 0.67 (sensitivity 59%, specificity 74%); RISE 0.78 (sensitivity 71%, specificity 84%). A separate analysis conducted only in patients actively listed for LTx (n = 247; 50 deaths) yielded similar results. In patients with IPF assessed for LTx as well as in those actually listed, multi-dimensional scores predict mortality better than individual variables, and with accuracy similar to the LAS. If validated, multi-dimensional scores may serve as inexpensive tools to guide decisions on the timing of referral and listing for LTx. Copyright © 2017 Elsevier Ltd. All rights reserved.
Power load prediction based on GM (1,1)
NASA Astrophysics Data System (ADS)
Wu, Di
2017-05-01
Currently, Chinese power load prediction is highly focused; the paper deeply studies grey prediction and applies it to Chinese electricity consumption during the recent 14 years; through after-test test, it obtains grey prediction which has good adaptability to medium and long-term power load.
Effect of accuracy of wind power prediction on power system operator
NASA Technical Reports Server (NTRS)
Schlueter, R. A.; Sigari, G.; Costi, T.
1985-01-01
This research project proposed a modified unit commitment that schedules connection and disconnection of generating units in response to load. A modified generation control is also proposed that controls steam units under automatic generation control, fast responding diesels, gas turbines and hydro units under a feedforward control, and wind turbine array output under a closed loop array control. This modified generation control and unit commitment require prediction of trend wind power variation one hour ahead and the prediction of error in this trend wind power prediction one half hour ahead. An improved meter for predicting trend wind speed variation is developed. Methods for accurately simulating the wind array power from a limited number of wind speed prediction records was developed. Finally, two methods for predicting the error in the trend wind power prediction were developed. This research provides a foundation for testing and evaluating the modified unit commitment and generation control that was developed to maintain operating reliability at a greatly reduced overall production cost for utilities with wind generation capacity.
Noguchi, Shingo; Yatera, Kazuhiro; Kawanami, Toshinori; Fujino, Yoshihisa; Moro, Hiroshi; Aoki, Nobumasa; Komiya, Kosaku; Kadota, Jun-Ichi; Shime, Nobuaki; Tsukada, Hiroki; Kohno, Shigeru; Mukae, Hiroshi
2017-01-01
In contrast to community-acquired pneumonia (CAP), no specific severity assessment tools have been developed for healthcare-associated pneumonia (HCAP) in clinical practice. In this review, we assessed the clinical significance of severity assessment tools for HCAP. We identified related articles from the PubMed database. The eligibility criteria were original research articles evaluating severity scoring tools and reporting the outcomes of mortality in patients with HCAP. Eight articles were included in the meta-analysis. The PORT score and CURB-65 were evaluated in 7 and 8 studies, respectively. Using cutoff values of ≥IV and V for the PORT score, the diagnostic odds ratios (DORs) were 5.28 (2.49-11.17) and 3.76 (2.88-4.92), respectively, and the areas under the curve (AUCs) were 0.68 (0.64-0.72) and 0.71 (0.67-0.75), respectively. Conversely, the AUCs for ≥IV and V were 0.71 (0.67-0.76) and 0.74 (0.70-0.78), respectively, when applied only to nonimmunocompromised patients. In contrast, when using cutoff values of ≥2 and ≥3 for CURB-65, the DORs were 3.35 (2.26-4.97) and 2.65 (2.05-3.43), respectively, and the AUCs were 0.65 (0.61-0.69) and 0.66 (0.62-0.70), respectively. Conversely, the AUCs for ≥2 and ≥3 were 0.65 (0.61-0.69) and 0.68 (0.64-0.72), respectively, when applied only to nonimmunocompromised patients. The PORT score and CURB-65 do not have substantial power compared with the tools for CAP patients, although the PORT score is more useful than CURB-65 for predicting mortality in HCAP patients. According to our results, however, these tools, especially the PORT score, can be more useful when limited to nonimmunocompromised patients. © 2017 S. Karger AG, Basel.
Strategies for Near Real Time Estimation of Precipitable Water Vapor
NASA Technical Reports Server (NTRS)
Bar-Sever, Yoaz E.
1996-01-01
Traditionally used for high precision geodesy, the GPS system has recently emerged as an equally powerful tool in atmospheric studies, in particular, climatology and meteorology. There are several products of GPS-based systems that are of interest to climatologists and meteorologists. One of the most useful is the GPS-based estimate of the amount of Precipitable Water Vapor (PWV) in the troposphere. Water vapor is an important variable in the study of climate changes and atmospheric convection (Yuan et al., 1993), and is of crucial importance for severe weather forecasting and operational numerical weather prediction (Kuo et al., 1993).
Power generation using sugar cane bagasse: A heat recovery analysis
NASA Astrophysics Data System (ADS)
Seguro, Jean Vittorio
The sugar industry is facing the need to improve its performance by increasing efficiency and developing profitable by-products. An important possibility is the production of electrical power for sale. Co-generation has been practiced in the sugar industry for a long time in a very inefficient way with the main purpose of getting rid of the bagasse. The goal of this research was to develop a software tool that could be used to improve the way that bagasse is used to generate power. Special focus was given to the heat recovery components of the co-generation plant (economizer, air pre-heater and bagasse dryer) to determine if one, or a combination, of them led to a more efficient co-generation cycle. An extensive review of the state of the art of power generation in the sugar industry was conducted and is summarized in this dissertation. Based on this models were developed. After testing the models and comparing the results with the data collected from the literature, a software application that integrated all these models was developed to simulate the complete co-generation plant. Seven different cycles, three different pressures, and sixty-eight distributions of the flue gas through the heat recovery components can be simulated. The software includes an economic analysis tool that can help the designer determine the economic feasibility of different options. Results from running the simulation are presented that demonstrate its effectiveness in evaluating and comparing the different heat recovery components and power generation cycles. These results indicate that the economizer is the most beneficial option for heat recovery and that the use of waste heat in a bagasse dryer is the least desirable option. Quantitative comparisons of several possible cycle options with the widely-used traditional back-pressure turbine cycle are given. These indicate that a double extraction condensing cycle is best for co-generation purposes. Power generation gains between 40 and 100% are predicted for some cycles with the addition of optimum heat recovery systems.
NASA Astrophysics Data System (ADS)
Matetic, Rudy J.
Over-exposure to noise remains a widespread and serious health hazard in the U.S. mining industries despite 25 years of regulation. Every day, 80% of the nation's miners go to work in an environment where the time weighted average (TWA) noise level exceeds 85 dBA and more than 25% of the miners are exposed to a TWA noise level that exceeds 90 dBA, the permissible exposure limit (PEL). Additionally, MSHA coal noise sample data collected from 2000 to 2002 show that 65% of the equipment whose operators exceeded 100% noise dosage comprise only seven different types of machines; auger miners, bulldozers, continuous miners, front end loaders, roof bolters, shuttle cars (electric), and trucks. In addition, the MSHA data indicate that the roof bolter is third among all the equipment and second among equipment in underground coal whose operators exceed 100% dosage. A research program was implemented to: (1) determine, characterize and to measure sound power levels radiated by a roof bolting machine during differing drilling configurations (thrust, rotational speed, penetration rate, etc.) and utilizing differing types of drilling methods in high compressive strength rock media (>20,000 psi). The research approach characterized the sound power level results from laboratory testing and provided the mining industry with empirical data relative to utilizing differing noise control technologies (drilling configurations and types of drilling methods) in reducing sound power level emissions on a roof bolting machine; (2) distinguish and correlate the empirical data into one, statistically valid, equation, in which, provided the mining industry with a tool to predict overall sound power levels of a roof bolting machine given any type of drilling configuration and drilling method utilized in industry; (3) provided the mining industry with several approaches to predict or determine sound pressure levels in an underground coal mine utilizing laboratory test results from a roof bolting machine and (4) described a method for determining an operators' noise dosage of a roof bolting machine utilizing predicted or determined sound pressure levels.
Modeling of power electronic systems with EMTP
NASA Technical Reports Server (NTRS)
Tam, Kwa-Sur; Dravid, Narayan V.
1989-01-01
In view of the potential impact of power electronics on power systems, there is need for a computer modeling/analysis tool to perform simulation studies on power systems with power electronic components as well as to educate engineering students about such systems. The modeling of the major power electronic components of the NASA Space Station Freedom Electric Power System is described along with ElectroMagnetic Transients Program (EMTP) and it is demonstrated that EMTP can serve as a very useful tool for teaching, design, analysis, and research in the area of power systems with power electronic components. EMTP modeling of power electronic circuits is described and simulation results are presented.
Predicting High-Power Performance in Professional Cyclists.
Sanders, Dajo; Heijboer, Mathieu; Akubat, Ibrahim; Meijer, Kenneth; Hesselink, Matthijs K
2017-03-01
To assess if short-duration (5 to ~300 s) high-power performance can accurately be predicted using the anaerobic power reserve (APR) model in professional cyclists. Data from 4 professional cyclists from a World Tour cycling team were used. Using the maximal aerobic power, sprint peak power output, and an exponential constant describing the decrement in power over time, a power-duration relationship was established for each participant. To test the predictive accuracy of the model, several all-out field trials of different durations were performed by each cyclist. The power output achieved during the all-out trials was compared with the predicted power output by the APR model. The power output predicted by the model showed very large to nearly perfect correlations to the actual power output obtained during the all-out trials for each cyclist (r = .88 ± .21, .92 ± .17, .95 ± .13, and .97 ± .09). Power output during the all-out trials remained within an average of 6.6% (53 W) of the predicted power output by the model. This preliminary pilot study presents 4 case studies on the applicability of the APR model in professional cyclists using a field-based approach. The decrement in all-out performance during high-intensity exercise seems to conform to a general relationship with a single exponential-decay model describing the decrement in power vs increasing duration. These results are in line with previous studies using the APR model to predict performance during brief all-out trials. Future research should evaluate the APR model with a larger sample size of elite cyclists.
Automated benchmarking of peptide-MHC class I binding predictions.
Trolle, Thomas; Metushi, Imir G; Greenbaum, Jason A; Kim, Yohan; Sidney, John; Lund, Ole; Sette, Alessandro; Peters, Bjoern; Nielsen, Morten
2015-07-01
Numerous in silico methods predicting peptide binding to major histocompatibility complex (MHC) class I molecules have been developed over the last decades. However, the multitude of available prediction tools makes it non-trivial for the end-user to select which tool to use for a given task. To provide a solid basis on which to compare different prediction tools, we here describe a framework for the automated benchmarking of peptide-MHC class I binding prediction tools. The framework runs weekly benchmarks on data that are newly entered into the Immune Epitope Database (IEDB), giving the public access to frequent, up-to-date performance evaluations of all participating tools. To overcome potential selection bias in the data included in the IEDB, a strategy was implemented that suggests a set of peptides for which different prediction methods give divergent predictions as to their binding capability. Upon experimental binding validation, these peptides entered the benchmark study. The benchmark has run for 15 weeks and includes evaluation of 44 datasets covering 17 MHC alleles and more than 4000 peptide-MHC binding measurements. Inspection of the results allows the end-user to make educated selections between participating tools. Of the four participating servers, NetMHCpan performed the best, followed by ANN, SMM and finally ARB. Up-to-date performance evaluations of each server can be found online at http://tools.iedb.org/auto_bench/mhci/weekly. All prediction tool developers are invited to participate in the benchmark. Sign-up instructions are available at http://tools.iedb.org/auto_bench/mhci/join. mniel@cbs.dtu.dk or bpeters@liai.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Automated benchmarking of peptide-MHC class I binding predictions
Trolle, Thomas; Metushi, Imir G.; Greenbaum, Jason A.; Kim, Yohan; Sidney, John; Lund, Ole; Sette, Alessandro; Peters, Bjoern; Nielsen, Morten
2015-01-01
Motivation: Numerous in silico methods predicting peptide binding to major histocompatibility complex (MHC) class I molecules have been developed over the last decades. However, the multitude of available prediction tools makes it non-trivial for the end-user to select which tool to use for a given task. To provide a solid basis on which to compare different prediction tools, we here describe a framework for the automated benchmarking of peptide-MHC class I binding prediction tools. The framework runs weekly benchmarks on data that are newly entered into the Immune Epitope Database (IEDB), giving the public access to frequent, up-to-date performance evaluations of all participating tools. To overcome potential selection bias in the data included in the IEDB, a strategy was implemented that suggests a set of peptides for which different prediction methods give divergent predictions as to their binding capability. Upon experimental binding validation, these peptides entered the benchmark study. Results: The benchmark has run for 15 weeks and includes evaluation of 44 datasets covering 17 MHC alleles and more than 4000 peptide-MHC binding measurements. Inspection of the results allows the end-user to make educated selections between participating tools. Of the four participating servers, NetMHCpan performed the best, followed by ANN, SMM and finally ARB. Availability and implementation: Up-to-date performance evaluations of each server can be found online at http://tools.iedb.org/auto_bench/mhci/weekly. All prediction tool developers are invited to participate in the benchmark. Sign-up instructions are available at http://tools.iedb.org/auto_bench/mhci/join. Contact: mniel@cbs.dtu.dk or bpeters@liai.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25717196
Catanzaro, Daniele; Schäffer, Alejandro A.; Schwartz, Russell
2016-01-01
Ductal Carcinoma In Situ (DCIS) is a precursor lesion of Invasive Ductal Carcinoma (IDC) of the breast. Investigating its temporal progression could provide fundamental new insights for the development of better diagnostic tools to predict which cases of DCIS will progress to IDC. We investigate the problem of reconstructing a plausible progression from single-cell sampled data of an individual with Synchronous DCIS and IDC. Specifically, by using a number of assumptions derived from the observation of cellular atypia occurring in IDC, we design a possible predictive model using integer linear programming (ILP). Computational experiments carried out on a preexisting data set of 13 patients with simultaneous DCIS and IDC show that the corresponding predicted progression models are classifiable into categories having specific evolutionary characteristics. The approach provides new insights into mechanisms of clonal progression in breast cancers and helps illustrate the power of the ILP approach for similar problems in reconstructing tumor evolution scenarios under complex sets of constraints. PMID:26353381
Inhomogeneous distribution of water droplets in cloud turbulence
NASA Astrophysics Data System (ADS)
Fouxon, Itzhak; Park, Yongnam; Harduf, Roei; Lee, Changhoon
2015-09-01
We consider sedimentation of small particles in the turbulent flow where fluid accelerations are much smaller than acceleration of gravity g . The particles are dragged by the flow by linear friction force. We demonstrate that the pair-correlation function of particles' concentration diverges with decreasing separation as a power law with negative exponent. This manifests fractal distribution of particles in space. We find that the exponent is proportional to ratio of integral of energy spectrum of turbulence times the wave number over g . The proportionality coefficient is a universal number independent of particle size. We derive the spectrum of Lyapunov exponents that describes the evolution of small patches of particles. It is demonstrated that particles separate dominantly in the horizontal plane. This provides a theory for the recently observed vertical columns formed by the particles. We confirm the predictions by direct numerical simulations of Navier-Stokes turbulence. The predictions include conditions that hold for water droplets in warm clouds thus providing a tool for the prediction of rain formation.
Evaluating Functional Annotations of Enzymes Using the Gene Ontology.
Holliday, Gemma L; Davidson, Rebecca; Akiva, Eyal; Babbitt, Patricia C
2017-01-01
The Gene Ontology (GO) (Ashburner et al., Nat Genet 25(1):25-29, 2000) is a powerful tool in the informatics arsenal of methods for evaluating annotations in a protein dataset. From identifying the nearest well annotated homologue of a protein of interest to predicting where misannotation has occurred to knowing how confident you can be in the annotations assigned to those proteins is critical. In this chapter we explore what makes an enzyme unique and how we can use GO to infer aspects of protein function based on sequence similarity. These can range from identification of misannotation or other errors in a predicted function to accurate function prediction for an enzyme of entirely unknown function. Although GO annotation applies to any gene products, we focus here a describing our approach for hierarchical classification of enzymes in the Structure-Function Linkage Database (SFLD) (Akiva et al., Nucleic Acids Res 42(Database issue):D521-530, 2014) as a guide for informed utilisation of annotation transfer based on GO terms.
De Novo Chromosome Structure Prediction
NASA Astrophysics Data System (ADS)
di Pierro, Michele; Cheng, Ryan R.; Lieberman-Aiden, Erez; Wolynes, Peter G.; Onuchic, Jose'n.
Chromatin consists of DNA and hundreds of proteins that interact with the genetic material. In vivo, chromatin folds into nonrandom structures. The physical mechanism leading to these characteristic conformations, however, remains poorly understood. We recently introduced MiChroM, a model that generates chromosome conformations by using the idea that chromatin can be subdivided into types based on its biochemical interactions. Here we extend and complete our previous finding by showing that structural chromatin types can be inferred from ChIP-Seq data. Chromatin types, which are distinct from DNA sequence, are partially epigenetically controlled and change during cell differentiation, thus constituting a link between epigenetics, chromosomal organization, and cell development. We show that, for GM12878 lymphoblastoid cells we are able to predict accurate chromosome structures with the only input of genomic data. The degree of accuracy achieved by our prediction supports the viability of the proposed physical mechanism of chromatin folding and makes the computational model a powerful tool for future investigations.
Gaussian processes with optimal kernel construction for neuro-degenerative clinical onset prediction
NASA Astrophysics Data System (ADS)
Canas, Liane S.; Yvernault, Benjamin; Cash, David M.; Molteni, Erika; Veale, Tom; Benzinger, Tammie; Ourselin, Sébastien; Mead, Simon; Modat, Marc
2018-02-01
Gaussian Processes (GP) are a powerful tool to capture the complex time-variations of a dataset. In the context of medical imaging analysis, they allow a robust modelling even in case of highly uncertain or incomplete datasets. Predictions from GP are dependent of the covariance kernel function selected to explain the data variance. To overcome this limitation, we propose a framework to identify the optimal covariance kernel function to model the data.The optimal kernel is defined as a composition of base kernel functions used to identify correlation patterns between data points. Our approach includes a modified version of the Compositional Kernel Learning (CKL) algorithm, in which we score the kernel families using a new energy function that depends both the Bayesian Information Criterion (BIC) and the explained variance score. We applied the proposed framework to model the progression of neurodegenerative diseases over time, in particular the progression of autosomal dominantly-inherited Alzheimer's disease, and use it to predict the time to clinical onset of subjects carrying genetic mutation.
Catanzaro, Daniele; Shackney, Stanley E; Schaffer, Alejandro A; Schwartz, Russell
2016-01-01
Ductal Carcinoma In Situ (DCIS) is a precursor lesion of Invasive Ductal Carcinoma (IDC) of the breast. Investigating its temporal progression could provide fundamental new insights for the development of better diagnostic tools to predict which cases of DCIS will progress to IDC. We investigate the problem of reconstructing a plausible progression from single-cell sampled data of an individual with synchronous DCIS and IDC. Specifically, by using a number of assumptions derived from the observation of cellular atypia occurring in IDC, we design a possible predictive model using integer linear programming (ILP). Computational experiments carried out on a preexisting data set of 13 patients with simultaneous DCIS and IDC show that the corresponding predicted progression models are classifiable into categories having specific evolutionary characteristics. The approach provides new insights into mechanisms of clonal progression in breast cancers and helps illustrate the power of the ILP approach for similar problems in reconstructing tumor evolution scenarios under complex sets of constraints.
Reducing Brain Signal Noise in the Prediction of Economic Choices: A Case Study in Neuroeconomics
Sundararajan, Raanju R.; Palma, Marco A.; Pourahmadi, Mohsen
2017-01-01
In order to reduce the noise of brain signals, neuroeconomic experiments typically aggregate data from hundreds of trials collected from a few individuals. This contrasts with the principle of simple and controlled designs in experimental and behavioral economics. We use a frequency domain variant of the stationary subspace analysis (SSA) technique, denoted as DSSA, to filter out the noise (nonstationary sources) in EEG brain signals. The nonstationary sources in the brain signal are associated with variations in the mental state that are unrelated to the experimental task. DSSA is a powerful tool for reducing the number of trials needed from each participant in neuroeconomic experiments and also for improving the prediction performance of an economic choice task. For a single trial, when DSSA is used as a noise reduction technique, the prediction model in a food snack choice experiment has an increase in overall accuracy by around 10% and in sensitivity and specificity by around 20% and in AUC by around 30%, respectively. PMID:29311784