Sample records for baseline method development

  1. A novel approach for baseline correction in 1H-MRS signals based on ensemble empirical mode decomposition.

    PubMed

    Parto Dezfouli, Mohammad Ali; Dezfouli, Mohsen Parto; Rad, Hamidreza Saligheh

    2014-01-01

    Proton magnetic resonance spectroscopy ((1)H-MRS) is a non-invasive diagnostic tool for measuring biochemical changes in the human body. Acquired (1)H-MRS signals may be corrupted due to a wideband baseline signal generated by macromolecules. Recently, several methods have been developed for the correction of such baseline signals, however most of them are not able to estimate baseline in complex overlapped signal. In this study, a novel automatic baseline correction method is proposed for (1)H-MRS spectra based on ensemble empirical mode decomposition (EEMD). This investigation was applied on both the simulated data and the in-vivo (1)H-MRS of human brain signals. Results justify the efficiency of the proposed method to remove the baseline from (1)H-MRS signals.

  2. A New Statistics-Based Online Baseline Restorer for a High Count-Rate Fully Digital System.

    PubMed

    Li, Hongdi; Wang, Chao; Baghaei, Hossain; Zhang, Yuxuan; Ramirez, Rocio; Liu, Shitao; An, Shaohui; Wong, Wai-Hoi

    2010-04-01

    The goal of this work is to develop a novel, accurate, real-time digital baseline restorer using online statistical processing for a high count-rate digital system such as positron emission tomography (PET). In high count-rate nuclear instrumentation applications, analog signals are DC-coupled for better performance. However, the detectors, pre-amplifiers and other front-end electronics would cause a signal baseline drift in a DC-coupling system, which will degrade the performance of energy resolution and positioning accuracy. Event pileups normally exist in a high-count rate system and the baseline drift will create errors in the event pileup-correction. Hence, a baseline restorer (BLR) is required in a high count-rate system to remove the DC drift ahead of the pileup correction. Many methods have been reported for BLR from classic analog methods to digital filter solutions. However a single channel BLR with analog method can only work under 500 kcps count-rate, and normally an analog front-end application-specific integrated circuits (ASIC) is required for the application involved hundreds BLR such as a PET camera. We have developed a simple statistics-based online baseline restorer (SOBLR) for a high count-rate fully digital system. In this method, we acquire additional samples, excluding the real gamma pulses, from the existing free-running ADC in the digital system, and perform online statistical processing to generate a baseline value. This baseline value will be subtracted from the digitized waveform to retrieve its original pulse with zero-baseline drift. This method can self-track the baseline without a micro-controller involved. The circuit consists of two digital counter/timers, one comparator, one register and one subtraction unit. Simulation shows a single channel works at 30 Mcps count-rate with pileup condition. 336 baseline restorer circuits have been implemented into 12 field-programmable-gate-arrays (FPGA) for our new fully digital PET system.

  3. A method to establish seismic noise baselines for automated station assessment

    USGS Publications Warehouse

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  4. Development of the Phase-up Technology of the Radio Telescopes: 6.7 GHz Methanol Maser Observations with Phased Hitachi 32 m and Takahagi 32 m Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Takefuji, K.; Sugiyama, K.; Yonekura, Y.; Saito, T.; Fujisawa, K.; Kondo, T.

    2017-11-01

    For the sake of high-sensitivity 6.7 GHz methanol maser observations, we developed a new technology for coherently combining the two signals from the Hitachi 32 m radio telescope and the Takahagi 32 m radio telescope of the Japanese Very long baseline interferometer Network (JVN), where the two telescopes were separated by about 260 m. After the two telescopes were phased as a twofold larger single telescope, the mean signal-to-noise ratio (S/N) of the 6.7 GHz methanol masers observed by the phased telescopes was improved to 1.254-fold higher than that of the single dish, through a very long baseline interferometry (VLBI) experiment on the 50 km baseline of the Kashima 34 m telescope and the 1000 km baseline of the Yamaguchi 32 m telescope. Furthermore, we compared the S/Ns of the 6.7 GHz maser spectra for two methods. One is a VLBI method and the other is the newly developed digital position switching that is a similar technology to that used in noise-canceling headphones. Finally, we confirmed that the mean S/N of method of the digital position switching (ON-OFF) was 1.597-fold higher than that of the VLBI method.

  5. Pretreatment data is highly predictive of liver chemistry signals in clinical trials

    PubMed Central

    Cai, Zhaohui; Bresell, Anders; Steinberg, Mark H; Silberg, Debra G; Furlong, Stephen T

    2012-01-01

    Purpose The goal of this retrospective analysis was to assess how well predictive models could determine which patients would develop liver chemistry signals during clinical trials based on their pretreatment (baseline) information. Patients and methods Based on data from 24 late-stage clinical trials, classification models were developed to predict liver chemistry outcomes using baseline information, which included demographics, medical history, concomitant medications, and baseline laboratory results. Results Predictive models using baseline data predicted which patients would develop liver signals during the trials with average validation accuracy around 80%. Baseline levels of individual liver chemistry tests were most important for predicting their own elevations during the trials. High bilirubin levels at baseline were not uncommon and were associated with a high risk of developing biochemical Hy’s law cases. Baseline γ-glutamyltransferase (GGT) level appeared to have some predictive value, but did not increase predictability beyond using established liver chemistry tests. Conclusion It is possible to predict which patients are at a higher risk of developing liver chemistry signals using pretreatment (baseline) data. Derived knowledge from such predictions may allow proactive and targeted risk management, and the type of analysis described here could help determine whether new biomarkers offer improved performance over established ones. PMID:23226004

  6. A new estimator for VLBI baseline length repeatability

    NASA Astrophysics Data System (ADS)

    Titov, O.

    2009-11-01

    The goal of this paper is to introduce a more effective technique to approximate for the “repeatability-baseline length” relationship that is used to evaluate the quality of geodetic VLBI results. Traditionally, this relationship is approximated by a quadratic function of baseline length over all baselines. The new model incorporates the mean number of observed group delays of the reference radio sources (i.e. estimated as global parameters) used in the estimation of each baseline. It is shown that the new method provides a better approximation of the “repeatability-baseline length” relationship than the traditional model. Further development of the new approach comes down to modeling the repeatability as a function of two parameters: baseline length and baseline slewing rate. Within the framework of this new approach the station vertical and horizontal uncertainties can be treated as a function of baseline length. While the previous relationship indicated that the station vertical uncertainties are generally 4-5 times larger than the horizontal uncertainties, the vertical uncertainties as determined by the new method are only larger by a factor of 1.44 over all baseline lengths.

  7. Development of a S/w System for Relative Positioning Using GPS Carrier Phase

    NASA Astrophysics Data System (ADS)

    Ahn, Yong-Won; Kim, Chun-Hwey; Park, Pil-Ho; Park, Jong-Uk; Jo, Jeong-Ho

    1997-12-01

    We developed a GPS phase data processing S/W system which calculates baseline vectors and distances between two points located in the surface of the Earth. For this development a Double-Difference method and L1 carrier phase data from GPS(Global Positioning System) were used. This S/W system consists of four main parts : satellite position calculation, Single-Difference equation, Double-Difference equation, and correlation. To verify our S/W, we fixed KAO(N36.37, E127.37, H77.61m), one of the International GPS Services for Geodynamics, which is located at Tae-Jon, and we measured baseline vectors and relative distances with data from observations at approximate baseline distances of 2.7, 42.1, 81.1, 146.6km. Then we compared the vectors and distances with the data which we obtained from the GPSurvery S/W system, with the L1/L2 ION-Free method and broadcast ephemeris. From the comparison of the vectors and distances with the data from the GPSurvey S/W system, we found baseline vectors X, Y, Z and baseline distances matched well within the extent of 50cm and 10cm, respectively.

  8. Baseline estimation from simultaneous satellite laser tracking

    NASA Technical Reports Server (NTRS)

    Dedes, George C.

    1987-01-01

    Simultaneous Range Differences (SRDs) to Lageos are obtained by dividing the observing stations into pairs with quasi-simultaneous observations. For each of those pairs the station with the least number of observations is identified, and at its observing epochs interpolated ranges for the alternate station are generated. The SRD observables are obtained by subtracting the actually observed laser range of the station having the least number of observations from the interpolated ranges of the alternate station. On the basis of these observables semidynamic single baseline solutions were performed. The aim of these solutions is to further develop and implement the SRD method in the real data environment, to assess its accuracy, its advantages and disadvantages as related to the range dynamic mode methods, when the baselines are the only parameters of interest. Baselines, using simultaneous laser range observations to Lageos, were also estimated through the purely geometric method. These baselines formed the standards the standards of comparison in the accuracy assessment of the SRD method when compared to that of the range dynamic mode methods. On the basis of this comparison it was concluded that for baselines of regional extent the SRD method is very effective, efficient, and at least as accurate as the range dynamic mode methods, and that on the basis of a simple orbital modeling and a limited orbit adjustment. The SRD method is insensitive to the inconsistencies affecting the terrestrial reference frame and simultaneous adjustment of the Earth Rotation Parameters (ERPs) is not necessary.

  9. Improved Modeling of Finite-Rate Turbulent Combustion Processes in Research Combustors

    NASA Technical Reports Server (NTRS)

    VanOverbeke, Thomas J.

    1998-01-01

    The objective of this thesis is to further develop and test a stochastic model of turbulent combustion in recirculating flows. There is a requirement to increase the accuracy of multi-dimensional combustion predictions. As turbulence affects reaction rates, this interaction must be more accurately evaluated. In this work a more physically correct way of handling the interaction of turbulence on combustion is further developed and tested. As turbulence involves randomness, stochastic modeling is used. Averaged values such as temperature and species concentration are found by integrating the probability density function (pdf) over the range of the scalar. The model in this work does not assume the pdf type, but solves for the evolution of the pdf using the Monte Carlo solution technique. The model is further developed by including a more robust reaction solver, by using accurate thermodynamics and by more accurate transport elements. The stochastic method is used with Semi-Implicit Method for Pressure-Linked Equations. The SIMPLE method is used to solve for velocity, pressure, turbulent kinetic energy and dissipation. The pdf solver solves for temperature and species concentration. Thus, the method is partially familiar to combustor engineers. The method is compared to benchmark experimental data and baseline calculations. The baseline method was tested on isothermal flows, evaporating sprays and combusting sprays. Pdf and baseline predictions were performed for three diffusion flames and one premixed flame. The pdf method predicted lower combustion rates than the baseline method in agreement with the data, except for the premixed flame. The baseline and stochastic predictions bounded the experimental data for the premixed flame. The use of a continuous mixing model or relax to mean mixing model had little effect on the prediction of average temperature. Two grids were used in a hydrogen diffusion flame simulation. Grid density did not effect the predictions except for peak temperature and tangential velocity. The hybrid pdf method did take longer and required more memory, but has a theoretical basis to extend to many reaction steps which cannot be said of current turbulent combustion models.

  10. Linear MALDI-ToF simultaneous spectrum deconvolution and baseline removal.

    PubMed

    Picaud, Vincent; Giovannelli, Jean-Francois; Truntzer, Caroline; Charrier, Jean-Philippe; Giremus, Audrey; Grangeat, Pierre; Mercier, Catherine

    2018-04-05

    Thanks to a reasonable cost and simple sample preparation procedure, linear MALDI-ToF spectrometry is a growing technology for clinical microbiology. With appropriate spectrum databases, this technology can be used for early identification of pathogens in body fluids. However, due to the low resolution of linear MALDI-ToF instruments, robust and accurate peak picking remains a challenging task. In this context we propose a new peak extraction algorithm from raw spectrum. With this method the spectrum baseline and spectrum peaks are processed jointly. The approach relies on an additive model constituted by a smooth baseline part plus a sparse peak list convolved with a known peak shape. The model is then fitted under a Gaussian noise model. The proposed method is well suited to process low resolution spectra with important baseline and unresolved peaks. We developed a new peak deconvolution procedure. The paper describes the method derivation and discusses some of its interpretations. The algorithm is then described in a pseudo-code form where the required optimization procedure is detailed. For synthetic data the method is compared to a more conventional approach. The new method reduces artifacts caused by the usual two-steps procedure, baseline removal then peak extraction. Finally some results on real linear MALDI-ToF spectra are provided. We introduced a new method for peak picking, where peak deconvolution and baseline computation are performed jointly. On simulated data we showed that this global approach performs better than a classical one where baseline and peaks are processed sequentially. A dedicated experiment has been conducted on real spectra. In this study a collection of spectra of spiked proteins were acquired and then analyzed. Better performances of the proposed method, in term of accuracy and reproductibility, have been observed and validated by an extended statistical analysis.

  11. Predicting infant cortical surface development using a 4D varifold-based learning framework and local topography-based shape morphing.

    PubMed

    Rekik, Islem; Li, Gang; Lin, Weili; Shen, Dinggang

    2016-02-01

    Longitudinal neuroimaging analysis methods have remarkably advanced our understanding of early postnatal brain development. However, learning predictive models to trace forth the evolution trajectories of both normal and abnormal cortical shapes remains broadly absent. To fill this critical gap, we pioneered the first prediction model for longitudinal developing cortical surfaces in infants using a spatiotemporal current-based learning framework solely from the baseline cortical surface. In this paper, we detail this prediction model and even further improve its performance by introducing two key variants. First, we use the varifold metric to overcome the limitations of the current metric for surface registration that was used in our preliminary study. We also extend the conventional varifold-based surface registration model for pairwise registration to a spatiotemporal surface regression model. Second, we propose a morphing process of the baseline surface using its topographic attributes such as normal direction and principal curvature sign. Specifically, our method learns from longitudinal data both the geometric (vertices positions) and dynamic (temporal evolution trajectories) features of the infant cortical surface, comprising a training stage and a prediction stage. In the training stage, we use the proposed varifold-based shape regression model to estimate geodesic cortical shape evolution trajectories for each training subject. We then build an empirical mean spatiotemporal surface atlas. In the prediction stage, given an infant, we select the best learnt features from training subjects to simultaneously predict the cortical surface shapes at all later timepoints, based on similarity metrics between this baseline surface and the learnt baseline population average surface atlas. We used a leave-one-out cross validation method to predict the inner cortical surface shape at 3, 6, 9 and 12 months of age from the baseline cortical surface shape at birth. Our method attained a higher prediction accuracy and better captured the spatiotemporal dynamic change of the highly folded cortical surface than the previous proposed prediction method. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Comparison of Three Methods Estimating Baseline Creatinine For Acute Kidney Injury in Hospitalized Patients: a Multicentre Survey in Third-Level Urban Hospitals of China.

    PubMed

    Lang, Xia-Bing; Yang, Yi; Yang, Ju-Rong; Wan, Jian-Xin; Yu, Sheng-Qiang; Cui, Jiong; Tang, Xiao-Jing; Chen, Jianghua

    2018-01-01

    A lack of baseline serum creatinine (SCr) data leads to underestimation of the burden caused by acute kidney injury (AKI) in developing countries. The goal of this study was to investigate the effects of various baseline SCr analysis methods on the current diagnosis of AKI in hospitalized patients. Patients with at least one SCr value during their hospital stay between January 1, 2011 and December 31, 2012 were retrospectively included in the study. The baseline SCr was determined either by the minimum SCr (SCrMIN) or the estimated SCr using the MDRD formula (SCrGFR-75). We also used the dynamic baseline SCr (SCrdynamic) in accordance with the 7 day/48 hour time window. AKI was defined based on the KDIGO SCr criteria. Of 562,733 hospitalized patients, 350,458 (62.3%) had at least one SCr determination, and 146,185 (26.0%) had repeat SCr tests. AKI was diagnosed in 13,883 (2.5%) patients using the SCrMIN, 21,281 (3.8%) using the SCrGFR-75 and 9,288 (1.7%) using the SCrdynamic. Compared with the non-AKI patients, AKI patients had a higher in-hospital mortality rate regardless of the baseline SCr analysis method. Because of the scarcity of SCr data, imputation of the baseline SCr is necessary to remedy the missing data. The detection rate of AKI varies depending on the different imputation methods. SCrGFR-75 can identify more AKI cases than the other two methods. © 2018 The Author(s). Published by S. Karger AG, Basel.

  13. Subsonic panel method for designing wing surfaces from pressure distribution

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.; Hawk, J. D.

    1983-01-01

    An iterative method has been developed for designing wing section contours corresponding to a prescribed subcritical distribution of pressure. The calculations are initialized by using a surface panel method to analyze a baseline wing or wing-fuselage configuration. A first-order expansion to the baseline panel method equations is then used to calculate a matrix containing the partial derivative of potential at each control point with respect to each unknown geometry parameter. In every iteration cycle, the matrix is used both to calculate the geometry perturbation and to analyze the perturbed geometry. The distribution of potential on the perturbed geometry is established by simple linear extrapolation from the baseline solution. The extrapolated potential is converted to pressure by Bernoulli's equation. Not only is the accuracy of the approach good for very large perturbations, but the computing cost of each complete iteration cycle is substantially less than one analysis solution by a conventional panel method.

  14. EFPI sensor utilizing optical spectrum analyzer with tunable laser: detection of baseline oscillations faster than spectrum acquisition rate

    NASA Astrophysics Data System (ADS)

    Ushakov, Nikolai; Liokumovich, Leonid

    2014-05-01

    A novel approach for extrinsic Fabry-Perot interferometer baseline measurement has been developed. The principles of frequency-scanning interferometry are utilized for registration of the interferometer spectral function, from which the baseline is demodulated. The proposed approach enables one to capture the absolute baseline variations at frequencies much higher than the spectral acquisition rate. Despite the conventional approaches, associating a single baseline indication to the registered spectrum, in the proposed method a modified frequency detection procedure is applied to the spectrum. This provides an ability to capture the baseline variations which took place during the spectrum acquisition. The limitations on the parameters of the possibly registered baseline variations are formulated. The experimental verification of the proposed approach for different perturbations has been performed.

  15. Comparison of baseline removal methods for laser-induced breakdown spectroscopy of geological samples

    NASA Astrophysics Data System (ADS)

    Dyar, M. Darby; Giguere, Stephen; Carey, CJ; Boucher, Thomas

    2016-12-01

    This project examines the causes, effects, and optimization of continuum removal in laser-induced breakdown spectroscopy (LIBS) to produce the best possible prediction accuracy of elemental composition in geological samples. We compare prediction accuracy resulting from several different techniques for baseline removal, including asymmetric least squares (ALS), adaptive iteratively reweighted penalized least squares (Air-PLS), fully automatic baseline correction (FABC), continuous wavelet transformation, median filtering, polynomial fitting, the iterative thresholding Dietrich method, convex hull/rubber band techniques, and a newly-developed technique for Custom baseline removal (BLR). We assess the predictive performance of these methods using partial least-squares analysis for 13 elements of geological interest, expressed as the weight percentages of SiO2, Al2O3, TiO2, FeO, MgO, CaO, Na2O, K2O, and the parts per million concentrations of Ni, Cr, Zn, Mn, and Co. We find that previously published methods for baseline subtraction generally produce equivalent prediction accuracies for major elements. When those pre-existing methods are used, automated optimization of their adjustable parameters is always necessary to wring the best predictive accuracy out of a data set; ideally, it should be done for each individual variable. The new technique of Custom BLR produces significant improvements in prediction accuracy over existing methods across varying geological data sets, instruments, and varying analytical conditions. These results also demonstrate the dual objectives of the continuum removal problem: removing a smooth underlying signal to fit individual peaks (univariate analysis) versus using feature selection to select only those channels that contribute to best prediction accuracy for multivariate analyses. Overall, the current practice of using generalized, one-method-fits-all-spectra baseline removal results in poorer predictive performance for all methods. The extra steps needed to optimize baseline removal for each predicted variable and empower multivariate techniques with the best possible input data for optimal prediction accuracy are shown to be well worth the slight increase in necessary computations and complexity.

  16. Integrated Advanced Sounding Unit-A (AMSU-A). Configuration Management Plan

    NASA Technical Reports Server (NTRS)

    Cavanaugh, J.

    1996-01-01

    The purpose of this plan is to identify the baseline to be established during the development life cycle of the integrated AMSU-A, and define the methods and procedures which Aerojet will follow in the implementation of configuration control for each established baseline. Also this plan establishes the Configuration Management process to be used for the deliverable hardware, software, and firmware of the Integrated AMSU-A during development, design, fabrication, test, and delivery.

  17. Extending the Operational Envelope of a Turbofan Engine Simulation into the Sub-Idle Region

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Hamley, Andrew J.; Guo, Ten-Huei; Litt, Jonathan S.

    2016-01-01

    In many non-linear gas turbine simulations, operation in the sub-idle region can lead to model instability. This paper lays out a method for extending the operational envelope of a map based gas turbine simulation to include the sub-idle region. This method develops a multi-simulation solution where the baseline component maps are extrapolated below the idle level and an alternate model is developed to serve as a safety net when the baseline model becomes unstable or unreliable. Sub-idle model development takes place in two distinct operational areas, windmilling/shutdown and purge/cranking/ startup. These models are based on derived steady state operating points with transient values extrapolated between initial (known) and final (assumed) states. Model transitioning logic is developed to predict baseline model sub-idle instability, and transition smoothly and stably to the backup sub-idle model. Results from the simulation show a realistic approximation of sub-idle behavior as compared to generic sub-idle engine performance that allows the engine to operate continuously and stably from shutdown to full power.

  18. Extending the Operational Envelope of a Turbofan Engine Simulation into the Sub-Idle Region

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes Walter; Hamley, Andrew J.; Guo, Ten-Huei; Litt, Jonathan S.

    2016-01-01

    In many non-linear gas turbine simulations, operation in the sub-idle region can lead to model instability. This paper lays out a method for extending the operational envelope of a map based gas turbine simulation to include the sub-idle region. This method develops a multi-simulation solution where the baseline component maps are extrapolated below the idle level and an alternate model is developed to serve as a safety net when the baseline model becomes unstable or unreliable. Sub-idle model development takes place in two distinct operational areas, windmilling/shutdown and purge/cranking/startup. These models are based on derived steady state operating points with transient values extrapolated between initial (known) and final (assumed) states. Model transitioning logic is developed to predict baseline model sub-idle instability, and transition smoothly and stably to the backup sub-idle model. Results from the simulation show a realistic approximation of sub-idle behavior as compared to generic sub-idle engine performance that allows the engine to operate continuously and stably from shutdown to full power.

  19. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan

    2015-10-05

    Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reductionmore » in the amount of reserves that must be held to accommodate the uncertainty of solar power output.« less

  20. Operationalizing clean development mechanism baselines: A case study of China's electrical sector

    NASA Astrophysics Data System (ADS)

    Steenhof, Paul A.

    The global carbon market is rapidly developing as the first commitment period of the Kyoto Protocol draws closer and Parties to the Protocol with greenhouse gas (GHG) emission reduction targets seek alternative ways to reduce their emissions. The Protocol includes the Clean Development Mechanism (CDM), a tool that encourages project-based investments to be made in developing nations that will lead to an additional reduction in emissions. Due to China's economic size and rate of growth, technological characteristics, and its reliance on coal, it contains a large proportion of the global CDM potential. As China's economy modernizes, more technologies and processes are requiring electricity and demand for this energy source is accelerating rapidly. Relatively inefficient technology to generate electricity in China thereby results in the electrical sector having substantial GHG emission reduction opportunities as related to the CDM. In order to ensure the credibility of the CDM in leading to a reduction in GHG emissions, it is important that the baseline method used in the CDM approval process is scientifically sound and accessible for both others to use and for evaluation purposes. Three different methods for assessing CDM baselines and environmental additionality are investigated in the context of China's electrical sector: a method based on a historical perspective of the electrical sector (factor decomposition), a method structured upon a current perspective (operating and build margins), and a simulation of the future (dispatch analysis). Assessing future emission levels for China's electrical sector is a very challenging task given the complexity of the system, its dynamics, and that it is heavily influenced by internal and external forces, but of the different baseline methods investigated, dispatch modelling is best suited for the Chinese context as it is able to consider the important regional and temporal dimensions of its economy and its future development. For China, the most promising options for promoting sustainable development, one of the goals of the Kyoto Protocol, appear to be tied to increasing electrical end-use and generation efficiency, particularly clean coal technology for electricity generation since coal will likely continue to be a dominant primary fuel.

  1. Investigating the Baseline Skills of Research Students Using a Competency-Based Self-Assessment Method

    ERIC Educational Resources Information Center

    Bromley, Anthony P.; Boran, James R.; Myddelton, William A.

    2007-01-01

    Recent government-led initiatives are changing the nature of the UK PhD to support the greater development of transferable skills. There are similar initiatives internationally. A key requirement and challenge is to effectively assess the "baseline" skills of a cohort on entry to a research programme and then monitor their progress in…

  2. Testing a machine-learning algorithm to predict the persistence and severity of major depressive disorder from baseline self-reports.

    PubMed

    Kessler, R C; van Loo, H M; Wardenaar, K J; Bossarte, R M; Brenner, L A; Cai, T; Ebert, D D; Hwang, I; Li, J; de Jonge, P; Nierenberg, A A; Petukhova, M V; Rosellini, A J; Sampson, N A; Schoevers, R A; Wilcox, M A; Zaslavsky, A M

    2016-10-01

    Heterogeneity of major depressive disorder (MDD) illness course complicates clinical decision-making. Although efforts to use symptom profiles or biomarkers to develop clinically useful prognostic subtypes have had limited success, a recent report showed that machine-learning (ML) models developed from self-reports about incident episode characteristics and comorbidities among respondents with lifetime MDD in the World Health Organization World Mental Health (WMH) Surveys predicted MDD persistence, chronicity and severity with good accuracy. We report results of model validation in an independent prospective national household sample of 1056 respondents with lifetime MDD at baseline. The WMH ML models were applied to these baseline data to generate predicted outcome scores that were compared with observed scores assessed 10-12 years after baseline. ML model prediction accuracy was also compared with that of conventional logistic regression models. Area under the receiver operating characteristic curve based on ML (0.63 for high chronicity and 0.71-0.76 for the other prospective outcomes) was consistently higher than for the logistic models (0.62-0.70) despite the latter models including more predictors. A total of 34.6-38.1% of respondents with subsequent high persistence chronicity and 40.8-55.8% with the severity indicators were in the top 20% of the baseline ML-predicted risk distribution, while only 0.9% of respondents with subsequent hospitalizations and 1.5% with suicide attempts were in the lowest 20% of the ML-predicted risk distribution. These results confirm that clinically useful MDD risk-stratification models can be generated from baseline patient self-reports and that ML methods improve on conventional methods in developing such models.

  3. Testing a machine-learning algorithm to predict the persistence and severity of major depressive disorder from baseline self-reports

    PubMed Central

    Kessler, Ronald C.; van Loo, Hanna M.; Wardenaar, Klaas J.; Bossarte, Robert M.; Brenner, Lisa A.; Cai, Tianxi; Ebert, David Daniel; Hwang, Irving; Li, Junlong; de Jonge, Peter; Nierenberg, Andrew A.; Petukhova, Maria V.; Rosellini, Anthony J.; Sampson, Nancy A.; Schoevers, Robert A.; Wilcox, Marsha A.; Zaslavsky, Alan M.

    2015-01-01

    Heterogeneity of major depressive disorder (MDD) illness course complicates clinical decision-making. While efforts to use symptom profiles or biomarkers to develop clinically useful prognostic subtypes have had limited success, a recent report showed that machine learning (ML) models developed from self-reports about incident episode characteristics and comorbidities among respondents with lifetime MDD in the World Health Organization World Mental Health (WMH) Surveys predicted MDD persistence, chronicity, and severity with good accuracy. We report results of model validation in an independent prospective national household sample of 1,056 respondents with lifetime MDD at baseline. The WMH ML models were applied to these baseline data to generate predicted outcome scores that were compared to observed scores assessed 10–12 years after baseline. ML model prediction accuracy was also compared to that of conventional logistic regression models. Area under the receiver operating characteristic curve (AUC) based on ML (.63 for high chronicity and .71–.76 for the other prospective outcomes) was consistently higher than for the logistic models (.62–.70) despite the latter models including more predictors. 34.6–38.1% of respondents with subsequent high persistence-chronicity and 40.8–55.8% with the severity indicators were in the top 20% of the baseline ML predicted risk distribution, while only 0.9% of respondents with subsequent hospitalizations and 1.5% with suicide attempts were in the lowest 20% of the ML predicted risk distribution. These results confirm that clinically useful MDD risk stratification models can be generated from baseline patient self-reports and that ML methods improve on conventional methods in developing such models. PMID:26728563

  4. Methods for network meta-analysis of continuous outcomes using individual patient data: a case study in acupuncture for chronic pain.

    PubMed

    Saramago, Pedro; Woods, Beth; Weatherly, Helen; Manca, Andrea; Sculpher, Mark; Khan, Kamran; Vickers, Andrew J; MacPherson, Hugh

    2016-10-06

    Network meta-analysis methods, which are an extension of the standard pair-wise synthesis framework, allow for the simultaneous comparison of multiple interventions and consideration of the entire body of evidence in a single statistical model. There are well-established advantages to using individual patient data to perform network meta-analysis and methods for network meta-analysis of individual patient data have already been developed for dichotomous and time-to-event data. This paper describes appropriate methods for the network meta-analysis of individual patient data on continuous outcomes. This paper introduces and describes network meta-analysis of individual patient data models for continuous outcomes using the analysis of covariance framework. Comparisons are made between this approach and change score and final score only approaches, which are frequently used and have been proposed in the methodological literature. A motivating example on the effectiveness of acupuncture for chronic pain is used to demonstrate the methods. Individual patient data on 28 randomised controlled trials were synthesised. Consistency of endpoints across the evidence base was obtained through standardisation and mapping exercises. Individual patient data availability avoided the use of non-baseline-adjusted models, allowing instead for analysis of covariance models to be applied and thus improving the precision of treatment effect estimates while adjusting for baseline imbalance. The network meta-analysis of individual patient data using the analysis of covariance approach is advocated to be the most appropriate modelling approach for network meta-analysis of continuous outcomes, particularly in the presence of baseline imbalance. Further methods developments are required to address the challenge of analysing aggregate level data in the presence of baseline imbalance.

  5. Development of new methodologies for evaluating the energy performance of new commercial buildings

    NASA Astrophysics Data System (ADS)

    Song, Suwon

    The concept of Measurement and Verification (M&V) of a new building continues to become more important because efficient design alone is often not sufficient to deliver an efficient building. Simulation models that are calibrated to measured data can be used to evaluate the energy performance of new buildings if they are compared to energy baselines such as similar buildings, energy codes, and design standards. Unfortunately, there is a lack of detailed M&V methods and analysis methods to measure energy savings from new buildings that would have hypothetical energy baselines. Therefore, this study developed and demonstrated several new methodologies for evaluating the energy performance of new commercial buildings using a case-study building in Austin, Texas. First, three new M&V methods were developed to enhance the previous generic M&V framework for new buildings, including: (1) The development of a method to synthesize weather-normalized cooling energy use from a correlation of Motor Control Center (MCC) electricity use when chilled water use is unavailable, (2) The development of an improved method to analyze measured solar transmittance against incidence angle for sample glazing using different solar sensor types, including Eppley PSP and Li-Cor sensors, and (3) The development of an improved method to analyze chiller efficiency and operation at part-load conditions. Second, three new calibration methods were developed and analyzed, including: (1) A new percentile analysis added to the previous signature method for use with a DOE-2 calibration, (2) A new analysis to account for undocumented exhaust air in DOE-2 calibration, and (3) An analysis of the impact of synthesized direct normal solar radiation using the Erbs correlation on DOE-2 simulation. Third, an analysis of the actual energy savings compared to three different energy baselines was performed, including: (1) Energy Use Index (EUI) comparisons with sub-metered data, (2) New comparisons against Standards 90.1-1989 and 90.1-2001, and (3) A new evaluation of the performance of selected Energy Conservation Design Measures (ECDMs). Finally, potential energy savings were also simulated from selected improvements, including: minimum supply air flow, undocumented exhaust air, and daylighting.

  6. Uncertainty Estimation Improves Energy Measurement and Verification Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, Travis; Price, Phillip N.; Sohn, Michael D.

    2014-05-14

    Implementing energy conservation measures in buildings can reduce energy costs and environmental impacts, but such measures cost money to implement so intelligent investment strategies require the ability to quantify the energy savings by comparing actual energy used to how much energy would have been used in absence of the conservation measures (known as the baseline energy use). Methods exist for predicting baseline energy use, but a limitation of most statistical methods reported in the literature is inadequate quantification of the uncertainty in baseline energy use predictions. However, estimation of uncertainty is essential for weighing the risks of investing in retrofits.more » Most commercial buildings have, or soon will have, electricity meters capable of providing data at short time intervals. These data provide new opportunities to quantify uncertainty in baseline predictions, and to do so after shorter measurement durations than are traditionally used. In this paper, we show that uncertainty estimation provides greater measurement and verification (M&V) information and helps to overcome some of the difficulties with deciding how much data is needed to develop baseline models and to confirm energy savings. We also show that cross-validation is an effective method for computing uncertainty. In so doing, we extend a simple regression-based method of predicting energy use using short-interval meter data. We demonstrate the methods by predicting energy use in 17 real commercial buildings. We discuss the benefits of uncertainty estimates which can provide actionable decision making information for investing in energy conservation measures.« less

  7. Ground deformation monitoring using small baseline DInSAR technique: A case study in Taiyuan City from 2003 to 2009

    USGS Publications Warehouse

    Wu, H.-A.; Zhang, Y.-H.; Chen, X.-Y.; Lu, T.; Du, J.; Sun, Z.-H.; Sun, G.-T.

    2011-01-01

    DInSAR technique based on time series of SAR images has been very popular to monitor ground stow deformation in recent years such as permanent scatterers (PS) method small baseline subsets (SBAS) method and coherent targets (CT) method. By taking advantage of PS method and CT method in this paper small baseline DTnSAR technique is used to investigate the ground deformation of Taiyuan City Shanxi Province from 2003 to 2009 by using 23 ENVISAT ASAR images. The experiment results demonstrate that: (1) during this period four significant subsidence centers have been developed in Taiyuan namely Xiayuan Wujiabu Xiaodian Sunjiazhai. The largest subsidence center is Sunjiazhai with an average subsidence rate of -77. 28 mm/a; (2) The subsidence of the old center Wanbolin has sHowed down. And the subsidence in the northern region has stopped and some areas even rebounded. (3) The change of subsidence centers indicates that the control measures of "closing wells and reducing exploitation" taken by the Taiyuan government has achieved initial effects. (4) The experiment results have been validated with leveling data and the acouracy is 2. 90 mm which shows that the small baseline DInSAR technique can be used to monitor urban ground deformation.

  8. Increasing conclusiveness of clinical breath analysis by improved baseline correction of multi capillary column - ion mobility spectrometry (MCC-IMS) data.

    PubMed

    Szymańska, Ewa; Tinnevelt, Gerjen H; Brodrick, Emma; Williams, Mark; Davies, Antony N; van Manen, Henk-Jan; Buydens, Lutgarde M C

    2016-08-05

    Current challenges of clinical breath analysis include large data size and non-clinically relevant variations observed in exhaled breath measurements, which should be urgently addressed with competent scientific data tools. In this study, three different baseline correction methods are evaluated within a previously developed data size reduction strategy for multi capillary column - ion mobility spectrometry (MCC-IMS) datasets. Introduced for the first time in breath data analysis, the Top-hat method is presented as the optimum baseline correction method. A refined data size reduction strategy is employed in the analysis of a large breathomic dataset on a healthy and respiratory disease population. New insights into MCC-IMS spectra differences associated with respiratory diseases are provided, demonstrating the additional value of the refined data analysis strategy in clinical breath analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Development of MCAERO wing design panel method with interactive graphics module

    NASA Technical Reports Server (NTRS)

    Hawk, J. D.; Bristow, D. R.

    1984-01-01

    A reliable and efficient iterative method has been developed for designing wing section contours corresponding to a prescribed subcritical pressure distribution. The design process is initialized by using MCAERO (MCAIR 3-D Subsonic Potential Flow Analysis Code) to analyze a baseline configuration. A second program DMCAERO is then used to calculate a matrix containing the partial derivative of potential at each control point with respect to each unknown geometry parameter by applying a first-order expansion to the baseline equations in MCAERO. This matrix is calculated only once but is used in each iteration cycle to calculate the geometry perturbation and to analyze the perturbed geometry. The potential on the new geometry is calculated by linear extrapolation from the baseline solution. This extrapolated potential is converted to velocity by numerical differentiation, and velocity is converted to pressure by using Bernoulli's equation. There is an interactive graphics option which allows the user to graphically display the results of the design process and to interactively change either the geometry or the prescribed pressure distribution.

  10. An Automated Baseline Correction Method Based on Iterative Morphological Operations.

    PubMed

    Chen, Yunliang; Dai, Liankui

    2018-05-01

    Raman spectra usually suffer from baseline drift caused by fluorescence or other reasons. Therefore, baseline correction is a necessary and crucial step that must be performed before subsequent processing and analysis of Raman spectra. An automated baseline correction method based on iterative morphological operations is proposed in this work. The method can adaptively determine the structuring element first and then gradually remove the spectral peaks during iteration to get an estimated baseline. Experiments on simulated data and real-world Raman data show that the proposed method is accurate, fast, and flexible for handling different kinds of baselines in various practical situations. The comparison of the proposed method with some state-of-the-art baseline correction methods demonstrates its advantages over the existing methods in terms of accuracy, adaptability, and flexibility. Although only Raman spectra are investigated in this paper, the proposed method is hopefully to be used for the baseline correction of other analytical instrumental signals, such as IR spectra and chromatograms.

  11. Laser tracker orientation in confined space using on-board targets

    NASA Astrophysics Data System (ADS)

    Gao, Yang; Kyle, Stephen; Lin, Jiarui; Yang, Linghui; Ren, Yu; Zhu, Jigui

    2016-08-01

    This paper presents a novel orientation method for two laser trackers using on-board targets attached to the tracker head and rotating with it. The technique extends an existing method developed for theodolite intersection systems which are now rarely used. This method requires only a very narrow space along the baseline between the instrument heads, in order to establish the orientation relationship. This has potential application in environments where space is restricted. The orientation parameters can be calculated by means of two-face reciprocal measurements to the on-board targets, and measurements to a common point close to the baseline. An accurate model is then applied which can be solved through nonlinear optimization. Experimental comparison has been made with the conventional orientation method, which is based on measurements to common intersection points located off the baseline. This requires more space and the comparison has demonstrated the feasibility of the more compact technique presented here. Physical setup and testing suggest that the method is practical. Uncertainties estimated by simulation indicate good performance in terms of measurement quality.

  12. Management of the baseline shift using a new and simple method for respiratory-gated radiation therapy: Detectability and effectiveness of a flexible monitoring system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tachibana, Hidenobu; Kitamura, Nozomi; Ito, Yasushi

    2011-07-15

    Purpose: In respiratory-gated radiation therapy, a baseline shift decreases the accuracy of target coverage and organs at risk (OAR) sparing. The effectiveness of audio-feedback and audio-visual feedback in correcting the baseline shift in the breathing pattern of the patient has been demonstrated previously. However, the baseline shift derived from the intrafraction motion of the patient's body cannot be corrected by these methods. In the present study, the authors designed and developed a simple and flexible system. Methods: The system consisted of a web camera and a computer running our in-house software. The in-house software was adapted to template matching andmore » also to no preimage processing. The system was capable of monitoring the baseline shift in the intrafraction motion of the patient's body. Another marker box was used to monitor the baseline shift due to the flexible setups required of a marker box for gated signals. The system accuracy was evaluated by employing a respiratory motion phantom and was found to be within AAPM Task Group 142 tolerance (positional accuracy <2 mm and temporal accuracy <100 ms) for respiratory-gated radiation therapy. Additionally, the effectiveness of this flexible and independent system in gated treatment was investigated in healthy volunteers, in terms of the results from the differences in the baseline shift detectable between the marker positions, which the authors evaluated statistically. Results: The movement of the marker on the sternum [1.599 {+-} 0.622 mm (1 SD)] was substantially decreased as compared with the abdomen [6.547 {+-} 0.962 mm (1 SD)]. Additionally, in all of the volunteers, the baseline shifts for the sternum [-0.136 {+-} 0.868 (2 SD)] were in better agreement with the nominal baseline shifts than was the case for the abdomen [-0.722 {+-} 1.56 mm (2 SD)]. The baseline shifts could be accurately measured and detected using the monitoring system, which could acquire the movement of the marker on the sternum. The baseline shift-monitoring system with the displacement-based methods for highly accurate respiratory-gated treatments should be used to make most of the displacement-based gating methods. Conclusions: The advent of intensity modulated radiation therapy and volumetric modulated radiation therapy facilitates margin reduction for the planning target volumes and the OARs, but highly accurate irradiation is needed to achieve target coverage and OAR sparing with a small margin. The baseline shifts can affect treatment not only with the respiratory gating system but also without the system. Our system can manage the baseline shift and also enables treatment irradiation to be undertaken with high accuracy.« less

  13. A pragmatic randomized comparative effectiveness trial of transitional care for a socioeconomically diverse population: Design, rationale and baseline characteristics.

    PubMed

    Schaeffer, Christine; Teter, Caroline; Finch, Emily A; Hurt, Courtney; Keeter, Mary Kate; Liss, David T; Rogers, Angela; Sheth, Avani; Ackermann, Ronald

    2018-02-01

    Transitional care programs have been widely used to reduce readmissions and improve the quality and safety of the handoff process between hospital and outpatient providers. Very little is known about effective transitional care interventions among patients who are uninsured or with Medicaid. This paper describes the design and baseline characteristics of a pragmatic randomized comparative effectiveness trial of transitional care. Northwestern Medical Group- Transitional Care (NMG-TC) care model was developed to address the needs of patients with multiple medical problems that required lifestyle changes and were amenable to office-based management. We present the design, evaluation methods and baseline characteristics of NMG-TC trial patients. Baseline demographic characteristics indicate that our patient population is predominantly male, Medicaid insured and non-white. This study will evaluate two methods for implementing an effective transitional care model in a medically complex and socioeconomically diverse population. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan

    2015-08-05

    Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reductionmore » in the amount of reserves that must be held to accommodate the uncertainty of solar power output. forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output.« less

  15. Delirium as a Predictor of Physical and Cognitive Function in Individuals Aged 80 and Older After Transcatheter Aortic Valve Implantation or Surgical Aortic Valve Replacement.

    PubMed

    Eide, Leslie S P; Ranhoff, Anette H; Fridlund, Bengt; Haaverstad, Rune; Hufthammer, Karl Ove; Kuiper, Karel K J; Nordrehaug, Jan E; Norekvål, Tone M

    2016-06-01

    To determine how development of delirium after surgical aortic valve replacement (SAVR) or transcatheter aortic valve implantation (TAVI) could predict activity of daily living (ADL) and instrumental ADLs (IADL) disability, cognitive function, and self-reported health in individuals aged 80 and older. Prospective cohort study. Tertiary university hospital. Individuals aged 80 and older undergoing elective SAVR or TAVI (N = 136). Delirium was assessed for 5 days using the Confusion Assessment Method. The Barthel Index, Nottingham Extended ADL Scale, and SF-12 were used to determine ADL and IADL ability and self-reported health at baseline and 1- and 6-month follow-up. Cognition was assessed using the Mini-Mental State Examination at baseline and 6-month follow-up. Participants had lower IADL scores 1 month after SAVR than at baseline (baseline 58, 1 month: delirium 42, no delirium 50, P ≤ .02), but scores had returned to baseline levels at 6 months. The Medical Outcomes Study 12-item Short-Form Health Survey (SF-12) Physical Component Summary (PCS) score was higher at 6-month follow-up (48) than at baseline (39), especially in participants who did not develop delirium (P < .001). No differences in other outcomes were found. Regression models suggest that delirium may help predict IADL disability 1 month after baseline (P ≤ .07) but does not predict large differences in ADL disability, cognitive function, or SF-12-scores. Individuals who underwent TAVI and developed delirium had lower ADL (baseline 19, 1-month 16, P < .001) and IADL (baseline 49, 1-month 40, P = .003) scores at 1-month follow-up. SF-12 PCS score (baseline 30) increased from baseline to 1- (35, P = .04) and 6- (35, P = .02) month follow-up in individuals who underwent TAVI and did not develop delirium. Delirium after TAVI predicted greater ADL and IADL disability at 1-month but not at 6-month follow-up. Individuals who develop delirium after SAVR and TAVI have poorer short-term IADL function but do not seem to have long-term reductions in physical, mental, or self-reported health. © 2016 The Authors. The Journal of the American Geriatrics Society published by Wiley Periodicals, Inc. on behalf of The American Geriatrics Society.

  16. Artificial intelligence (AI) based tactical guidance for fighter aircraft

    NASA Technical Reports Server (NTRS)

    Mcmanus, John W.; Goodrich, Kenneth H.

    1990-01-01

    A research program investigating the use of artificial intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS), a second generation TDG, is presented. The knowledge-based systems used by CLAWS to aid in the tactical decision-making process are outlined in detail, and the results of tests to evaluate the performance of CLAWS versus a baseline TDG developed in FORTRAN to run in real time in the Langley Differential Maneuvering Simulator, are presented. To date, these test results have shown significant performance gains with respect to the TDG baseline in one-versus-one air combat engagements, and the AI-based TDG software has proven to be much easier to modify and maintain than the baseline FORTRAN TDG programs.

  17. Airfoil/Wing Flow Control Using Flexible Extended Trailing Edge

    DTIC Science & Technology

    2009-02-27

    and (b) Power spectrums of drag coefficient Figure 4. Mean velocity profiles O Baseline NACA0012. AoA 18 deg c Baseline NACA0012. AoA 20...dynamics, (a) fin amplitude and (b) power spectrum of fin amplitude Development of Computational Tools Simulations of the time-dependent deformation of...combination of experimental, computational and theoretical methods. Compared with Gurney flap and conventional flap, this device enhanced lift at a smaller

  18. Chiral separation of G-type chemical warfare nerve agents via analytical supercritical fluid chromatography.

    PubMed

    Kasten, Shane A; Zulli, Steven; Jones, Jonathan L; Dephillipo, Thomas; Cerasoli, Douglas M

    2014-12-01

    Chemical warfare nerve agents (CWNAs) are extremely toxic organophosphorus compounds that contain a chiral phosphorus center. Undirected synthesis of G-type CWNAs produces stereoisomers of tabun, sarin, soman, and cyclosarin (GA, GB, GD, and GF, respectively). Analytical-scale methods were developed using a supercritical fluid chromatography (SFC) system in tandem with a mass spectrometer for the separation, quantitation, and isolation of individual stereoisomers of GA, GB, GD, and GF. Screening various chiral stationary phases (CSPs) for the capacity to provide full baseline separation of the CWNAs revealed that a Regis WhelkO1 (SS) column was capable of separating the enantiomers of GA, GB, and GF, with elution of the P(+) enantiomer preceding elution of the corresponding P(-) enantiomer; two WhelkO1 (SS) columns had to be connected in series to achieve complete baseline resolution. The four diastereomers of GD were also resolved using two tandem WhelkO1 (SS) columns, with complete baseline separation of the two P(+) epimers. A single WhelkO1 (RR) column with inverse stereochemistry resulted in baseline separation of the GD P(-) epimers. The analytical methods described can be scaled to allow isolation of individual stereoisomers to assist in screening and development of countermeasures to organophosphorus nerve agents. © 2014 The Authors. Chirality published by John Wiley Periodicals, Inc.

  19. Updating the 2001 National Land Cover Database Impervious Surface Products to 2006 using Landsat imagery change detection methods

    USGS Publications Warehouse

    Xian, George; Homer, Collin G.

    2010-01-01

    A prototype method was developed to update the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001 to a nominal date of 2006. NLCD 2001 is widely used as a baseline for national land cover and impervious cover conditions. To enable the updating of this database in an optimal manner, methods are designed to be accomplished by individual Landsat scene. Using conservative change thresholds based on land cover classes, areas of change and no-change were segregated from change vectors calculated from normalized Landsat scenes from 2001 and 2006. By sampling from NLCD 2001 impervious surface in unchanged areas, impervious surface predictions were estimated for changed areas within an urban extent defined by a companion land cover classification. Methods were developed and tested for national application across six study sites containing a variety of urban impervious surface. Results show the vast majority of impervious surface change associated with urban development was captured, with overall RMSE from 6.86 to 13.12% for these areas. Changes of urban development density were also evaluated by characterizing the categories of change by percentile for impervious surface. This prototype method provides a relatively low cost, flexible approach to generate updated impervious surface using NLCD 2001 as the baseline.

  20. Developing dementia prevention trials: baseline report of the Home-Based Assessment study.

    PubMed

    Sano, Mary; Egelko, Susan; Donohue, Michael; Ferris, Steven; Kaye, Jeffrey; Hayes, Tamara L; Mundt, James C; Sun, Chung-Kai; Paparello, Silvia; Aisen, Paul S

    2013-01-01

    This report describes the baseline experience of the multicenter, Home-Based Assessment study, designed to develop methods for dementia prevention trials using novel technologies for test administration and data collection. Nondemented individuals of 75 years of age or more were recruited and evaluated in-person using established clinical trial outcomes of cognition and function, and randomized to one of 3 assessment methodologies: (1) mail-in questionnaire/live telephone interviews [mail-in/phone (MIP)]; (2) automated telephone with interactive voice recognition; and (3) internet-based computer Kiosk. Brief versions of cognitive and noncognitive outcomes were adapted to each methodology and administered at baseline and repeatedly over a 4-year period. "Efficiency" measures assessed the time from screening to baseline, and staff time required for each methodology. A total of 713 individuals signed consent and were screened; 640 met eligibility and were randomized to one of 3 assessment arms; and 581 completed baseline. Dropout, time from screening to baseline, and total staff time were highest among those assigned to internet-based computer Kiosk. However, efficiency measures were driven by nonrecurring start-up activities suggesting that differences may be mitigated over a long trial. Performance among Home-Based Assessment instruments collected through different technologies will be compared with established outcomes over this 4-year study.

  1. Developing Dementia Prevention Trials: Baseline Report of the Home-Based Assessment Study

    PubMed Central

    Sano, Mary; Egelko, Susan; Donohue, Michael; Ferris, Steven; Kaye, Jeffrey; Hayes, Tamara L.; Mundt, James C.; Sun, C.K.; Paparello, Silvia; Aisen, Paul S.

    2014-01-01

    This report describes the baseline experience of the multi-center, Home Based Assessment (HBA) study, designed to develop methods for dementia prevention trials using novel technologies for test administration and data collection. Non-demented individuals ≥ 75 years old were recruited and evaluated in-person using established clinical trial outcomes of cognition and function, and randomized to one of 3 assessment methodologies: 1) mail-in questionnaire/live telephone interviews (MIP); 2) automated telephone with interactive voice recognition (IVR); and 3) internet-based computer Kiosk (KIO). Brief versions of cognitive and non-cognitive outcomes, were adapted to each methodology and administered at baseline and repeatedly over a 4-year period. “Efficiency” measures assessed the time from screening to baseline, and staff time required for each methodology. 713 individuals signed consent and were screened; 640 met eligibility and were randomized to one of 3 assessment arms and 581 completed baseline. Drop out, time from screening to baseline and total staff time were highest among those assigned to KIO. However efficiency measures were driven by non-recurring start-up activities suggesting that differences may be mitigated over a long trial. Performance among HBA instruments collected via different technologies will be compared to established outcomes over this 4 year study. PMID:23151596

  2. SU-E-T-468: Implementation of the TG-142 QA Process for Seven Linacs with Enhanced Beam Conformance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woollard, J; Ayan, A; DiCostanzo, D

    2015-06-15

    Purpose: To develop a TG-142 compliant QA process for 7 Varian TrueBeam linear accelerators (linacs) with enhanced beam conformance and dosimetrically matched beam models. To ensure consistent performance of all 7 linacs, the QA process should include a common set of baseline values for use in routine QA on all linacs. Methods: The TG 142 report provides recommended tests, tolerances and frequencies for quality assurance of medical accelerators. Based on the guidance provided in the report, measurement tests were developed to evaluate each of the applicable parameters listed for daily, monthly and annual QA. These tests were then performed onmore » each of our 7 new linacs as they came on line at our institution. Results: The tolerance values specified in TG-142 for each QA test are either absolute tolerances (i.e. ±2mm) or require a comparison to a baseline value. The results of our QA tests were first used to ensure that all 7 linacs were operating within the suggested tolerance values provided in TG −142 for those tests with absolute tolerances and that the performance of the linacs was adequately matched. The QA test results were then used to develop a set of common baseline values for those QA tests that require comparison to a baseline value at routine monthly and annual QA. The procedures and baseline values were incorporated into a spreadsheets for use in monthly and annual QA. Conclusion: We have developed a set of procedures for daily, monthly and annual QA of our linacs that are consistent with the TG-142 report. A common set of baseline values was developed for routine QA tests. The use of this common set of baseline values for comparison at monthly and annual QA will ensure consistent performance of all 7 linacs.« less

  3. Comparative Analysis of a Principal Component Analysis-Based and an Artificial Neural Network-Based Method for Baseline Removal.

    PubMed

    Carvajal, Roberto C; Arias, Luis E; Garces, Hugo O; Sbarbaro, Daniel G

    2016-04-01

    This work presents a non-parametric method based on a principal component analysis (PCA) and a parametric one based on artificial neural networks (ANN) to remove continuous baseline features from spectra. The non-parametric method estimates the baseline based on a set of sampled basis vectors obtained from PCA applied over a previously composed continuous spectra learning matrix. The parametric method, however, uses an ANN to filter out the baseline. Previous studies have demonstrated that this method is one of the most effective for baseline removal. The evaluation of both methods was carried out by using a synthetic database designed for benchmarking baseline removal algorithms, containing 100 synthetic composed spectra at different signal-to-baseline ratio (SBR), signal-to-noise ratio (SNR), and baseline slopes. In addition to deomonstrating the utility of the proposed methods and to compare them in a real application, a spectral data set measured from a flame radiation process was used. Several performance metrics such as correlation coefficient, chi-square value, and goodness-of-fit coefficient were calculated to quantify and compare both algorithms. Results demonstrate that the PCA-based method outperforms the one based on ANN both in terms of performance and simplicity. © The Author(s) 2016.

  4. Quantification of baseline pupillary response and task-evoked pupillary response during constant and incremental task load.

    PubMed

    Mosaly, Prithima R; Mazur, Lukasz M; Marks, Lawrence B

    2017-10-01

    The methods employed to quantify the baseline pupil size and task-evoked pupillary response (TEPR) may affect the overall study results. To test this hypothesis, the objective of this study was to assess variability in baseline pupil size and TEPR during two basic working memory tasks: constant load of 3-letters memorisation-recall (10 trials), and incremental load memorisation-recall (two trials of each load level), using two commonly used methods (1) change from trail/load specific baseline, (2) change from constant baseline. Results indicated that there was a significant shift in baseline between the trails for constant load, and between the load levels for incremental load. The TEPR was independent of shifts in baseline using method 1 only for constant load, and method 2 only for higher levels of incremental load condition. These important findings suggest that the assessment of both the baseline and methods to quantify TEPR are critical in ergonomics application, especially in studies with small number of trials per subject per condition. Practitioner Summary: Quantification of TEPR can be affected by shifts in baseline pupil size that are most likely affected by non-cognitive factors when other external factors are kept constant. Therefore, quantification methods employed to compute both baseline and TEPR are critical in understanding the information processing of humans in practical ergonomics settings.

  5. Propensity score to detect baseline imbalance in cluster randomized trials: the role of the c-statistic.

    PubMed

    Leyrat, Clémence; Caille, Agnès; Foucher, Yohann; Giraudeau, Bruno

    2016-01-22

    Despite randomization, baseline imbalance and confounding bias may occur in cluster randomized trials (CRTs). Covariate imbalance may jeopardize the validity of statistical inferences if they occur on prognostic factors. Thus, the diagnosis of a such imbalance is essential to adjust statistical analysis if required. We developed a tool based on the c-statistic of the propensity score (PS) model to detect global baseline covariate imbalance in CRTs and assess the risk of confounding bias. We performed a simulation study to assess the performance of the proposed tool and applied this method to analyze the data from 2 published CRTs. The proposed method had good performance for large sample sizes (n =500 per arm) and when the number of unbalanced covariates was not too small as compared with the total number of baseline covariates (≥40% of unbalanced covariates). We also provide a strategy for pre selection of the covariates needed to be included in the PS model to enhance imbalance detection. The proposed tool could be useful in deciding whether covariate adjustment is required before performing statistical analyses of CRTs.

  6. Automatically finding relevant citations for clinical guideline development.

    PubMed

    Bui, Duy Duc An; Jonnalagadda, Siddhartha; Del Fiol, Guilherme

    2015-10-01

    Literature database search is a crucial step in the development of clinical practice guidelines and systematic reviews. In the age of information technology, the process of literature search is still conducted manually, therefore it is costly, slow and subject to human errors. In this research, we sought to improve the traditional search approach using innovative query expansion and citation ranking approaches. We developed a citation retrieval system composed of query expansion and citation ranking methods. The methods are unsupervised and easily integrated over the PubMed search engine. To validate the system, we developed a gold standard consisting of citations that were systematically searched and screened to support the development of cardiovascular clinical practice guidelines. The expansion and ranking methods were evaluated separately and compared with baseline approaches. Compared with the baseline PubMed expansion, the query expansion algorithm improved recall (80.2% vs. 51.5%) with small loss on precision (0.4% vs. 0.6%). The algorithm could find all citations used to support a larger number of guideline recommendations than the baseline approach (64.5% vs. 37.2%, p<0.001). In addition, the citation ranking approach performed better than PubMed's "most recent" ranking (average precision +6.5%, recall@k +21.1%, p<0.001), PubMed's rank by "relevance" (average precision +6.1%, recall@k +14.8%, p<0.001), and the machine learning classifier that identifies scientifically sound studies from MEDLINE citations (average precision +4.9%, recall@k +4.2%, p<0.001). Our unsupervised query expansion and ranking techniques are more flexible and effective than PubMed's default search engine behavior and the machine learning classifier. Automated citation finding is promising to augment the traditional literature search. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Chiral Separation of G-type Chemical Warfare Nerve Agents via Analytical Supercritical Fluid Chromatography

    PubMed Central

    Kasten, Shane A; Zulli, Steven; Jones, Jonathan L; Dephillipo, Thomas; Cerasoli, Douglas M

    2014-01-01

    Chemical warfare nerve agents (CWNAs) are extremely toxic organophosphorus compounds that contain a chiral phosphorus center. Undirected synthesis of G-type CWNAs produces stereoisomers of tabun, sarin, soman, and cyclosarin (GA, GB, GD, and GF, respectively). Analytical-scale methods were developed using a supercritical fluid chromatography (SFC) system in tandem with a mass spectrometer for the separation, quantitation, and isolation of individual stereoisomers of GA, GB, GD, and GF. Screening various chiral stationary phases (CSPs) for the capacity to provide full baseline separation of the CWNAs revealed that a Regis WhelkO1 (SS) column was capable of separating the enantiomers of GA, GB, and GF, with elution of the P(+) enantiomer preceding elution of the corresponding P(–) enantiomer; two WhelkO1 (SS) columns had to be connected in series to achieve complete baseline resolution. The four diastereomers of GD were also resolved using two tandem WhelkO1 (SS) columns, with complete baseline separation of the two P(+) epimers. A single WhelkO1 (RR) column with inverse stereochemistry resulted in baseline separation of the GD P(–) epimers. The analytical methods described can be scaled to allow isolation of individual stereoisomers to assist in screening and development of countermeasures to organophosphorus nerve agents. Chirality 26:817–824, 2014. © 2014 The Authors. Chirality published by John Wiley Periodicals, Inc. PMID:25298066

  8. Characterization of fission gas bubbles in irradiated U-10Mo fuel

    DOE PAGES

    Casella, Andrew M.; Burkes, Douglas E.; MacFarlan, Paul J.; ...

    2017-06-06

    A simple, repeatable method for characterization of fission gas bubbles in irradiated U-Mo fuels has been developed. This method involves mechanical potting and polishing of samples along with examination with a scanning electron microscope located outside of a hot cell. The commercially available software packages CellProfiler, MATLAB, and Mathematica are used to segment and analyze the captured images. The results are compared and contrasted. Finally, baseline methods for fission gas bubble characterization are suggested for consideration and further development.

  9. ANSI/ASHRAE/IES Standard 90.1-2010 Performance Rating Method Reference Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goel, Supriya; Rosenberg, Michael I.

    This document is intended to be a reference manual for the Appendix G Performance Rating Method (PRM) of ANSI/ASHRAE/IES Standard 90.1- 2010 (Standard 90.1-2010).The PRM is used for rating the energy efficiency of commercial and high-rise residential buildings with designs that exceed the requirements of Standard 90.1. The procedures and processes described in this manual are designed to provide consistency and accuracy by filling in gaps and providing additional details needed by users of the PRM. It should be noted that this document is created independently from ASHRAE and SSPC 90.1 and is not sanctioned nor approved by either ofmore » those entities . Potential users of this manual include energy modelers, software developers and implementers of “beyond code” energy programs. Energy modelers using ASHRAE Standard 90.1-2010 for beyond code programs can use this document as a reference manual for interpreting requirements of the Performance Rating method. Software developers, developing tools for automated creation of the baseline model can use this reference manual as a guideline for developing the rules for the baseline model.« less

  10. Dynamic baseline detection method for power data network service

    NASA Astrophysics Data System (ADS)

    Chen, Wei

    2017-08-01

    This paper proposes a dynamic baseline Traffic detection Method which is based on the historical traffic data for the Power data network. The method uses Cisco's NetFlow acquisition tool to collect the original historical traffic data from network element at fixed intervals. This method uses three dimensions information including the communication port, time, traffic (number of bytes or number of packets) t. By filtering, removing the deviation value, calculating the dynamic baseline value, comparing the actual value with the baseline value, the method can detect whether the current network traffic is abnormal.

  11. Baseline Risk Factors that Predict the Development of Open-angle Glaucoma in a Population: The Los Angeles Latino Eye Study

    PubMed Central

    Jiang, Xuejuan; Varma, Rohit; Wu, Shuang; Torres, Mina; Azen, Stanley P; Francis, Brian A.; Chopra, Vikas; Nguyen, Betsy Bao-Thu

    2012-01-01

    Objective To determine which baseline socio-demographic, lifestyle, anthropometric, clinical, and ocular risk factors predict the development of open-angle glaucoma (OAG) in an adult population. Design A population-based, prospective cohort study. Participants A total of 3,772 self-identified Latinos aged 40 years and older from Los Angeles, California who were free of OAG at baseline. Methods Participants from the Los Angeles Latino Eye Study had standardized study visits at baseline and 4-year follow-up with structured interviews and a comprehensive ophthalmologic examination. OAG was defined as the presence of an open angle and a glaucomatous visual field abnormality and/or evidence of glaucomatous optic nerve damage in at least one eye. Multivariate logistic regression with stepwise selection was performed to determine which potential baseline risk factors independently predict the development of OAG. Main Outcome Measure Odds ratios for various risk factors. Results Over the 4-year follow-up, 87 participants developed OAG. The baseline risk factors that predict the development of OAG include: older age (odds ratio [OR] per decade, 2.19; 95% confidence intervals [CI], 1.74-2.75; P<0.001), higher intraocular pressure (OR per mmHg, 1.18; 95% CI, 1.10-1.26; P<0.001), longer axial length (OR per mm, 1.48; 95% CI, 1.22-1.80; P<0.001), thinner central cornea (OR per 40 μm thinner, 1.30; 95% CI, 1.00-1.70; P=0.050) higher waist to hip ratio (OR per 0.05 higher, 1.21; 95% CI, 1.05-1.39; P=0.007) and lack of vision insurance (OR, 2.08; 95% CI, 1.26-3.41; P=0.004). Conclusions Despite a mean baseline IOP of 14 mmHg in Latinos, higher intraocular pressure is an important risk factor for developing OAG. Biometric measures suggestive of less structural support such as longer axial length and thin CCT were identified as important risk factors. Lack of health insurance reduces access to eye care and increases the burden of OAG by reducing the likelihood of early detection and treatment of OAG. PMID:22796305

  12. Instantaneous Real-Time Kinematic Decimeter-Level Positioning with BeiDou Triple-Frequency Signals over Medium Baselines.

    PubMed

    He, Xiyang; Zhang, Xiaohong; Tang, Long; Liu, Wanke

    2015-12-22

    Many applications, such as marine navigation, land vehicles location, etc., require real time precise positioning under medium or long baseline conditions. In this contribution, we develop a model of real-time kinematic decimeter-level positioning with BeiDou Navigation Satellite System (BDS) triple-frequency signals over medium distances. The ambiguities of two extra-wide-lane (EWL) combinations are fixed first, and then a wide lane (WL) combination is reformed based on the two EWL combinations for positioning. Theoretical analysis and empirical analysis is given of the ambiguity fixing rate and the positioning accuracy of the presented method. The results indicate that the ambiguity fixing rate can be up to more than 98% when using BDS medium baseline observations, which is much higher than that of dual-frequency Hatch-Melbourne-Wübbena (HMW) method. As for positioning accuracy, decimeter level accuracy can be achieved with this method, which is comparable to that of carrier-smoothed code differential positioning method. Signal interruption simulation experiment indicates that the proposed method can realize fast high-precision positioning whereas the carrier-smoothed code differential positioning method needs several hundreds of seconds for obtaining high precision results. We can conclude that a relatively high accuracy and high fixing rate can be achieved for triple-frequency WL method with single-epoch observations, displaying significant advantage comparing to traditional carrier-smoothed code differential positioning method.

  13. Multiple Acquisition InSAR Analysis: Persistent Scatterer and Small Baseline Approaches

    NASA Astrophysics Data System (ADS)

    Hooper, A.

    2006-12-01

    InSAR techniques that process data from multiple acquisitions enable us to form time series of deformation and also allow us to reduce error terms present in single interferograms. There are currently two broad categories of methods that deal with multiple images: persistent scatterer methods and small baseline methods. The persistent scatterer approach relies on identifying pixels whose scattering properties vary little with time and look angle. Pixels that are dominated by a singular scatterer best meet these criteria; therefore, images are processed at full resolution to both increase the chance of there being only one dominant scatterer present, and to reduce the contribution from other scatterers within each pixel. In images where most pixels contain multiple scatterers of similar strength, even at the highest possible resolution, the persistent scatterer approach is less optimal, as the scattering characteristics of these pixels vary substantially with look angle. In this case, an approach that interferes only pairs of images for which the difference in look angle is small makes better sense, and resolution can be sacrificed to reduce the effects of the look angle difference by band-pass filtering. This is the small baseline approach. Existing small baseline methods depend on forming a series of multilooked interferograms and unwrapping each one individually. This approach fails to take advantage of two of the benefits of processing multiple acquisitions, however, which are usually embodied in persistent scatterer methods: the ability to find and extract the phase for single-look pixels with good signal-to-noise ratio that are surrounded by noisy pixels, and the ability to unwrap more robustly in three dimensions, the third dimension being that of time. We have developed, therefore, a new small baseline method to select individual single-look pixels that behave coherently in time, so that isolated stable pixels may be found. After correction for various error terms, the phase values of the selected pixels are unwrapped using a new three-dimensional algorithm. We apply our small baseline method to an area in southern Iceland that includes Katla and Eyjafjallajökull volcanoes, and retrieve a time series of deformation that shows transient deformation due to intrusion of magma beneath Eyjafjallajökull. We also process the data using the Stanford method for persistent scatterers (StaMPS) for comparison.

  14. Baseline Obesity Status Modifies Effectiveness of Adapted Diabetes Prevention Program Lifestyle Interventions for Weight Management in Primary Care

    PubMed Central

    Azar, Kristen M. J.; Xiao, Lan; Ma, Jun

    2013-01-01

    Objective. To examine whether baseline obesity severity modifies the effects of two different, primary care-based, technology-enhanced lifestyle interventions among overweight or obese adults with prediabetes and/or metabolic syndrome. Patients and Methods. We compared mean differences in changes from baseline to 15 months in clinical measures of general and central obesity among participants randomized to usual care alone (n = 81) or usual care plus a coach-led group (n = 79) or self-directed individual (n = 81) intervention, stratified by baseline body mass index (BMI) category. Results. Participants with baseline BMI 35+ had greater reductions in mean BMI, body weight (as percentage change), and waist circumference in the coach-led group intervention, compared to usual care and the self-directed individual intervention (P < 0.05 for all). In contrast, the self-directed intervention was more effective than usual care only among participants with baseline BMIs between 25 ≤ 35. Mean weight loss exceeded 5% in the coach-led intervention regardless of baseline BMI category, but this was achieved only among self-directed intervention participants with baseline BMIs <35. Conclusions. Baseline BMI may influence behavioral weight-loss treatment effectiveness. Researchers and clinicians should take an individual's baseline BMI into account when developing or recommending lifestyle focused treatment strategy. This trial is registered with ClinicalTrials.gov NCT00842426. PMID:24369008

  15. A distribution method for analysing the baseline of pulsatile endocrine signals as exemplified by 24-hour growth hormone profiles.

    PubMed

    Matthews, D R; Hindmarsh, P C; Pringle, P J; Brook, C G

    1991-09-01

    To develop a method for quantifying the distribution of concentrations present in hormone profiles, which would allow an observer-unbiased estimate of the time concentration attribute and to make an assessment of the baseline. The log-transformed concentrations (regardless of their temporal attribute) are sorted and allocated to class intervals. The number of observations in each interval are then determined and expressed as a percentage of the total number of samples drawn in the study period. The data may be displayed as a frequency distribution or as a cumulative distribution. Cumulative distributions may be plotted as sigmoidal ogives or can be transformed into discrete probabilities (linear probits), which are then linear, and amenable to regression analysis. Probability analysis gives estimates of the mean (the value below which 50% of the observed concentrations lie, which we term 'OC50'). 'Baseline' can be defined in terms of percentage occupancy--the 'Observed Concentration for 5%' (which we term 'OC5') which is the threshold at or below which the hormone concentrations are measured 5% of the time. We report the use of applying this method to 24-hour growth hormone (GH) profiles from 63 children, 26 adults and one giant. We demonstrate that GH effects (growth or gigantism) in these groups are more related to the baseline OC5 concentration than peak concentration (OC5 +/- 95% confidence limits: adults 0.05 +/- 0.04, peak-height-velocity pubertal 0.39 +/- 0.22, giant 8.9 mU/l). Pulsatile hormone profiles can be analysed using this method in order to assess baseline and other concentration domains.

  16. Selectivity optimization in green chromatography by gradient stationary phase optimized selectivity liquid chromatography.

    PubMed

    Chen, Kai; Lynen, Frédéric; De Beer, Maarten; Hitzel, Laure; Ferguson, Paul; Hanna-Brown, Melissa; Sandra, Pat

    2010-11-12

    Stationary phase optimized selectivity liquid chromatography (SOSLC) is a promising technique to optimize the selectivity of a given separation by using a combination of different stationary phases. Previous work has shown that SOSLC offers excellent possibilities for method development, especially after the recent modification towards linear gradient SOSLC. The present work is aimed at developing and extending the SOSLC approach towards selectivity optimization and method development for green chromatography. Contrary to current LC practices, a green mobile phase (water/ethanol/formic acid) is hereby preselected and the composition of the stationary phase is optimized under a given gradient profile to obtain baseline resolution of all target solutes in the shortest possible analysis time. With the algorithm adapted to the high viscosity property of ethanol, the principle is illustrated with a fast, full baseline resolution for a randomly selected mixture composed of sulphonamides, xanthine alkaloids and steroids. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. Smoothness of In vivo Spectral Baseline Determined by Mean Squared Error

    PubMed Central

    Zhang, Yan; Shen, Jun

    2013-01-01

    Purpose A nonparametric smooth line is usually added to spectral model to account for background signals in vivo magnetic resonance spectroscopy (MRS). The assumed smoothness of the baseline significantly influences quantitative spectral fitting. In this paper, a method is proposed to minimize baseline influences on estimated spectral parameters. Methods In this paper, the non-parametric baseline function with a given smoothness was treated as a function of spectral parameters. Its uncertainty was measured by root-mean-squared error (RMSE). The proposed method was demonstrated with a simulated spectrum and in vivo spectra of both short echo time (TE) and averaged echo times. The estimated in vivo baselines were compared with the metabolite-nulled spectra, and the LCModel-estimated baselines. The accuracies of estimated baseline and metabolite concentrations were further verified by cross-validation. Results An optimal smoothness condition was found that led to the minimal baseline RMSE. In this condition, the best fit was balanced against minimal baseline influences on metabolite concentration estimates. Conclusion Baseline RMSE can be used to indicate estimated baseline uncertainties and serve as the criterion for determining the baseline smoothness of in vivo MRS. PMID:24259436

  18. Estimating effectiveness of HPV vaccination against HPV infection from post-vaccination data in the absence of baseline data.

    PubMed

    Vänskä, Simopekka; Söderlund-Strand, Anna; Uhnoo, Ingrid; Lehtinen, Matti; Dillner, Joakim

    2018-04-28

    HPV vaccination programs have been introduced in large parts of the world, but monitoring of effectiveness is not routinely performed. Many countries introduced vaccination programs without establishing the baseline of HPV prevalences. We developed and validated methods to estimate protective effectiveness (PE) of vaccination from the post-vaccination data alone using references, which are invariant under HPV vaccination. Type-specific HPV prevalence data for 15-39 year-old women were collected from the pre- and post-vaccination era in a region in southern Sweden. In a region in middle Sweden, where no baseline data had been collected, only post-vaccination data was collected. The age-specific baseline prevalence of vaccine HPV types (vtHPV, HPV 6, 11, 16, 18) were reconstructed as Beta distributions from post-vaccination data by applying the reference odds ratios between the target HPV type and non-vaccine-type HPV (nvtHPV) prevalences. Older non-vaccinated age cohorts and the southern Sweden region were used as the references. The methods for baseline reconstructions were validated by computing the Bhattacharyya coefficient (BC), a measure for divergence, between reconstructed and actual observed prevalences for vaccine HPV types in Southern Sweden, and in addition, for non-vaccine types in both regions. The PE estimates among 18-21 year-old women were validated by comparing the PE estimates that were based on the reconstructed baseline prevalences against the PE estimates based on the actual baseline prevalences. In Southern Sweden the PEs against vtHPV were 52.2% (95% CI: 44.9-58.5) using the reconstructed baseline and 49.6% (43.2-55.5) using the actual baseline, with high BC 82.7% between the reconstructed and actual baseline. In the middle Sweden region where baseline data was missing, the PE was estimated at 40.5% (31.6-48.5). Protective effectiveness of HPV vaccination can be estimated from post-vaccination data alone via reconstructing the baseline using non-vaccine HPV type data. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. The Danish Organic Action Plan 2020: assessment method and baseline status of organic procurement in public kitchens.

    PubMed

    Sørensen, Nina N; Lassen, Anne D; Løje, Hanne; Tetens, Inge

    2015-09-01

    With political support from the Danish Organic Action Plan 2020, organic public procurement in Denmark is expected to increase. In order to evaluate changes in organic food procurement in Danish public kitchens, reliable methods are needed. The present study aimed to compare organic food procurement measurements by two methods and to collect and discuss baseline organic food procurement measurements from public kitchens participating in the Danish Organic Action Plan 2020. Comparison study measuring organic food procurement by applying two different methods, one based on the use of procurement invoices (the Organic Cuisine Label method) and the other on self-reported procurement (the Dogme method). Baseline organic food procurement status was based on organic food procurement measurements and background information from public kitchens. Public kitchens participating in the six organic food conversion projects funded by the Danish Organic Action Plan 2020 during 2012 and 2013. Twenty-six public kitchens (comparison study) and 345 public kitchens (baseline organic food procurement status). A high significant correlation coefficient was found between the two organic food procurement measurement methods (r=0·83, P<0·001) with measurements relevant for the baseline status. Mean baseline organic food procurement was found to be 24 % when including measurements from both methods. The results indicate that organic food procurement measurements by both methods were valid for the baseline status report of the Danish Organic Action Plan 2020. Baseline results in Danish public kitchens suggest there is room for more organic as well as sustainable public procurement in Denmark.

  20. A long baseline global stereo matching based upon short baseline estimation

    NASA Astrophysics Data System (ADS)

    Li, Jing; Zhao, Hong; Li, Zigang; Gu, Feifei; Zhao, Zixin; Ma, Yueyang; Fang, Meiqi

    2018-05-01

    In global stereo vision, balancing the matching efficiency and computing accuracy seems to be impossible because they contradict each other. In the case of a long baseline, this contradiction becomes more prominent. In order to solve this difficult problem, this paper proposes a novel idea to improve both the efficiency and accuracy in global stereo matching for a long baseline. In this way, the reference images located between the long baseline image pairs are firstly chosen to form the new image pairs with short baselines. The relationship between the disparities of pixels in the image pairs with different baselines is revealed by considering the quantized error so that the disparity search range under the long baseline can be reduced by guidance of the short baseline to gain matching efficiency. Then, the novel idea is integrated into the graph cuts (GCs) to form a multi-step GC algorithm based on the short baseline estimation, by which the disparity map under the long baseline can be calculated iteratively on the basis of the previous matching. Furthermore, the image information from the pixels that are non-occluded under the short baseline but are occluded for the long baseline can be employed to improve the matching accuracy. Although the time complexity of the proposed method depends on the locations of the chosen reference images, it is usually much lower for a long baseline stereo matching than when using the traditional GC algorithm. Finally, the validity of the proposed method is examined by experiments based on benchmark datasets. The results show that the proposed method is superior to the traditional GC method in terms of efficiency and accuracy, and thus it is suitable for long baseline stereo matching.

  1. Development of Temporomandibular Disorders is associated with greater bodily pain experience

    PubMed Central

    Lim, Pei Feng; Smith, Shad; Bhalang, Kanokporn; Slade, Gary D.; Maixner, William

    2009-01-01

    Objectives The aim of this study is to examine the difference in the report of bodily pain experienced by subjects who develop temporomandibular disorders (TMD) and by those who do not develop TMD over a 3 year observation period. Methods This is a 3 year prospective study of 266 females aged 18–34 years initially free of TMD pain. All subjects completed the Symptom Report Questionnaire (SRQ) at baseline and yearly intervals, and at the time they developed TMD (if applicable). The SRQ is a self-report instrument evaluating the extent and location of pain experienced in the prior 6 months. Statistical analysis was carried out using repeated measures ANOVA. Results Over the 3 year period, 16 subjects developed TMD based on the Research Diagnostic Criteria for TMD. Subjects who developed TMD reported more headaches (P=0.0089), muscle soreness or pain (P=0.005), joint soreness or pain (P=0.0012), back pain (P=0.0001), chest pain (P=0.0004), abdominal pain (P=0.0021), and menstrual pain (P=0.0036) than subjects who did not develop TMD at both the baseline and final visits. Subjects who developed TMD also reported significantly more headache (P=0.0006), muscle soreness or pain (P=0.0059), and other pains (P=0.0188) when they were diagnosed with TMD compared to the baseline visit. Discussion The development of TMD was accompanied by increases in headaches, muscle soreness or pain, and other pains that were not observed in the subjects who did not develop TMD. Subjects who developed TMD also report higher experience of joint, back, chest and menstrual pain at baseline. PMID:20090437

  2. Updating the 2001 National Land Cover Database land cover classification to 2006 by using Landsat imagery change detection methods

    USGS Publications Warehouse

    Xian, George; Homer, Collin G.; Fry, Joyce

    2009-01-01

    The recent release of the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001, which represents the nation's land cover status based on a nominal date of 2001, is widely used as a baseline for national land cover conditions. To enable the updating of this land cover information in a consistent and continuous manner, a prototype method was developed to update land cover by an individual Landsat path and row. This method updates NLCD 2001 to a nominal date of 2006 by using both Landsat imagery and data from NLCD 2001 as the baseline. Pairs of Landsat scenes in the same season in 2001 and 2006 were acquired according to satellite paths and rows and normalized to allow calculation of change vectors between the two dates. Conservative thresholds based on Anderson Level I land cover classes were used to segregate the change vectors and determine areas of change and no-change. Once change areas had been identified, land cover classifications at the full NLCD resolution for 2006 areas of change were completed by sampling from NLCD 2001 in unchanged areas. Methods were developed and tested across five Landsat path/row study sites that contain several metropolitan areas including Seattle, Washington; San Diego, California; Sioux Falls, South Dakota; Jackson, Mississippi; and Manchester, New Hampshire. Results from the five study areas show that the vast majority of land cover change was captured and updated with overall land cover classification accuracies of 78.32%, 87.5%, 88.57%, 78.36%, and 83.33% for these areas. The method optimizes mapping efficiency and has the potential to provide users a flexible method to generate updated land cover at national and regional scales by using NLCD 2001 as the baseline.

  3. On Possibility of Direct Asteroid Deflection by Electric Solar Wind Sail

    NASA Astrophysics Data System (ADS)

    Merikallio, Sini; Janhunen, Pekka

    2010-05-01

    The Electric Solar Wind Sail (E-sail) is a new propulsion method for interplanetary travel which was invented in 2006 and is currently under development. The E-sail uses charged tethers to extract momentum from the solar wind particles to obtain propulsive thrust. According to current estimates, the E-sail is 2-3 orders of magnitude better than traditional propulsion methods (chemical rockets and ion engines) in terms of produced lifetime-integrated impulse per propulsion system mass. Here we analyze the problem of using the E-sail for directly deflecting an Earth-threatening asteroid. The problem then culminates into how to attach the E-sail device to the asteroid. We assess a number of alternative attachment strategies and arrive at a recommendation of using the gravity tractor method because of its workability for a wide variety of asteroid types. We also consider possible techniques to scale up the E-sail force beyond the baseline one Newton level to deal with more imminent or larger asteroid or cometary threats. As a baseline case we consider a 3 million ton asteroid which can be deflected with a baseline 1 N E-sail in 5-10 years. Once developed, the E-sail would appear to provide a safe and reasonably low-cost way of deflecting dangerous asteroids and other heavenly bodies in cases where the collision threat becomes known several years in advance.

  4. Development of an approach to correcting MicroPEM baseline drift.

    PubMed

    Zhang, Ting; Chillrud, Steven N; Pitiranggon, Masha; Ross, James; Ji, Junfeng; Yan, Beizhan

    2018-07-01

    Fine particulate matter (PM 2.5 ) is associated with various adverse health outcomes. The MicroPEM (RTI, NC), a miniaturized real-time portable particulate sensor with an integrated filter for collecting particles, has been widely used for personal PM 2.5 exposure assessment. Five-day deployments were targeted on a total of 142 deployments (personal or residential) to obtain real-time PM 2.5 levels from children living in New York City and Baltimore. Among these 142 deployments, 79 applied high-efficiency particulate air (HEPA) filters in the field at the beginning and end of each deployment to adjust the zero level of the nephelometer. However, unacceptable baseline drift was observed in a large fraction (> 40%) of acquisitions in this study even after HEPA correction. This drift issue has been observed in several other studies as well. The purpose of the present study is to develop an algorithm to correct the baseline drift in MicroPEM based on central site ambient data during inactive time periods. A running baseline & gravimetric correction (RBGC) method was developed based on the comparison of MicroPEM readings during inactive periods to ambient PM 2.5 levels provided by fixed monitoring sites and the gravimetric weight of PM 2.5 collected on the MicroPEM filters. The results after RBGC correction were compared with those using HEPA approach and gravimetric correction alone. Seven pairs of duplicate acquisitions were used to validate the RBGC method. The percentages of acquisitions with baseline drift problems were 42%, 53% and 10% for raw, HEPA corrected, and RBGC corrected data, respectively. Pearson correlation analysis of duplicates showed an increase in the coefficient of determination from 0.75 for raw data to 0.97 after RBGC correction. In addition, the slope of the regression line increased from 0.60 for raw data to 1.00 after RBGC correction. The RBGC approach corrected the baseline drift issue associated with MicroPEM data. The algorithm developed has the potential for use with data generated from other types of PM sensors that contain a filter for weighing as well. In addition, this approach can be applied in many other regions, given widely available ambient PM data from monitoring networks, especially in urban areas. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Association of Fitness With Incident Dyslipidemias Over 25 Years in the Coronary Artery Risk Development in Young Adults Study

    PubMed Central

    Sarzynski, Mark A.; Schuna, John M.; Carnethon, Mercedes R.; Jacobs, David R.; Lewis, Cora E.; Quesenberry, Charles P.; Sidney, Stephen; Schreiner, Pamela J.; Sternfeld, Barbara

    2015-01-01

    Introduction Few studies have examined the longitudinal associations of fitness or changes in fitness on the risk of developing dyslipidemias. This study examined the associations of: (1) baseline fitness with 25-year dyslipidemia incidence; and (2) 20-year fitness change on dyslipidemia development in middle age in the Coronary Artery Risk Development in young Adults (CARDIA) study. Methods Multivariable Cox proportional hazards regression models were used to test the association of baseline fitness (1985–1986) with dyslipidemia incidence over 25 years (2010–2011) in CARDIA (N=4,898). Modified Poisson regression models were used to examine the association of 20-year change in fitness with dyslipidemia incidence between Years 20 and 25 (n=2,487). Data were analyzed in June 2014 and February 2015. Results In adjusted models, the risk of incident low high-density lipoprotein cholesterol (HDL-C), high triglycerides, and high low-density lipoprotein cholesterol (LDL-C) was significantly lower, by 9%, 16%, and 14%, respectively, for each 2.0-minute increase in baseline treadmill endurance. After additional adjustment for baseline trait level, the associations remained significant for incident high triglycerides and high LDL-C in the total population and for incident high triglycerides in both men and women. In race-stratified models, these associations appeared to be limited to whites. In adjusted models, change in fitness did not predict 5-year incidence of dyslipidemias, whereas baseline fitness significantly predicted 5-year incidence of high triglycerides. Conclusions Our findings demonstrate the importance of cardiorespiratory fitness in young adulthood as a risk factor for developing dyslipidemias, particularly high triglycerides, during the transition to middle age. PMID:26165197

  6. Longitudinal Association between Periodontitis and Development of Diabetes Running title: Periodontitis and Diabetes Development.

    PubMed

    Joshipura, Kaumudi J; Muñoz-Torres, Francisco J; Dye, Bruce A; Leroux, Brian G; Ramírez-Vick, Margarita; Pérez, Cynthia M

    2018-04-18

    Clinical trials have shown very modest short-term improvements in glycemic control among participants with diabetes after periodontitis treatment. Few longitudinal studies suggest that periodontitis may be related to prediabetes/diabetes risk. We evaluated 1,206 diabetes free participants in the San Juan Overweight Adults Longitudinal Study (SOALS) and 941 with complete 3-year follow-up data were included. The National Health and Nutrition Examination Survey (NHANES) methods were used to assess periodontitis. Diabetes and prediabetes were classified using American Diabetes Association cutoffs for fasting and 2-hour post-load glucose and HbA1c. We used Poisson regression adjusting for baseline age, gender, smoking, education, family history of diabetes, physical activity, waist circumference, and alcohol intake. Over the 3-year follow-up, 69 (7.3%) of the 941 individuals developed type 2 diabetes, and 142 (34.9%) of the 407 with normal glycemia at baseline developed prediabetes. In multivariable models, greater mean pocket depth and mean attachment loss at baseline were associated with lower risk of developing prediabetes/diabetes over the follow-up (IRR=0.81; 95% CI: 0.67-0.99, and IRR=0.86; 95% CI: 0.74-0.99, respectively). Increase in periodontal attachment loss from baseline to follow-up was associated with higher prediabetes/diabetes risk (multivariate IRR=1.25; 95% CI: 1.09-1.42), and increase in pocket depth was associated with >20% fasting glucose increase (multivariate IRR=1.43; 95% CI: 1.14-1.79). The inverse associations persisted after additionally adjusting for baseline income, sugar-sweetened beverages, number of teeth, oral hygiene, glycemia, or previous periodontal therapy. There is no association between periodontitis and risk of prediabetes/diabetes in this longitudinal study. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Behavioral repertoire of larval zebrafish: Baseline activity and response to drug treatment.

    EPA Science Inventory

    As part of the EPA’s effort to develop an in vivo, vertebrate screen for toxic chemicals, we have begun to characterize basic behaviors of 6-day post-fertilization (dpf) zebrafish (Danio rerio) larvae in a microtiter plate format. Our main goal is to develop a method for rapidly ...

  8. Baseline and target values for regional and point PV power forecasts: Toward improved solar forecasting

    DOE PAGES

    Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; ...

    2015-11-10

    Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based onmore » state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.« less

  9. Macular Atrophy Development and Subretinal Drusenoid Deposits in Anti-Vascular Endothelial Growth Factor Treated Age-Related Macular Degeneration

    PubMed Central

    Zarubina, Anna V.; Gal-Or, Orly; Huisingh, Carrie E.; Owsley, Cynthia

    2017-01-01

    Purpose To explore the association between presence of subretinal drusenoid deposits (SDD) at baseline in eyes with neovascular age-related macular degeneration (nAMD) with the development of macular atrophy (MA) during anti-vascular endothelial growth factor (VEGF) therapy. Methods There were 74 eyes without pre-existing MA receiving anti-VEGF therapy for nAMD for 2 years or longer analyzed. At least two image modalities that included spectral-domain optical coherence tomography, near-infrared reflectance, fluorescein angiography, and color fundus photos were used to assess for SDD presence, phenotype (dot and ribbon), and location, neovascularization type, and MA. Logistic regression models using generalized estimating equations assessed the association between SDD and the development of MA adjusting for age, neovascularization type, and choroidal thickness. Results SDD were present in 46 eyes (63%) at baseline. MA developed in 38 eyes (51%) during the mean of 4.7 ± 1.2 years of follow-up. Compared with eyes without SDD, those with SDD at baseline were 3.0 times (95% confidence interval [CI] 1.1–8.5, P = 0.0343) more likely to develop MA. Eyes with SDD present in the inferior macula and inferior extramacular fields at baseline were 3.0 times and 6.5 times more likely to develop MA at follow-up than eyes without SDD in these locations (95% CI 1.0–8.9, P = 0.0461 and 95% CI 1.3–32.4, P = 0.0218, respectively). MA development was not associated with a specific SDD phenotype. Conclusions MA frequently developed in eyes during anti-VEGF treatment. SDD were independently associated with MA development. The extension of SDD into the inferior fundus, particularly in the inferior extramacular field, conferred higher odds of subsequent MA development. PMID:29196768

  10. Conserving biodiversity and ecosystem function through limited development: an empirical evaluation.

    PubMed

    Milder, Jeffrey C; Lassoie, James P; Bedford, Barbara L

    2008-02-01

    Suburban, exurban, and rural development in the United States consumes nearly 1 million hectares of land per year and is a leading threat to biodiversity. In response to this threat, conservation development has been advanced as a way to combine land development and land conservation while providing functional protection for natural resources. Yet, although conservation development techniques have been in use for decades, there have been few critical evaluations of their conservation effectiveness. We addressed this deficiency by assessing the conservation outcomes of one type of conservation development project: conservation and limited development projects (CLDPs). Conducted by land trusts, landowners, and developers, CLDPs use revenue from limited development to finance the protection of land and natural resources. We compared a sample of 10 CLDPs from the eastern United States with their respective baseline scenarios (conventional development) and with a sample of conservation subdivisions--a different conservation development technique characterized by higher-density development. To measure conservation success, we created an evaluation method containing eight indicators that quantify project impacts to terrestrial and aquatic ecosystems at the site and in the surrounding landscape. The CLDPs protected and managed threatened natural resources including rare species and ecological communities. In terms of conservation benefits, the CLDPs significantly outperformed their respective baseline scenarios and the conservation subdivisions. These results imply that CLDPs can offer a low-impact alternative to conventional development and a low-cost method for protecting land when conventional conservation techniques are too expensive. In addition, our evaluation method demonstrates how planners and developers can incorporate appropriate ecological considerations when designing, reviewing, and evaluating conservation development projects.

  11. GNSS Single Frequency, Single Epoch Reliable Attitude Determination Method with Baseline Vector Constraint.

    PubMed

    Gong, Ang; Zhao, Xiubin; Pang, Chunlei; Duan, Rong; Wang, Yong

    2015-12-02

    For Global Navigation Satellite System (GNSS) single frequency, single epoch attitude determination, this paper proposes a new reliable method with baseline vector constraint. First, prior knowledge of baseline length, heading, and pitch obtained from other navigation equipment or sensors are used to reconstruct objective function rigorously. Then, searching strategy is improved. It substitutes gradually Enlarged ellipsoidal search space for non-ellipsoidal search space to ensure correct ambiguity candidates are within it and make the searching process directly be carried out by least squares ambiguity decorrelation algorithm (LAMBDA) method. For all vector candidates, some ones are further eliminated by derived approximate inequality, which accelerates the searching process. Experimental results show that compared to traditional method with only baseline length constraint, this new method can utilize a priori baseline three-dimensional knowledge to fix ambiguity reliably and achieve a high success rate. Experimental tests also verify it is not very sensitive to baseline vector error and can perform robustly when angular error is not great.

  12. GPS Attitude Determination Using Deployable-Mounted Antennas

    NASA Technical Reports Server (NTRS)

    Osborne, Michael L.; Tolson, Robert H.

    1996-01-01

    The primary objective of this investigation is to develop a method to solve for spacecraft attitude in the presence of potential incomplete antenna deployment. Most research on the use of the Global Positioning System (GPS) in attitude determination has assumed that the antenna baselines are known to less than 5 centimeters, or one quarter of the GPS signal wavelength. However, if the GPS antennas are mounted on a deployable fixture such as a solar panel, the actual antenna positions will not necessarily be within 5 cm of nominal. Incomplete antenna deployment could cause the baselines to be grossly in error, perhaps by as much as a meter. Overcoming this large uncertainty in order to accurately determine attitude is the focus of this study. To this end, a two-step solution method is proposed. The first step uses a least-squares estimate of the baselines to geometrically calculate the deployment angle errors of the solar panels. For the spacecraft under investigation, the first step determines the baselines to 3-4 cm with 4-8 minutes of data. A Kalman filter is then used to complete the attitude determination process, resulting in typical attitude errors of 0.50.

  13. Impact of the Birkman Method Assessment on Pharmacy Student Self-Confidence, Self-Perceptions, and Self-Awareness

    PubMed Central

    Grant, Amy D.; Fabel, Patricia H.; Worrall, Cathy; Brittain, Kristy; Martinez, Breanne; Lu, Z. Kevin; Davis, Robert; Doran, Georgia H.; Ziegler, Bryan

    2016-01-01

    Objective. To identify changes in pharmacy student self-confidence, self-perceptions, and self-awareness after completing the Birkman Method assessment and training program. Methods. Survey tools were developed to evaluate students at baseline and following the co-curricular Birkman Method program. Following IRB approval, students participating in the Birkman Method program were recruited for enrollment in this survey-based study. Results. Student self-confidence was high at baseline (mean=4 out of 5) and did not significantly change after Birkman Method testing and training. Self-perceptions regarding usual and stressed communication style and behaviors and behavioral needs under stress changed significantly after Birkman Method testing and training for these endpoints. The Birkman Method intervention resulted in a significant improvement in self-awareness, as indicated by a mean self-perception accuracy score increase of 1.6 points (95% CI: 1.3-1.9). Conclusions. A Birkman Method assessment and training program is an effective self-assessment tool for students, and may be useful for accomplishing Accreditation Council for Pharmacy Education (ACPE) 2016 Standard 4 affective domain elements, particularly self-awareness. PMID:28090097

  14. Operational Dynamic Configuration Analysis

    NASA Technical Reports Server (NTRS)

    Lai, Chok Fung; Zelinski, Shannon

    2010-01-01

    Sectors may combine or split within areas of specialization in response to changing traffic patterns. This method of managing capacity and controller workload could be made more flexible by dynamically modifying sector boundaries. Much work has been done on methods for dynamically creating new sector boundaries [1-5]. Many assessments of dynamic configuration methods assume the current day baseline configuration remains fixed [6-7]. A challenging question is how to select a dynamic configuration baseline to assess potential benefits of proposed dynamic configuration concepts. Bloem used operational sector reconfigurations as a baseline [8]. The main difficulty is that operational reconfiguration data is noisy. Reconfigurations often occur frequently to accommodate staff training or breaks, or to complete a more complicated reconfiguration through a rapid sequence of simpler reconfigurations. Gupta quantified a few aspects of airspace boundary changes from this data [9]. Most of these metrics are unique to sector combining operations and not applicable to more flexible dynamic configuration concepts. To better understand what sort of reconfigurations are acceptable or beneficial, more configuration change metrics should be developed and their distribution in current practice should be computed. This paper proposes a method to select a simple sequence of configurations among operational configurations to serve as a dynamic configuration baseline for future dynamic configuration concept assessments. New configuration change metrics are applied to the operational data to establish current day thresholds for these metrics. These thresholds are then corroborated, refined, or dismissed based on airspace practitioner feedback. The dynamic configuration baseline selection method uses a k-means clustering algorithm to select the sequence of configurations and trigger times from a given day of operational sector combination data. The clustering algorithm selects a simplified schedule containing k configurations based on stability score of the sector combinations among the raw operational configurations. In addition, the number of the selected configurations is determined based on balance between accuracy and assessment complexity.

  15. Geodetic and Astrometric Measurements with Very-Long-Baseline Interferometry. Ph.D. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Robertson, D. S.

    1975-01-01

    The use of very-long-baseline interferometry (VLBI) observations for the estimation of geodetic and astrometric parameters is discussed. Analytic models for the dependence of delay and delay rate on these parameters are developed and used for parameter estimation by the method of weighted least squares. Results are presented from approximately 15,000 delay and delay-rate observations, obtained in a series of nineteen VLBI experiments involving a total of five stations on two continents. The closure of baseline triangles is investigated and found to be consistent with the scatter of the various baseline-component results. Estimates are made of the wobble of the earth's pole and of the irregularities in the earth's rotation rate. Estimates are also made of the precession constant and of the vertical Love number, for which a value of 0.55 + or - 0.05 was obtained.

  16. Increasing ambulatory pulse pressure predicts the development of left ventricular hypertrophy during long-term follow-up.

    PubMed

    Pääkkö, Tero J W; Perkiömäki, Juha S; Kesäniemi, Y Antero; Ylitalo, Antti S; Lumme, Jarmo A; Huikuri, Heikki V; Ukkola, Olavi H

    2018-03-01

    Ambulatory blood pressure (ABP) has been shown to have an association with left ventricular hypertrophy (LVH). We evaluated the association between ABP characteristics and the development of LVH during long-term follow-up (20 years) in 420 middle-aged subjects from OPERA cohort. ABP measurements (ABPM) were recorded and echocardiographic examinations were performed at baseline and revisit. Anthropometrics were measured and laboratory analyses performed at visit. The questionnaire presented to all participants elicited detailed information about their habits. Left ventricular mass index (LVMI) was calculated according to Troys method. Baseline LVMI was a significant independent predictor of LVMI change (p < 0.001). None of the baseline continuous ABPM predicted the change in LVMI. A greater increase in daytime and night-time systolic blood pressure (BP) (p from 0.006 to 0.048) and 24 h, daytime and night-time pulse pressure (PP) (p from 0.005 to 0.034) predicted a greater increase in LVMI. Especially the increase in night-time SBP (p = 0.006) and PP (p = 0.005) predicted a greater increase in LVMI. We also considered circadian BP profiles among subjects, whose ABPM at baseline and echocardiographic measurements both at baseline and follow-up were available. Diastolic non-dippers were observed to show a greater increase in LVMI compared to diastolic dippers (10.6 ± 33.0 g/m 2 vs. 7.0 ± 28.8 g/m 2 , p = 0.032), when baseline LVMI and in-office DBP were taken account. These findings suggest that an increasing ambulatory PP increases and a diastolic non-dipping status may increase the risk for the development of LVH during later life course.

  17. Obstructive Sleep Apnea (OSA) in Preadolescent Girls is Associated with Delayed Breast Development Compared to Girls without OSA

    PubMed Central

    Shaw, Natalie D.; Goodwin, James L.; Silva, Graciela E.; Hall, Janet E.; Quan, Stuart F.; Malhotra, Atul

    2013-01-01

    Study Objective: Adults with obstructive sleep apnea (OSA) have lower sex steroid levels than controls. We sought to determine whether OSA also interferes with reproductive hormones in adolescence by tracking the pace of pubertal development. Methods: One hundred seventy-two children in the Tucson Children's Assessment of Sleep Apnea study (TuCASA) underwent two home polysomnographic studies, spaced 4-5 years apart. Height and weight were measured at both visits, and Tanner staging of breasts/genitals and pubic hair were self-assessed by a pictorial questionnaire at follow-up. Results: Eighty-seven girls and 85 boys, age 8.9 ± 1.6 years (mean ± SD) at baseline and 13.4 ± 1.6 years at follow-up, participated. Twenty-seven percent of participants were over-weight or obese at baseline, and the majority remained so at follow-up. Twenty-six percent of girls and 28% of boys met criteria for OSA, defined as a respiratory disturbance index (RDI) ≥ 1/h associated with a 3% desaturation (RDI 3%), at baseline. There was an inverse relationship between baseline log RDI 3% and Tanner breast stage at follow-up (coefficient -1.3, p = 0.02) in girls after adjusting for age (p < 0.001), body mass index (p < 0.005), and ethnicity. Girls with OSA at baseline were more than 1 Tanner breast stage behind girls without OSA at follow-up. OSA did not affect genital development in boys or pubic hair development in either sex. Conclusions: OSA in preadolescent girls predicts delayed breast development relative to girls without OSA. Sleep fragmentation and/or hypoxia seen in OSA may interfere with reproductive development in girls. Citation: Shaw ND; Goodwin JL; Silva GE; Hall JE; Quan SF; Malhotra A. Obstructive sleep apnea (OSA) in preadolescent girls is associated with delayed breast development compared to girls without OSA. J Clin Sleep Med 2013;9(8):813-818. PMID:23946712

  18. Leveraging probabilistic peak detection to estimate baseline drift in complex chromatographic samples.

    PubMed

    Lopatka, Martin; Barcaru, Andrei; Sjerps, Marjan J; Vivó-Truyols, Gabriel

    2016-01-29

    Accurate analysis of chromatographic data often requires the removal of baseline drift. A frequently employed strategy strives to determine asymmetric weights in order to fit a baseline model by regression. Unfortunately, chromatograms characterized by a very high peak saturation pose a significant challenge to such algorithms. In addition, a low signal-to-noise ratio (i.e. s/n<40) also adversely affects accurate baseline correction by asymmetrically weighted regression. We present a baseline estimation method that leverages a probabilistic peak detection algorithm. A posterior probability of being affected by a peak is computed for each point in the chromatogram, leading to a set of weights that allow non-iterative calculation of a baseline estimate. For extremely saturated chromatograms, the peak weighted (PW) method demonstrates notable improvement compared to the other methods examined. However, in chromatograms characterized by low-noise and well-resolved peaks, the asymmetric least squares (ALS) and the more sophisticated Mixture Model (MM) approaches achieve superior results in significantly less time. We evaluate the performance of these three baseline correction methods over a range of chromatographic conditions to demonstrate the cases in which each method is most appropriate. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Treatment decisions based on scalar and functional baseline covariates.

    PubMed

    Ciarleglio, Adam; Petkova, Eva; Ogden, R Todd; Tarpey, Thaddeus

    2015-12-01

    The amount and complexity of patient-level data being collected in randomized-controlled trials offer both opportunities and challenges for developing personalized rules for assigning treatment for a given disease or ailment. For example, trials examining treatments for major depressive disorder are not only collecting typical baseline data such as age, gender, or scores on various tests, but also data that measure the structure and function of the brain such as images from magnetic resonance imaging (MRI), functional MRI (fMRI), or electroencephalography (EEG). These latter types of data have an inherent structure and may be considered as functional data. We propose an approach that uses baseline covariates, both scalars and functions, to aid in the selection of an optimal treatment. In addition to providing information on which treatment should be selected for a new patient, the estimated regime has the potential to provide insight into the relationship between treatment response and the set of baseline covariates. Our approach can be viewed as an extension of "advantage learning" to include both scalar and functional covariates. We describe our method and how to implement it using existing software. Empirical performance of our method is evaluated with simulated data in a variety of settings and also applied to data arising from a study of patients with major depressive disorder from whom baseline scalar covariates as well as functional data from EEG are available. © 2015, The International Biometric Society.

  20. Association between Carotid Plaque Characteristics and Cerebral White Matter Lesions: One-Year Follow-Up Study by MRI

    PubMed Central

    Kwee, Robert M.; Hofman, Paul A. M.; Gronenschild, Ed H. B. M.; van Oostenbrugge, Robert J.; Mess, Werner H.; Berg, Johannes W. M. ter.; Franke, Cees L.; Korten, Arthur G. G. C.; Meems, Bé J.; van Engelshoven, Jos M. A.; Wildberger, Joachim E.; Kooi, M. Eline

    2011-01-01

    Objective To prospectively assess the relation between carotid plaque characteristics and the development of new cerebral white matter lesions (WMLs) at MRI. Methods Fifty TIA/stroke patients with ipsilateral 30–69% carotid stenosis underwent MRI of the plaque at baseline. Total plaque volume and markers of vulnerability to thromboembolism (lipid-rich necrotic core [LRNC] volume, fibrous cap [FC] status, and presence of intraplaque hemorrhage [IPH]) were assessed. All patients also underwent brain MRI at baseline and after one year. Ipsilateral cerebral WMLs were quantified with a semiautomatic method. Results Mean WML volume significantly increased over a one-year period (6.52 vs. 6.97 mm3, P = 0.005). WML volume at baseline and WML progression did not significantly differ (P>0.05) between patients with 30–49% and patients with 50–69% stenosis. There was a significant correlation between total plaque volume and baseline ipsilateral WML volume (Spearman ρ = 0.393, P = 0.005). There was no significant correlation between total plaque volume and ipsilateral WML progression. There were no significant associations between LRNC volume and WML volume at baseline and WML progression. WML volume at baseline and WML progression did not significantly differ between patients with a thick and intact FC and patients with a thin and/or ruptured FC. WML volume at baseline and WML progression also did not significantly differ between patients with and without IPH. Conclusion The results of this study indicate that carotid plaque burden is significantly associated with WML severity, but that there is no causal relationship between carotid plaque vulnerability and the occurrence of WMLs. PMID:21347225

  1. Artificial Intelligence (AI) Based Tactical Guidance for Fighter Aircraft

    NASA Technical Reports Server (NTRS)

    McManus, John W.; Goodrich, Kenneth H.

    1990-01-01

    A research program investigating the use of Artificial Intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range (WVR) air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS), a second generation TDG, is presented. The Knowledge-Based Systems used by CLAWS to aid in the tactical decision-making process are outlined in detail, and the results of tests to evaluate the performance of CLAWS versus a baseline TDG developed in FORTRAN to run in real-time in the Langley Differential Maneuvering Simulator (DMS), are presented. To date, these test results have shown significant performance gains with respect to the TDG baseline in one-versus-one air combat engagements, and the AI-based TDG software has proven to be much easier to modify and maintain than the baseline FORTRAN TDG programs. Alternate computing environments and programming approaches, including the use of parallel algorithms and heterogeneous computer networks are discussed, and the design and performance of a prototype concurrent TDG system are presented.

  2. Premorbid determinants of left ventricular dysfunction in a novel model of gradually induced pressure overload in the adult canine

    NASA Technical Reports Server (NTRS)

    Koide, M.; Nagatsu, M.; Zile, M. R.; Hamawaki, M.; Swindle, M. M.; Keech, G.; DeFreyte, G.; Tagawa, H.; Cooper, G. 4th; Carabello, B. A.

    1997-01-01

    BACKGROUND: When a pressure overload is placed on the left ventricle, some patients develop relatively modest hypertrophy whereas others develop extensive hypertrophy. Likewise, the occurrence of contractile dysfunction also is variable. The cause of this heterogeneity is not well understood. METHODS AND RESULTS: We recently developed a model of gradual proximal aortic constriction in the adult canine that mimicked the heterogeneity of the hypertrophic response seen in humans. We hypothesized that differences in outcome were related to differences present before banding. Fifteen animals were studied initially. Ten developed left ventricular dysfunction (dys group). Five dogs maintained normal function (nl group). At baseline, the nl group had a lower mean systolic wall stress (96 +/- 9 kdyne/cm2; dys group, 156 +/- 7 kdyne/cm2; P < .0002) and greater relative left ventricular mass (left ventricular weight [g]/body wt [kg], 5.1 +/- 0.36; dys group, 3.9 +/- 0.26; P < .02). On the basis of differences in mean systolic wall stress at baseline, we predicted outcome in the next 28 dogs by using a cutoff of 115 kdyne/cm2. Eighteen of 20 dogs with baseline mean systolic stress > 115 kdyne/cm2 developed dysfunction whereas 6 of 8 dogs with resting stress < or = 115 kdyne/cm2 maintained normal function. CONCLUSIONS: We conclude that this canine model mimicked the heterogeneous hypertrophic response seen in humans. In the group that eventually developed dysfunction there was less cardiac mass despite 60% higher wall stress at baseline, suggesting a different set point for regulating myocardial growth in the two groups.

  3. Overcoming Recruitment Challenges of Web-based Interventions for Tobacco Use: The Case of Web-based Acceptance and Commitment Therapy for Smoking Cessation

    PubMed Central

    Heffner, Jaimee L; Wyszynski, Christopher M; Comstock, Bryan; Mercer, Laina D.; Bricker, Jonathan

    2013-01-01

    Web-based behavioral interventions for substance use are being developed at a rapid pace, yet there is a dearth of information regarding the most effective methods for recruiting participants into web-based intervention trials. In this paper, we describe our successful recruitment of participants into a pilot trial of web-based Acceptance and Commitment Therapy (ACT) for smoking cessation and compare traditional and web-based methods of recruitment in terms of their effects on baseline participant characteristics, association with study retention and treatment outcome, yield, and cost-effectiveness. Over a 10-week period starting June 15, 2010, we recruited 222 smokers for a web-based smoking cessation study using a variety of recruitment methods. The largest portion of randomized participants were recruited through Google AdWords (36%), followed by medical Internet media (23%), standard media (14%), word of mouth (12%), broadcast emails (11%), and social media (6%). Recruitment source was not related to baseline participant characteristics, 3-month data retention, or 30-day point prevalence smoking abstinence at the 3-month outcome assessment. Cost per randomized participant ranged from $5.27/participant for word of mouth to $172.76/participant for social media, with a mean cost of $42.48/participant. Our diversified approach to recruitment, including both traditional and web-based methods, enabled timely enrollment of participants into the study. Because there was no evidence of a substantive difference in baseline characteristics, retention, or outcomes based on recruitment channel, the yield and cost-effectiveness of recruitment methods may be the more critical considerations in developing a feasible recruitment plan for a web-based smoking cessation intervention study. PMID:23770645

  4. Overcoming recruitment challenges of web-based interventions for tobacco use: the case of web-based acceptance and commitment therapy for smoking cessation.

    PubMed

    Heffner, Jaimee L; Wyszynski, Christopher M; Comstock, Bryan; Mercer, Laina D; Bricker, Jonathan

    2013-10-01

    Web-based behavioral interventions for substance use are being developed at a rapid pace, yet there is a dearth of information regarding the most effective methods for recruiting participants into web-based intervention trials. In this paper, we describe our successful recruitment of participants into a pilot trial of web-based Acceptance and Commitment Therapy (ACT) for smoking cessation and compare traditional and web-based methods of recruitment in terms of their effects on baseline participant characteristics, association with study retention and treatment outcome, yield, and cost-effectiveness. Over a 10-week period starting June 15, 2010, we recruited 222 smokers for a web-based smoking cessation study using a variety of recruitment methods. The largest portion of randomized participants were recruited through Google AdWords (36%), followed by medical Internet media (23%), standard media (14%), word of mouth (12%), broadcast emails (11%), and social media (6%). Recruitment source was not related to baseline participant characteristics, 3-month data retention, or 30-day point prevalence smoking abstinence at the 3-month outcome assessment. Cost per randomized participant ranged from $5.27/participant for word of mouth to $172.76/participant for social media, with a mean cost of $42.48/participant. Our diversified approach to recruitment, including both traditional and web-based methods, enabled timely enrollment of participants into the study. Because there was no evidence of a substantive difference in baseline characteristics, retention, or outcomes based on recruitment channel, the yield and cost-effectiveness of recruitment methods may be the more critical considerations in developing a feasible recruitment plan for a web-based smoking cessation intervention study. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Method and apparatus for reliable inter-antenna baseline determination

    NASA Technical Reports Server (NTRS)

    Wilson, John M. (Inventor)

    2001-01-01

    Disclosed is a method for inter-antenna baseline determination that uses an antenna configuration comprising a pair of relatively closely spaced antennas and other pairs of distant antennas. The closely spaced pair provides a short baseline having an integer ambiguity that may be searched exhaustively to identify the correct set of integers. This baseline is then used as a priori information to aid the determination of longer baselines that, once determined, may be used for accurate run time attitude determination.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Jianfei; Wang, Shijun; Turkbey, Evrim B.

    Purpose: Renal calculi are common extracolonic incidental findings on computed tomographic colonography (CTC). This work aims to develop a fully automated computer-aided diagnosis system to accurately detect renal calculi on CTC images. Methods: The authors developed a total variation (TV) flow method to reduce image noise within the kidneys while maintaining the characteristic appearance of renal calculi. Maximally stable extremal region (MSER) features were then calculated to robustly identify calculi candidates. Finally, the authors computed texture and shape features that were imported to support vector machines for calculus classification. The method was validated on a dataset of 192 patients andmore » compared to a baseline approach that detects calculi by thresholding. The authors also compared their method with the detection approaches using anisotropic diffusion and nonsmoothing. Results: At a false positive rate of 8 per patient, the sensitivities of the new method and the baseline thresholding approach were 69% and 35% (p < 1e − 3) on all calculi from 1 to 433 mm{sup 3} in the testing dataset. The sensitivities of the detection methods using anisotropic diffusion and nonsmoothing were 36% and 0%, respectively. The sensitivity of the new method increased to 90% if only larger and more clinically relevant calculi were considered. Conclusions: Experimental results demonstrated that TV-flow and MSER features are efficient means to robustly and accurately detect renal calculi on low-dose, high noise CTC images. Thus, the proposed method can potentially improve diagnosis.« less

  7. AGM-88E Advanced Anti-Radiation Guided Missile (AGM-88E AARGM)

    DTIC Science & Technology

    2015-12-01

    0.0 0.0 Acq O&M 0.0 0.0 -- 0.0 0.0 0.0 0.0 Total 1528.5 1661.1 N/A 2107.4 1861.4 2026.2 2663.7 1 APB Breach Confidence Level Confidence Level of...normal conditions, encountering average levels of technical, schedule, and programmatic risk and external interference. Based on the rigor in methods...SAR Baseline to Current SAR Baseline (TY $M) Initial PAUC Development Estimate Changes PAUC Production Estimate Econ Qty Sch Eng Est Oth Spt Total

  8. Determination of human body burden baseline date of platinum through autopsy tissue analysis.

    PubMed Central

    Vandiver, F; Duffield, F V; Yoakum, A; Bumgarner, J; Moran, J

    1976-01-01

    Results of analysis for platinum in 97 autopsy sets are presented. Analysis was performed by a specially developed emission spectrochemical method. Almost half of the individuals studied were found to have detectable platinum in one or more tissue samples. Platinum was found to be deposited in 13 of 21 tissue types investigated. Surprisingly high values were observed in subcutaneous fat, previously not considered to be a target site for platinum deposition. These data will serve as a human tissue platinum burden baseline in EPA's Catalyst Research Program. PMID:1001291

  9. Comparison of demons deformable registration-based methods for texture analysis of serial thoracic CT scans

    NASA Astrophysics Data System (ADS)

    Cunliffe, Alexandra R.; Al-Hallaq, Hania A.; Fei, Xianhan M.; Tuohy, Rachel E.; Armato, Samuel G.

    2013-02-01

    To determine how 19 image texture features may be altered by three image registration methods, "normal" baseline and follow-up computed tomography (CT) scans from 27 patients were analyzed. Nineteen texture feature values were calculated in over 1,000 32x32-pixel regions of interest (ROIs) randomly placed in each baseline scan. All three methods used demons registration to map baseline scan ROIs to anatomically matched locations in the corresponding transformed follow-up scan. For the first method, the follow-up scan transformation was subsampled to achieve a voxel size identical to that of the baseline scan. For the second method, the follow-up scan was transformed through affine registration to achieve global alignment with the baseline scan. For the third method, the follow-up scan was directly deformed to the baseline scan using demons deformable registration. Feature values in matched ROIs were compared using Bland- Altman 95% limits of agreement. For each feature, the range spanned by the 95% limits was normalized to the mean feature value to obtain the normalized range of agreement, nRoA. Wilcoxon signed-rank tests were used to compare nRoA values across features for the three methods. Significance for individual tests was adjusted using the Bonferroni method. nRoA was significantly smaller for affine-registered scans than for the resampled scans (p=0.003), indicating lower feature value variability between baseline and follow-up scan ROIs using this method. For both of these methods, however, nRoA was significantly higher than when feature values were calculated directly on demons-deformed followup scans (p<0.001). Across features and methods, nRoA values remained below 26%.

  10. The Construction and Validation of All-Atom Bulk-Phase Models of Amorphous Polymers Using the TIGER2/TIGER3 Empirical Sampling Method

    PubMed Central

    Li, Xianfeng; Murthy, Sanjeeva; Latour, Robert A.

    2011-01-01

    A new empirical sampling method termed “temperature intervals with global exchange of replicas and reduced radii” (TIGER3) is presented and demonstrated to efficiently equilibrate entangled long-chain molecular systems such as amorphous polymers. The TIGER3 algorithm is a replica exchange method in which simulations are run in parallel over a range of temperature levels at and above a designated baseline temperature. The replicas sampled at temperature levels above the baseline are run through a series of cycles with each cycle containing four stages – heating, sampling, quenching, and temperature level reassignment. The method allows chain segments to pass through one another at elevated temperature levels during the sampling stage by reducing the van der Waals radii of the atoms, thus eliminating chain entanglement problems. Atomic radii are then returned to their regular values and re-equilibrated at elevated temperature prior to quenching to the baseline temperature. Following quenching, replicas are compared using a Metropolis Monte Carlo exchange process for the construction of an approximate Boltzmann-weighted ensemble of states and then reassigned to the elevated temperature levels for additional sampling. Further system equilibration is performed by periodic implementation of the previously developed TIGER2 algorithm between cycles of TIGER3, which applies thermal cycling without radii reduction. When coupled with a coarse-grained modeling approach, the combined TIGER2/TIGER3 algorithm yields fast equilibration of bulk-phase models of amorphous polymer, even for polymers with complex, highly branched structures. The developed method was tested by modeling the polyethylene melt. The calculated properties of chain conformation and chain segment packing agreed well with published data. The method was also applied to generate equilibrated structural models of three increasingly complex amorphous polymer systems: poly(methyl methacrylate), poly(butyl methacrylate), and DTB-succinate copolymer. Calculated glass transition temperature (Tg) and structural parameter profile (S(q)) for each resulting polymer model were found to be in close agreement with experimental Tg values and structural measurements obtained by x-ray diffraction, thus validating that the developed methods provide realistic models of amorphous polymer structure. PMID:21769156

  11. Introducing gender equity to adolescent school children: A mixed methods' study.

    PubMed

    Syed, Saba

    2017-01-01

    Over the past decade, gender equality and women's empowerment have been explicitly recognized as key not only to the health of nations but also to social and economic development. The aim of the present study was to assess the effectiveness of a mixed methods' participatory group education approach to introduce gender equity to adolescent school children. It also assessed baseline and postintervention knowledge, attitudes, and practices regarding gender equity, sexual and reproductive health among adolescent students in government-aided schools, and finally, compare the pre- and post-intervention gender equitable (GE) attitudes among the study participants. A government-aided school was selected by nonprobalistic intentional sampling. On 5 predesignated days, willing students were included in the intervention which included a pretest, a group of educational-based participatory mixed methods' intervention followed by a posttest assessment. A total of 186 students participated in the study. Girls had better baseline GE scores as compared to boys and they also improvised more on the baseline scores following the intervention. The present mixed method approach to introduce gender equity to adolescent school children through a group education-based interventional approach proved to be effective in initiating dialog and sensitizing adolescents on gender equity and violence within a school setting.

  12. Decreased N-Acetyl Aspartate/Myo-Inositol Ratio in the Posterior Cingulate Cortex Shown by Magnetic Resonance Spectroscopy May Be One of the Risk Markers of Preclinical Alzheimer’s Disease: A 7-Year Follow-Up Study

    PubMed Central

    Waragai, Masaaki; Moriya, Masaru; Nojo, Takeshi

    2017-01-01

    Although molecular positron emission tomography imaging of amyloid and tau proteins can facilitate the detection of preclinical Alzheimer’s disease (AD) pathology, it is not useful in clinical practice. More practical surrogate markers for preclinical AD would provide valuable tools. Thus, we sought to validate the utility of conventional magnetic resonance spectroscopy (MRS) as a screening method for preclinical AD. A total of 289 older participants who were cognitively normal at baseline were clinically followed up for analysis of MRS metabolites, including N-acetyl aspartate (NAA) and myo-inositol (MI) in the posterior cingulate cortex (PCC) for 7 years. The 289 participants were retrospectively divided into five groups 7 years after baseline: 200 (69%) remained cognitively normal; 53 (18%) developed mild cognitive impairment (MCI); 21 (7%) developed AD; eight (2%) developed Parkinson’s disease with normal cognition, and seven (2%) developed dementia with Lewy bodies (DLB). The NAA/MI ratios of the PCC in the AD, MCI, and DLB groups were significantly decreased compared with participants who maintained normal cognition from baseline to 7 years after baseline. MMSE scores 7 years after baseline were significantly correlated with MI/Cr and NAA/MI ratios in the PCC. These results suggest that cognitively normal elderly subjects with low NAA/MI ratios in the PCC might be at risk of progression to clinical AD. Thus, the NAA/MI ratio in the PCC measured with conventional 1H MRS should be reconsidered as a possible adjunctive screening marker of preclinical AD in clinical practice. PMID:28968236

  13. Simple automatic strategy for background drift correction in chromatographic data analysis.

    PubMed

    Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin

    2016-06-03

    Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Risk factors associated with the development of overt nephropathy in type 2 diabetes patients: A 12 years observational study

    PubMed Central

    Viswanathan, Vijay; Tilak, Priyanka; Kumpatla, Satyavani

    2012-01-01

    Background & objectives: Diabetic nephropathy (DN) is the leading cause of chronic kidney disease and end-stage renal disease in developing countries. Early detection and risk reduction measures can prevent DN. The aim of the study was to determine the risk factors for the development of proteinuria over a period of 12 years of follow up in normoalbuminuric type 2 diabetes patients attending a specialized centre. Methods: Of the 2630 type 2 diabetes subjects newly registered in 1996, 152 (M:F;92:60) normoalbuminuric subjects had baseline and subsequent measurements of anthropometric, haemodynamic and biochemical details spanning 12 years. The subjects were divided into 2 groups based on the renal status during follow up visits. Group 1 (non-progressors) had persistent normoalbuminuria and group 2 (progressors) had persistent proteinuria. Presence of other diabetic complications during follow up and details on antidiabetic and antihypertensive agents were noted. Results: During median follow up of 11 years in subjects with normal renal function at baseline, 44.1 per cent developed proteinuria at follow up. Glucose levels, HbA1c, systolic blood pressure (SBP), triglycerides, and urea levels were significantly higher at baseline among progressors than non-progressors. Progressors had a longer duration of diabetes and significant fall in estimated glomerular filtration rate (eGFR) levels at follow up. In Cox's regression analysis, baseline age, duration of diabetes, baseline HbA1c and mean values of HbA1c, triglycerides, SBP and presence of retinopathy showed significant association with the development of macroalbuminuria. Interpretation & conclusions: Type 2 diabetes patients with uncontrolled diabetes and increase in blood pressure are at high risk of developing nephropathy. Age, long duration of diabetes, elevated BP, poor glycaemic control and presence of retinopathy were significantly associated with the progression of diabetic nephropathy. PMID:22885263

  15. A method for analyzing clustered interval-censored data based on Cox's model.

    PubMed

    Kor, Chew-Teng; Cheng, Kuang-Fu; Chen, Yi-Hau

    2013-02-28

    Methods for analyzing interval-censored data are well established. Unfortunately, these methods are inappropriate for the studies with correlated data. In this paper, we focus on developing a method for analyzing clustered interval-censored data. Our method is based on Cox's proportional hazard model with piecewise-constant baseline hazard function. The correlation structure of the data can be modeled by using Clayton's copula or independence model with proper adjustment in the covariance estimation. We establish estimating equations for the regression parameters and baseline hazards (and a parameter in copula) simultaneously. Simulation results confirm that the point estimators follow a multivariate normal distribution, and our proposed variance estimations are reliable. In particular, we found that the approach with independence model worked well even when the true correlation model was derived from Clayton's copula. We applied our method to a family-based cohort study of pandemic H1N1 influenza in Taiwan during 2009-2010. Using the proposed method, we investigate the impact of vaccination and family contacts on the incidence of pH1N1 influenza. Copyright © 2012 John Wiley & Sons, Ltd.

  16. Determination of patellofemoral pain sub-groups and development of a method for predicting treatment outcome using running gait kinematics.

    PubMed

    Watari, Ricky; Kobsar, Dylan; Phinyomark, Angkoon; Osis, Sean; Ferber, Reed

    2016-10-01

    Not all patients with patellofemoral pain exhibit successful outcomes following exercise therapy. Thus, the ability to identify patellofemoral pain subgroups related to treatment response is important for the development of optimal therapeutic strategies to improve rehabilitation outcomes. The purpose of this study was to use baseline running gait kinematic and clinical outcome variables to classify patellofemoral pain patients on treatment response retrospectively. Forty-one individuals with patellofemoral pain that underwent a 6-week exercise intervention program were sub-grouped as treatment Responders (n=28) and Non-responders (n=13) based on self-reported measures of pain and function. Baseline three-dimensional running kinematics, and self-reported measures underwent a linear discriminant analysis of the principal components of the variables to retrospectively classify participants based on treatment response. The significance of the discriminant function was verified with a Wilk's lambda test (α=0.05). The model selected 2 gait principal components and had a 78.1% classification accuracy. Overall, Non-responders exhibited greater ankle dorsiflexion, knee abduction and hip flexion during the swing phase and greater ankle inversion during the stance phase, compared to Responders. This is the first study to investigate an objective method to use baseline kinematic and self-report outcome variables to classify on patellofemoral pain treatment outcome. This study represents a significant first step towards a method to help clinicians make evidence-informed decisions regarding optimal treatment strategies for patients with patellofemoral pain. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Moving an asteroid with electric solar wind sail

    NASA Astrophysics Data System (ADS)

    Merikallio, S.; Janhunen, P.

    2010-12-01

    The electric solar wind sail (E-Sail) is a new propulsion method for interplanetary travel which was invented in 2006 and is currently under development. The E-Sail uses charged tethers to extract momentum from the solar wind particles to obtain propulsive thrust. According to current estimates, the E-Sail is 2-3 orders of magnitude better than traditional propulsion methods (chemical rockets and ion engines) in terms of produced lifetime-integrated impulse per propulsion system mass. Here we analyze the problem of using the E-Sail for directly deflecting an Earth-threatening asteroid. The problem then culminates into how to attach the E-Sail device to the asteroid. We assess alternative attachment strategies, namely straightforward direct towing with a cable and the gravity tractor method which works for a wider variety of situations. We also consider possible techniques to scale up the E-Sail force beyond the baseline one Newton level to deal with more imminent or larger asteroid or cometary threats. As a baseline case we consider an asteroid of effective diameter of 140 m and mass of 3 million tons, which can be deflected with a baseline 1 N E-Sail within 10 years. With a 5 N E-Sail the deflection could be achieved in 5 years. Once developed, the E-Sail would appear to provide a safe and reasonably low-cost way of deflecting dangerous asteroids and other heavenly bodies in cases where the collision threat becomes known several years in advance.

  18. Increase of EEG Spectral Theta Power Indicates Higher Risk of the Development of Severe Cognitive Decline in Parkinson’s Disease after 3 Years

    PubMed Central

    Cozac, Vitalii V.; Chaturvedi, Menorca; Hatz, Florian; Meyer, Antonia; Fuhr, Peter; Gschwandtner, Ute

    2016-01-01

    Objective: We investigated quantitative electroencephalography (qEEG) and clinical parameters as potential risk factors of severe cognitive decline in Parkinson’s disease. Methods: We prospectively investigated 37 patients with Parkinson’s disease at baseline and follow-up (after 3 years). Patients had no severe cognitive impairment at baseline. We used a summary score of cognitive tests as the outcome at follow-up. At baseline we assessed motor, cognitive, and psychiatric factors; qEEG variables [global relative median power (GRMP) spectra] were obtained by a fully automated processing of high-resolution EEG (256-channels). We used linear regression models with calculation of the explained variance to evaluate the relation of baseline parameters with cognitive deterioration. Results: The following baseline parameters significantly predicted severe cognitive decline: GRMP theta (4–8 Hz), cognitive task performance in executive functions and working memory. Conclusions: Combination of neurocognitive tests and qEEG improves identification of patients with higher risk of cognitive decline in PD. PMID:27965571

  19. Investigation of direct solar-to-microwave energy conversion techniques

    NASA Technical Reports Server (NTRS)

    Chatterton, N. E.; Mookherji, T. K.; Wunsch, P. K.

    1978-01-01

    Identification of alternative methods of producing microwave energy from solar radiation for purposes of directing power to the Earth from space is investigated. Specifically, methods of conversion of optical radiation into microwave radiation by the most direct means are investigated. Approaches based on demonstrated device functioning and basic phenomenologies are developed. There is no system concept developed, that is competitive with current baseline concepts. The most direct methods of conversion appear to require an initial step of production of coherent laser radiation. Other methods generally require production of electron streams for use in solid-state or cavity-oscillator systems. Further development is suggested to be worthwhile for suggested devices and on concepts utilizing a free-electron stream for the intraspace station power transport mechanism.

  20. Absolute-length determination of a long-baseline Fabry-Perot cavity by means of resonating modulation sidebands.

    PubMed

    Araya, A; Telada, S; Tochikubo, K; Taniguchi, S; Takahashi, R; Kawabe, K; Tatsumi, D; Yamazaki, T; Kawamura, S; Miyoki, S; Moriwaki, S; Musha, M; Nagano, S; Fujimoto, M K; Horikoshi, K; Mio, N; Naito, Y; Takamori, A; Yamamoto, K

    1999-05-01

    A new method has been demonstrated for absolute-length measurements of a long-baseline Fabry-Perot cavity by use of phase-modulated light. This method is based on determination of a free spectral range (FSR) of the cavity from the frequency difference between a carrier and phase-modulation sidebands, both of which resonate in the cavity. Sensitive response of the Fabry-Perot cavity near resonant frequencies ensures accurate determination of the FSR and thus of the absolute length of the cavity. This method was applied to a 300-m Fabry-Perot cavity of the TAMA gravitational wave detector that is being developed at the National Astronomical Observatory, Tokyo. With a modulation frequency of approximately 12 MHz, we successfully determined the absolute cavity length with resolution of 1 microm (3 x 10(-9) in strain) and observed local ground strain variations of 6 x 10(-8).

  1. Monitoring urban land cover change by updating the national land cover database impervious surface products

    USGS Publications Warehouse

    Xian, George Z.; Homer, Collin G.

    2009-01-01

    The U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001 is widely used as a baseline for national land cover and impervious conditions. To ensure timely and relevant data, it is important to update this base to a more recent time period. A prototype method was developed to update the land cover and impervious surface by individual Landsat path and row. This method updates NLCD 2001 to a nominal date of 2006 by using both Landsat imagery and data from NLCD 2001 as the baseline. Pairs of Landsat scenes in the same season from both 2001 and 2006 were acquired according to satellite paths and rows and normalized to allow calculation of change vectors between the two dates. Conservative thresholds based on Anderson Level I land cover classes were used to segregate the change vectors and determine areas of change and no-change. Once change areas had been identified, impervious surface was estimated for areas of change by sampling from NLCD 2001 in unchanged areas. Methods were developed and tested across five Landsat path/row study sites that contain a variety of metropolitan areas. Results from the five study areas show that the vast majority of impervious surface changes associated with urban developments were accurately captured and updated. The approach optimizes mapping efficiency and can provide users a flexible method to generate updated impervious surface at national and regional scales.

  2. Task 2 Report: Algorithm Development and Performance Analysis

    DTIC Science & Technology

    1993-07-01

    separated peaks ............................................. 39 7-16 Example ILGC data for schedule 3 phosphites showing an analysis method which integrates...more closely follows the baseline ................. 40 7-18 Example R.GC data for schedule 3 phosphites showing an analysis method resulting in unwanted...much of the ambiguity that can arise in GC/MS with trace environmental samples, for example. Correlated chromatography, on the other hand, separates the

  3. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    PubMed Central

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-01-01

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis. PMID:28029121

  4. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    PubMed

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  5. Comparison of fluorescence rejection methods of baseline correction and shifted excitation Raman difference spectroscopy

    NASA Astrophysics Data System (ADS)

    Cai, Zhijian; Zou, Wenlong; Wu, Jianhong

    2017-10-01

    Raman spectroscopy has been extensively used in biochemical tests, explosive detection, food additive and environmental pollutants. However, fluorescence disturbance brings a big trouble to the applications of portable Raman spectrometer. Currently, baseline correction and shifted-excitation Raman difference spectroscopy (SERDS) methods are the most prevailing fluorescence suppressing methods. In this paper, we compared the performances of baseline correction and SERDS methods, experimentally and simulatively. Through the comparison, it demonstrates that the baseline correction can get acceptable fluorescence-removed Raman spectrum if the original Raman signal has good signal-to-noise ratio, but it cannot recover the small Raman signals out of large noise background. By using SERDS method, the Raman signals, even very weak compared to fluorescence intensity and noise level, can be clearly extracted, and the fluorescence background can be completely rejected. The Raman spectrum recovered by SERDS has good signal to noise ratio. It's proved that baseline correction is more suitable for large bench-top Raman system with better quality or signal-to-noise ratio, while the SERDS method is more suitable for noisy devices, especially the portable Raman spectrometers.

  6. Improving Thermal Dose Accuracy in Magnetic Resonance-Guided Focused Ultrasound Surgery: Long-Term Thermometry Using a Prior Baseline as a Reference

    PubMed Central

    Bitton, Rachel R.; Webb, Taylor D.; Pauly, Kim Butts; Ghanouni, Pejman

    2015-01-01

    Purpose To investigate thermal dose volume (TDV) and non-perfused volume (NPV) of magnetic resonance-guided focused ultrasound (MRgFUS) treatments in patients with soft tissue tumors, and describe a method for MR thermal dosimetry using a baseline reference. Materials and Methods Agreement between TDV and immediate post treatment NPV was evaluated from MRgFUS treatments of five patients with biopsy-proven desmoid tumors. Thermometry data (gradient echo, 3T) were analyzed over the entire course of the treatments to discern temperature errors in the standard approach. The technique searches previously acquired baseline images for a match using 2D normalized cross-correlation and a weighted mean of phase difference images. Thermal dose maps and TDVs were recalculated using the matched baseline and compared to NPV. Results TDV and NPV showed between 47%–91% disagreement, using the standard immediate baseline method for calculating TDV. Long-term thermometry showed a nonlinear local temperature accrual, where peak additional temperature varied between 4–13°C (mean = 7.8°C) across patients. The prior baseline method could be implemented by finding a previously acquired matching baseline 61% ± 8% (mean ± SD) of the time. We found 7%–42% of the disagreement between TDV and NPV was due to errors in thermometry caused by heat accrual. For all patients, the prior baseline method increased the estimated treatment volume and reduced the discrepancies between TDV and NPV (P = 0.023). Conclusion This study presents a mismatch between in-treatment and post treatment efficacy measures. The prior baseline approach accounts for local heating and improves the accuracy of thermal dose-predicted volume. PMID:26119129

  7. Clustering of longitudinal data by using an extended baseline: A new method for treatment efficacy clustering in longitudinal data.

    PubMed

    Schramm, Catherine; Vial, Céline; Bachoud-Lévi, Anne-Catherine; Katsahian, Sandrine

    2018-01-01

    Heterogeneity in treatment efficacy is a major concern in clinical trials. Clustering may help to identify the treatment responders and the non-responders. In the context of longitudinal cluster analyses, sample size and variability of the times of measurements are the main issues with the current methods. Here, we propose a new two-step method for the Clustering of Longitudinal data by using an Extended Baseline. The first step relies on a piecewise linear mixed model for repeated measurements with a treatment-time interaction. The second step clusters the random predictions and considers several parametric (model-based) and non-parametric (partitioning, ascendant hierarchical clustering) algorithms. A simulation study compares all options of the clustering of longitudinal data by using an extended baseline method with the latent-class mixed model. The clustering of longitudinal data by using an extended baseline method with the two model-based algorithms was the more robust model. The clustering of longitudinal data by using an extended baseline method with all the non-parametric algorithms failed when there were unequal variances of treatment effect between clusters or when the subgroups had unbalanced sample sizes. The latent-class mixed model failed when the between-patients slope variability is high. Two real data sets on neurodegenerative disease and on obesity illustrate the clustering of longitudinal data by using an extended baseline method and show how clustering may help to identify the marker(s) of the treatment response. The application of the clustering of longitudinal data by using an extended baseline method in exploratory analysis as the first stage before setting up stratified designs can provide a better estimation of treatment effect in future clinical trials.

  8. A novel baseline-correction method for standard addition based derivative spectra and its application to quantitative analysis of benzo(a)pyrene in vegetable oil samples.

    PubMed

    Li, Na; Li, Xiu-Ying; Zou, Zhe-Xiang; Lin, Li-Rong; Li, Yao-Qun

    2011-07-07

    In the present work, a baseline-correction method based on peak-to-derivative baseline measurement was proposed for the elimination of complex matrix interference that was mainly caused by unknown components and/or background in the analysis of derivative spectra. This novel method was applicable particularly when the matrix interfering components showed a broad spectral band, which was common in practical analysis. The derivative baseline was established by connecting two crossing points of the spectral curves obtained with a standard addition method (SAM). The applicability and reliability of the proposed method was demonstrated through both theoretical simulation and practical application. Firstly, Gaussian bands were used to simulate 'interfering' and 'analyte' bands to investigate the effect of different parameters of interfering band on the derivative baseline. This simulation analysis verified that the accuracy of the proposed method was remarkably better than other conventional methods such as peak-to-zero, tangent, and peak-to-peak measurements. Then the above proposed baseline-correction method was applied to the determination of benzo(a)pyrene (BaP) in vegetable oil samples by second-derivative synchronous fluorescence spectroscopy. The satisfactory results were obtained by using this new method to analyze a certified reference material (coconut oil, BCR(®)-458) with a relative error of -3.2% from the certified BaP concentration. Potentially, the proposed method can be applied to various types of derivative spectra in different fields such as UV-visible absorption spectroscopy, fluorescence spectroscopy and infrared spectroscopy.

  9. A Simple Method to Control Positive Baseline Trend within Data Nonoverlap

    ERIC Educational Resources Information Center

    Parker, Richard I.; Vannest, Kimberly J.; Davis, John L.

    2014-01-01

    Nonoverlap is widely used as a statistical summary of data; however, these analyses rarely correct unwanted positive baseline trend. This article presents and validates the graph rotation for overlap and trend (GROT) technique, a hand calculation method for controlling positive baseline trend within an analysis of data nonoverlap. GROT is…

  10. Pushing CHARA to its Limit: A Pathway Toward 80X80 Pixel Images of Stellar Surfaces

    NASA Astrophysics Data System (ADS)

    Norris, Ryan

    2018-04-01

    Imagine a future with 80x80 pixel images of stellar surfaces. With a maximum baseline of 330 m, the CHARA Array is already capable of achieving 0.5 mas resolution, sufficient for imaging the red supergiant Betelgeuse (d = 42.3 mas ) at such a scale. However several issues have hampered attempts to image the largest stars at CHARA, including a lack of baselines shorter than 34 m and instrument sensitivities unable to measure the faintest fringes. Here we discuss what is needed to achieve imaging of large stars at CHARA. We will present suggestions for future telescope placement, describing the advantages of a short baseline, while also considering the needs of other imaging targets that might benefit from additional baselines. We will also present developments in image reconstruction methods that can improve the resolution of images today, albeit of smaller targets and at a lesser scale. Of course, there will be example images, created using simulated oifits data and state of the art reconstruction techniques!

  11. Single-Trial Normalization for Event-Related Spectral Decomposition Reduces Sensitivity to Noisy Trials

    PubMed Central

    Grandchamp, Romain; Delorme, Arnaud

    2011-01-01

    In electroencephalography, the classical event-related potential model often proves to be a limited method to study complex brain dynamics. For this reason, spectral techniques adapted from signal processing such as event-related spectral perturbation (ERSP) – and its variant event-related synchronization and event-related desynchronization – have been used over the past 20 years. They represent average spectral changes in response to a stimulus. These spectral methods do not have strong consensus for comparing pre- and post-stimulus activity. When computing ERSP, pre-stimulus baseline removal is usually performed after averaging the spectral estimate of multiple trials. Correcting the baseline of each single-trial prior to averaging spectral estimates is an alternative baseline correction method. However, we show that this method leads to positively skewed post-stimulus ERSP values. We eventually present new single-trial-based ERSP baseline correction methods that perform trial normalization or centering prior to applying classical baseline correction methods. We show that single-trial correction methods minimize the contribution of artifactual data trials with high-amplitude spectral estimates and are robust to outliers when performing statistical inference testing. We then characterize these methods in terms of their time–frequency responses and behavior compared to classical ERSP methods. PMID:21994498

  12. Computer-aided detection of renal calculi from noncontrast CT images using TV-flow and MSER features

    PubMed Central

    Liu, Jianfei; Wang, Shijun; Turkbey, Evrim B.; Linguraru, Marius George; Yao, Jianhua; Summers, Ronald M.

    2015-01-01

    Purpose: Renal calculi are common extracolonic incidental findings on computed tomographic colonography (CTC). This work aims to develop a fully automated computer-aided diagnosis system to accurately detect renal calculi on CTC images. Methods: The authors developed a total variation (TV) flow method to reduce image noise within the kidneys while maintaining the characteristic appearance of renal calculi. Maximally stable extremal region (MSER) features were then calculated to robustly identify calculi candidates. Finally, the authors computed texture and shape features that were imported to support vector machines for calculus classification. The method was validated on a dataset of 192 patients and compared to a baseline approach that detects calculi by thresholding. The authors also compared their method with the detection approaches using anisotropic diffusion and nonsmoothing. Results: At a false positive rate of 8 per patient, the sensitivities of the new method and the baseline thresholding approach were 69% and 35% (p < 1e − 3) on all calculi from 1 to 433 mm3 in the testing dataset. The sensitivities of the detection methods using anisotropic diffusion and nonsmoothing were 36% and 0%, respectively. The sensitivity of the new method increased to 90% if only larger and more clinically relevant calculi were considered. Conclusions: Experimental results demonstrated that TV-flow and MSER features are efficient means to robustly and accurately detect renal calculi on low-dose, high noise CTC images. Thus, the proposed method can potentially improve diagnosis. PMID:25563255

  13. Estimation of daily mean streamflow for ungaged stream locations in the Delaware River Basin, water years 1960–2010

    USGS Publications Warehouse

    Stuckey, Marla H.

    2016-06-09

    The ability to characterize baseline streamflow conditions, compare them with current conditions, and assess effects of human activities on streamflow is fundamental to water-management programs addressing water allocation, human-health issues, recreation needs, and establishment of ecological flow criteria. The U.S. Geological Survey, through the National Water Census, has developed the Delaware River Basin Streamflow Estimator Tool (DRB-SET) to estimate baseline (minimally altered) and altered (affected by regulation, diversion, mining, or other anthropogenic activities) and altered streamflow at a daily time step for ungaged stream locations in the Delaware River Basin for water years 1960–2010. Daily mean baseline streamflow is estimated by using the QPPQ method to equate streamflow expressed as a percentile from the flow-duration curve (FDC) for a particular day at an ungaged stream location with the percentile from a FDC for the same day at a hydrologically similar gaged location where streamflow is measured. Parameter-based regression equations were developed for 22 exceedance probabilities from the FDC for ungaged stream locations in the Delaware River Basin. Water use data from 2010 is used to adjust the baseline daily mean streamflow generated from the QPPQ method at ungaged stream locations in the Delaware River Basin to reflect current, or altered, conditions. To evaluate the effectiveness of the overall QPPQ method contained within DRB-SET, a comparison of observed and estimated daily mean streamflows was performed for 109 reference streamgages in and near the Delaware River Basin. The Nash-Sutcliffe efficiency (NSE) values were computed as a measure of goodness of fit. The NSE values (using log10 streamflow values) ranged from 0.22 to 0.98 (median of 0.90) for 45 streamgages in the Upper Delaware River Basin and from -0.37 to 0.98 (median of 0.79) for 41 streamgages in the Lower Delaware River Basin.

  14. Prevalence of polymorphisms with significant resistance to NS5A inhibitors in treatment-naive patients with hepatitis C virus genotypes 1a and 3a in Sweden.

    PubMed

    Lindström, Ida; Kjellin, Midori; Palanisamy, Navaneethan; Bondeson, Kåre; Wesslén, Lars; Lannergard, Anders; Lennerstrand, Johan

    2015-08-01

    The future treatment of hepatitis C virus (HCV) infection will be combinations of direct-acting antivirals (DAAs) that not only target multiple viral targets, but are also effective against different HCV genotypes. Of the many drug targets in HCV, one promising target is the non-structural 5A protein (NS5A), against which inhibitors, namely daclatasvir, ledipasvir and ombitasvir, have shown potent efficacy. However, since HCV is known to have very high sequence diversity, development of resistance is a problem against but not limited to NS5A inhibitors (i.e. resistance also found against NS3-protease and NS5B non-nucleoside inhibitors), when used in suboptimal combinations. Furthermore, it has been shown that natural resistance against DAAs is present in treatment-naïve patients and such baseline resistance will potentially complicate future treatment strategies. A pan-genotypic population-sequencing method with degenerated primers targeting the NS5A region was developed. We have investigated the prevalence of baseline resistant variants in 127 treatment-naïve patients of HCV genotypes 1a, 1b, 2b and 3a. The method could successfully sequence more than 95% of genotype 1a, 1b and 3a samples. Interpretation of fold resistance data against the NS5A inhibitors was done with the help of earlier published phenotypic data. Baseline resistance variants associated with high resistance (1000-50,000-fold) was found in three patients: Q30H or Y93N in genotype 1a patients and further Y93H in a genotype 3a patient. Using this method, baseline resistance can be examined and the data could have a potential role in selecting the optimal and cost-efficient treatment for the patient.

  15. Detection of Bursts and Pauses in Spike Trains

    PubMed Central

    Ko, D.; Wilson, C. J.; Lobb, C. J.; Paladini, C. A.

    2012-01-01

    Midbrain dopaminergic neurons in vivo exhibit a wide range of firing patterns. They normally fire constantly at a low rate, and speed up, firing a phasic burst when reward exceeds prediction, or pause when an expected reward does not occur. Therefore, the detection of bursts and pauses from spike train data is a critical problem when studying the role of phasic dopamine (DA) in reward related learning, and other DA dependent behaviors. However, few statistical methods have been developed that can identify bursts and pauses simultaneously. We propose a new statistical method, the Robust Gaussian Surprise (RGS) method, which performs an exhaustive search of bursts and pauses in spike trains simultaneously. We found that the RGS method is adaptable to various patterns of spike trains recorded in vivo, and is not influenced by baseline firing rate, making it applicable to all in vivo spike trains where baseline firing rates vary over time. We compare the performance of the RGS method to other methods of detecting bursts, such as the Poisson Surprise (PS), Rank Surprise (RS), and Template methods. Analysis of data using the RGS method reveals potential mechanisms underlying how bursts and pauses are controlled in DA neurons. PMID:22939922

  16. Nontangent, Developed Contour Bulkheads for a Single-Stage Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Wu, K. Chauncey; Lepsch, Roger A., Jr.

    2000-01-01

    Dry weights for single-stage launch vehicles that incorporate nontangent, developed contour bulkheads are estimated and compared to a baseline vehicle with 1.414 aspect ratio ellipsoidal bulkheads. Weights, volumes, and heights of optimized bulkhead designs are computed using a preliminary design bulkhead analysis code. The dry weights of vehicles that incorporate the optimized bulkheads are predicted using a vehicle weights and sizing code. Two optimization approaches are employed. A structural-level method, where the vehicle's three major bulkhead regions are optimized separately and then incorporated into a model for computation of the vehicle dry weight, predicts a reduction of4365 lb (2.2 %) from the 200,679-lb baseline vehicle dry weight. In the second, vehicle-level, approach, the vehicle dry weight is the objective function for the optimization. For the vehicle-level analysis, modified bulkhead designs are analyzed and incorporated into the weights model for computation of a dry weight. The optimizer simultaneously manipulates design variables for all three bulkheads to reduce the dry weight. The vehicle-level analysis predicts a dry weight reduction of 5129 lb, a 2.6% reduction from the baseline weight. Based on these results, nontangent, developed contour bulkheads may provide substantial weight savings for single stage vehicles.

  17. Improving the performance of US Environmental Protection Agency Method 300.1 for monitoring drinking water compliance.

    PubMed

    Wagner, Herbert P; Pepich, Barry V; Hautman, Daniel P; Munch, David J

    2003-09-05

    In 1998, the United States Environmental Protection Agency (EPA) promulgated the maximum contaminant level (MCL) for bromate in drinking water at 10 microg/l, and the method for compliance monitoring of bromate in drinking water was established under Stage 1 of the Disinfectants/Disinfection By-Products Rule (D/DBP) as EPA Method 300.1. In January 2002, the United States Food and Drug Administration (FDA) regulated the bromate concentration in bottled waters at 10 microg/l. EPA anticipates proposing additional methods, which have improved performance for bromate monitoring, in addition to EPA Method 300.1, in the Stage 2 DBP Rule. Until the Stage 2 Rule is promulgated, EPA Method 300.1 will continue to be the only method approved for compliance monitoring of bromate. This manuscript describes the work completed at EPA's Technical Support Center (TSC) to assess the performance of recently developed suppressor technologies toward improving the trace level performance of EPA Method 300.1, specifically for the analysis of trace levels of bromate in high ionic matrices. Three different types of Dionex suppressors were evaluated. The baseline noise, return to baseline after the water dip, detection limits, precision and accuracy, and advantages/disadvantages of each suppressor are discussed. Performance data for the three different suppressors indicates that chemical suppression of the eluent, using the AMMS III suppressor, is the most effective means to reduce baseline noise, resulting in the best resolution and the lowest bromate detection limits, even when a high ionic matrix is analyzed. Incorporation of the AMMS III suppressor improves the performance of EPA Method 300.1 at and below 5.0 microg/l and is a quick way for laboratories to improve their bromate compliance monitoring.

  18. Assessing Stress Responses in Beaked and Sperm Whales in the Bahamas

    DTIC Science & Technology

    2015-09-30

    IMPACT/APPLICATIONS Developing methods to better understand the sub-lethal, physiologic consequences of underwater noise disturbance on species of...baseline ranges of stress-related fecal hormones are being developed and can be applied in the future to assess physiologic responses to elevated acoustic...fecal aldosterone assays as an additional measure of adrenal activation during stress responses in North Atlantic right whales (Eubalaena glacialis

  19. Measuring Historical Coastal Change using GIS and the Change Polygon Approach

    USGS Publications Warehouse

    Smith, M.J.; Cromley, R.G.

    2012-01-01

    This study compares two automated approaches, the transect-from-baseline technique and a new change polygon method, for quantifying historical coastal change over time. The study shows that the transect-from-baseline technique is complicated by choice of a proper baseline as well as generating transects that intersect with each other rather than with the nearest shoreline. The change polygon method captures the full spatial difference between the positions of the two shorelines and average coastal change is the defined as the ratio of the net area divided by the shoreline length. Although then change polygon method is sensitive to the definition and measurement of shoreline length, the results are more invariant to parameter changes than the transect-from-baseline method, suggesting that the change polygon technique may be a more robust coastal change method. ?? 2012 Blackwell Publishing Ltd.

  20. Defining the baseline for inhibition concentration calculations for hormetic hazards.

    PubMed

    Bailer, A J; Oris, J T

    2000-01-01

    The use of endpoint estimates based on modeling inhibition of test organism response relative to a baseline response is an important tool in the testing and evaluation of aquatic hazards. In the presence of a hormetic hazard, the definition of the baseline response is not clear because non-zero levels of the hazard stimulate an enhanced response prior to inhibition. In the present study, the methodology and implications of how one defines a baseline response for inhibition concentration estimation in aquatic toxicity tests was evaluated. Three possible baselines were considered: the control response level; the pooling of responses, including controls and all concentration conditions with responses enhanced relative to controls; and, finally, the maximal response. The statistical methods associated with estimating inhibition relative to the first two baseline definitions were described and a method for estimating inhibition relative to the third baseline definition was derived. These methods were illustrated with data from a standard aquatic zooplankton reproductive toxicity test in which the number of young produced in three broods of a cladoceran exposed to effluent was modeled as a function of effluent concentration. Copyright 2000 John Wiley & Sons, Ltd.

  1. A Fully Customized Baseline Removal Framework for Spectroscopic Applications.

    PubMed

    Giguere, Stephen; Boucher, Thomas; Carey, C J; Mahadevan, Sridhar; Dyar, M Darby

    2017-07-01

    The task of proper baseline or continuum removal is common to nearly all types of spectroscopy. Its goal is to remove any portion of a signal that is irrelevant to features of interest while preserving any predictive information. Despite the importance of baseline removal, median or guessed default parameters are commonly employed, often using commercially available software supplied with instruments. Several published baseline removal algorithms have been shown to be useful for particular spectroscopic applications but their generalizability is ambiguous. The new Custom Baseline Removal (Custom BLR) method presented here generalizes the problem of baseline removal by combining operations from previously proposed methods to synthesize new correction algorithms. It creates novel methods for each technique, application, and training set, discovering new algorithms that maximize the predictive accuracy of the resulting spectroscopic models. In most cases, these learned methods either match or improve on the performance of the best alternative. Examples of these advantages are shown for three different scenarios: quantification of components in near-infrared spectra of corn and laser-induced breakdown spectroscopy data of rocks, and classification/matching of minerals using Raman spectroscopy. Software to implement this optimization is available from the authors. By removing subjectivity from this commonly encountered task, Custom BLR is a significant step toward completely automatic and general baseline removal in spectroscopic and other applications.

  2. Ensembles of novelty detection classifiers for structural health monitoring using guided waves

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Karpenko, Oleksii; Koricho, Ermias; Khomenko, Anton; Haq, Mahmoodul; Udpa, Lalita

    2018-01-01

    Guided wave structural health monitoring uses sparse sensor networks embedded in sophisticated structures for defect detection and characterization. The biggest challenge of those sensor networks is developing robust techniques for reliable damage detection under changing environmental and operating conditions (EOC). To address this challenge, we develop a novelty classifier for damage detection based on one class support vector machines. We identify appropriate features for damage detection and introduce a feature aggregation method which quadratically increases the number of available training observations. We adopt a two-level voting scheme by using an ensemble of classifiers and predictions. Each classifier is trained on a different segment of the guided wave signal, and each classifier makes an ensemble of predictions based on a single observation. Using this approach, the classifier can be trained using a small number of baseline signals. We study the performance using Monte-Carlo simulations of an analytical model and data from impact damage experiments on a glass fiber composite plate. We also demonstrate the classifier performance using two types of baseline signals: fixed and rolling baseline training set. The former requires prior knowledge of baseline signals from all EOC, while the latter does not and leverages the fact that EOC vary slowly over time and can be modeled as a Gaussian process.

  3. Development of a hospital-based patient-reported outcome framework for lung cancer patients: a study protocol.

    PubMed

    Moloczij, Natasha; Gough, Karla; Solomon, Benjamin; Ball, David; Mileshkin, Linda; Duffy, Mary; Krishnasamy, Mei

    2018-01-11

    Patient-reported outcome (PRO) data is central to the delivery of quality health care. Establishing sustainable, reliable and cost-efficient methods for routine collection and integration of PRO data into health information systems is challenging. This protocol paper describes the design and structure of a study to develop and pilot test a PRO framework to systematically and longitudinally collect PRO data from a cohort of lung cancer patients at a comprehensive cancer centre in Australia. Best-practice guidelines for developing registries aimed at collecting PROs informed the development of this PRO framework. Framework components included: achieving consensus on determining the purpose of the framework, the PRO measures to be included, the data collection time points and collection methods (electronic and paper), establishing processes to safeguard the quality of the data collected and to link the PRO framework to an existing hospital-based lung cancer clinical registry. Lung cancer patients will be invited to give feedback on the PRO measures (PROMs) chosen and the data collection time points and methods. Implementation of the framework will be piloted for 12 months. Then a mixed-methods approach used to explore patient and multidisciplinary perspectives on the feasibility of implementing the framework and linking it to the lung cancer clinical registry, its clinical utility, perceptions of data collection burden, and preliminary assessment of resource costs to integrate, implement and sustain the PRO framework. The PRO data set will include: a quality of life questionnaire (EORTC-QLQ-C30) and the EORTC lung cancer specific module (QLQC-LC-13). These will be collected pre-treatment (baseline), 2, 6 and 12 months post-baseline. Also, four social isolation questions (PROMIS) will be collected at baseline. Identifying and deciding on the overall purpose, clinical utility of data and which PROs to collect from patients requires careful consideration. Our study will explore how PRO data collection processes that link to a clinical data set can be developed and integrated; how PRO systems that are easy for patients to complete and professionals to use in practice can be achieved, and will provide indicative costs of developing and integrating a longitudinal PRO framework into routine hospital data collection systems. This study is not a clinical trial and is therefore not registered in any trial registry. However, it has received human research ethics approval (LNR/16/PMCC/45).

  4. External validation of a Cox prognostic model: principles and methods

    PubMed Central

    2013-01-01

    Background A prognostic model should not enter clinical practice unless it has been demonstrated that it performs a useful role. External validation denotes evaluation of model performance in a sample independent of that used to develop the model. Unlike for logistic regression models, external validation of Cox models is sparsely treated in the literature. Successful validation of a model means achieving satisfactory discrimination and calibration (prediction accuracy) in the validation sample. Validating Cox models is not straightforward because event probabilities are estimated relative to an unspecified baseline function. Methods We describe statistical approaches to external validation of a published Cox model according to the level of published information, specifically (1) the prognostic index only, (2) the prognostic index together with Kaplan-Meier curves for risk groups, and (3) the first two plus the baseline survival curve (the estimated survival function at the mean prognostic index across the sample). The most challenging task, requiring level 3 information, is assessing calibration, for which we suggest a method of approximating the baseline survival function. Results We apply the methods to two comparable datasets in primary breast cancer, treating one as derivation and the other as validation sample. Results are presented for discrimination and calibration. We demonstrate plots of survival probabilities that can assist model evaluation. Conclusions Our validation methods are applicable to a wide range of prognostic studies and provide researchers with a toolkit for external validation of a published Cox model. PMID:23496923

  5. Passive interior noise reduction analysis of King Air 350 turboprop aircraft using boundary element method/finite element method (BEM/FEM)

    NASA Astrophysics Data System (ADS)

    Dandaroy, Indranil; Vondracek, Joseph; Hund, Ron; Hartley, Dayton

    2005-09-01

    The objective of this study was to develop a vibro-acoustic computational model of the Raytheon King Air 350 turboprop aircraft with an intent to reduce propfan noise in the cabin. To develop the baseline analysis, an acoustic cavity model of the aircraft interior and a structural dynamics model of the aircraft fuselage were created. The acoustic model was an indirect boundary element method representation using SYSNOISE, while the structural model was a finite-element method normal modes representation in NASTRAN and subsequently imported to SYSNOISE. In the acoustic model, the fan excitation sources were represented employing the Ffowcs Williams-Hawkings equation. The acoustic and the structural models were fully coupled in SYSNOISE and solved to yield the baseline response of acoustic pressure in the aircraft interior and vibration on the aircraft structure due to fan noise. Various vibration absorbers, tuned to fundamental blade passage tone (100 Hz) and its first harmonic (200 Hz), were applied to the structural model to study their effect on cabin noise reduction. Parametric studies were performed to optimize the number and location of these passive devices. Effects of synchrophasing and absorptive noise treatments applied to the aircraft interior were also investigated for noise reduction.

  6. Development Of Regional Climate Mitigation Baseline For A DominantAgro-Ecological Zone Of Karnataka, India

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudha, P.; Shubhashree, D.; Khan, H.

    2007-06-01

    Setting a baseline for carbon stock changes in forest andland use sector mitigation projects is an essential step for assessingadditionality of the project. There are two approaches for settingbaselines namely, project-specific and regional baseline. This paperpresents the methodology adopted for estimating the land available formitigation, for developing a regional baseline, transaction cost involvedand a comparison of project-specific and regional baseline. The studyshowed that it is possible to estimate the potential land and itssuitability for afforestation and reforestation mitigation projects,using existing maps and data, in the dry zone of Karnataka, southernIndia. The study adopted a three-step approach for developing a regionalbaseline,more » namely: i) identification of likely baseline options for landuse, ii) estimation of baseline rates of land-use change, and iii)quantification of baseline carbon profile over time. The analysis showedthat carbon stock estimates made for wastelands and fallow lands forproject-specific as well as the regional baseline are comparable. Theratio of wasteland Carbon stocks of a project to regional baseline is1.02, and that of fallow lands in the project to regional baseline is0.97. The cost of conducting field studies for determination of regionalbaseline is about a quarter of the cost of developing a project-specificbaseline on a per hectare basis. The study has shown the reliability,feasibility and cost-effectiveness of adopting regional baseline forforestry sectormitigation projects.« less

  7. U.S. Geological Survey experience with the residual absolutes method

    NASA Astrophysics Data System (ADS)

    Worthington, E. William; Matzka, Jürgen

    2017-10-01

    The U.S. Geological Survey (USGS) Geomagnetism Program has developed and tested the residual method of absolutes, with the assistance of the Danish Technical University's (DTU) Geomagnetism Program. Three years of testing were performed at College Magnetic Observatory (CMO), Fairbanks, Alaska, to compare the residual method with the null method. Results show that the two methods compare very well with each other and both sets of baseline data were used to process the 2015 definitive data. The residual method will be implemented at the other USGS high-latitude geomagnetic observatories in the summer of 2017 and 2018.

  8. Conservation of water for washing beef heads at harvest

    USDA-ARS?s Scientific Manuscript database

    The objective of this research was to develop methods to conserve water necessary to cleanse beef heads prior to USDA–FSIS inspection. This was to be accomplished by establishing a baseline for the minimum amount of water necessary to adequately wash a head and application of image analysis to provi...

  9. General Practitioners' Management of Psychostimulant Drug Misuse: Implications for Education and Training

    ERIC Educational Resources Information Center

    Alkhamis, Ahmed; Matheson, Catriona; Bond, Christine

    2009-01-01

    Aims: To provide baseline data regarding GPs' knowledge, experience, and attitudes toward the management of PsychoStimulant Drug Misuse (PSDM) patients to inform future education and training initiatives. Methods: A structured cross-sectional postal questionnaire was developed following initial content setting interviews, piloted then sent to a…

  10. Hampton Roads climate impact quantification initiative : baseline assessment of the transportation assets & overview of economic analyses useful in quantifying impacts

    DOT National Transportation Integrated Search

    2016-09-13

    The Hampton Roads Climate Impact Quantification Initiative (HRCIQI) is a multi-part study sponsored by the U.S. Department of Transportation (DOT) Climate Change Center with the goals that include developing a cost tool that provides methods for volu...

  11. Mindful "Vitality in Practice": an intervention to improve the work engagement and energy balance among workers; the development and design of the randomised controlled trial

    PubMed Central

    2011-01-01

    Background Modern working life has become more mental and less physical in nature, contributing to impaired mental health and a disturbed energy balance. This may result in mental health problems and overweight. Both are significant threats to the health of workers and thus also a financial burden for society, including employers. Targeting work engagement and energy balance could prevent impaired mental health and overweight, respectively. Methods/Design The study population consists of highly educated workers in two Dutch research institutes. The intervention was systematically developed, based on the Intervention Mapping (IM) protocol, involving workers and management in the process. The workers' needs were assessed by combining the results of interviews, focus group discussions and a questionnaire with available literature. Suitable methods and strategies were selected resulting in an intervention including: eight weeks of customized mindfulness training, followed by eight sessions of e-coaching and supporting elements, such as providing fruit and snack vegetables at the workplace, lunch walking routes, and a buddy system. The effects of the intervention will be evaluated in a RCT, with measurements at baseline, six months (T1) and 12 months (T2). In addition, cost-effectiveness and process of the intervention will also be evaluated. Discussion At baseline the level of work engagement of the sample was "average". Of the study population, 60.1% did not engage in vigorous physical activity at all. An average working day consists of eight sedentary hours. For the Phase II RCT, there were no significant differences between the intervention and the control group at baseline, except for vigorous physical activity. The baseline characteristics of the study population were congruent with the results of the needs assessment. The IM protocol used for the systematic development of the intervention produced an appropriate intervention to test in the planned RCT. Trial registration number Netherlands Trial Register (NTR): NTR2199 PMID:21951433

  12. No Clinically Significant Changes in Pulmonary Function Following Stereotactic Body Radiation Therapy for Early- Stage Peripheral Non-Small Cell Lung Cancer: An Analysis of RTOG 0236

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanic, Sinisa, E-mail: sinisa.stanic@carle.com; Paulus, Rebecca; Timmerman, Robert D.

    2014-04-01

    Purpose: To investigate pulmonary function test (PFT) results and arterial blood gas changes (complete PFT) following stereotactic body radiation therapy (SBRT) and to see whether baseline PFT correlates with lung toxicity and overall survival in medically inoperable patients receiving SBRT for early stage, peripheral, non-small cell lung cancer (NSCLC). Methods and Materials: During the 2-year follow-up, PFT data were collected for patients with T1-T2N0M0 peripheral NSCLC who received effectively 18 Gy × 3 in a phase 2 North American multicenter study (Radiation Therapy Oncology Group [RTOG] protocol 0236). Pulmonary toxicity was graded by using the RTOG SBRT pulmonary toxicity scale. Paired Wilcoxon signedmore » rank test, logistic regression model, and Kaplan-Meier method were used for statistical analysis. Results: At 2 years, mean percentage predicted forced expiratory volume in the first second and diffusing capacity for carbon monoxide declines were 5.8% and 6.3%, respectively, with minimal changes in arterial blood gases and no significant decline in oxygen saturation. Baseline PFT was not predictive of any pulmonary toxicity following SBRT. Whole-lung V5 (the percentage of normal lung tissue receiving 5 Gy), V10, V20, and mean dose to the whole lung were almost identical between patients who developed pneumonitis and patients who were pneumonitis-free. Poor baseline PFT did not predict decreased overall survival. Patients with poor baseline PFT as the reason for medical inoperability had higher median and overall survival rates than patients with normal baseline PFT values but with cardiac morbidity. Conclusions: Poor baseline PFT did not appear to predict pulmonary toxicity or decreased overall survival after SBRT in this medically inoperable population. Poor baseline PFT alone should not be used to exclude patients with early stage lung cancer from treatment with SBRT.« less

  13. Computer-aided detection of renal calculi from noncontrast CT images using TV-flow and MSER features.

    PubMed

    Liu, Jianfei; Wang, Shijun; Turkbey, Evrim B; Linguraru, Marius George; Yao, Jianhua; Summers, Ronald M

    2015-01-01

    Renal calculi are common extracolonic incidental findings on computed tomographic colonography (CTC). This work aims to develop a fully automated computer-aided diagnosis system to accurately detect renal calculi on CTC images. The authors developed a total variation (TV) flow method to reduce image noise within the kidneys while maintaining the characteristic appearance of renal calculi. Maximally stable extremal region (MSER) features were then calculated to robustly identify calculi candidates. Finally, the authors computed texture and shape features that were imported to support vector machines for calculus classification. The method was validated on a dataset of 192 patients and compared to a baseline approach that detects calculi by thresholding. The authors also compared their method with the detection approaches using anisotropic diffusion and nonsmoothing. At a false positive rate of 8 per patient, the sensitivities of the new method and the baseline thresholding approach were 69% and 35% (p < 1e - 3) on all calculi from 1 to 433 mm(3) in the testing dataset. The sensitivities of the detection methods using anisotropic diffusion and nonsmoothing were 36% and 0%, respectively. The sensitivity of the new method increased to 90% if only larger and more clinically relevant calculi were considered. Experimental results demonstrated that TV-flow and MSER features are efficient means to robustly and accurately detect renal calculi on low-dose, high noise CTC images. Thus, the proposed method can potentially improve diagnosis.

  14. A Fatigue Crack Size Evaluation Method Based on Lamb Wave Simulation and Limited Experimental Data

    PubMed Central

    He, Jingjing; Ran, Yunmeng; Liu, Bin; Yang, Jinsong; Guan, Xuefei

    2017-01-01

    This paper presents a systematic and general method for Lamb wave-based crack size quantification using finite element simulations and Bayesian updating. The method consists of construction of a baseline quantification model using finite element simulation data and Bayesian updating with limited Lamb wave data from target structure. The baseline model correlates two proposed damage sensitive features, namely the normalized amplitude and phase change, with the crack length through a response surface model. The two damage sensitive features are extracted from the first received S0 mode wave package. The model parameters of the baseline model are estimated using finite element simulation data. To account for uncertainties from numerical modeling, geometry, material and manufacturing between the baseline model and the target model, Bayesian method is employed to update the baseline model with a few measurements acquired from the actual target structure. A rigorous validation is made using in-situ fatigue testing and Lamb wave data from coupon specimens and realistic lap-joint components. The effectiveness and accuracy of the proposed method is demonstrated under different loading and damage conditions. PMID:28902148

  15. An Integrated Ransac and Graph Based Mismatch Elimination Approach for Wide-Baseline Image Matching

    NASA Astrophysics Data System (ADS)

    Hasheminasab, M.; Ebadi, H.; Sedaghat, A.

    2015-12-01

    In this paper we propose an integrated approach in order to increase the precision of feature point matching. Many different algorithms have been developed as to optimizing the short-baseline image matching while because of illumination differences and viewpoints changes, wide-baseline image matching is so difficult to handle. Fortunately, the recent developments in the automatic extraction of local invariant features make wide-baseline image matching possible. The matching algorithms which are based on local feature similarity principle, using feature descriptor as to establish correspondence between feature point sets. To date, the most remarkable descriptor is the scale-invariant feature transform (SIFT) descriptor , which is invariant to image rotation and scale, and it remains robust across a substantial range of affine distortion, presence of noise, and changes in illumination. The epipolar constraint based on RANSAC (random sample consensus) method is a conventional model for mismatch elimination, particularly in computer vision. Because only the distance from the epipolar line is considered, there are a few false matches in the selected matching results based on epipolar geometry and RANSAC. Aguilariu et al. proposed Graph Transformation Matching (GTM) algorithm to remove outliers which has some difficulties when the mismatched points surrounded by the same local neighbor structure. In this study to overcome these limitations, which mentioned above, a new three step matching scheme is presented where the SIFT algorithm is used to obtain initial corresponding point sets. In the second step, in order to reduce the outliers, RANSAC algorithm is applied. Finally, to remove the remained mismatches, based on the adjacent K-NN graph, the GTM is implemented. Four different close range image datasets with changes in viewpoint are utilized to evaluate the performance of the proposed method and the experimental results indicate its robustness and capability.

  16. Personalised news filtering and recommendation system using Chi-square statistics-based K-nearest neighbour (χ2SB-KNN) model

    NASA Astrophysics Data System (ADS)

    Adeniyi, D. A.; Wei, Z.; Yang, Y.

    2017-10-01

    Recommendation problem has been extensively studied by researchers in the field of data mining, database and information retrieval. This study presents the design and realisation of an automated, personalised news recommendations system based on Chi-square statistics-based K-nearest neighbour (χ2SB-KNN) model. The proposed χ2SB-KNN model has the potential to overcome computational complexity and information overloading problems, reduces runtime and speeds up execution process through the use of critical value of χ2 distribution. The proposed recommendation engine can alleviate scalability challenges through combined online pattern discovery and pattern matching for real-time recommendations. This work also showcases the development of a novel method of feature selection referred to as Data Discretisation-Based feature selection method. This is used for selecting the best features for the proposed χ2SB-KNN algorithm at the preprocessing stage of the classification procedures. The implementation of the proposed χ2SB-KNN model is achieved through the use of a developed in-house Java program on an experimental website called OUC newsreaders' website. Finally, we compared the performance of our system with two baseline methods which are traditional Euclidean distance K-nearest neighbour and Naive Bayesian techniques. The result shows a significant improvement of our method over the baseline methods studied.

  17. Parameterization of spectral baseline directly from short echo time full spectra in 1 H-MRS.

    PubMed

    Lee, Hyeong Hun; Kim, Hyeonjin

    2017-09-01

    To investigate the feasibility of parameterizing macromolecule (MM) resonances directly from short echo time (TE) spectra rather than pre-acquired, T 1 -weighted, metabolite-nulled spectra in 1 H-MRS. Initial line parameters for metabolites and MMs were set for rat brain spectra acquired at 9.4 Tesla upon a priori knowledge. Then, MM line parameters were optimized over several steps with fixed metabolite line parameters. The proposed method was tested by estimating metabolite T 1 . The results were compared with those obtained with two existing methods. Furthermore, subject-specific, spin density-weighted, MM model spectra were generated according to the MM line parameters from the proposed method for metabolite quantification. The results were compared with those obtained with subject-specific, T 1 -weighted, metabolite-nulled spectra. The metabolite T 1 were largely in close agreement among the three methods. The spin density-weighted MM resonances from the proposed method were in good agreement with the T 1 -weighted, metabolite-nulled spectra except for the MM resonance at ∼3.2 ppm. The metabolite concentrations estimated by incorporating these two different spectral baselines were also in good agreement except for several metabolites with resonances at ∼3.2 ppm. The MM parameterization directly from short-TE spectra is feasible. Further development of the method may allow for better representation of spectral baseline with negligible T 1 -weighting. Magn Reson Med 78:836-847, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  18. Pretreatment data is highly predictive of liver chemistry signals in clinical trials.

    PubMed

    Cai, Zhaohui; Bresell, Anders; Steinberg, Mark H; Silberg, Debra G; Furlong, Stephen T

    2012-01-01

    The goal of this retrospective analysis was to assess how well predictive models could determine which patients would develop liver chemistry signals during clinical trials based on their pretreatment (baseline) information. Based on data from 24 late-stage clinical trials, classification models were developed to predict liver chemistry outcomes using baseline information, which included demographics, medical history, concomitant medications, and baseline laboratory results. Predictive models using baseline data predicted which patients would develop liver signals during the trials with average validation accuracy around 80%. Baseline levels of individual liver chemistry tests were most important for predicting their own elevations during the trials. High bilirubin levels at baseline were not uncommon and were associated with a high risk of developing biochemical Hy's law cases. Baseline γ-glutamyltransferase (GGT) level appeared to have some predictive value, but did not increase predictability beyond using established liver chemistry tests. It is possible to predict which patients are at a higher risk of developing liver chemistry signals using pretreatment (baseline) data. Derived knowledge from such predictions may allow proactive and targeted risk management, and the type of analysis described here could help determine whether new biomarkers offer improved performance over established ones.

  19. Subfoveal choroidal thickness predicts macular atrophy in age-related macular degeneration: results from the TREX-AMD trial.

    PubMed

    Fan, Wenying; Abdelfattah, Nizar Saleh; Uji, Akihito; Lei, Jianqin; Ip, Michael; Sadda, SriniVas R; Wykoff, Charles C

    2018-03-01

    Our purpose was to evaluate the relationship between subfoveal choroidal thickness (SCT) and development of macular atrophy (MA) in eyes with age-related macular degeneration (AMD). This was a prospective, multicenter study. Sixty participants (120 eyes) in the TREX-AMD trial (NCT01648292) with treatment-naïve neovascular AMD (NVAMD) in at least one eye were included. SCT was measured by certified reading center graders at baseline using spectral domain optical coherence tomography (SDOCT). The baseline SCT was correlated with the presence of MA at baseline and development of incident MA by month 18. Generalized estimating equations were used to account for information from both eyes. Baseline SCT in eyes with MA was statistically significantly less than in those without MA in both the dry AMD (DAMD) (P = 0.04) and NVAMD (P = 0.01) groups. Comparison of baseline SCT between MA developers and non-MA developers revealed a statistically significant difference (P = 0.03). Receiver operating characteristic curve (ROC) analysis showed the cut-off threshold of SCT for predicting the development of MA in cases without MA at baseline was 124 μm (AUC = 0.772; Sensitivity = 0.923; Specificity = 0.5). Among eyes without MA at baseline, those with baseline SCT ≤124 μm were 4.3 times more likely to develop MA (Odds ratio: 4.3, 95% confidence interval: 1.6-12, P = 0.005) than those with baseline SCT >124 μm. Eyes with AMD and MA had less SCT than those without MA. Eyes with less baseline SCT also appear to be at higher risk to develop MA within 18 months.

  20. Development of NTCP models for head and neck cancer patients treated with three-dimensional conformal radiotherapy for xerostomia and sticky saliva: the role of dosimetric and clinical factors.

    PubMed

    Beetz, Ivo; Schilstra, Cornelis; Burlage, Fred R; Koken, Phil W; Doornaert, Patricia; Bijl, Henk P; Chouvalova, Olga; Leemans, C René; de Bock, Geertruida H; Christianen, Miranda E M C; van der Laan, Bernard F A M; Vissink, Arjan; Steenbakkers, Roel J H M; Langendijk, Johannes A

    2012-10-01

    The purpose of this multicentre prospective study was to investigate the significance of the radiation dose in the major and minor salivary glands, and other pre-treatment and treatment factors, with regard to the development of patient-rated xerostomia and sticky saliva among head and neck cancer (HNC) patients treated with primary (chemo-) radiotherapy ((CH)RT). The study population was composed of 167 consecutive HNC patients treated with three-dimensional conformal (3D-CRT) (CH) RT. The primary endpoint was moderate to severe xerostomia (XER6m) as assessed by the EORTC QLQ-H&N35 at 6 months after completing (CH)RT. The secondary endpoint was moderate to severe sticky saliva at 6 months (STIC6m). All organs at risk (OARs) potentially involved in salivary function were delineated on planning-CT, including the parotid, submandibular and sublingual glands and the minor glands in the soft palate, cheeks and lips. Patients with moderate to severe xerostomia or sticky saliva at baseline were excluded. The optimum number of variables for a multivariate logistic regression model was determined using a bootstrapping method. The multivariate analysis showed the mean parotid dose, age and baseline xerostomia (none versus a bit) to be the most important predictors for XER6m. The risk of developing xerostomia increased with age and was higher when minor baseline xerostomia was present in comparison with patients without any xerostomia complaints at baseline. Model performance was good with an area under the curve (AUC) of 0.82. For STIC6m, the mean submandibular dose, age, the mean sublingual dose and baseline sticky saliva (none versus a bit) were most predictive for sticky saliva. The risk of developing STIC6m increased with age and was higher when minor baseline sticky saliva was present in comparison with patients without any sticky saliva complaints at baseline. Model performance was good with an AUC of 0.84. Dose distributions in the minor salivary glands in patients receiving 3D-CRT have limited significance with regard to patient-rated symptoms related to salivary dysfunction. Besides the parotid and submandibular glands, only the sublingual glands were significantly associated with sticky saliva. In addition, reliable risk estimation also requires information from other factors such as age and baseline subjective scores. When these selected factors are included in predictive models, instead of only dose volume histogram parameters, model performance can be improved significantly. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  1. Improved Topographic Mapping Through Multi-Baseline SAR Interferometry with MAP Estimation

    NASA Astrophysics Data System (ADS)

    Dong, Yuting; Jiang, Houjun; Zhang, Lu; Liao, Mingsheng; Shi, Xuguo

    2015-05-01

    There is an inherent contradiction between the sensitivity of height measurement and the accuracy of phase unwrapping for SAR interferometry (InSAR) over rough terrain. This contradiction can be resolved by multi-baseline InSAR analysis, which exploits multiple phase observations with different normal baselines to improve phase unwrapping accuracy, or even avoid phase unwrapping. In this paper we propose a maximum a posteriori (MAP) estimation method assisted by SRTM DEM data for multi-baseline InSAR topographic mapping. Based on our method, a data processing flow is established and applied in processing multi-baseline ALOS/PALSAR dataset. The accuracy of resultant DEMs is evaluated by using a standard Chinese national DEM of scale 1:10,000 as reference. The results show that multi-baseline InSAR can improve DEM accuracy compared with single-baseline case. It is noteworthy that phase unwrapping is avoided and the quality of multi-baseline InSAR DEM can meet the DTED-2 standard.

  2. Method, system, and computer-readable medium for determining performance characteristics of an object undergoing one or more arbitrary aging conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gering, Kevin L.

    A method, system, and computer-readable medium are described for characterizing performance loss of an object undergoing an arbitrary aging condition. Baseline aging data may be collected from the object for at least one known baseline aging condition over time, determining baseline multiple sigmoid model parameters from the baseline data, and performance loss of the object may be determined over time through multiple sigmoid model parameters associated with the object undergoing the arbitrary aging condition using a differential deviation-from-baseline approach from the baseline multiple sigmoid model parameters. The system may include an object, monitoring hardware configured to sample performance characteristics ofmore » the object, and a processor coupled to the monitoring hardware. The processor is configured to determine performance loss for the arbitrary aging condition from a comparison of the performance characteristics of the object deviating from baseline performance characteristics associated with a baseline aging condition.« less

  3. Machine Learning Algorithms Utilizing Quantitative CT Features May Predict Eventual Onset of Bronchiolitis Obliterans Syndrome After Lung Transplantation.

    PubMed

    Barbosa, Eduardo J Mortani; Lanclus, Maarten; Vos, Wim; Van Holsbeke, Cedric; De Backer, William; De Backer, Jan; Lee, James

    2018-02-19

    Long-term survival after lung transplantation (LTx) is limited by bronchiolitis obliterans syndrome (BOS), defined as a sustained decline in forced expiratory volume in the first second (FEV 1 ) not explained by other causes. We assessed whether machine learning (ML) utilizing quantitative computed tomography (qCT) metrics can predict eventual development of BOS. Paired inspiratory-expiratory CT scans of 71 patients who underwent LTx were analyzed retrospectively (BOS [n = 41] versus non-BOS [n = 30]), using at least two different time points. The BOS cohort experienced a reduction in FEV 1 of >10% compared to baseline FEV 1 post LTx. Multifactor analysis correlated declining FEV 1 with qCT features linked to acute inflammation or BOS onset. Student t test and ML were applied on baseline qCT features to identify lung transplant patients at baseline that eventually developed BOS. The FEV 1 decline in the BOS cohort correlated with an increase in the lung volume (P = .027) and in the central airway volume at functional residual capacity (P = .018), not observed in non-BOS patients, whereas the non-BOS cohort experienced a decrease in the central airway volume at total lung capacity with declining FEV 1 (P = .039). Twenty-three baseline qCT parameters could significantly distinguish between non-BOS patients and eventual BOS developers (P < .05), whereas no pulmonary function testing parameters could. Using ML methods (support vector machine), we could identify BOS developers at baseline with an accuracy of 85%, using only three qCT parameters. ML utilizing qCT could discern distinct mechanisms driving FEV 1 decline in BOS and non-BOS LTx patients and predict eventual onset of BOS. This approach may become useful to optimize management of LTx patients. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  4. Treatment of chronic telogen effluvium with oral minoxidil: A retrospective study.

    PubMed

    Perera, Eshini; Sinclair, Rodney

    2017-01-01

    Background : Chronic telogen effluvium (CTE) may be primary or secondary to various causes, including drug reaction, nutritional deficiency and female pattern hair loss (FPHL).  Oral minoxidil stimulates hair growth, and topical minoxidil is used in the treatment of FPHL and male androgenetic alopecia. minoxidil has not been used to treat CTE. This study aimed to assess the treatment of CTE with once daily oral minoxidil. Methods : Women with a diagnosis of CTE based on >6 month history of increased telogen hair shedding, no visible mid frontal scalp hair loss (Sinclair stage 1) and no hair follicle miniaturization on scalp biopsy were treated with once daily oral minoxidil.  Hair shedding scores (HSS) at baseline, 6 and 12 months were analysed using the Wilcoxon rank sum test for pair-wise comparisons. Results : Thirty-six women were treated with oral minoxidil (range, 0.25-2.5 mg) daily for 6 months.  Mean age was 46.9 years (range 20-83), HSS at baseline was 5.64, and duration of diagnosis was 6.55 years (range 1-27).  There was a reduction in mean HSS scores from baseline to 6 months of 1.7 (p<0.001) and baseline to 12 months of 2.58 (p<0.001). Five women who described trichodynia at baseline, noted improvement or resolution within 3 months.  Mean change in blood pressure was minus 0.5 mmHg systolic and plus 2.1 mmHg diastolic.  Two patients developed transient postural dizziness that resolved with continued treatment.  One patient developed ankle oedema.  Thirteen women developed facial hypertrichosis.  For 6 patients this was mild and did not require treatment; 4 had waxing of their upper lip or forehead; 3 had laser hair removal.  No patients developed any haematological abnormality.  All 36 women completed 12 months of treatment. Conclusions : Once daily oral minoxidil appears to reduce hair shedding in CTE.  Placebo controlled studies are recommended to further assess this response.

  5. Detection of QT prolongation using a novel ECG analysis algorithm applying intelligent automation: Prospective blinded evaluation using the Cardiac Safety Research Consortium ECG database

    PubMed Central

    Green, Cynthia L.; Kligfield, Paul; George, Samuel; Gussak, Ihor; Vajdic, Branislav; Sager, Philip; Krucoff, Mitchell W.

    2013-01-01

    Background The Cardiac Safety Research Consortium (CSRC) provides both “learning” and blinded “testing” digital ECG datasets from thorough QT (TQT) studies annotated for submission to the US Food and Drug Administration (FDA) to developers of ECG analysis technologies. This manuscript reports the first results from a blinded “testing” dataset that examines Developer re-analysis of original Sponsor-reported core laboratory data. Methods 11,925 anonymized ECGs including both moxifloxacin and placebo arms of a parallel-group TQT in 191 subjects were blindly analyzed using a novel ECG analysis algorithm applying intelligent automation. Developer measured ECG intervals were submitted to CSRC for unblinding, temporal reconstruction of the TQT exposures, and statistical comparison to core laboratory findings previously submitted to FDA by the pharmaceutical sponsor. Primary comparisons included baseline-adjusted interval measurements, baseline- and placebo-adjusted moxifloxacin QTcF changes (ddQTcF), and associated variability measures. Results Developer and Sponsor-reported baseline-adjusted data were similar with average differences less than 1 millisecond (ms) for all intervals. Both Developer and Sponsor-reported data demonstrated assay sensitivity with similar ddQTcF changes. Average within-subject standard deviation for triplicate QTcF measurements was significantly lower for Developer than Sponsor-reported data (5.4 ms and 7.2 ms, respectively; p<0.001). Conclusion The virtually automated ECG algorithm used for this analysis produced similar yet less variable TQT results compared to the Sponsor-reported study, without the use of a manual core laboratory. These findings indicate CSRC ECG datasets can be useful for evaluating novel methods and algorithms for determining QT/QTc prolongation by drugs. While the results should not constitute endorsement of specific algorithms by either CSRC or FDA, the value of a public domain digital ECG warehouse to provide prospective, blinded comparisons of ECG technologies applied for QT/QTc measurement is illustrated. PMID:22424006

  6. Spacecraft drag-free technology development: On-board estimation and control synthesis

    NASA Technical Reports Server (NTRS)

    Key, R. W.; Mettler, E.; Milman, M. H.; Schaechter, D. B.

    1982-01-01

    Estimation and control methods for a Drag-Free spacecraft are discussed. The functional and analytical synthesis of on-board estimators and controllers for an integrated attitude and translation control system is represented. The framework for detail definition and design of the baseline drag-free system is created. The techniques for solution of self-gravity and electrostatic charging problems are applicable generally, as is the control system development.

  7. Monitoring ship noise to assess the impact of coastal developments on marine mammals.

    PubMed

    Merchant, Nathan D; Pirotta, Enrico; Barton, Tim R; Thompson, Paul M

    2014-01-15

    The potential impacts of underwater noise on marine mammals are widely recognised, but uncertainty over variability in baseline noise levels often constrains efforts to manage these impacts. This paper characterises natural and anthropogenic contributors to underwater noise at two sites in the Moray Firth Special Area of Conservation, an important marine mammal habitat that may be exposed to increased shipping activity from proposed offshore energy developments. We aimed to establish a pre-development baseline, and to develop ship noise monitoring methods using Automatic Identification System (AIS) and time-lapse video to record trends in noise levels and shipping activity. Our results detail the noise levels currently experienced by a locally protected bottlenose dolphin population, explore the relationship between broadband sound exposure levels and the indicators proposed in response to the EU Marine Strategy Framework Directive, and provide a ship noise assessment toolkit which can be applied in other coastal marine environments. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Improving thermal dose accuracy in magnetic resonance-guided focused ultrasound surgery: Long-term thermometry using a prior baseline as a reference.

    PubMed

    Bitton, Rachel R; Webb, Taylor D; Pauly, Kim Butts; Ghanouni, Pejman

    2016-01-01

    To investigate thermal dose volume (TDV) and non-perfused volume (NPV) of magnetic resonance-guided focused ultrasound (MRgFUS) treatments in patients with soft tissue tumors, and describe a method for MR thermal dosimetry using a baseline reference. Agreement between TDV and immediate post treatment NPV was evaluated from MRgFUS treatments of five patients with biopsy-proven desmoid tumors. Thermometry data (gradient echo, 3T) were analyzed over the entire course of the treatments to discern temperature errors in the standard approach. The technique searches previously acquired baseline images for a match using 2D normalized cross-correlation and a weighted mean of phase difference images. Thermal dose maps and TDVs were recalculated using the matched baseline and compared to NPV. TDV and NPV showed between 47%-91% disagreement, using the standard immediate baseline method for calculating TDV. Long-term thermometry showed a nonlinear local temperature accrual, where peak additional temperature varied between 4-13°C (mean = 7.8°C) across patients. The prior baseline method could be implemented by finding a previously acquired matching baseline 61% ± 8% (mean ± SD) of the time. We found 7%-42% of the disagreement between TDV and NPV was due to errors in thermometry caused by heat accrual. For all patients, the prior baseline method increased the estimated treatment volume and reduced the discrepancies between TDV and NPV (P = 0.023). This study presents a mismatch between in-treatment and post treatment efficacy measures. The prior baseline approach accounts for local heating and improves the accuracy of thermal dose-predicted volume. © 2015 Wiley Periodicals, Inc.

  9. Monitoring environmental burden reduction from household waste prevention.

    PubMed

    Matsuda, Takeshi; Hirai, Yasuhiro; Asari, Misuzu; Yano, Junya; Miura, Takahiro; Ii, Ryota; Sakai, Shin-Ichi

    2018-01-01

    In this study, the amount of prevented household waste in Kyoto city was quantified using three methods. Subsequently, the greenhouse gas (GHG) emission reduction by waste prevention was calculated in order to monitor the impact of waste prevention. The methods of quantification were "relative change from baseline year (a)," "absolute change from potential waste generation (b)," and "absolute amount of activities (c)." Method (a) was popular for measuring waste prevention, but method (b) was the original approach to determine the absolute amount of waste prevention by estimating the potential waste generation. Method (c) also provided the absolute value utilizing the information of activities. Methods (b) and (c) enable the evaluation of the waste prevention activities with a similar baseline for recycling. Methods (b) and (c) gave significantly higher GHG reductions than method (a) because of the difference in baseline between them. Therefore, setting a baseline is very important for evaluating waste prevention. In practice, when focusing on the monitoring of a specific policy or campaign, method (a) is an appropriate option. On the other hand, when comparing the total impact of waste prevention to that of recycling, methods (b) and (c) should be applied. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Weight-Control Methods, 3-Year Weight Change, and Eating Behaviors: A Prospective Nationwide Study of Middle-Aged New Zealand Women.

    PubMed

    Leong, Sook Ling; Gray, Andrew; Haszard, Jillian; Horwath, Caroline

    2016-08-01

    The effectiveness of women's weight-control methods and the influences of dieting on eating behaviors remain unclear. Our aim was to determine the association of various weight-control methods at baseline with weight change to 3 years, and examine the association between baseline weight-control status (trying to lose weight, trying to prevent weight gain or no weight-control attempts) and changes in intuitive eating and binge eating at 3 years. A nationally representative sample of 1,601 New Zealand women (40 to 50 years) was recruited and completed a self-administered questionnaire at baseline regarding use of variety of weight-control methods. Information on demographic characteristics, weight, height, food habits, binge eating, and intuitive eating were collected at baseline and 3 years. Linear and logistic regression models examined associations between both weight status and weight-control methods at baseline and weight change to 3 years; and baseline weight-control status and change in intuitive eating from baseline to 3 years and binge eating at 3 years. χ(2) tests were used to cross-sectionally compare food habits across the weight status categories at both baseline and 3 years. Trying to lose weight and the use of weight-control methods at baseline were not associated with change in body weight to 3 years. There were a few differences in the frequency of consumption of high-energy-density foods between those trying to lose or maintain weight and those not attempting weight control. Trying to lose weight at baseline was associated with a 2.0-unit (95% CI 0.7 to 3.4, P=0.003) reduction in intuitive eating scores by 3 years (potential range=21 to 105), and 224% (odds ratio=3.24; 95% CI 1.69 to 6.20; P<0.001) higher odds of binge eating at 3 years. The apparent ineffectiveness of dieting and weight-control behaviors may reflect misconceptions about what constitutes healthy eating or energy-dense foods. Dieting may reduce women's ability to recognize hunger and satiety cues and place women at increased risk of binge eating. Copyright © 2016 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  11. A diagnostic prototype of the potable water subsystem of the Space Station Freedom ECLSS

    NASA Technical Reports Server (NTRS)

    Lukefahr, Brenda D.; Rochowiak, Daniel M.; Benson, Brian L.; Rogers, John S.; Mckee, James W.

    1989-01-01

    In analyzing the baseline Environmental Control and Life Support System (ECLSS) command and control architecture, various processes are found which would be enhanced by the use of knowledge based system methods of implementation. The most suitable process for prototyping using rule based methods are documented, while domain knowledge resources and other practical considerations are examined. Requirements for a prototype rule based software system are documented. These requirements reflect Space Station Freedom ECLSS software and hardware development efforts, and knowledge based system requirements. A quick prototype knowledge based system environment is researched and developed.

  12. 40 CFR 80.93 - Individual baseline submission and approval.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of its individual baseline to EPA (Fuel Studies and Standards Branch, Baseline Submission, U.S. EPA... Studies and Standards Branch, Baseline Petition, U.S. EPA, 2565 Plymouth Road, Ann Arbor, Michigan 48105..., used in the determination of a given fuel parameter; (iii) Identification of test method. If not per...

  13. 40 CFR 80.93 - Individual baseline submission and approval.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of its individual baseline to EPA (Fuel Studies and Standards Branch, Baseline Submission, U.S. EPA... Studies and Standards Branch, Baseline Petition, U.S. EPA, 2565 Plymouth Road, Ann Arbor, Michigan 48105..., used in the determination of a given fuel parameter; (iii) Identification of test method. If not per...

  14. 40 CFR 80.93 - Individual baseline submission and approval.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of its individual baseline to EPA (Fuel Studies and Standards Branch, Baseline Submission, U.S. EPA... Studies and Standards Branch, Baseline Petition, U.S. EPA, 2565 Plymouth Road, Ann Arbor, Michigan 48105..., used in the determination of a given fuel parameter; (iii) Identification of test method. If not per...

  15. 40 CFR 80.93 - Individual baseline submission and approval.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of its individual baseline to EPA (Fuel Studies and Standards Branch, Baseline Submission, U.S. EPA... Studies and Standards Branch, Baseline Petition, U.S. EPA, 2565 Plymouth Road, Ann Arbor, Michigan 48105..., used in the determination of a given fuel parameter; (iii) Identification of test method. If not per...

  16. Agricultural Baseline (BL0) scenario

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinckel, Chad M [University of Tennessee] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154)

    2016-07-13

    Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.

  17. Multi-objective optimization for an automated and simultaneous phase and baseline correction of NMR spectral data

    NASA Astrophysics Data System (ADS)

    Sawall, Mathias; von Harbou, Erik; Moog, Annekathrin; Behrens, Richard; Schröder, Henning; Simoneau, Joël; Steimers, Ellen; Neymeyr, Klaus

    2018-04-01

    Spectral data preprocessing is an integral and sometimes inevitable part of chemometric analyses. For Nuclear Magnetic Resonance (NMR) spectra a possible first preprocessing step is a phase correction which is applied to the Fourier transformed free induction decay (FID) signal. This preprocessing step can be followed by a separate baseline correction step. Especially if series of high-resolution spectra are considered, then automated and computationally fast preprocessing routines are desirable. A new method is suggested that applies the phase and the baseline corrections simultaneously in an automated form without manual input, which distinguishes this work from other approaches. The underlying multi-objective optimization or Pareto optimization provides improved results compared to consecutively applied correction steps. The optimization process uses an objective function which applies strong penalty constraints and weaker regularization conditions. The new method includes an approach for the detection of zero baseline regions. The baseline correction uses a modified Whittaker smoother. The functionality of the new method is demonstrated for experimental NMR spectra. The results are verified against gravimetric data. The method is compared to alternative preprocessing tools. Additionally, the simultaneous correction method is compared to a consecutive application of the two correction steps.

  18. Time-Domain Evaluation of Fractional Order Controllers’ Direct Discretization Methods

    NASA Astrophysics Data System (ADS)

    Ma, Chengbin; Hori, Yoichi

    Fractional Order Control (FOC), in which the controlled systems and/or controllers are described by fractional order differential equations, has been applied to various control problems. Though it is not difficult to understand FOC’s theoretical superiority, realization issue keeps being somewhat problematic. Since the fractional order systems have an infinite dimension, proper approximation by finite difference equation is needed to realize the designed fractional order controllers. In this paper, the existing direct discretization methods are evaluated by their convergences and time-domain comparison with the baseline case. Proposed sampling time scaling property is used to calculate the baseline case with full memory length. This novel discretization method is based on the classical trapezoidal rule but with scaled sampling time. Comparative studies show good performance and simple algorithm make the Short Memory Principle method most practically superior. The FOC research is still at its primary stage. But its applications in modeling and robustness against non-linearities reveal the promising aspects. Parallel to the development of FOC theories, applying FOC to various control problems is also crucially important and one of top priority issues.

  19. Flexible, multi-measurement guided wave damage detection under varying temperatures

    NASA Astrophysics Data System (ADS)

    Douglass, Alexander C. S.; Harley, Joel B.

    2018-04-01

    Temperature compensation in structural health monitoring helps identify damage in a structure by removing data variations due to environmental conditions, such as temperature. Stretch-based methods are one of the most commonly used temperature compensation methods. To account for variations in temperature, stretch-based methods optimally stretch signals in time to optimally match a measurement to a baseline. All of the data is then compared with the single baseline to determine the presence of damage. Yet, for these methods to be effective, the measurement and the baseline must satisfy the inherent assumptions of the temperature compensation method. In many scenarios, these assumptions are wrong, the methods generate error, and damage detection fails. To improve damage detection, a multi-measurement damage detection method is introduced. By using each measurement in the dataset as a baseline, error caused by imperfect temperature compensation is reduced. The multi-measurement method increases the detection effectiveness of our damage metric, or damage indicator, over time and reduces the presence of additional peaks caused by temperature that could be mistaken for damage. By using many baselines, the variance of the damage indicator is reduced and the effects from damage are amplified. Notably, the multi-measurement improves damage detection over single-measurement methods. This is demonstrated through an increase in the maximum of our damage signature from 0.55 to 0.95 (where large values, up to a maximum of one, represent a statistically significant change in the data due to damage).

  20. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis.

    PubMed

    Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L

    2017-07-01

    To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.

  1. Reproducibility of Abdominal Aortic Aneurysm Diameter Measurement and Growth Evaluation on Axial and Multiplanar Computed Tomography Reformations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dugas, Alexandre; Therasse, Eric; Kauffmann, Claude

    2012-08-15

    Purpose: To compare different methods measuring abdominal aortic aneurysm (AAA) maximal diameter (Dmax) and its progression on multidetector computed tomography (MDCT) scan. Materials and Methods: Forty AAA patients with two MDCT scans acquired at different times (baseline and follow-up) were included. Three observers measured AAA diameters by seven different methods: on axial images (anteroposterior, transverse, maximal, and short-axis views) and on multiplanar reformation (MPR) images (coronal, sagittal, and orthogonal views). Diameter measurement and progression were compared over time for the seven methods. Reproducibility of measurement methods was assessed by intraclass correlation coefficient (ICC) and Bland-Altman analysis. Results: Dmax, as measuredmore » on axial slices at baseline and follow-up (FU) MDCTs, was greater than that measured using the orthogonal method (p = 0.046 for baseline and 0.028 for FU), whereas Dmax measured with the orthogonal method was greater those using all other measurement methods (p-value range: <0.0001-0.03) but anteroposterior diameter (p = 0.18 baseline and 0.10 FU). The greatest interobserver ICCs were obtained for the orthogonal and transverse methods (0.972) at baseline and for the orthogonal and sagittal MPR images at FU (0.973 and 0.977). Interobserver ICC of the orthogonal method to document AAA progression was greater (ICC = 0.833) than measurements taken on axial images (ICC = 0.662-0.780) and single-plane MPR images (0.772-0.817). Conclusion: AAA Dmax measured on MDCT axial slices overestimates aneurysm size. Diameter as measured by the orthogonal method is more reproducible, especially to document AAA progression.« less

  2. Higher serum bilirubin level as a protective factor for the development of diabetes in healthy Korean men: a 4 year retrospective longitudinal study.

    PubMed

    Jung, Chang Hee; Lee, Min Jung; Kang, Yu Mi; Hwang, Jenie Yoonoo; Jang, Jung Eun; Leem, Jaechan; Park, Joong-Yeol; Kim, Hong-Kyu; Lee, Woo Je

    2014-01-01

    Bilirubin, a natural product of heme catabolism by heme oxygenase, one of key antioxidant enzymes, has been recognized as a substance with potent antioxidant and cytoprotective properties. Several studies have shown a significant negative relationship between serum bilirubin levels and the risk of metabolic disorders, including type 2 diabetes. However, longitudinal studies investigating the association of elevated serum bilirubin levels and type 2 diabetes are lacking. In the present study, we aimed to investigate the longitudinal effects of baseline serum bilirubin concentrations on the development of type 2 diabetes in healthy Korean men. This 4 year retrospective longitudinal observational study was conducted at the Asan Medical Center, Seoul, Republic of Korea. The study population consisted of 5960 men without type 2 diabetes who underwent routine health examinations in 2007 (baseline) and 2011 (follow-up). Baseline serum bilirubin concentrations were determined by the vanadate oxidation method. During a 4 year period, 409 incident cases of diabetes (6.9 %) were identified. Incident type 2 diabetes decreased across the baseline bilirubin quartile categories (P for trend <0.001). In multivariable-adjusted model, the relative risk (RR) for the development of type 2 diabetes was significantly lower in the highest (i.e., 1.30-2.00 mg/dl) than in the lowest bilirubin quartile category (i.e., ≤ 0.90 mg/dl), even after adjustment for confounding variables (RR=0.69, 95% confidence interval 0.48-0.99, P for trend = 0.041). The results indicate that serum total bilirubin level may provide additional information for predicting future development of type 2 diabetes in healthy subjects. © 2013.

  3. U.S. Geological Survey experience with the residual absolutes method

    USGS Publications Warehouse

    Worthington, E. William; Matzka, Jurgen

    2017-01-01

    The U.S. Geological Survey (USGS) Geomagnetism Program has developed and tested the residual method of absolutes, with the assistance of the Danish Technical University's (DTU) Geomagnetism Program. Three years of testing were performed at College Magnetic Observatory (CMO), Fairbanks, Alaska, to compare the residual method with the null method. Results show that the two methods compare very well with each other and both sets of baseline data were used to process the 2015 definitive data. The residual method will be implemented at the other USGS high-latitude geomagnetic observatories in the summer of 2017 and 2018.

  4. Contrast induced nephropathy in hypertensive patients after elective percutaneous coronary intervention

    NASA Astrophysics Data System (ADS)

    Aryfa Andra, Cut; Khairul, Andi; Aria Arina, Cut; Mukhtar, Zulfikri; Nyak Kaoy, Isfanuddin

    2018-03-01

    Contrast induced nephropathy (CIN) is the third lead cause of hospital acquired renal failure and was associated with significant morbidity and mortality. We hypothesized that hypertension is an independent risk factor for the development of CIN in patients undergoing elective percutaneous coronary intervention (PCI). The case-control method was used, 138 patients scheduled for elective PCI. We measured serum creatinine at baseline and after 24 hours of the procedure. CIN was defined as arising in serum creatinine of at least 44 μmol/l (0,5 mg/dl) or 25% rise from baseline. All patients received low osmolality nonionic contrast during PCI. Hypertension was defined as self-reported a history of treated or untreated diagnosed high blood pressure. One hundred thirty-eight patients (74,6%) were male, and 35 patients (25,4%) were female. Among the 138 patients, 86 (62,3%) were hypertensive patients whereas 52 (37,7%) were nonhypertensive patients. There was no difference in baseline serum creatinine levels and the amount of contrast media in patient with and without CIN. CIN developed in 42 patients, 39 patients (92,9%) were hypertensive compared to 3 patients (7,1%) without hypertension with p value < 0,05. (Odds ratio 16,8, 95% CI 4.542 - 62,412). This study showed that hypertension was a risk factor for the development of CIN

  5. The WISTAH hand study: A prospective cohort study of distal upper extremity musculoskeletal disorders

    PubMed Central

    2012-01-01

    Background Few prospective cohort studies of distal upper extremity musculoskeletal disorders have been performed. Past studies have provided somewhat conflicting evidence for occupational risk factors and have largely reported data without adjustments for many personal and psychosocial factors. Methods/design A multi-center prospective cohort study was incepted to quantify risk factors for distal upper extremity musculoskeletal disorders and potentially develop improved methods for analyzing jobs. Disorders to analyze included carpal tunnel syndrome, lateral epicondylalgia, medial epicondylalgia, trigger digit, deQuervain’s stenosing tenosynovitis and other tendinoses. Workers have thus far been enrolled from 17 different employment settings in 3 diverse US states and performed widely varying work. At baseline, workers undergo laptop administered questionnaires, structured interviews, two standardized physical examinations and nerve conduction studies to ascertain demographic, medical history, psychosocial factors and current musculoskeletal disorders. All workers’ jobs are individually measured for physical factors and are videotaped. Workers are followed monthly for the development of musculoskeletal disorders. Repeat nerve conduction studies are performed for those with symptoms of tingling and numbness in the prior six months. Changes in jobs necessitate re-measure and re-videotaping of job physical factors. Case definitions have been established. Point prevalence of carpal tunnel syndrome is a combination of paraesthesias in at least two median nerve-served digits plus an abnormal nerve conduction study at baseline. The lifetime cumulative incidence of carpal tunnel syndrome will also include those with a past history of carpal tunnel syndrome. Incident cases will exclude those with either a past history or prevalent cases at baseline. Statistical methods planned include survival analyses and logistic regression. Discussion A prospective cohort study of distal upper extremity musculoskeletal disorders is underway and has successfully enrolled over 1,000 workers to date. PMID:22672216

  6. Ensembles of novelty detection classifiers for structural health monitoring using guided waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dib, Gerges; Karpenko, Oleksii; Koricho, Ermias

    Guided wave structural health monitoring uses sparse sensor networks embedded in sophisticated structures for defect detection and characterization. The biggest challenge of those sensor networks is developing robust techniques for reliable damage detection under changing environmental and operating conditions. To address this challenge, we develop a novelty classifier for damage detection based on one class support vector machines. We identify appropriate features for damage detection and introduce a feature aggregation method which quadratically increases the number of available training observations.We adopt a two-level voting scheme by using an ensemble of classifiers and predictions. Each classifier is trained on a differentmore » segment of the guided wave signal, and each classifier makes an ensemble of predictions based on a single observation. Using this approach, the classifier can be trained using a small number of baseline signals. We study the performance using monte-carlo simulations of an analytical model and data from impact damage experiments on a glass fiber composite plate.We also demonstrate the classifier performance using two types of baseline signals: fixed and rolling baseline training set. The former requires prior knowledge of baseline signals from all environmental and operating conditions, while the latter does not and leverages the fact that environmental and operating conditions vary slowly over time and can be modeled as a Gaussian process.« less

  7. Clinical and Vitamin Response to a Short-Term Multi-Micronutrient Intervention in Brazilian Children and Teens: From Population Data to Interindividual Responses.

    PubMed

    Mathias, Mariana Giaretta; Coelho-Landell, Carolina de Almeida; Scott-Boyer, Marie-Pier; Lacroix, Sébastien; Morine, Melissa J; Salomão, Roberta Garcia; Toffano, Roseli Borges Donegá; Almada, Maria Olímpia Ribeiro do Vale; Camarneiro, Joyce Moraes; Hillesheim, Elaine; de Barros, Tamiris Trevisan; Camelo-Junior, José Simon; Campos Giménez, Esther; Redeuil, Karine; Goyon, Alexandre; Bertschy, Emmanuelle; Lévêques, Antoine; Oberson, Jean-Marie; Giménez, Catherine; Carayol, Jerome; Kussmann, Martin; Descombes, Patrick; Métairon, Slyviane; Draper, Colleen Fogarty; Conus, Nelly; Mottaz, Sara Colombo; Corsini, Giovanna Zambianchi; Myoshi, Stephanie Kazu Brandão; Muniz, Mariana Mendes; Hernandes, Lívia Cristina; Venâncio, Vinícius Paula; Antunes, Lusania Maria Greggi; da Silva, Rosana Queiroz; Laurito, Taís Fontellas; Rossi, Isabela Ribeiro; Ricci, Raquel; Jorge, Jéssica Ré; Fagá, Mayara Leite; Quinhoneiro, Driele Cristina Gomes; Reche, Mariana Chinarelli; Silva, Paula Vitória Sozza; Falquetti, Letícia Lima; da Cunha, Thaís Helena Alves; Deminice, Thalia Manfrin Martins; Tambellini, Tâmara Hambúrguer; de Souza, Gabriela Cristina Arces; de Oliveira, Mariana Moraes; Nogueira-Pileggi, Vicky; Matsumoto, Marina Takemoto; Priami, Corrado; Kaput, Jim; Monteiro, Jacqueline Pontes

    2018-03-01

    Micronutrients are in small amounts in foods, act in concert, and require variable amounts of time to see changes in health and risk for disease. These first principles are incorporated into an intervention study designed to develop new experimental strategies for setting target recommendations for food bioactives for populations and individuals. A 6-week multivitamin/mineral intervention is conducted in 9-13 year olds. Participants (136) are (i) their own control (n-of-1); (ii) monitored for compliance; (iii) measured for 36 circulating vitamin forms, 30 clinical, anthropometric, and food intake parameters at baseline, post intervention, and following a 6-week washout; and (iv) had their ancestry accounted for as modifier of vitamin baseline or response. The same intervention is repeated the following year (135 participants). Most vitamins respond positively and many clinical parameters change in directions consistent with improved metabolic health to the intervention. Baseline levels of any metabolite predict its own response to the intervention. Elastic net penalized regression models are identified, and significantly predict response to intervention on the basis of multiple vitamin/clinical baseline measures. The study design, computational methods, and results are a step toward developing recommendations for optimizing vitamin levels and health parameters for individuals. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Genetic Dissociation of Daily Sleep and Sleep Following Thermogenetic Sleep Deprivation in Drosophila

    PubMed Central

    Dubowy, Christine; Moravcevic, Katarina; Yue, Zhifeng; Wan, Joy Y.; Van Dongen, Hans P.A.; Sehgal, Amita

    2016-01-01

    Study Objectives: Sleep rebound—the increase in sleep that follows sleep deprivation—is a hallmark of homeostatic sleep regulation that is conserved across the animal kingdom. However, both the mechanisms that underlie sleep rebound and its relationship to habitual daily sleep remain unclear. To address this, we developed an efficient thermogenetic method of inducing sleep deprivation in Drosophila that produces a substantial rebound, and applied the newly developed method to assess sleep rebound in a screen of 1,741 mutated lines. We used data generated by this screen to identify lines with reduced sleep rebound following thermogenetic sleep deprivation, and to probe the relationship between habitual sleep amount and sleep following thermogenetic sleep deprivation in Drosophila. Methods: To develop a thermogenetic method of sleep deprivation suitable for screening, we thermogenetically stimulated different populations of wake-promoting neurons labeled by Gal4 drivers. Sleep rebound following thermogenetically-induced wakefulness varies across the different sets of wake-promoting neurons that were stimulated, from very little to quite substantial. Thermogenetic activation of neurons marked by the c584-Gal4 driver produces both strong sleep loss and a substantial rebound that is more consistent within genotypes than rebound following mechanical or caffeine-induced sleep deprivation. We therefore used this driver to induce sleep deprivation in a screen of 1,741 mutagenized lines generated by the Drosophila Gene Disruption Project. Flies were subjected to 9 h of sleep deprivation during the dark period and released from sleep deprivation 3 h before lights-on. Recovery was measured over the 15 h following sleep deprivation. Following identification of lines with reduced sleep rebound, we characterized baseline sleep and sleep depth before and after sleep deprivation for these hits. Results: We identified two lines that consistently exhibit a blunted increase in the duration and depth of sleep after thermogenetic sleep deprivation. Neither of the two genotypes has reduced total baseline sleep. Statistical analysis across all screened lines shows that genotype is a strong predictor of recovery sleep, independent from effects of genotype on baseline sleep. Conclusions: Our data show that rebound sleep following thermogenetic sleep deprivation can be genetically separated from sleep at baseline. This suggests that genetically controlled mechanisms of sleep regulation not manifest under undisturbed conditions contribute to sleep rebound following thermogenetic sleep deprivation. Citation: Dubowy C, Moravcevic K, Yue Z, Wan JY, Van Dongen HP, Sehgal A. Genetic dissociation of daily sleep and sleep following thermogenetic sleep deprivation in Drosophila. SLEEP 2016;39(5):1083–1095. PMID:26951392

  9. Genetic Dissociation of Daily Sleep and Sleep Following Thermogenetic Sleep Deprivation in Drosophila.

    PubMed

    Dubowy, Christine; Moravcevic, Katarina; Yue, Zhifeng; Wan, Joy Y; Van Dongen, Hans P A; Sehgal, Amita

    2016-05-01

    Sleep rebound-the increase in sleep that follows sleep deprivation-is a hallmark of homeostatic sleep regulation that is conserved across the animal kingdom. However, both the mechanisms that underlie sleep rebound and its relationship to habitual daily sleep remain unclear. To address this, we developed an efficient thermogenetic method of inducing sleep deprivation in Drosophila that produces a substantial rebound, and applied the newly developed method to assess sleep rebound in a screen of 1,741 mutated lines. We used data generated by this screen to identify lines with reduced sleep rebound following thermogenetic sleep deprivation, and to probe the relationship between habitual sleep amount and sleep following thermogenetic sleep deprivation in Drosophila. To develop a thermogenetic method of sleep deprivation suitable for screening, we thermogenetically stimulated different populations of wake-promoting neurons labeled by Gal4 drivers. Sleep rebound following thermogenetically-induced wakefulness varies across the different sets of wake-promoting neurons that were stimulated, from very little to quite substantial. Thermogenetic activation of neurons marked by the c584-Gal4 driver produces both strong sleep loss and a substantial rebound that is more consistent within genotypes than rebound following mechanical or caffeine-induced sleep deprivation. We therefore used this driver to induce sleep deprivation in a screen of 1,741 mutagenized lines generated by the Drosophila Gene Disruption Project. Flies were subjected to 9 h of sleep deprivation during the dark period and released from sleep deprivation 3 h before lights-on. Recovery was measured over the 15 h following sleep deprivation. Following identification of lines with reduced sleep rebound, we characterized baseline sleep and sleep depth before and after sleep deprivation for these hits. We identified two lines that consistently exhibit a blunted increase in the duration and depth of sleep after thermogenetic sleep deprivation. Neither of the two genotypes has reduced total baseline sleep. Statistical analysis across all screened lines shows that genotype is a strong predictor of recovery sleep, independent from effects of genotype on baseline sleep. Our data show that rebound sleep following thermogenetic sleep deprivation can be genetically separated from sleep at baseline. This suggests that genetically controlled mechanisms of sleep regulation not manifest under undisturbed conditions contribute to sleep rebound following thermogenetic sleep deprivation. © 2016 Associated Professional Sleep Societies, LLC.

  10. Low gravity synthesis of polymers with controlled molecular configuration

    NASA Technical Reports Server (NTRS)

    Heimbuch, A. H.; Parker, J. A.; Schindler, A.; Olf, H. G.

    1975-01-01

    Heterogeneous chemical systems have been studied for the synthesis of isotactic polypropylene in order to establish baseline parameters for the reaction process and to develop sensitive and accurate methods of analysis. These parameters and analytical methods may be used to make a comparison between the polypropylene obtained at one g with that of zero g (gravity). Baseline reaction parameters have been established for the slurry (liquid monomer in heptane/solid catalyst) polymerization of propylene to yield high purity, 98% isotactic polypropylene. Kinetic data for the slurry reaction showed that a sufficient quantity of polymer for complete characterization can be produced in a reaction time of 5 min; this time is compatible with that available on a sounding rocket for a zero-g simulation experiment. The preformed (activated) catalyst was found to be more reproducible in its activity than the in situ formed catalyst.

  11. [A correction method of baseline drift of discrete spectrum of NIR].

    PubMed

    Hu, Ai-Qin; Yuan, Hong-Fu; Song, Chun-Feng; Li, Xiao-Yu

    2014-10-01

    In the present paper, a new correction method of baseline drift of discrete spectrum is proposed by combination of cubic spline interpolation and first order derivative. A fitting spectrum is constructed by cubic spline interpolation, using the datum in discrete spectrum as interpolation nodes. The fitting spectrum is differentiable. First order derivative is applied to the fitting spectrum to calculate derivative spectrum. The spectral wavelengths which are the same as the original discrete spectrum were taken out from the derivative spectrum to constitute the first derivative spectra of the discrete spectra, thereby to correct the baseline drift of the discrete spectra. The effects of the new method were demonstrated by comparison of the performances of multivariate models built using original spectra, direct differential spectra and the spectra pretreated by the new method. The results show that negative effects on the performance of multivariate model caused by baseline drift of discrete spectra can be effectively eliminated by the new method.

  12. [Baseline correction of spectrum for the inversion of chlorophyll-a concentration in the turbidity water].

    PubMed

    Wei, Yu-Chun; Wang, Guo-Xiang; Cheng, Chun-Mei; Zhang, Jing; Sun, Xiao-Peng

    2012-09-01

    Suspended particle material is the main factor affecting remote sensing inversion of chlorophyll-a concentration (Chla) in turbidity water. According to the optical property of suspended material in water, the present paper proposed a linear baseline correction method to weaken the suspended particle contribution in the spectrum above turbidity water surface. The linear baseline was defined as the connecting line of reflectance from 450 to 750 nm, and baseline correction is that spectrum reflectance subtracts the baseline. Analysis result of field data in situ of Meiliangwan, Taihu Lake in April, 2011 and March, 2010 shows that spectrum linear baseline correction can improve the inversion precision of Chl a and produce the better model diagnoses. As the data in March, 2010, RMSE of band ratio model built by original spectrum is 4.11 mg x m(-3), and that built by spectrum baseline correction is 3.58 mg x m(-3). Meanwhile, residual distribution and homoscedasticity in the model built by baseline correction spectrum is improved obviously. The model RMSE of April, 2011 shows the similar result. The authors suggest that using linear baseline correction as the spectrum processing method to improve Chla inversion accuracy in turbidity water without algae bloom.

  13. Implementation of Evidence-Based Employment Services in Specialty Mental Health

    PubMed Central

    Hamilton, Alison B; Cohen, Amy N; Glover, Dawn L; Whelan, Fiona; Chemerinski, Eran; McNagny, Kirk P; Mullins, Deborah; Reist, Christopher; Schubert, Max; Young, Alexander S

    2013-01-01

    Objective. Study a quality improvement approach for implementing evidence-based employment services at specialty mental health clinics. Data Sources/Study Setting. Semistructured interviews with clinicians and administrators before, during, and after implementation. Qualitative field notes, structured baseline and follow-up interviews with patients, semistructured interviews with patients after implementation, and administrative data. Study Design. Site-level controlled trial at four implementation and four control sites. Hybrid implementation–effectiveness study with mixed methods intervention evaluation design. Data Collection/Extraction Methods. Site visits, in-person and telephone interviews, patient surveys, patient self-assessment. A total of 801 patients completed baseline surveys and 53 clinicians and other clinical key stakeholders completed longitudinal qualitative interviews. Principal Findings. At baseline, sites varied in the availability, utilization, and quality of supported employment. Each site needed quality improvement for this service, though for differing reasons, with some needing development of the service itself and others needing increased service capacity. Improvements in knowledge, attitudes, beliefs, and referral behaviors were evident in mid- and postimplementation interviews, though some barriers persisted. Half of patients expressed an interest in working at baseline. Patients at implementation sites were 2.3 times more likely to receive employment services during the study year. Those who had a service visit were more likely to be employed at follow-up than those who did not. Conclusions. Studies of implementation and effectiveness require mixed methods to both enhance implementation in real time and provide context for interpretation of complex results. In this study, a quality improvement approach resulted in superior patient-level outcomes and improved clinician knowledge, attitudes, and behaviors, in the context of substantial variation among sites. PMID:24138608

  14. The Quality of Medication Use in Older Adults: Methods of a Longitudinal Study

    PubMed Central

    Roth, Mary T.; Moore, Charity G.; Ivey, Jena L.; Esserman, Denise A.; Campbell, William H.; Weinberger, Morris

    2009-01-01

    Background The quality of medication use in older adults is a recurring problem of substantial concern. Efforts to both measure and improve the quality of medication use often define quality too narrowly and fall short of addressing the complexity of an older adult's medication regimen. Objective In an effort to more comprehensively define the quality of medication use in older adults, we conducted a prospective cohort study to: 1) describe the quality of medication use in community-residing older adults at baseline, examining differences between Whites and African Americans; 2) examine the effect of race on medication-related problems[mtr1], and 3) assess the change in quality medication use between Whites and African Americans over time. This paper presents the research design and methods of this longitudinal study. Methods We interviewed 100 White and 100 African-American community-residing older adults three times over one year (baseline, 6, and 12 months). We oversampled African Americans so that we could estimate racial differences in the quality of medication use. We collected information on the quality of medication use, relying on a clinical pharmacist's assessment of quality and the Assessing Care of Vulnerable Elders (ACOVE) quality indicators. We also collected data on demographic characteristics, health literacy, functional status, and participant-reported drug therapy concerns. Results Two hundred older adults were enrolled into the study and completed a baseline visit. Of the 200, 92% completed the 6-month visit (n=183) and 88% completed the 12-month visit (n=176). We present baseline demographic characteristics for the 200 older adults enrolled in the study. Conclusion This longitudinal study is an initial step toward developing more comprehensive, patient-centered measures and interventions to improve the quality of medication use in older adults. PMID:19028378

  15. Optimization of a reversible hood for protecting a pedestrian's head during car collisions.

    PubMed

    Huang, Sunan; Yang, Jikuang

    2010-07-01

    This study evaluated and optimized the performance of a reversible hood (RH) for the prevention of the head injuries of an adult pedestrian from car collisions. The FE model of a production car front was introduced and validated. The baseline RH was developed from the original hood in the validated car front model. In order to evaluate the protective performance of the baseline RH, the FE models of an adult headform and a 50th percentile human head were used in parallel to impact the baseline RH. Based on the evaluation, the response surface method was applied to optimize the RH in terms of the material stiffness, lifting speed, and lifted height. Finally, the headform model and the human head model were again used to evaluate the protective performance of the optimized RH. It was found that the lifted baseline RH can obviously reduce the impact responses of the headform model and the human head model by comparing with the retracted and lifting baseline RH. When the optimized RH was lifted, the HIC values of the headform model and the human head model were further reduced to much lower than 1000. The risk of pedestrian head injuries can be prevented as required by EEVC WG17. Copyright 2009 Elsevier Ltd. All rights reserved.

  16. A novel baseline correction method using convex optimization framework in laser-induced breakdown spectroscopy quantitative analysis

    NASA Astrophysics Data System (ADS)

    Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun

    2017-12-01

    For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.

  17. Impact of a multidimensional infection control approach on central line-associated bloodstream infections rates in adult intensive care units of 8 cities of Turkey: findings of the International Nosocomial Infection Control Consortium (INICC)

    PubMed Central

    2013-01-01

    Background Central line-associated bloodstream infections (CLABs) have long been associated with excess lengths of stay, increased hospital costs and mortality attributable to them. Different studies from developed countries have shown that practice bundles reduce the incidence of CLAB in intensive care units. However, the impact of the bundle strategy has not been systematically analyzed in the adult intensive care unit (ICU) setting in developing countries, such as Turkey. The aim of this study is to analyze the impact of the International Nosocomial Infection Control Consortium (INICC) multidimensional infection control approach to reduce the rates of CLAB in 13 ICUs of 13 INICC member hospitals from 8 cities of Turkey. Methods We conducted active, prospective surveillance before-after study to determine CLAB rates in a cohort of 4,017 adults hospitalized in ICUs. We applied the definitions of the CDC/NHSN and INICC surveillance methods. The study was divided into baseline and intervention periods. During baseline, active outcome surveillance of CLAB rates was performed. During intervention, the INICC multidimensional approach for CLAB reduction was implemented and included the following measures: 1- bundle of infection control interventions, 2- education, 3- outcome surveillance, 4- process surveillance, 5- feedback of CLAB rates, and 6- performance feedback on infection control practices. CLAB rates obtained in baseline were compared with CLAB rates obtained during intervention. Results During baseline, 3,129 central line (CL) days were recorded, and during intervention, we recorded 23,463 CL-days. We used random effects Poisson regression to account for clustering of CLAB rates within hospital across time periods. The baseline CLAB rate was 22.7 per 1000 CL days, which was decreased during the intervention period to 12.0 CLABs per 1000 CL days (IRR 0.613; 95% CI 0.43 – 0.87; P 0.007). This amounted to a 39% reduction in the incidence rate of CLAB. Conclusions The implementation of multidimensional infection control approach was associated with a significant reduction in the CLAB rates in adult ICUs of Turkey, and thus should be widely implemented. PMID:23641950

  18. Development of a Theoretically Based Treatment for Sentence Comprehension Deficits in Individuals with Aphasia

    ERIC Educational Resources Information Center

    Kiran, Swathi; Caplan, David; Sandberg, Chaleece; Levy, Joshua; Berardino, Alex; Ascenso, Elsa; Villard, Sarah; Tripodis, Yorghos

    2012-01-01

    Purpose: Two new treatments, 1 based on sentence to picture matching (SPM) and the other on object manipulation (OM), that train participants on the thematic roles of sentences using pictures or by manipulating objects were piloted. Method: Using a single-subject multiple-baseline design, sentence comprehension was trained on the affected sentence…

  19. Searching for Signs of Life in Ontario Universities: An Innovative Method for Evaluating Biodiversity Integration within University Curricula

    ERIC Educational Resources Information Center

    McCallum, Jenn; Elliott, Paul; McIntosh, Terese

    2017-01-01

    This study investigates the degree to which biodiversity concepts are included within university curricula in Ontario and provides a baseline for tracking this. A keyword search of undergraduate and graduate academic calendars from six Ontario universities was conducted. A list of 28 relevant keywords was developed, and university program…

  20. Reporting Multiple-Group Mean and Covariance Structure across Occasions with Structural Equation Modeling

    ERIC Educational Resources Information Center

    Okech, David

    2012-01-01

    Objectives: Using baseline and second wave data, the study evaluated the measurement and structural properties of parenting stress, personal mastery, and economic strain with N = 381 lower income parents who decided to join and those who did not join in a child development savings account program. Methods: Structural equation modeling mean and…

  1. The British Chinese Adoption Study: Orphanage Care, Adoption and Mid-Life Outcomes

    ERIC Educational Resources Information Center

    Rushton, Alan; Grant, Margaret; Feast, Julia; Simmonds, John

    2013-01-01

    Background: While studies of ex-orphanage care show adverse effects on development, the longer-term impact on mid-life psychosocial functioning and physical health has not been established. Methods: Orphanage records provided baseline data on a sample of 100 Hong Kong Chinese girls who were subsequently adopted into the UK. A mid-life follow-up…

  2. Treatment for Acquired Apraxia of Speech: Examination of Treatment Intensity and Practice Schedule

    ERIC Educational Resources Information Center

    Wambaugh, Julie L.; Nessler, Christina; Cameron, Rosalea; Mauszycki, Shannon C.

    2013-01-01

    Purpose: The authors designed this investigation to extend the development of a treatment for acquired apraxia of speech (AOS)--sound production treatment (SPT)--by examining the effects of 2 treatment intensities and 2 schedules of practice. Method: The authors used a multiple baseline design across participants and behaviors with 4 speakers with…

  3. Evaluating the Effects of Child Savings Accounts Program Participation on Parental Well-Being

    ERIC Educational Resources Information Center

    Okech, David

    2012-01-01

    Objectives: Using baseline and second wave data, the study evaluated the impact of child savings accounts participation on parenting stress, personal mastery, and economic strain with N = 381 lower income parents who decided to join and those who did not join in a child development savings account program. Methods: Structural equation modeling for…

  4. Performance-based plastic design of earthquake resistant reinforced concrete moment frames

    NASA Astrophysics Data System (ADS)

    Liao, Wen-Cheng

    Performance-Based Plastic Design (PBPD) method has been recently developed to achieve enhanced performance of earthquake resistant structures. The design concept uses pre-selected target drift and yield mechanism as performance criteria. The design base shear for selected hazard level is determined by equating the work needed to push the structure monotonically up to the target drift to the corresponding energy demand of an equivalent SDOF oscillator. This study presents development of the PBPD approach as applied to reinforced concrete special moment frame (RC SMF) structures. RC structures present special challenge because of their complex and degrading ("pinched") hysteretic behavior. In order to account for the degrading hysteretic behavior the 1-EMA 440 C2 factor approach was used in the process of determining the design base shear. Four baseline RC SMF (4, 8, 12 and 20-story) as used in the FEMA P695 were selected for this study. Those frames were redesigned by the PBPD approach. The baseline frames and the PBPD frames were subjected to extensive inelastic pushover and time-history analyses. The PBPD frames showed much improved response meeting all desired performance objectives, including the intended yield mechanisms and the target drifts. On the contrary, the baseline frames experienced large story drifts due to flexural yielding of the columns. The work-energy equation to determine design base shear can also be used to estimate seismic demands, called the energy spectrum method. In this approach the skeleton force-displacement (capacity) curve of the structure is converted into energy-displacement plot (Ec) which is superimposed over the corresponding energy demand plot ( Ed) for the specified hazard level to determine the expected peak displacement demands. In summary, this study shows that the PBPD approach can be successfully applied to RC moment frame structures as well, and that the responses of the example moment frames were much improved over those of the corresponding baseline frames. In addition, the drift demands of all study frames as computed by the energy spectrum method were in excellent agreement with those obtained from detailed inelastic dynamic analyses.

  5. Estimating current and future streamflow characteristics at ungaged sites, central and eastern Montana, with application to evaluating effects of climate change on fish populations

    USGS Publications Warehouse

    Sando, Roy; Chase, Katherine J.

    2017-03-23

    A common statistical procedure for estimating streamflow statistics at ungaged locations is to develop a relational model between streamflow and drainage basin characteristics at gaged locations using least squares regression analysis; however, least squares regression methods are parametric and make constraining assumptions about the data distribution. The random forest regression method provides an alternative nonparametric method for estimating streamflow characteristics at ungaged sites and requires that the data meet fewer statistical conditions than least squares regression methods.Random forest regression analysis was used to develop predictive models for 89 streamflow characteristics using Precipitation-Runoff Modeling System simulated streamflow data and drainage basin characteristics at 179 sites in central and eastern Montana. The predictive models were developed from streamflow data simulated for current (baseline, water years 1982–99) conditions and three future periods (water years 2021–38, 2046–63, and 2071–88) under three different climate-change scenarios. These predictive models were then used to predict streamflow characteristics for baseline conditions and three future periods at 1,707 fish sampling sites in central and eastern Montana. The average root mean square error for all predictive models was about 50 percent. When streamflow predictions at 23 fish sampling sites were compared to nearby locations with simulated data, the mean relative percent difference was about 43 percent. When predictions were compared to streamflow data recorded at 21 U.S. Geological Survey streamflow-gaging stations outside of the calibration basins, the average mean absolute percent error was about 73 percent.

  6. Predictive value of different prostate-specific antigen-based markers in men with baseline total prostate-specific antigen <2.0 ng/mL.

    PubMed

    Fujizuka, Yuji; Ito, Kazuto; Oki, Ryo; Suzuki, Rie; Sekine, Yoshitaka; Koike, Hidekazu; Matsui, Hiroshi; Shibata, Yasuhiro; Suzuki, Kazuhiro

    2017-08-01

    To investigate the predictive value of various molecular forms of prostate-specific antigen in men with baseline prostate-specific antigen <2.0 ng/mL. The case cohort comprised 150 men with a baseline prostate-specific antigen level <2.0 ng/mL, and who developed prostate cancer within 10 years. The control cohort was 300 baseline prostate-specific antigen- and age-adjusted men who did not develop prostate cancer. Serum prostate-specific antigen, free prostate-specific antigen, and [-2] proenzyme prostate-specific antigen were measured at baseline and last screening visit. The predictive impact of baseline prostate-specific antigen- and [-2] proenzyme prostate-specific antigen-related indices on developing prostate cancer was investigated. The predictive impact of those indices at last screening visit and velocities from baseline to final screening on tumor aggressiveness were also investigated. The baseline free to total prostate-specific antigen ratio was a significant predictor of prostate cancer development. The odds ratio was 6.08 in the lowest quintile baseline free to total prostate-specific antigen ratio subgroup. No serum indices at diagnosis were associated with tumor aggressiveness. The Prostate Health Index velocity and [-2] proenzyme prostate-specific antigen/free prostate-specific antigen velocity significantly increased in patients with higher risk D'Amico risk groups and higher Gleason scores. Free to total prostate-specific antigen ratio in men with low baseline prostate-specific antigen levels seems to predict the risk of developing prostate cancer, and it could be useful for a more effective individualized screening system. Longitudinal changes in [-2] proenzyme prostate-specific antigen-related indices seem to correlate with tumor aggressiveness, and they could be used as prognostic tool before treatment and during active surveillance. © 2017 The Japanese Urological Association.

  7. [A novel school-based strategy for the prevention of HIV/AIDS, sexually transmitted disease (STDs), and teen pregnancies].

    PubMed

    Torres, Pilar; Walker, Dilys M; Gutiérez, Juan Pablo; Bertozzi, Stefano M

    2006-01-01

    To introduce the study design of an HIV/AIDS and unplanned pregnancy prevention program targeting high school students, and to present the results from the baseline survey. A school curriculum was developed to inform adolescent students about HIV/AIDS/STD prevention, which included information on emergency contraception (EC) for adolescent students. A randomized controlled study was conducted to simultaneously evaluate the effect of this intervention. The baseline survey collected data on contraception knowledge and attitudes regarding sexual behaviors. A total of 11,117 students from 40 schools participated in the baseline (52% female, the mean age of both males and females was 15.5). A total of 10% of the females and 24% of the men surveyed were sexually active at baseline, but only 39% of those sexually active reported using a condom at the time of their first sexual intercourse. Among the sexually active students surveyed, a third of the males and a fifth of the females reported at least one condom slip or breakage. Most of the students were aware of EC. The low proportion of students that report using condoms accompanied by their incorrect use points to the need for HIV/AIDS and unplanned pregnancy prevention efforts. This novel approach offers adolescents EC, a backup method to the condom. The approach is feasible as students know what EC is and furthermore it appears that they are willing to use this method.

  8. Maintenance of Pain in Children with Functional Abdominal Pain

    PubMed Central

    Czyzewski, Danita I.; Self, Mariella M.; Williams, Amy E.; Weidler, Erica M.; Blatz, Allison M.; Shulman, Robert J.

    2015-01-01

    Objectives A significant proportion of children with functional abdominal pain develop chronic pain. Identifying clinical characteristics predicting pain persistence is important in targeting interventions. We examined whether child anxiety and/or pain-stooling relations were related to maintenance of abdominal pain frequency and compared the predictive value of three methods for assessing pain-stooling relations (i.e., diary, parent report, child report). Methods Seventy-six children (7–10-years-old at baseline) who presented for medical treatment of functional abdominal pain were followed up 18–24 months later. Baseline anxiety and abdominal pain-stooling relations based on pain and stooling diaries and child- and parent-questionnaires were examined in relationship to the persistence of abdominal pain frequency. Results Children’s baseline anxiety was not related to persistence of pain frequency. However, children who displayed irritable bowel syndrome (IBS) symptoms at baseline maintained pain frequency at follow-up, whereas in children in whom there was no relationship between pain and stooling, pain frequency decreased. Pain and stool diaries and parent report of pain-stooling relations were predictive of pain persistence but child-report questionnaires were not. Conclusions The presence of IBS symptoms in school age children with functional abdominal pain appears to predict persistence of abdominal pain over time, while anxiety does not. Prospective pain and stooling diaries and parent report of IBS symptoms were predictors of pain maintenance, but child report of symptoms was not. PMID:26301615

  9. Psychological Dysregulation During Adolescence Mediates the Association of Parent-Child Attachment in Childhood and Substance Use Disorder in Adulthood1

    PubMed Central

    Zhai, Zu Wei; Kirisci, Levent; Tarter, Ralph E.; Ridenour, Ty A.

    2015-01-01

    Objective This prospective study tested the hypothesis that psychological dysregulation in mid-adolescence (age 16) mediates the association between parent-child attachment in late childhood (age 10-12) and development of substance use disorder (SUD) in adulthood (age 22). Method The Youth Attachment to Parents Scale (YAPS) was developed in 10-12 year old boys and girls (N = 694) at baseline residing in western Pennsylvania. Psychological dysregulation was measured by the neurobehavior disinhibition trait. Substance use was assessed at ages 10-12, 12-14, 16 and 19. SUD was diagnosed at age 22 using the Structured Clinical Interview for DSM Disorders. The mediation of parent-child attachment and SUD by neurobehavior disinhibition was tested separately for mothers and fathers while controlling for baseline substance use. Results Psychological dysregulation mediates the association between attachment to mothers and SUD, and partially mediates the association between attachment to fathers and SUD. Significant mediation effects remains after controlling for baseline substance use. Conclusion Optimal prevention of SUD should include ameliorating both psychological dysregulation predisposing to SUD and quality of the parent-child relationship. PMID:24359508

  10. Composite transport wing technology development: Design development tests and advanced structural concepts

    NASA Technical Reports Server (NTRS)

    Griffin, Charles F.; Harvill, William E.

    1988-01-01

    Numerous design concepts, materials, and manufacturing methods were investigated for the covers and spars of a transport box wing. Cover panels and spar segments were fabricated and tested to verify the structural integrity of design concepts and fabrication techniques. Compression tests on stiffened panels demonstrated the ability of graphite/epoxy wing upper cover designs to achieve a 35 percent weight savings compared to the aluminum baseline. The impact damage tolerance of the designs and materials used for these panels limits the allowable compression strain and therefore the maximum achievable weight savings. Bending and shear tests on various spar designs verified an average weight savings of 37 percent compared to the aluminum baseline. Impact damage to spar webs did not significantly degrade structural performance. Predictions of spar web shear instability correlated well with measured performance. The structural integrity of spars manufactured by filament winding equalled or exceeded those fabricated by hand lay-up. The information obtained will be applied to the design, fabrication, and test of a full-scale section of a wing box. When completed, the tests on the technology integration box beam will demonstrate the structural integrity of an advanced composite wing design which is 25 percent lighter than the metal baseline.

  11. Machine protection system for rotating equipment and method

    DOEpatents

    Lakshminarasimha, Arkalgud N.; Rucigay, Richard J.; Ozgur, Dincer

    2003-01-01

    A machine protection system and method for rotating equipment introduces new alarming features and makes use of full proximity probe sensor information, including amplitude and phase. Baseline vibration amplitude and phase data is estimated and tracked according to operating modes of the rotating equipment. Baseline vibration and phase data can be determined using a rolling average and variance and stored in a unit circle or tracked using short term average and long term average baselines. The sensed vibration amplitude and phase is compared with the baseline vibration amplitude and phase data. Operation of the rotating equipment can be controlled based on the vibration amplitude and phase.

  12. Comparison of composite rotor blade models: A coupled-beam analysis and an MSC/NASTRAN finite-element model

    NASA Technical Reports Server (NTRS)

    Hodges, Robert V.; Nixon, Mark W.; Rehfield, Lawrence W.

    1987-01-01

    A methodology was developed for the structural analysis of composite rotor blades. This coupled-beam analysis is relatively simple to use compared with alternative analysis techniques. The beam analysis was developed for thin-wall single-cell rotor structures and includes the effects of elastic coupling. This paper demonstrates the effectiveness of the new composite-beam analysis method through comparison of its results with those of an established baseline analysis technique. The baseline analysis is an MSC/NASTRAN finite-element model built up from anisotropic shell elements. Deformations are compared for three linear static load cases of centrifugal force at design rotor speed, applied torque, and lift for an ideal rotor in hover. A D-spar designed to twist under axial loading is the subject of the analysis. Results indicate the coupled-beam analysis is well within engineering accuracy.

  13. Indentation experiments and simulation of ovine bone using a viscoelastic-plastic damage model

    PubMed Central

    Zhao, Yang; Wu, Ziheng; Turner, Simon; MacLeay, Jennifer; Niebur, Glen L.; Ovaert, Timothy C.

    2015-01-01

    Indentation methods have been widely used to study bone at the micro- and nanoscales. It has been shown that bone exhibits viscoelastic behavior with permanent deformation during indentation. At the same time, damage due to microcracks is induced due to the stresses beneath the indenter tip. In this work, a simplified viscoelastic-plastic damage model was developed to more closely simulate indentation creep data, and the effect of the model parameters on the indentation curve was investigated. Experimentally, baseline and 2-year postovariectomized (OVX-2) ovine (sheep) bone samples were prepared and indented. The damage model was then applied via finite element analysis to simulate the bone indentation data. The mechanical properties of yielding, viscosity, and damage parameter were obtained from the simulations. The results suggest that damage develops more quickly for OVX-2 samples under the same indentation load conditions as the baseline data. PMID:26136623

  14. Acoustic startle response in rats predicts inter-individual variation in fear extinction.

    PubMed

    Russo, Amanda S; Parsons, Ryan G

    2017-03-01

    Although a large portion of the population is exposed to a traumatic event at some point, only a small percentage of the population develops post-traumatic stress disorder (PTSD), suggesting the presence of predisposing factors. Abnormal acoustic startle response (ASR) has been shown to be associated with PTSD, implicating it as a potential predictor of the development of PTSD-like behavior. Since poor extinction and retention of extinction learning are characteristic of PTSD patients, it is of interest to determine if abnormal ASR is predictive of development of such deficits. To determine whether baseline ASR has utility in predicting the development of PTSD-like behavior, the relationship between baseline ASR and freezing behavior following Pavlovian fear conditioning was examined in a group of adult, male Sprague-Dawley rats. Baseline acoustic startle response (ASR) was assessed preceding exposure to a Pavlovian fear conditioning paradigm where freezing behavior was measured during fear conditioning, extinction training, and extinction testing. Although there was no relationship between baseline ASR and fear memory following conditioning, rats with low baseline ASR had significantly lower magnitude of retention of the extinction memory than rats with high baseline ASR. The results suggest that baseline ASR has value as a predictive index of the development of a PTSD-like phenotype. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Primal-dual convex optimization in large deformation diffeomorphic metric mapping: LDDMM meets robust regularizers

    NASA Astrophysics Data System (ADS)

    Hernandez, Monica

    2017-12-01

    This paper proposes a method for primal-dual convex optimization in variational large deformation diffeomorphic metric mapping problems formulated with robust regularizers and robust image similarity metrics. The method is based on Chambolle and Pock primal-dual algorithm for solving general convex optimization problems. Diagonal preconditioning is used to ensure the convergence of the algorithm to the global minimum. We consider three robust regularizers liable to provide acceptable results in diffeomorphic registration: Huber, V-Huber and total generalized variation. The Huber norm is used in the image similarity term. The primal-dual equations are derived for the stationary and the non-stationary parameterizations of diffeomorphisms. The resulting algorithms have been implemented for running in the GPU using Cuda. For the most memory consuming methods, we have developed a multi-GPU implementation. The GPU implementations allowed us to perform an exhaustive evaluation study in NIREP and LPBA40 databases. The experiments showed that, for all the considered regularizers, the proposed method converges to diffeomorphic solutions while better preserving discontinuities at the boundaries of the objects compared to baseline diffeomorphic registration methods. In most cases, the evaluation showed a competitive performance for the robust regularizers, close to the performance of the baseline diffeomorphic registration methods.

  16. Association between baseline cognitive impairment and postoperative delirium in elderly patients undergoing surgery for adult spinal deformity.

    PubMed

    Adogwa, Owoicho; Elsamadicy, Aladine A; Vuong, Victoria D; Fialkoff, Jared; Cheng, Joseph; Karikari, Isaac O; Bagley, Carlos A

    2018-01-01

    OBJECTIVE Postoperative delirium is common in elderly patients undergoing spine surgery and is associated with a longer and more costly hospital course, functional decline, postoperative institutionalization, and higher likelihood of death within 6 months of discharge. Preoperative cognitive impairment may be a risk factor for the development of postoperative delirium. The aim of this study was to investigate the relationship between baseline cognitive impairment and postoperative delirium in geriatric patients undergoing surgery for degenerative scoliosis. METHODS Elderly patients 65 years and older undergoing a planned elective spinal surgery for correction of adult degenerative scoliosis were enrolled in this study. Preoperative cognition was assessed using the validated Saint Louis University Mental Status (SLUMS) examination. SLUMS comprises 11 questions, with a maximum score of 30 points. Mild cognitive impairment was defined as a SLUMS score between 21 and 26 points, while severe cognitive impairment was defined as a SLUMS score of ≤ 20 points. Normal cognition was defined as a SLUMS score of ≥ 27 points. Delirium was assessed daily using the Confusion Assessment Method (CAM) and rated as absent or present on the basis of CAM. The incidence of delirium was compared in patients with and without baseline cognitive impairment. RESULTS Twenty-two patients (18%) developed delirium postoperatively. Baseline demographics, including age, sex, comorbidities, and perioperative variables, were similar in patients with and without delirium. The length of in-hospital stay (mean 5.33 days vs 5.48 days) and 30-day hospital readmission rates (12.28% vs 12%) were similar between patients with and without delirium, respectively. Patients with preoperative cognitive impairment (i.e., a lower SLUMS score) had a higher incidence of postoperative delirium. One- and 2-year patient reported outcomes scores were similar in patients with and without delirium. CONCLUSIONS Cognitive impairment is a risk factor for the development of postoperative delirium. Postoperative delirium may be associated with decreased preoperative cognitive reserve. Cognitive impairment assessments should be considered in the preoperative evaluations of elderly patients prior to surgery.

  17. Evolution of Geographic Atrophy in Participants Treated with Ranibizumab for Neovascular Age-related Macular Degeneration

    PubMed Central

    Thavikulwat, Alisa T.; Jacobs-El, Naima; Kim, Jane S.; Agrón, Elvira; Hasan, Jesia; Meyerle, Catherine B.; Valent, David; Cukras, Catherine A.; Wiley, Henry E.; Wong, Wai T.; Chew, Emily Y.

    2016-01-01

    Purpose To evaluate the risk factors, incidence, and rate of progression of geographic atrophy (GA) in eyes with neovascular age-related macular degeneration (nAMD) treated with ranibizumab. Design Post-hoc analysis of a prospective clinical study. Participants 69 participants with nAMD in at least one eye. Methods Participants were prospectively treated in the study eye with 0.5 mg intravitreal ranibizumab. Study eyes received 4 monthly injections followed by pro re nata injections until a fluid-free macula was achieved on optical coherence tomography. Risk factors assessed included baseline demographics, treatment, and ocular characteristics on imaging. Eyes were evaluated on fundus autofluorescence (FAF) for GA. The rate of GA area growth in study and fellow eyes was analyzed by linear regression of square-root transformed areas. Main Outcome Measures Development of new-onset GA and rate of GA area growth measured on ocular imaging, including FAF images of the study eyes. Results Sixty-nine participants (mean age 78.8±7.8 years) with an average of 40.0±13.6 months of follow-up were analyzed. Twenty-two of 69 study eyes (32%) were treatment naïve. During their first year of the study, participants received an average of 9.2±3.3 injections in the study eye. Of 63 study eyes with quality baseline images, 22 (35%) had pre-existing GA. Of the remaining 41 eyes, 7 (17%) developed new-onset GA during study follow-up. Those who developed new GA were older (all ≥79 years old) and had received fewer study injections on average (6.9 vs. 10.4 injections at 1 year) compared to those who did not develop new GA. Of the 12 treatment naïve study eyes without GA at baseline, 1 (8.3%) developed new GA during the study. In 21 study eyes with quantifiable GA area, eyes with GA present at baseline (16/21) enlarged by 0.34±0.26 mm/year, compared to 0.19±0.12 mm/year in eyes developing new-onset GA (5/21). Conclusions While 17% of study eyes without GA present at baseline receiving ranibizumab developed new GA, the role of ranibizumab in the development of GA is unclear. Further prospective longitudinal studies are required to determine the eyes most at risk of developing GA in the setting of anti-VEGF treatment. PMID:28630947

  18. Increase in serum albumin concentration is associated with prediabetes development and progression to overt diabetes independently of metabolic syndrome

    PubMed Central

    Jun, Ji Eun; Lee, Seung-Eun; Lee, You-Bin; Jee, Jae Hwan; Bae, Ji Cheol; Jin, Sang-Man; Hur, Kyu Yeon; Lee, Moon-Kyu

    2017-01-01

    Aim Serum albumin concentration is associated with both type 2 diabetes and metabolic syndrome (MetS). We sought to investigate whether baseline serum albumin and change in serum albumin could be independent risk factors for prediabetes in subjects without MetS. We further examined the effect of serum albumin on progression to overt diabetes in subjects who developed prediabetes. Methods Among 10,792 participants without diabetes and MetS who consecutively underwent yearly health check-ups over six years, 9,807 subjects without incident MetS were enrolled in this longitudinal retrospective study. The risk of developing prediabetes (impared fasting glucose or hemoglobin A1c) was analyzed according to baseline and percent change in serum albumin concentration using Cox regression analysis. Serial changes in serum albumin concentration were measured from baseline to one year before prediabetes diagnosis, and then from the time of prediabetes diagnosis to progression to overt diabetes or final follow-up. Results A total of 4,398 incident cases of prediabetes developed during 35,807 person-years (median 3.8 years). The hazard ratio for incident prediabetes decreased as percent change in serum albumin concentration (quartiles and per 1%) increased in a crude and fully adjusted model. However, baseline serum albumin concentration itself was not associated with prediabetic risk. Serum albumin levels kept increasing until the end of follow-up in prediabetic subjects who returned to normal glycemic status, whereas these measures did not change in prediabetic subjects who developed type 2 diabetes. Serum albumin concentration measured at the end of follow-up was the highest in the regression group, compared to the stationary (p = 0.014) or progression groups (p = 0.009). Conclusions Increase in serum albumin concentration might protect against early glycemic deterioration and progression to type 2 diabetes even in subjects without MetS. PMID:28430803

  19. Deoxyribonucleic acid telomere length shortening can predict the incidence of non-alcoholic fatty liver disease in patients with type 2 diabetes mellitus.

    PubMed

    Ping, Fan; Li, Zeng-Yi; Lv, Ke; Zhou, Mei-Cen; Dong, Ya-Xiu; Sun, Qi; Li, Yu-Xiu

    2017-03-01

    To investigate the effect of telomere shortening and other predictive factors of non-alcoholic fatty liver disease (NAFLD) in type 2 diabetes mellitus patients in a 6-year prospective cohort study. A total of 70 type 2 diabetes mellitus (mean age 57.8 ± 6.7 years) patients without NAFLD were included in the study, and 64 of them were successfully followed up 6 years later, excluding four cases with significant alcohol consumption. NAFLD was diagnosed by the hepatorenal ratio obtained by a quantitative ultrasound method using NIH image analysis software. The 39 individuals that developed NAFLD were allocated to group A, and the 21 individuals that did not develop NAFLD were allocated to group B. Fluorescent real-time quantitative polymerase chain reaction was used to measure telomere length. There was no significant difference between the two groups in baseline telomere length; however, at the end of the 6th year, telomere length had become shorter in group A compared with group B. There were significant differences between these two groups in baseline body mass index, waistline, systolic blood pressure, glycated hemoglobin and fasting C-peptide level. In addition, the estimated indices of baseline insulin resistance increased in group A. Fasting insulin level, body mass index, systolic blood pressure at baseline and the shortening of telomere length were independent risk factors of NAFLD in type 2 diabetes mellitus patients. Telomere length became shorter in type 2 diabetes mellitus patients who developed NAFLD over the course of 6 years. Type 2 diabetes mellitus patients who developed NAFLD had more serious insulin resistance compared with those who did not develop NAFLD a long time ago. © 2016 The Authors. Journal of Diabetes Investigation published by Asian Association for the Study of Diabetes (AASD) and John Wiley & Sons Australia, Ltd.

  20. T2 relaxation time measurements are limited in monitoring progression, once advanced cartilage defects at the knee occur

    PubMed Central

    Jungmann, P.M.; Kraus, M.S.; Nardo, L.; Liebl, H.; Alizai, H.; Joseph, G.B.; Liu, F.; Lynch, J.; McCulloch, C.E.; Nevitt, M.C.; Link, T.M.

    2014-01-01

    Purpose To study the natural evolution of cartilage T2 relaxation times in knees with various extents of morphological cartilage abnormalities, assessed with 3T MRI from the Osteoarthritis Initiative. Materials and Methods Right knee MRIs of 245, 45–60 year old individuals without radiographic OA were included. Cartilage was segmented and T2 maps were generated in five compartments (patella, medial and lateral femoral condyle, medial and lateral tibia) at baseline and two-year follow-up. We examined the association of T2 values and two-year change of T2 values with various Whole-Organ MR Imaging Scores (WORMS). Statistical analysis was performed with ANOVA and Students t-tests. Results Higher baseline T2 was associated with more severe cartilage defects at baseline and subsequent cartilage loss (P<0.001). However, longitudinal T2 change was inversely associated with both baseline (P=0.038) and follow-up (P=0.002) severity of cartilage defects. Knees that developed new cartilage defects had smaller increases in T2 than subjects without defects (P=0.045). Individuals with higher baseline T2 showed smaller T2 increases over time (P<0.001). Conclusion An inverse correlation of longitudinal T2 changes versus baseline T2 values and morphological cartilage abnormalities suggests that once morphological cartilage defects occur, T2 values may be limited for evaluating further cartilage degradation. PMID:24038491

  1. Child health promotion program in South Korea in collaboration with US National Aeronautics and Space Administration: Improvement in dietary and nutrition knowledge of young children

    PubMed Central

    Lim, Hyunjung; Kim, JiEun; Min, Jungwon; Carvajal, Nubia A.; Lloyd, Charles W.

    2016-01-01

    BACKGROUND/OBJECTIVES Childhood obesity has become a global epidemic. Development of effective and sustainable programs to promote healthy behaviors from a young age is important. This study developed and tested an intervention program designed to promote healthy eating and physical activity among young children in South Korea by adaptation of the US National Aeronautics and Space Administration (NASA) Mission X (MX) Program. SUBJECTS/METHODS The intervention program consisted of 4 weeks of fitness and 2 weeks of nutrition education. A sample of 104 subjects completed pre- and post-surveys on the Children's Nutrition Acknowledgement Test (NAT). Parents were asked for their children's characteristics and two 24-hour dietary records, the Nutrition Quotient (NQ) at baseline and a 6-week follow-up. Child weight status was assessed using Korean body mass index (BMI) percentiles. RESULTS At baseline, 16.4% (boy: 15.4%; girl: 19.2%) of subjects were overweight or obese (based on BMI≥85%tile). Fat consumption significantly decreased in normal BMI children (48.6 ± 16.8 g at baseline to 41.9 ± 18.1 g after intervention, P < 0.05); total NQ score significantly increased from 66.4 to 67.9 (P < 0.05); total NAT score significantly improved in normal BMI children (74.3 at baseline to 81.9 after the program), children being underweight (from 71.0 to 77.0), and overweight children (77.1 at baseline vs. 88.2 after intervention, P < 0.001). CONCLUSIONS The 6-week South Korean NASA MX project is feasible and shows favorable changes in eating behaviors and nutritional knowledge among young children. PMID:27698964

  2. Health-related quality of life among cognitively intact nursing home residents with and without cancer – a 6-year longitudinal study

    PubMed Central

    Drageset, Jorunn; Eide, Geir Egil; Corbett, Anne

    2017-01-01

    Background Limited information exists regarding the natural development of health-related quality of life (HRQOL) and its determinants among mentally intact nursing home (NH) residents. We aimed to examine HRQOL over time during a 6-year period among residents of NHs, who are not cognitively impaired, and to examine whether sense of coherence and a diagnosis of cancer influence HRQOL. Methods The study was prospective and included baseline assessment and 6-year follow-up. After baseline assessment of 227 cognitively intact NH residents (Clinical Dementia Rating score ≤ 0.5), we interviewed 52 living respondents a second time at the 5-year follow-up and 18 respondents a third time at the 6-year follow-up. We recorded data from the interviews using the Short Form-36 (SF-36) Health Survey and the Sense of Coherence Scale. To study different developments over time for residents without and with cancer, we tested interactions between cancer and time. Results The subscores of physical functioning and role limitation–physical domains declined with time (P < 0.001 and P = 0.02, respectively). Having a diagnosis of cancer at baseline was negatively correlated with general health (P = 0.002). Sense of coherence at baseline was positively correlated with all the SF-36 subscores from baseline to follow-up (P < 0.001). Conclusion The study indicates that the HRQOL changed over time during the 6 years of follow-up, and the sense of coherence appeared to be an important component of the HRQOL. Finally, our results showed that having a diagnosis of cancer was associated with decline in the general health subdimension. PMID:28490913

  3. The Effect of Ultralow-Dose Transdermal Estradiol on Urinary Incontinence in Postmenopausal Women

    PubMed Central

    Waetjen, L. Elaine; Brown, Jeanette S.; Vittinghoff, Eric; Ensrud, Kristine E.; Pinkerton, JoAnn; Wallace, Robert; Macer, Judith L.; Grady, Deborah

    2006-01-01

    OBJECTIVE To estimate the effect of 2 years of treatment with ultralow-dose transdermal estradiol (E2) on incontinence in postmenopausal women. METHODS Ultra Low Dose Transdermal estRogen Assessment (ULTRA) was a multicenter, randomized, double-blinded, placebo-controlled trial of unopposed ultralow-dose (0.014 mg/d) transdermal E2 for prevention of osteoporosis in 417 postmenopausal women aged 60 to 80 years. Frequency of incontinence episodes was assessed at baseline and after 4 months and 2 years of treatment using a self-reported questionnaire. We used an intention-to-treat analysis to compare change in incontinence frequency, improved (decreased 2 or more episodes per week), unchanged (increased or decreased no more than 1 episode per week), or worsened (increased 2 or more episodes per week) between the E2 and placebo groups among women with and without at least weekly incontinence at baseline. RESULTS At baseline, the prevalence of at least weekly incontinence was similar between E2 and placebo groups (43%). After 2 years, there was no difference between groups in the proportions of women with incontinence at baseline whose incontinence improved, worsened, or was unchanged. The odds ratio for worsening incontinence in the E2 compared with placebo group was 1.35 (95% confidence interval 0.75–2.42. In women without incontinence at baseline, the odds of developing at least weekly incontinence after 2 years in the E2 compared with placebo group was not significant (odds ratio 1.2, 95% confidence interval 0.7–2.2). CONCLUSION Two years of treatment with unopposed ultralow-dose transdermal E2 did not substantially change the frequency of incontinence symptoms or alter the risk of developing at least weekly incontinence. PMID:16260511

  4. Wide baseline stereo matching based on double topological relationship consistency

    NASA Astrophysics Data System (ADS)

    Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang

    2009-07-01

    Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.

  5. An automated baseline correction protocol for infrared spectra of atmospheric aerosols collected on polytetrafluoroethylene (Teflon) filters

    NASA Astrophysics Data System (ADS)

    Kuzmiakova, Adele; Dillner, Ann M.; Takahama, Satoshi

    2016-06-01

    A growing body of research on statistical applications for characterization of atmospheric aerosol Fourier transform infrared (FT-IR) samples collected on polytetrafluoroethylene (PTFE) filters (e.g., Russell et al., 2011; Ruthenburg et al., 2014) and a rising interest in analyzing FT-IR samples collected by air quality monitoring networks call for an automated PTFE baseline correction solution. The existing polynomial technique (Takahama et al., 2013) is not scalable to a project with a large number of aerosol samples because it contains many parameters and requires expert intervention. Therefore, the question of how to develop an automated method for baseline correcting hundreds to thousands of ambient aerosol spectra given the variability in both environmental mixture composition and PTFE baselines remains. This study approaches the question by detailing the statistical protocol, which allows for the precise definition of analyte and background subregions, applies nonparametric smoothing splines to reproduce sample-specific PTFE variations, and integrates performance metrics from atmospheric aerosol and blank samples alike in the smoothing parameter selection. Referencing 794 atmospheric aerosol samples from seven Interagency Monitoring of PROtected Visual Environment (IMPROVE) sites collected during 2011, we start by identifying key FT-IR signal characteristics, such as non-negative absorbance or analyte segment transformation, to capture sample-specific transitions between background and analyte. While referring to qualitative properties of PTFE background, the goal of smoothing splines interpolation is to learn the baseline structure in the background region to predict the baseline structure in the analyte region. We then validate the model by comparing smoothing splines baseline-corrected spectra with uncorrected and polynomial baseline (PB)-corrected equivalents via three statistical applications: (1) clustering analysis, (2) functional group quantification, and (3) thermal optical reflectance (TOR) organic carbon (OC) and elemental carbon (EC) predictions. The discrepancy rate for a four-cluster solution is 10 %. For all functional groups but carboxylic COH the discrepancy is ≤ 10 %. Performance metrics obtained from TOR OC and EC predictions (R2 ≥ 0.94 %, bias ≤ 0.01 µg m-3, and error ≤ 0.04 µg m-3) are on a par with those obtained from uncorrected and PB-corrected spectra. The proposed protocol leads to visually and analytically similar estimates as those generated by the polynomial method. More importantly, the automated solution allows us and future users to evaluate its analytical reproducibility while minimizing reducible user bias. We anticipate the protocol will enable FT-IR researchers and data analysts to quickly and reliably analyze a large amount of data and connect them to a variety of available statistical learning methods to be applied to analyte absorbances isolated in atmospheric aerosol samples.

  6. Religiousness Among At-Risk Drinkers: Is It Prospectively Associated With the Development or Maintenance of an Alcohol-Use Disorder?*

    PubMed Central

    Borders, Tyrone F.; Curran, Geoffrey M.; Mattox, Rhonda; Booth, Brenda M.

    2010-01-01

    Objective: This study examined whether particular dimensions of religiousness are prospectively associated with the development or maintenance of an alcohol-use disorder (AUD) among at-risk drinkers or persons with a history of problem drinking. Method: A prospective cohort study was conducted among at-risk drinkers identified through a population-based telephone survey of adults residing in the southeastern United States. The cohort was stratified by baseline AUD status to determine how several dimensions of religiousness (organized religious attendance, religious self-ranking, religious influence on one's life, coping through prayer, and talking with a religious leader) were associated with the development and, separately, the maintenance or remission of an AUD over 6 months. Multiple logistic regression analyses were conducted to estimate the odds of developing versus not developing an AUD and maintaining versus remitting from an AUD while adjusting for measures of social support and other covariates. Results: Among persons without an AUD at baseline, more frequent organized religious attendance, adjusted odds ratio (ORadj) = 0.73, 95% CI [0.55, 0.96], and coping through prayer, ORadj = 0.63, 95% CI [0.45, 0.87], were associated with lower adjusted odds of developing an AUD. In contrast, among persons with an AUD at baseline, no dimension of religiousness was associated with the maintenance or remission of an AUD. Conclusions: The findings of this study suggest that religious attendance and coping through prayer may protect against the development of an AUD among at-risk drinkers. Further research is warranted to ascertain whether these or other religious activities and practices should be promoted among at-risk drinkers. PMID:20105423

  7. Word Sense Disambiguation in Bangla Language Using Supervised Methodology with Necessary Modifications

    NASA Astrophysics Data System (ADS)

    Pal, Alok Ranjan; Saha, Diganta; Dash, Niladri Sekhar; Pal, Antara

    2018-05-01

    An attempt is made in this paper to report how a supervised methodology has been adopted for the task of word sense disambiguation in Bangla with necessary modifications. At the initial stage, the Naïve Bayes probabilistic model that has been adopted as a baseline method for sense classification, yields moderate result with 81% accuracy when applied on a database of 19 (nineteen) most frequently used Bangla ambiguous words. On experimental basis, the baseline method is modified with two extensions: (a) inclusion of lemmatization process into of the system, and (b) bootstrapping of the operational process. As a result, the level of accuracy of the method is slightly improved up to 84% accuracy, which is a positive signal for the whole process of disambiguation as it opens scope for further modification of the existing method for better result. The data sets that have been used for this experiment include the Bangla POS tagged corpus obtained from the Indian Languages Corpora Initiative, and the Bangla WordNet, an online sense inventory developed at the Indian Statistical Institute, Kolkata. The paper also reports about the challenges and pitfalls of the work that have been closely observed and addressed to achieve expected level of accuracy.

  8. The Forest Ecosystem Study: background, rationale, implementation, baseline conditions, and silvicultural assessment.

    Treesearch

    Andrew B. Carey; David R. Thysell; Angus W. Brodie

    1999-01-01

    The Forest Ecosystem Study (FES) came about as an early response to the need for innovative silvicultural methods designed to stimulate development of late-successional attributes in managed forests—a need ensuing from the exceptional and longstanding controversies over old-growth forests and endangered species concerns in the Pacific Northwest. In 1991, scientists...

  9. Economic Value of Dispensing Home-Based Preoperative Chlorhexidine Bathing Cloths to Prevent Surgical Site Infection

    PubMed Central

    Bailey, Rachel R.; Stuckey, Dianna R.; Norman, Bryan A.; Duggan, Andrew P.; Bacon, Kristina M.; Connor, Diana L.; Lee, Ingi; Muder, Robert R.; Lee, Bruce Y.

    2012-01-01

    OBJECTIVE To estimate the economic value of dispensing preoperative home-based chlorhexidine bathing cloth kits to orthopedic patients to prevent surgical site infection (SSI). METHODS A stochastic decision-analytic computer simulation model was developed from the hospital’s perspective depicting the decision of whether to dispense the kits preoperatively to orthopedic patients. We varied patient age, cloth cost, SSI-attributable excess length of stay, cost per bed-day, patient compliance with the regimen, and cloth antimicrobial efficacy to determine which variables were the most significant drivers of the model’s outcomes. RESULTS When all other variables remained at baseline and cloth efficacy was at least 50%, patient compliance only had to be half of baseline (baseline mean, 15.3%; range, 8.23%–20.0%) for chlorhexidine cloths to remain the dominant strategy (ie, less costly and providing better health outcomes). When cloth efficacy fell to 10%, 1.5 times the baseline bathing compliance also afforded dominance of the preoperative bath. CONCLUSIONS The results of our study favor the routine distribution of bathing kits. Even with low patient compliance and cloth efficacy values, distribution of bathing kits is an economically beneficial strategy for the prevention of SSI. PMID:21515977

  10. Smoking relapse-prevention intervention for cancer patients: Study design and baseline data from the surviving SmokeFree randomized controlled trial.

    PubMed

    Díaz, Diana B; Brandon, Thomas H; Sutton, Steven K; Meltzer, Lauren R; Hoehn, Hannah J; Meade, Cathy D; Jacobsen, Paul B; McCaffrey, Judith C; Haura, Eric B; Lin, Hui-Yi; Simmons, Vani N

    2016-09-01

    Continued smoking after a cancer diagnosis contributes to several negative health outcomes. Although many cancer patients attempt to quit smoking, high smoking relapse rates have been observed. This highlights the need for a targeted, evidence-based smoking-relapse prevention intervention. The design, method, and baseline characteristics of a randomized controlled trial assessing the efficacy of a self-help smoking-relapse prevention intervention are presented. Cancer patients who had recently quit smoking were randomized to one of two conditions. The Usual Care (UC) group received the institution's standard of care. The smoking relapse-prevention intervention (SRP) group received standard of care, plus 8 relapse-prevention booklets mailed over a 3month period, and a targeted educational DVD developed specifically for cancer patients. Four hundred and fourteen participants were enrolled and completed a baseline survey. Primary outcomes will be self-reported smoking status at 6 and 12-months after baseline. Biochemical verification of smoking status was completed for a subsample. If found to be efficacious, this low-cost intervention could be easily disseminated with significant potential for reducing the risk of negative cancer outcomes associated with continued smoking. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Development and reliability of a Motivational Interviewing Scenarios Tool for Eating Disorders (MIST-ED) using a skills-based intervention among caregivers.

    PubMed

    Sepulveda, Ana R; Wise, Caroline; Zabala, Maria; Todd, Gill; Treasure, Janet

    2013-12-01

    The aims of this study were to develop an eating disorder scenarios tool to assess the motivational interviewing (MI) skills of caregivers and evaluate the coding reliability of the instrument, and to test the sensitivity to change through a pre/post/follow-up design. The resulting Motivational Interview Scenarios Tool for Eating Disorders (MIST-ED) was administered to caregivers (n = 66) who were asked to provide oral and written responses before and after a skills-based intervention, and at a 3-month follow-up. Raters achieved excellent inter-rater reliability (intra-class correlations of 91.8% on MI adherent and 86.1% for MI non-adherent statements for written scenarios and 89.2%, and 85.3% for oral scenarios). Following the intervention, MI adherent statements increased (baseline = 9.4%, post = 61.5% and follow-up 47.2%) and non-MI adherent statements decreased (baseline = 90.6%, post = 38.5% and follow-up = 52.8%). This instrument can be used as a simple method to measure the acquisition of MI skills to improve coping and both response methods are adequate. The tool shows good sensitivity to improved skills. © 2013.

  12. Program prioritization to control chronic diseases in African-American faith-based communities.

    PubMed Central

    Hoyo, Cathrine; Reid, Laverne; Hatch, John; Sellers, Denethia B.; Ellison, Arlinda; Hackney, Tara; Porterfield, Deborah; Page, Joyce; Parrish, Theodore

    2004-01-01

    OBJECTIVE: In the last decade, African-American congregations have been inundated with requests to participate in health promotion activities; however, most are not equipped to effectively participate. We assessed the effect of providing congregation leaders with skills on identifying their own health needs, planning, and implementing their own interventions. METHODS: At baseline, 21 congregational leaders from South East Raleigh, NC were taught methods for developing needs assessments, planning, and implementing health promotion activities tailored for their congregations. After approximately four years, 14 of the 21 congregations were successfully recontacted. RESULTS: At baseline, the congregation leadership ranked diabetes as the ninth (out of 10) most urgent health concern in their communities. However, at follow-up, not only was diabetes identified as the most serious health concern, but most congregations had taken advantage of available community and congregational resources to prevent it. Larger congregations were more likely than smaller ones to take advantage of available resources. CONCLUSIONS: Larger African-American congregations are an effective vehicle by which health promotion messages can diffuse; however, the leadership must be provided with skills to assess health needs before selecting programs most beneficial to their congregations. Mechanisms by which small congregation leaders can participate need development. PMID:15101672

  13. Toward the definition of a bipolar prodrome: Dimensional predictors of bipolar spectrum disorder in at-risk youth

    PubMed Central

    Hafeman, Danella M.; Merranko, John; Axelson, David; Goldstein, Benjamin I.; Goldstein, Tina; Monk, Kelly; Hickey, Mary Beth; Sakolsky, Dara; Diler, Rasim; Iyengar, Satish; Brent, David; Kupfer, David; Birmaher, Boris

    2016-01-01

    Objective We aimed to assess dimensional symptomatic predictors of new-onset bipolar spectrum disorder in youth at familial risk of bipolar disorder (“at-risk” youth). Method Offspring aged 6–18 of parents with bipolar-I/II disorder (n=391) and offspring of community controls (n=248) were recruited without regard to non-bipolar psychopathology. At baseline, 8.4% (33/391) of offspring of bipolar parents had bipolar spectrum; 14.7% (44/299) of offspring with follow-up developed new-onset bipolar spectrum (15 with bipolar-I/II) over eight years. Scales collected at baseline and follow-up were reduced using factor analyses; factors (both at baseline and visit proximal to conversion or last contact) were then assessed as predictors of new-onset bipolar spectrum. Results Relative to community control offspring, at-risk and bipolar offspring had higher baseline levels of anxiety/depression, inattention/disinhibition, externalizing, subsydromal manic, and affective lability symptoms (p<.05). The strongest predictors of new-onset bipolar spectrum were: baseline anxiety/depression, baseline and proximal affective lability, and proximal subsyndromal manic symptoms (p<.05). While affective lability and anxiety/depression were elevated throughout follow-up in those who later developed bipolar spectrum, manic symptoms increased up to the point of conversion. A path analysis supported the hypothesized model that affective lability at baseline predicted new-onset bipolar spectrum, in part, through increased manic symptoms at the visit prior to conversion; earlier parental age of mood disorder onset also significantly increased risk of conversion (p<.001). While youth without anxiety/depression, affective lability, and mania (and with a parent with older age of mood disorder onset) had a 2% predicted chance of conversion to bipolar spectrum, those with all risk factors had a 49% predicted chance of conversion. Conclusions Dimensional measures of anxiety/depression, affective lability, and mania are important predictors of new-onset bipolar spectrum in this population of at-risk youth. These symptoms emerged from among numerous other candidates, underscoring the potential clinical and research utility of these findings. PMID:26892940

  14. Unsupervised Ensemble Anomaly Detection Using Time-Periodic Packet Sampling

    NASA Astrophysics Data System (ADS)

    Uchida, Masato; Nawata, Shuichi; Gu, Yu; Tsuru, Masato; Oie, Yuji

    We propose an anomaly detection method for finding patterns in network traffic that do not conform to legitimate (i.e., normal) behavior. The proposed method trains a baseline model describing the normal behavior of network traffic without using manually labeled traffic data. The trained baseline model is used as the basis for comparison with the audit network traffic. This anomaly detection works in an unsupervised manner through the use of time-periodic packet sampling, which is used in a manner that differs from its intended purpose — the lossy nature of packet sampling is used to extract normal packets from the unlabeled original traffic data. Evaluation using actual traffic traces showed that the proposed method has false positive and false negative rates in the detection of anomalies regarding TCP SYN packets comparable to those of a conventional method that uses manually labeled traffic data to train the baseline model. Performance variation due to the probabilistic nature of sampled traffic data is mitigated by using ensemble anomaly detection that collectively exploits multiple baseline models in parallel. Alarm sensitivity is adjusted for the intended use by using maximum- and minimum-based anomaly detection that effectively take advantage of the performance variations among the multiple baseline models. Testing using actual traffic traces showed that the proposed anomaly detection method performs as well as one using manually labeled traffic data and better than one using randomly sampled (unlabeled) traffic data.

  15. Asymmetries and Visual Field Summaries as Predictors of Glaucoma in the Ocular Hypertension Treatment Study

    PubMed Central

    Levine, Richard A.; Demirel, Shaban; Fan, Juanjuan; Keltner, John L.; Johnson, Chris A.; Kass, Michael A.

    2007-01-01

    Purpose To evaluate whether baseline visual field data and asymmetries between eyes predict the onset of primary open-angle glaucoma (POAG) in Ocular Hypertension Treatment Study (OHTS) participants. Methods A new index, mean prognosis (MP), was designed for optimal combination of visual field thresholds, to discriminate between eyes that developed POAG from eyes that did not. Baseline intraocular pressure (IOP) in fellow eyes was used to construct measures of IOP asymmetry. Age-adjusted baseline thresholds were used to develop indicators of visual field asymmetry and summary measures of visual field defects. Marginal multivariate failure time models were constructed that relate the new index MP, IOP asymmetry, and visual field asymmetry to POAG onset for OHTS participants. Results The marginal multivariate failure time analysis showed that the MP index is significantly related to POAG onset (P < 0.0001) and appears to be a more highly significant predictor of POAG onset than either mean deviation (MD; P = 0.17) or pattern standard deviation (PSD; P = 0.046). A 1-mm Hg increase in IOP asymmetry between fellow eyes is associated with a 17% increase in risk for development of POAG. When threshold asymmetry between eyes existed, the eye with lower thresholds was at a 37% greater risk of development of POAG, and this feature was more predictive of POAG onset than the visual field index MD, though not as strong a predictor as PSD. Conclusions The MP index, IOP asymmetry, and binocular test point asymmetry can assist in clinical evaluation of eyes at risk of development of POAG. PMID:16936102

  16. Agricultural Baseline (BL0) scenario of the 2016 Billion-Ton Report

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinkel, Chad [University of Tennessee, APAC] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Langholtz, Matthew H [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Myers, Aaron [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000320373827)

    2016-07-13

    Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.

  17. Beam-energy-spread minimization using cell-timing optimization

    NASA Astrophysics Data System (ADS)

    Rose, C. R.; Ekdahl, C.; Schulze, M.

    2012-04-01

    Beam energy spread, and related beam motion, increase the difficulty in tuning for multipulse radiographic experiments at the dual-axis radiographic hydrodynamic test facility’s axis-II linear induction accelerator (LIA). In this article, we describe an optimization method to reduce the energy spread by adjusting the timing of the cell voltages (both unloaded and loaded), either advancing or retarding, such that the injector voltage and summed cell voltages in the LIA result in a flatter energy profile. We developed a nonlinear optimization routine which accepts as inputs the 74 cell-voltage, injector voltage, and beam current waveforms. It optimizes cell timing per user-selected groups of cells and outputs timing adjustments, one for each of the selected groups. To verify the theory, we acquired and present data for both unloaded and loaded cell-timing optimizations. For the unloaded cells, the preoptimization baseline energy spread was reduced by 34% and 31% for two shots as compared to baseline. For the loaded-cell case, the measured energy spread was reduced by 49% compared to baseline.

  18. Change in Motor Function and Adverse Health Outcomes in Older African Americas

    PubMed Central

    Buchman, Aron S.; Wilson, Robert S.; Leurgans, Sue E.; Bennett, David A.; Barnes, Lisa L.

    2015-01-01

    Objective We tested whether declining motor function accelerates with age in older African Americans. Methods Eleven motor performances were assessed annually in 513 older African Americans. Results During follow-up of 5 years, linear mixed-effect models showed that motor function declined by about 0.03 units/yr (Estimate, −0.026, p<0.001); about 4% more rapidly for each additional year of age at baseline. A proportional hazard model showed that both baseline motor function level and its rate of change were independent predictors of death and incident disability (all p’s <0.001). These models showed that the additional annual amount of motor decline in 85 year old persons at baseline versus 65 year old persons was associated with a 1.5-fold higher rate of death and a 3-fold higher rate of developing Katz disability. Conclusions The rate of declining motor function accelerates with increasing age and its rate of decline predicts adverse health outcomes in older African Americans. PMID:26209439

  19. National facilities study. Volume 3: Mission and requirements model report

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The National Facility Study (NFS) was initiated in 1992 by Daniel S. Goldin, Administrator of NASA as an initiative to develop a comprehensive and integrated long-term plan for future facilities. The resulting, multi-agency NFS consisted of three Task Groups: Aeronautics, Space Operations, and Space Research and Development (R&D) Task Groups. A fourth group, the Engineering and Cost Analysis Task Group, was subsequently added to provide cross-cutting functions, such as assuring consistency in developing an inventory of space facilities. Space facilities decisions require an assessment of current and future needs. Therefore, the two task groups dealing with space developed a consistent model of future space mission programs, operations and R&D. The model is a middle ground baseline constructed for NFS analytical purposes with excursions to cover potential space program strategies. The model includes three major sectors: DOD, civilian government, and commercial space. The model spans the next 30 years because of the long lead times associated with facilities development and usage. This document, Volume 3 of the final NFS report, is organized along the following lines: Executive Summary -- provides a summary view of the 30-year mission forecast and requirements baseline, an overview of excursions from that baseline that were studied, and organization of the report; Introduction -- provides discussions of the methodology used in this analysis; Baseline Model -- provides the mission and requirements model baseline developed for Space Operations and Space R&D analyses; Excursions from the baseline -- reviews the details of variations or 'excursions' that were developed to test the future program projections captured in the baseline; and a Glossary of Acronyms.

  20. Matched Comparison Group Design Standards in Systematic Reviews of Early Childhood Interventions.

    PubMed

    Thomas, Jaime; Avellar, Sarah A; Deke, John; Gleason, Philip

    2017-06-01

    Systematic reviews assess the quality of research on program effectiveness to help decision makers faced with many intervention options. Study quality standards specify criteria that studies must meet, including accounting for baseline differences between intervention and comparison groups. We explore two issues related to systematic review standards: covariate choice and choice of estimation method. To help systematic reviews develop/refine quality standards and support researchers in using nonexperimental designs to estimate program effects, we address two questions: (1) How well do variables that systematic reviews typically require studies to account for explain variation in key child and family outcomes? (2) What methods should studies use to account for preexisting differences between intervention and comparison groups? We examined correlations between baseline characteristics and key outcomes using Early Childhood Longitudinal Study-Birth Cohort data to address Question 1. For Question 2, we used simulations to compare two methods-matching and regression adjustment-to account for preexisting differences between intervention and comparison groups. A broad range of potential baseline variables explained relatively little of the variation in child and family outcomes. This suggests the potential for bias even after accounting for these variables, highlighting the need for systematic reviews to provide appropriate cautions about interpreting the results of moderately rated, nonexperimental studies. Our simulations showed that regression adjustment can yield unbiased estimates if all relevant covariates are used, even when the model is misspecified, and preexisting differences between the intervention and the comparison groups exist.

  1. Adaptive Baseline Enhances EM-Based Policy Search: Validation in a View-Based Positioning Task of a Smartphone Balancer

    PubMed Central

    Wang, Jiexin; Uchibe, Eiji; Doya, Kenji

    2017-01-01

    EM-based policy search methods estimate a lower bound of the expected return from the histories of episodes and iteratively update the policy parameters using the maximum of a lower bound of expected return, which makes gradient calculation and learning rate tuning unnecessary. Previous algorithms like Policy learning by Weighting Exploration with the Returns, Fitness Expectation Maximization, and EM-based Policy Hyperparameter Exploration implemented the mechanisms to discard useless low-return episodes either implicitly or using a fixed baseline determined by the experimenter. In this paper, we propose an adaptive baseline method to discard worse samples from the reward history and examine different baselines, including the mean, and multiples of SDs from the mean. The simulation results of benchmark tasks of pendulum swing up and cart-pole balancing, and standing up and balancing of a two-wheeled smartphone robot showed improved performances. We further implemented the adaptive baseline with mean in our two-wheeled smartphone robot hardware to test its performance in the standing up and balancing task, and a view-based approaching task. Our results showed that with adaptive baseline, the method outperformed the previous algorithms and achieved faster, and more precise behaviors at a higher successful rate. PMID:28167910

  2. Air/Superfund national technical guidance study series, Volume 2. Estimation of baseline air emission at Superfund sites. Interim report(Final)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1989-01-01

    This volume is one in a series of manuals prepared for EPA to assist its Remedial Project Managers in the assessment of the air contaminant pathway and developing input data for risk assessment. The manual provides guidance on developing baseline-emission estimates from hazardous waste sites. Baseline-emission estimates (BEEs) are defined as emission rates estimated for a site in its undisturbed state. Specifically, the manual is intended to: Present a protocol for selecting the appropriate level of effort to characterize baseline air emissions; Assist site managers in designing an approach for BEEs; Describe useful technologies for developing site-specific baseline emission estimatesmore » (BEEs); Help site managers select the appropriate technologies for generating site-specific BEEs.« less

  3. Association of Fetal Heart Rate Baseline Change and Neonatal Outcomes.

    PubMed

    Yang, Michael; Stout, Molly J; López, Julia D; Colvin, Ryan; Macones, George A; Cahill, Alison G

    2017-07-01

    Objective  The objective of this study was to describe the incidence of baseline change within normal range during labor and its prediction of neonatal outcomes. Materials and Methods  This was a prospective cohort of singleton, nonanomalous, term neonates with continuous electronic fetal monitoring and normal baseline fetal heart rate throughout the last 2 hours of labor. We determined baseline in 10-minute segments using Eunice Kennedy Shriver National Institute of Child Health and Human Development criteria. We evaluated baseline changes of ≥ 20 and ≥ 30 bpm for association with acidemia (umbilical cord arterial pH ≤ 7.10) and neonatal intensive care unit (NICU) admission. Finally, we performed a sensitivity analysis of normal neonates, excluding those with acidemia, NICU admission, or 5-minute Apgar < 4. Results  Among all neonates ( n  = 3,021), 1,267 (41.9%) had change ≥ 20 bpm; 272 (9.0%) had ≥ 30 bpm. Among normal neonates ( n  = 2,939), 1,221 (41.5%) had change ≥20 bpm. Acidemia was not associated with baseline change of any direction or magnitude. NICU admission was associated with decrease ≥ 20 bpm (adjusted odds ratio [aOR]: 2.93; 95% confidence interval [CI]: 1.19 - 7.21) or any direction ≥ 20 bpm (aOR: 4.06; 95% CI: 1.46-11.29). For decrease ≥ 20 bpm, sensitivity and specificity were 40.0 and 81.7%; for any direction ≥ 20 bpm, 75.0 and 58.3%. Conclusion  Changes of normal baseline are common in term labor and poorly predict morbidity, regardless of direction or magnitude. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  4. Associations of Adolescent Emotional and Loss of Control Eating with 1-year Changes in Disordered Eating, Weight and Adiposity

    PubMed Central

    Stojek, Monika M. K.; Tanofsky-Kraff, Marian; Shomaker, Lauren B.; Kelly, Nichole R.; Thompson, Katherine A.; Mehari, Rim D.; Marwitz, Shannon E.; Demidowich, Andrew P.; Galescu, Ovidiu A.; Brady, Sheila M.; Yanovski, Susan Z.; Yanovski, Jack A.

    2016-01-01

    Objective Adolescent emotional-eating, referring to eating in response to negative affective states, is frequently reported by those with loss of control (LOC) eating. Although LOC eating has been shown to predict exacerbated disordered eating and excess weight/adiposity gain, the extent to which emotional-eating, either alone or in combination with LOC, predicts adverse outcomes has not been determined. Thus, we examined associations of baseline emotional-eating with changes in disordered eating, BMI, and adiposity over 1-year, and to what degree the presence or absence of baseline LOC moderated these associations. Methods 189 non-treatment-seeking youth (15.4±1.4y; 66% female; 67% non-Hispanic White, 38% overweight [BMI ≥85th %ile]) completed the emotional-eating Scale for Children/Adolescents and the Eating Disorder Examination interview at baseline and again at 1-year. Air displacement plethysmography assessed adiposity at both time points. Results Baseline emotional-eating alone was not significantly associated with the development of objective binge eating or changes in disordered eating attitudes, BMI or adiposity 1-year later. However, baseline emotional-eating interacted with the presence of baseline LOC in the prediction of 1-year outcomes. Among adolescents with LOC eating, greater baseline emotional-eating was related to increased disordered eating attitudes (p=.03), BMI (p=.04), and adiposity (p=.04) at 1-year, after correcting for false discovery rate. Discussion Emotional-eating among youth also reporting LOC was associated with adverse outcomes over 1-year. Adolescents who report both behaviors may represent a subset of individuals at especially high risk for exacerbated disordered eating and excess weight gain. PMID:27753140

  5. Concepts for 20/30 GHz satcom systems for direct-to-user applications

    NASA Technical Reports Server (NTRS)

    Jorasch, R.; Davies, R.; Baker, M.

    1980-01-01

    A baseline technique is described for implementing a direct-to-user (DTU) satcom communications system at 20/30 GHz transmission frequency. The purpose of this application is to utilize the high capacity frequency spectrum at K(A) band for communications among thousands of small terminals located at or close to a customer's facility. The baseline DTU system utilizes a TDMA method of communications with QPSK modulation. Twenty-five coverage beams from a geosynchronous orbit spacecraft provide full coverage of CONUS. Low cost terminals are limited to less than 4.5 meters diameter. The impact of rain attenuation on communications availability is examined. Other techniques including satellite switched antenna beams are outlined and critical K(A)-band technology developments are identified.

  6. Automated brain computed tomographic densitometry of early ischemic changes in acute stroke

    PubMed Central

    Stoel, Berend C.; Marquering, Henk A.; Staring, Marius; Beenen, Ludo F.; Slump, Cornelis H.; Roos, Yvo B.; Majoie, Charles B.

    2015-01-01

    Abstract. The Alberta Stroke Program Early CT score (ASPECTS) scoring method is frequently used for quantifying early ischemic changes (EICs) in patients with acute ischemic stroke in clinical studies. Varying interobserver agreement has been reported, however, with limited agreement. Therefore, our goal was to develop and evaluate an automated brain densitometric method. It divides CT scans of the brain into ASPECTS regions using atlas-based segmentation. EICs are quantified by comparing the brain density between contralateral sides. This method was optimized and validated using CT data from 10 and 63 patients, respectively. The automated method was validated against manual ASPECTS, stroke severity at baseline and clinical outcome after 7 to 10 days (NIH Stroke Scale, NIHSS) and 3 months (modified Rankin Scale). Manual and automated ASPECTS showed similar and statistically significant correlations with baseline NIHSS (R=−0.399 and −0.277, respectively) and with follow-up mRS (R=−0.256 and −0.272), except for the follow-up NIHSS. Agreement between automated and consensus ASPECTS reading was similar to the interobserver agreement of manual ASPECTS (differences <1 point in 73% of cases). The automated ASPECTS method could, therefore, be used as a supplementary tool to assist manual scoring. PMID:26158082

  7. Oscillation Baselining and Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  8. Developing a climatological / hydrological baseline for climate change impact assessment in a remote mountain region - an example from Peru

    NASA Astrophysics Data System (ADS)

    Salzmann, N.; Huggel, C.; Calanca, P.; Diaz, A.; Jonas, T.; Konzelmann, T.; Lagos, P.; Rohrer, M.; Silverio, W.; Zappa, M.

    2009-04-01

    Changes in the availability of fresh water caused by climatic changes will become a major issue in the coming years and decades. In this context, regions presently depending on water from retreating mountain glaciers are particularly vulnerable. In many parts of the Andes for example, people already suffer from the impacts of reduced glacier run off. Therefore, the development and implementation of adequate adaptation measures is an urgent need. To better understand the impact of climate change on water resources in the Andean region, a new research program (PACC - Programa de Adaptación al Cambio Climático en el Perú) between Peru and Switzerland has recently been launched by SDC (Swiss Agency for Development and Cooperation). As a first step, a scientific baseline relative to climatology, hydrology, agriculture and natural disasters will be developed on a regional scale for the Departments of Cusco and Apurimac in close cooperation with partners from Universities and governmental institutions as well as NGOs in Peru. A reliable data baseline is a must for the development of adaptation measures that can effectively cope with the risks induced by climate change. The realization of this task in remote mountain regions, where observational data are generally sparse, however, is challenging. Temporal and spatial gaps must be filled using indirect methods such as re-analyses, remote sensing and interpolation techniques. For future scenarios, the use of climate model output along with statistical and dynamical downscaling is indicated. This contribution will present and discuss approaches and possible concepts to tackle the challenges in a Peruvian context. In addition, first experiences will be reported particularly on cross-disciplinary issues that naturally emerge from the integrative perspective needed in climate change impact assessments and the development of adaptation strategies.

  9. An Analysis Method for Superconducting Resonator Parameter Extraction with Complex Baseline Removal

    NASA Technical Reports Server (NTRS)

    Cataldo, Giuseppe

    2014-01-01

    A new semi-empirical model is proposed for extracting the quality (Q) factors of arrays of superconducting microwave kinetic inductance detectors (MKIDs). The determination of the total internal and coupling Q factors enables the computation of the loss in the superconducting transmission lines. The method used allows the simultaneous analysis of multiple interacting discrete resonators with the presence of a complex spectral baseline arising from reflections in the system. The baseline removal allows an unbiased estimate of the device response as measured in a cryogenic instrumentation setting.

  10. Measuring continuous baseline covariate imbalances in clinical trial data

    PubMed Central

    Ciolino, Jody D.; Martin, Renee’ H.; Zhao, Wenle; Hill, Michael D.; Jauch, Edward C.; Palesch, Yuko Y.

    2014-01-01

    This paper presents and compares several methods of measuring continuous baseline covariate imbalance in clinical trial data. Simulations illustrate that though the t-test is an inappropriate method of assessing continuous baseline covariate imbalance, the test statistic itself is a robust measure in capturing imbalance in continuous covariate distributions. Guidelines to assess effects of imbalance on bias, type I error rate, and power for hypothesis test for treatment effect on continuous outcomes are presented, and the benefit of covariate-adjusted analysis (ANCOVA) is also illustrated. PMID:21865270

  11. Automatic identification and normalization of dosage forms in drug monographs

    PubMed Central

    2012-01-01

    Background Each day, millions of health consumers seek drug-related information on the Web. Despite some efforts in linking related resources, drug information is largely scattered in a wide variety of websites of different quality and credibility. Methods As a step toward providing users with integrated access to multiple trustworthy drug resources, we aim to develop a method capable of identifying drug's dosage form information in addition to drug name recognition. We developed rules and patterns for identifying dosage forms from different sections of full-text drug monographs, and subsequently normalized them to standardized RxNorm dosage forms. Results Our method represents a significant improvement compared with a baseline lookup approach, achieving overall macro-averaged Precision of 80%, Recall of 98%, and F-Measure of 85%. Conclusions We successfully developed an automatic approach for drug dosage form identification, which is critical for building links between different drug-related resources. PMID:22336431

  12. Use of a Respondent-Generated Personal Code for Matching Anonymous Adolescent Surveys in Longitudinal Studies.

    PubMed

    Ripper, Lisa; Ciaravino, Samantha; Jones, Kelley; Jaime, Maria Catrina D; Miller, Elizabeth

    2017-06-01

    Research on sensitive and private topics relies heavily on self-reported responses. Social desirability bias may reduce the accuracy and reliability of self-reported responses. Anonymous surveys appear to improve the likelihood of honest responses. A challenge with prospective research is maintaining anonymity while linking individual surveys over time. We have tested a secret code method in which participants create their own code based on eight questions that are not expected to change. In an ongoing middle school trial, 95.7% of follow-up surveys are matched to a baseline survey after changing up to two-code variables. The percentage matched improves by allowing up to four changes (99.7%). The use of a secret code as an anonymous identifier for linking baseline and follow-up surveys is feasible for use with adolescents. While developed for violence prevention research, this method may be useful with other sensitive health behavior research. Copyright © 2017 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  13. Dietary habits among the JPHC study participants at baseline survey. Japan Public Health Center-based Prospective Study on Cancer and Cardiovascular Diseases.

    PubMed

    Tsugane, S; Sasaki, S; Kobayashi, M; Tsubono, Y; Sobue, T

    2001-10-01

    Dietary habit is closely associated with development of cancer and cardiovascular diseases, however little prospective evidence has been published for Japanese, whose dietary habit is substantially different from Western countries. Therefore, frequencies of food consumption, food preference, cooking method and acceptance of dietary advice were investigated at the baseline by two kinds of self-administered food frequency questionnaires. Dietary habits between urban and rural (Tokyo and Osaka vs. others), or between Okinawa and non-Okinawa revealed recognizable differences. The so-called westernized foods such as bread, beef and coffee were more consumed in the urban areas such as Tokyo and Osaka and also in Okinawa. The frequencies of salted food intake such as pickled vegetables and salted seafoods were remarkably low in Okinawa. Cooking methods for meats, seafoods and vegetables were also unique in Okinawa. No distinct geographical difference was shown in food preference and modification of dietary habit by dietary advice.

  14. Hand-rearing, growth, and development of common loon (Gavia immer) chicks

    USGS Publications Warehouse

    Kenow, Kevin P.; Meier, Melissa S.; McColl, Laurie E.; Hines, Randy K.; Pichner, Jimmy; Johnson, Laura; Lyon, James E.; Scharold, Kellie Kroc; Meyer, Michael

    2014-01-01

    Common loon chicks were reared in captivity in association with studies to evaluate the effects of radiotransmitter implants and to assess the ecological risk of dietary methylmercury. Here we report on hatching and rearing methods used to successfully raise chicks to 105 days of age. We experienced a 91.5% hatch rate, and 89.6% of loon chicks survived to the end of the study at 105 days. Baseline information on observed rates of fish consumption, behavioral development, and growth patterns are provided. Husbandry techniques are provided that should prove valuable to wildlife rehabilitators caring for abandoned or injured loons, and biologists contemplating methods for restoring loons to areas within their former breeding range.

  15. Analysis of baseline, average, and longitudinally measured blood pressure data using linear mixed models.

    PubMed

    Hossain, Ahmed; Beyene, Joseph

    2014-01-01

    This article compares baseline, average, and longitudinal data analysis methods for identifying genetic variants in genome-wide association study using the Genetic Analysis Workshop 18 data. We apply methods that include (a) linear mixed models with baseline measures, (b) random intercept linear mixed models with mean measures outcome, and (c) random intercept linear mixed models with longitudinal measurements. In the linear mixed models, covariates are included as fixed effects, whereas relatedness among individuals is incorporated as the variance-covariance structure of the random effect for the individuals. The overall strategy of applying linear mixed models decorrelate the data is based on Aulchenko et al.'s GRAMMAR. By analyzing systolic and diastolic blood pressure, which are used separately as outcomes, we compare the 3 methods in identifying a known genetic variant that is associated with blood pressure from chromosome 3 and simulated phenotype data. We also analyze the real phenotype data to illustrate the methods. We conclude that the linear mixed model with longitudinal measurements of diastolic blood pressure is the most accurate at identifying the known single-nucleotide polymorphism among the methods, but linear mixed models with baseline measures perform best with systolic blood pressure as the outcome.

  16. The Relationship Between Balance Measured With a Modified Bathroom Scale and Falls and Disability in Older Adults: A 6-Month Follow-Up Study

    PubMed Central

    2015-01-01

    Background There are indications that older adults who suffer from poor balance have an increased risk for adverse health outcomes, such as falls and disability. Monitoring the development of balance over time enables early detection of balance decline, which can identify older adults who could benefit from interventions aimed at prevention of these adverse outcomes. An innovative and easy-to-use device that can be used by older adults for home-based monitoring of balance is a modified bathroom scale. Objective The objective of this paper is to study the relationship between balance scores obtained with a modified bathroom scale and falls and disability in a sample of older adults. Methods For this 6-month follow-up study, participants were recruited via physiotherapists working in a nursing home, geriatricians, exercise classes, and at an event about health for older adults. Inclusion criteria were being aged 65 years or older, being able to stand on a bathroom scale independently, and able to provide informed consent. A total of 41 nursing home patients and 139 community-dwelling older adults stepped onto the modified bathroom scale three consecutive times at baseline to measure their balance. Their mean balance scores on a scale from 0 to 16 were calculated—higher scores indicated better balance. Questionnaires were used to study falls and disability at baseline and after 6 months of follow-up. The cross-sectional relationship between balance and falls and disability at baseline was studied using t tests and Spearman rank correlations. Univariate and multivariate logistic regression analyses were conducted to study the relationship between balance measured at baseline and falls and disability development after 6 months of follow-up. Results A total of 128 participants with complete datasets—25.8% (33/128) male—and a mean age of 75.33 years (SD 6.26) were included in the analyses of this study. Balance scores of participants who reported at baseline that they had fallen at least once in the past 6 months were lower compared to nonfallers—8.9 and 11.2, respectively (P<.001). The correlation between mean balance score and disability sum-score at baseline was -.51 (P<.001). No significant associations were found between balance at baseline and falls after 6 months of follow-up. Baseline balance scores were significantly associated with the development of disability after 6 months of follow-up in the univariate analysis—odds ratio (OR) 0.86 (95% CI 0.76-0.98)—but not in the multivariate analysis when correcting for age, gender, baseline disability, and falls at follow-up—OR 0.94 (95% CI 0.79-1.11). Conclusions There is a cross-sectional relationship between balance measured by a modified bathroom scale and falls and disability in older adults. Despite this cross-sectional relationship, longitudinal data showed that balance scores have no predictive value for falls and might only have limited predictive value for disability development after 6 months of follow-up. PMID:26018423

  17. Development and validation of a fast static headspace GC method for determination of residual solvents in permethrin.

    PubMed

    Tian, Jingzhi; Rustum, Abu

    2016-09-05

    A fast static headspace gas chromatography (HS-GC) method was developed to separate all residual solvents present in commercial active pharmaceutical ingredient (API) batches of permethrin. A total of six residual solvents namely 2-methylpentane, 3-methylpentane, methylcyclopentane, n-hexane, cyclohexane and toluene were found in typical commercial batches of permethrin; and three of them are not in the list of ICH solvents. All six residual solvents were baseline separated in five minutes by the new method presented in this paper. The method was successfully validated as per International Conference on Harmonisation (ICH) guidelines. Evaluation of this method was conducted to separate 26 commonly used solvents in the manufacturing of various APIs, key intermediates of APIs and pharmaceutical excipients. The results of the evaluation demonstrated that this method can also be used as a general method to determine residual solvents in various APIs, intermediates and excipients that are used in pharmaceutical products. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Evolving Learning Paradigms: Re-Setting Baselines and Collection Methods of Information and Communication Technology in Education Statistics

    ERIC Educational Resources Information Center

    Gibson, David; Broadley, Tania; Downie, Jill; Wallet, Peter

    2018-01-01

    The UNESCO Institute for Statistics (UIS) has been measuring ICT in education since 2009, but with such rapid change in technology and its use in education, it is important now to revise the collection mechanisms to focus on how technology is being used to enhance learning and teaching. Sustainable development goal (SDG) 4, for example, moves…

  19. Assessing the Treatment Effects in Apraxia of Speech: Introduction and Evaluation of the Modified Diadochokinesis Test

    ERIC Educational Resources Information Center

    Hurkmans, Joost; Jonkers, Roel; Boonstra, Anne M.; Stewart, Roy E.; Reinders-Messelink, Heleen A.

    2012-01-01

    Background: The number of reliable and valid instruments to measure the effects of therapy in apraxia of speech (AoS) is limited. Aims: To evaluate the newly developed Modified Diadochokinesis Test (MDT), which is a task to assess the effects of rate and rhythm therapies for AoS in a multiple baseline across behaviours design. Methods: The…

  20. Improved quality-by-design compliant methodology for method development in reversed-phase liquid chromatography.

    PubMed

    Debrus, Benjamin; Guillarme, Davy; Rudaz, Serge

    2013-10-01

    A complete strategy dedicated to quality-by-design (QbD) compliant method development using design of experiments (DOE), multiple linear regressions responses modelling and Monte Carlo simulations for error propagation was evaluated for liquid chromatography (LC). The proposed approach includes four main steps: (i) the initial screening of column chemistry, mobile phase pH and organic modifier, (ii) the selectivity optimization through changes in gradient time and mobile phase temperature, (iii) the adaptation of column geometry to reach sufficient resolution, and (iv) the robust resolution optimization and identification of the method design space. This procedure was employed to obtain a complex chromatographic separation of 15 antipsychotic basic drugs, widely prescribed. To fully automate and expedite the QbD method development procedure, short columns packed with sub-2 μm particles were employed, together with a UHPLC system possessing columns and solvents selection valves. Through this example, the possibilities of the proposed QbD method development workflow were exposed and the different steps of the automated strategy were critically discussed. A baseline separation of the mixture of antipsychotic drugs was achieved with an analysis time of less than 15 min and the robustness of the method was demonstrated simultaneously with the method development phase. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Intensive care nurses' knowledge of pressure ulcers: development of an assessment tool and effect of an educational program.

    PubMed

    Tweed, Carol; Tweed, Mike

    2008-07-01

    Critically ill patients are at high risk for pressure ulcers. Successful prevention of pressure ulcers requires that caregivers have adequate knowledge of this complication. To assess intensive care nurses' knowledge of pressure ulcers and the impact of an educational program on knowledge levels. A knowledge assessment test was developed. A cohort of registered nurses in a tertiary referral hospital in New Zealand had knowledge assessed 3 times: before an educational program, within 2 weeks after the program, and 20 weeks later. Multivariate analysis was performed to determine if attributes such as length of time since qualifying or level of intensive care unit experience were associated with test scores. The content and results of the assessment test were evaluated. Completion of the educational program resulted in improved levels of knowledge. Mean scores on the assessment test were 84% at baseline and 89% following the educational program. The mean baseline score did not differ significantly from the mean 20-week follow-up score of 85%. No association was detected between demographic data and test scores. Content validity and standard setting were verified by using a variety of methods. Levels of knowledge to prevent and manage pressure ulcers were good initially and improved with an educational program, but soon returned to baseline.

  2. Application of ALOS and Envisat Data in Improving Multi-Temporal InSAR Methods for Monitoring Damavand Volcano and Landslide Deformation in the Center of Alborz Mountains, North Iran

    NASA Astrophysics Data System (ADS)

    Vajedian, S.; Motagh, M.; Nilfouroushan, F.

    2013-09-01

    InSAR capacity to detect slow deformation over terrain areas is limited by temporal and geometric decorrelations. Multitemporal InSAR techniques involving Persistent Scatterer (Ps-InSAR) and Small Baseline (SBAS) are recently developed to compensate the decorrelation problems. Geometric decorrelation in mountainous areas especially for Envisat images makes phase unwrapping process difficult. To improve this unwrapping problem, we first modified phase filtering to make the wrapped phase image as smooth as possible. In addition, in order to improve unwrapping results, a modified unwrapping method has been developed. This method includes removing possible orbital and tropospheric effects. Topographic correction is done within three-dimensional unwrapping, Orbital and tropospheric corrections are done after unwrapping process. To evaluate the effectiveness of our improved method we tested the proposed algorithm by Envisat and ALOS dataset and compared our results with recently developed PS software (StaMAPS). In addition we used GPS observations for evaluating the modified method. The results indicate that our method improves the estimated deformation significantly.

  3. 10 CFR 850.20 - Baseline beryllium inventory.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy... Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of the locations of beryllium operations and other locations of potential beryllium contamination, and identify the...

  4. Method for controlling powertrain pumps

    DOEpatents

    Sime, Karl Andrew; Spohn, Brian L; Demirovic, Besim; Martini, Ryan D; Miller, Jean Marie

    2013-10-22

    A method of controlling a pump supplying a fluid to a transmission includes sensing a requested power and an excess power for a powertrain. The requested power substantially meets the needs of the powertrain, while the excess power is not part of the requested power. The method includes sensing a triggering condition in response to the ability to convert the excess power into heat in the transmission, and determining that an operating temperature of the transmission is below a maximum. The method also includes determining a calibrated baseline and a dissipation command for the pump. The calibrated baseline command is configured to supply the fluid based upon the requested power, and the dissipation command is configured to supply additional fluid and consume the excess power with the pump. The method operates the pump at a combined command, which is equal to the calibrated baseline command plus the dissipation command.

  5. School Sun Protection Policies: Measure Development and Assessments in Two Regions of the United States

    PubMed Central

    Buller, David B.; French, Simone A.; Buller, Mary K.; Ashley, Jeff L.

    2012-01-01

    BACKGROUND In 2002, the US Centers for Disease Control and Prevention recommended that schools adopt policies that reduce exposure of children to ultraviolet radiation to prevent skin cancer. We report here the development of a school sun safety policy measure and baseline descriptive statistics from the assessment of written policies collected in 2005-2007 from public school districts that enrolled in a randomized trial evaluating a policy promotion program. METHODS Written policies were collected from 103 of 112 school districts in Colorado and Southern California prior to randomization. We developed methods for selecting policy headings/sections topics likely to contain sun safety policies for students and for assessing the presence, strength, and intent of policies. Trained coders assessed the content of each policy document. RESULTS Overall, 31% of districts had a policy addressing sun safety, most commonly, protective clothing, hats, sunscreen, and education at baseline. More California districts (51.9%) had these policies than Colorado districts (7.8%, p<.001). Policy scores were highest in districts with fewer Caucasian students (b=-0.02, p=.022) in Colorado (b=-0.02, p=.007) but not California (b=0.01, p=.299). CONCLUSION The protocol for assessing sun safety policy in board-approved written policy documents had several advantages over surveys of school officials. Sun protection policies were uncommon and limited in scope in 2005-2007. California has been more active at legislating school policy than Colorado. School district policies remain a largely untapped method for promoting the sun protection of children. PMID:23061553

  6. The Design of Time-Series Comparisons under Resource Constraints.

    ERIC Educational Resources Information Center

    Willemain, Thomas R.; Hartunian, Nelson S.

    1982-01-01

    Two methods for dividing an interrupted time-series study between baseline and experimental phases when study resources are limited are compared. In fixed designs, the baseline duration is predetermined. In flexible designs the baseline duration is contingent on remaining resources and the match of results to prior expectations of the evaluator.…

  7. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    PubMed Central

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  8. Evolution of strategic risks under future scenarios for improved utility master plans.

    PubMed

    Luís, Ana; Lickorish, Fiona; Pollard, Simon

    2016-01-01

    Integrated, long-term risk management in the water sector is poorly developed. Whilst scenario planning has been applied to singular issues (e.g. climate change), it often misses a link to risk management because the likelihood of impacts in the long-term are frequently unaccounted for in these analyses. Here we apply the morphological approach to scenario development for a case study utility, Empresa Portuguesa das Águas Livres (EPAL). A baseline portfolio of strategic risks threatening the achievement of EPAL's corporate objectives was evolved through the lens of three future scenarios, 'water scarcity', 'financial resource scarcity' and 'strong economic growth', built on drivers such as climate, demographic, economic, regulatory and technological changes and validated through a set of expert workshops. The results represent how the baseline set of risks might develop over a 30 year period, allowing threats and opportunities to be identified and enabling strategies for master plans to be devised. We believe this to be the first combined use of risk and futures methods applied to a portfolio of strategic risks in the water utility sector. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Mapping Forest Height in Gabon Using UAVSAR Multi-Baseline Polarimetric SAR Interferometry and Lidar Fusion

    NASA Astrophysics Data System (ADS)

    Simard, M.; Denbina, M. W.

    2017-12-01

    Using data collected by NASA's Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) and Land, Vegetation, and Ice Sensor (LVIS) lidar, we have estimated forest canopy height for a number of study areas in the country of Gabon using a new machine learning data fusion approach. Using multi-baseline polarimetric synthetic aperture radar interferometry (PolInSAR) data collected by UAVSAR, forest heights can be estimated using the random volume over ground model. In the case of multi-baseline UAVSAR data consisting of many repeat passes with spatially separated flight tracks, we can estimate different forest height values for each different image pair, or baseline. In order to choose the best forest height estimate for each pixel, the baselines must be selected or ranked, taking care to avoid baselines with unsuitable spatial separation, or severe temporal decorrelation effects. The current baseline selection algorithms in the literature use basic quality metrics derived from the PolInSAR data which are not necessarily indicative of the true height accuracy in all cases. We have developed a new data fusion technique which treats PolInSAR baseline selection as a supervised classification problem, where the classifier is trained using a sparse sampling of lidar data within the PolInSAR coverage area. The classifier uses a large variety of PolInSAR-derived features as input, including radar backscatter as well as features based on the PolInSAR coherence region shape and the PolInSAR complex coherences. The resulting data fusion method produces forest height estimates which are more accurate than a purely radar-based approach, while having a larger coverage area than the input lidar training data, combining some of the strengths of each sensor. The technique demonstrates the strong potential for forest canopy height and above-ground biomass mapping using fusion of PolInSAR with data from future spaceborne lidar missions such as the upcoming Global Ecosystems Dynamics Investigation (GEDI) lidar.

  10. A Computational/Experimental Study of Two Optimized Supersonic Transport Designs and the Reference H Baseline

    NASA Technical Reports Server (NTRS)

    Cliff, Susan E.; Baker, Timothy J.; Hicks, Raymond M.; Reuther, James J.

    1999-01-01

    Two supersonic transport configurations designed by use of non-linear aerodynamic optimization methods are compared with a linearly designed baseline configuration. One optimized configuration, designated Ames 7-04, was designed at NASA Ames Research Center using an Euler flow solver, and the other, designated Boeing W27, was designed at Boeing using a full-potential method. The two optimized configurations and the baseline were tested in the NASA Langley Unitary Plan Supersonic Wind Tunnel to evaluate the non-linear design optimization methodologies. In addition, the experimental results are compared with computational predictions for each of the three configurations from the Enter flow solver, AIRPLANE. The computational and experimental results both indicate moderate to substantial performance gains for the optimized configurations over the baseline configuration. The computed performance changes with and without diverters and nacelles were in excellent agreement with experiment for all three models. Comparisons of the computational and experimental cruise drag increments for the optimized configurations relative to the baseline show excellent agreement for the model designed by the Euler method, but poorer comparisons were found for the configuration designed by the full-potential code.

  11. Evaluation of a solid matrix for collection and ambient storage of RNA from whole blood

    PubMed Central

    2014-01-01

    Background Whole blood gene expression-based molecular diagnostic tests are becoming increasingly available. Conventional tube-based methods for obtaining RNA from whole blood can be limited by phlebotomy, volume requirements, and RNA stability during transport and storage. A dried blood spot matrix for collecting high-quality RNA, called RNA Stabilizing Matrix (RSM), was evaluated against PAXgene® blood collection tubes. Methods Whole blood was collected from 25 individuals and subjected to 3 sample storage conditions: 18 hours at either room temperature (baseline arm) or 37°C, and 6 days at room temperature. RNA was extracted and assessed for integrity by Agilent Bioanalyzer, and gene expression was compared by RT-qPCR across 23 mRNAs comprising a clinical test for obstructive coronary artery disease. Results RSM produced RNA of relatively high integrity across the various tested conditions (mean RIN ± 95% CI: baseline arm, 6.92 ± 0.24; 37°C arm, 5.98 ± 0.48; 6-day arm, 6.72 ± 0.23). PAXgene samples showed comparable RNA integrity in both baseline and 37°C arms (8.42 ± 0.17; 7.92 ± 0.1 respectively) however significant degradation was observed in the 6-day arm (3.19 ± 1.32). Gene expression scores on RSM were highly correlated between the baseline and 37°C and 6-day study arms (median r = 0.96, 0.95 respectively), as was the correlation to PAXgene tubes (median r = 0.95, p < 0.001). Conclusion RNA obtained from RSM shows little degradation and comparable RT-qPCR performance to PAXgene RNA for the 23 genes analyzed. Further development of this technology may provide a convenient method for collecting, shipping, and storing RNA for gene expression assays. PMID:24855452

  12. Multi-GNSS high-rate RTK, PPP and novel direct phase observation processing method: application to precise dynamic displacement detection

    NASA Astrophysics Data System (ADS)

    Paziewski, Jacek; Sieradzki, Rafal; Baryla, Radoslaw

    2018-03-01

    This paper provides the methodology and performance assessment of multi-GNSS signal processing for the detection of small-scale high-rate dynamic displacements. For this purpose, we used methods of relative (RTK) and absolute positioning (PPP), and a novel direct signal processing approach. The first two methods are recognized as providing accurate information on position in many navigation and surveying applications. The latter is an innovative method for dynamic displacement determination with the use of GNSS phase signal processing. This method is based on the developed functional model with parametrized epoch-wise topocentric relative coordinates derived from filtered GNSS observations. Current regular kinematic PPP positioning, as well as medium/long range RTK, may not offer coordinate estimates with subcentimeter precision. Thus, extended processing strategies of absolute and relative GNSS positioning have been developed and applied for displacement detection. The study also aimed to comparatively analyze the developed methods as well as to analyze the impact of combined GPS and BDS processing and the dependence of the results of the relative methods on the baseline length. All the methods were implemented with in-house developed software allowing for high-rate precise GNSS positioning and signal processing. The phase and pseudorange observations collected with a rate of 50 Hz during the field test served as the experiment’s data set. The displacements at the rover station were triggered in the horizontal plane using a device which was designed and constructed to ensure a periodic motion of GNSS antenna with an amplitude of ~3 cm and a frequency of ~4.5 Hz. Finally, a medium range RTK, PPP, and direct phase observation processing method demonstrated the capability of providing reliable and consistent results with the precision of the determined dynamic displacements at the millimeter level. Specifically, the research shows that the standard deviation of the displacement residuals obtained as the difference between a benchmark-ultra-short baseline RTK solution and selected scenarios ranged between 1.1 and 3.4 mm. At the same time, the differences in the mean amplitude of the oscillations derived from the established scenarios did not exceed 1.3 mm, whereas the frequency of the motion detected with the use of Fourier transformation was the same.

  13. Dietary Sodium Consumption Predicts Future Blood Pressure and Incident Hypertension in the Japanese Normotensive General Population

    PubMed Central

    Takase, Hiroyuki; Sugiura, Tomonori; Kimura, Genjiro; Ohte, Nobuyuki; Dohi, Yasuaki

    2015-01-01

    Background Although there is a close relationship between dietary sodium and hypertension, the concept that persons with relatively high dietary sodium are at increased risk of developing hypertension compared with those with relatively low dietary sodium has not been studied intensively in a cohort. Methods and Results We conducted an observational study to investigate whether dietary sodium intake predicts future blood pressure and the onset of hypertension in the general population. Individual sodium intake was estimated by calculating 24-hour urinary sodium excretion from spot urine in 4523 normotensive participants who visited our hospital for a health checkup. After a baseline examination, they were followed for a median of 1143 days, with the end point being development of hypertension. During the follow-up period, hypertension developed in 1027 participants (22.7%). The risk of developing hypertension was higher in those with higher rather than lower sodium intake (hazard ratio 1.25, 95% CI 1.04 to 1.50). In multivariate Cox proportional hazards regression analysis, baseline sodium intake and the yearly change in sodium intake during the follow-up period (as continuous variables) correlated with the incidence of hypertension. Furthermore, both the yearly increase in sodium intake and baseline sodium intake showed significant correlations with the yearly increase in systolic blood pressure in multivariate regression analysis after adjustment for possible risk factors. Conclusions Both relatively high levels of dietary sodium intake and gradual increases in dietary sodium are associated with future increases in blood pressure and the incidence of hypertension in the Japanese general population. PMID:26224048

  14. [Spatial-temporal evolution characterization of land subsidence by multi-temporal InSAR method and GIS technology].

    PubMed

    Chen, Bei-Bei; Gong, Hui-Li; Li, Xiao-Juan; Lei, Kun-Chao; Duan, Guang-Yao; Xie, Jin-Rong

    2014-04-01

    Long-term over-exploitation of underground resources, and static and dynamic load increase year by year influence the occurrence and development of regional land subsidence to a certain extent. Choosing 29 scenes Envisat ASAR images covering plain area of Beijing, China, the present paper used the multi-temporal InSAR method incorporating both persistent scatterer and small baseline approaches, and obtained monitoring information of regional land subsidence. Under different situation of space development and utilization, the authors chose five typical settlement areas; With classified information of land-use, multi-spectral remote sensing image, and geological data, and adopting GIS spatial analysis methods, the authors analyzed the time series evolution characteristics of uneven settlement. The comprehensive analysis results suggests that the complex situations of space development and utilization affect the trend of uneven settlement; the easier the situation of space development and utilization, the smaller the settlement gradient, and the less the uneven settlement trend.

  15. The LEGACY Girls Study: Growth and development in the context of breast cancer family history

    PubMed Central

    John, Esther M.; Terry, Mary Beth; Keegan, Theresa H.M.; Bradbury, Angela R.; Knight, Julia A.; Chung, Wendy K.; Frost, Caren J.; Lilge, Lothar; Patrick-Miller, Linda; Schwartz, Lisa A.; Whittemore, Alice S.; Buys, Saundra S.; Daly, Mary B.; Andrulis, Irene L.

    2017-01-01

    Background Although the timing of pubertal milestones has been associated with breast cancer risk, few studies of girls’ development include girls at increased breast cancer risk due to their family history. Methods The LEGACY (Lessons in Epidemiology and Genetics of Adult Cancer from Youth) Girls Study was initiated in 2011 in the USA and Canada to assess the relation between early-life exposures and intermediate markers of breast cancer risk (e.g., pubertal development, breast tissue characteristics) and to investigate psychosocial well-being and health behaviors in the context of family history. We describe the methods used to establish and follow a cohort of 1,040 girls ages 6–13 years at baseline, half with a breast cancer family history, and the collection of questionnaire data (family history, early-life exposures, growth and development, psychosocial and behavioral), anthropometry, biospecimens, and breast tissue characteristics using optical spectroscopy. Results During this initial 5-year phase of the study, follow-up visits are conducted every six months for repeated data and biospecimen collection. Participation in baseline components was high (98% for urine, 97.5% for blood or saliva, and 98% for anthropometry). At enrollment, 77% of girls were pre-menarcheal and 49% were at breast Tanner stage T1. Conclusions This study design allows thorough examination of events affecting girls’ growth and development and how they differ across the spectrum of breast cancer risk. A better understanding of early-life breast cancer risk factors will be essential to enhance prevention across the lifespan for those with and without a family history of the disease. PMID:26829160

  16. Development of a Mandarin-English Bilingual Speech Recognition System for Real World Music Retrieval

    NASA Astrophysics Data System (ADS)

    Zhang, Qingqing; Pan, Jielin; Lin, Yang; Shao, Jian; Yan, Yonghong

    In recent decades, there has been a great deal of research into the problem of bilingual speech recognition-to develop a recognizer that can handle inter- and intra-sentential language switching between two languages. This paper presents our recent work on the development of a grammar-constrained, Mandarin-English bilingual Speech Recognition System (MESRS) for real world music retrieval. Two of the main difficult issues in handling the bilingual speech recognition systems for real world applications are tackled in this paper. One is to balance the performance and the complexity of the bilingual speech recognition system; the other is to effectively deal with the matrix language accents in embedded language**. In order to process the intra-sentential language switching and reduce the amount of data required to robustly estimate statistical models, a compact single set of bilingual acoustic models derived by phone set merging and clustering is developed instead of using two separate monolingual models for each language. In our study, a novel Two-pass phone clustering method based on Confusion Matrix (TCM) is presented and compared with the log-likelihood measure method. Experiments testify that TCM can achieve better performance. Since potential system users' native language is Mandarin which is regarded as a matrix language in our application, their pronunciations of English as the embedded language usually contain Mandarin accents. In order to deal with the matrix language accents in embedded language, different non-native adaptation approaches are investigated. Experiments show that model retraining method outperforms the other common adaptation methods such as Maximum A Posteriori (MAP). With the effective incorporation of approaches on phone clustering and non-native adaptation, the Phrase Error Rate (PER) of MESRS for English utterances was reduced by 24.47% relatively compared to the baseline monolingual English system while the PER on Mandarin utterances was comparable to that of the baseline monolingual Mandarin system. The performance for bilingual utterances achieved 22.37% relative PER reduction.

  17. An improved method for the analysis of sennosides in Cassia angustifolia by high-performance liquid chromatography.

    PubMed

    Bala, S; Uniyal, G C; Dubey, T; Singh, S P

    2001-01-01

    A reversed-phase column liquid chromatographic method for the analysis of sennosides A and B present in leaf and pod extracts of Cassia angustifolia has been developed using a Symmetry C18 column and a linear binary gradient profile. The method can be utilised for the quantitative determination of other sennosides as a baseline resolution for most of the constituents was achieved. The method is economical in terms of the time taken and the amount of solvent used (25 mL) for each analysis. The validity of the method with respect to analysis was confirmed by comparing the UV spectra of each peak with those of reference compounds using a photodiode array detector.

  18. An exploratory baseline study of boy chorister vocal behaviour and development in an intensive professional context.

    PubMed

    Williams, Jenevora; Welch, Graham; Howard, David M

    2005-01-01

    Currently, there is no existing published empirical longitudinal data on the singing behaviours and development of choristers who perform in UK cathedrals and major chapels. Longitudinal group data is needed to provide a baseline against which individual chorister development can be mapped. The choristers perform to a professional standard on a daily basis, usually with linked rehearsals, whilst also following a full school curriculum. The impact of this intensive schedule in relation to current vocal behaviour, health and future development requires investigation. Furthermore, it is also necessary to understand the relationship between the requirements of chorister singing behaviour and adolescent voice change. The paper will report the initial findings of a new longitudinal chorister study, based in one of London's cathedrals. Singing and vocal behaviours are being profiled on a six-monthly basis using data from a specially designed acoustic and behavioural instrument. The information obtained will enable us to understand better the effects of such training and performance on underlying vocal behaviour and vocal health. The findings will also have implications for singing teachers and choral directors in relation to particular methods of vocal education and rehearsal.

  19. Prognosis of carotid dissecting aneurysms

    PubMed Central

    Larsson, Susanna C.; King, Alice; Madigan, Jeremy; Levi, Christopher; Norris, John W.

    2017-01-01

    Objective: To determine the natural history of dissecting aneurysm (DA) and whether DA is associated with an increased recurrent stroke risk and whether type of antithrombotic drugs (antiplatelets vs anticoagulants) modifies the persistence or development of DA. Methods: We included 264 patients with extracranial cervical artery dissection (CAD) from the Cervical Artery Dissection in Stroke Study (CADISS), a multicenter prospective study that compared antiplatelet with anticoagulation therapy. Logistic regression was used to estimate age- and sex-adjusted odds ratios. We conducted a systematic review of published studies assessing the natural history of DA and stroke risk in patients with non-surgically-treated extracranial CAD with DA. Results: In CADISS, DA was present in 24 of 264 patients at baseline. In 36 of 248 patients with follow-up neuroimaging at 3 months, 12 of the 24 baseline DAs persisted, and 24 new DA had developed. There was no association between treatment allocation (antiplatelets vs anticoagulants) and whether DA at baseline persisted at follow-up or whether new DA developed. During 12 months of follow-up, stroke occurred in 1 of 48 patients with DA and in 7 of 216 patients without DA (age- and sex-adjusted odds ratio 0.84; 95% confidence interval 0.10–7.31; p = 0.88). Published studies, mainly retrospective, showed a similarly low risk of stroke and no evidence of an increased stroke rate in patients with DA. Conclusions: The results of CADISS provide evidence suggesting that DAs may have benign prognosis and therefore medical treatment should be considered. PMID:28087823

  20. Trajectories of Change in Obesity and Symptoms of Depression: The CARDIA Study

    PubMed Central

    Epel, Elissa S.; Adler, Nancy E.; Kiefe, Catarina

    2010-01-01

    Objectives. We investigated whether, over time, baseline obesity is associated with change in depressive symptoms or if baseline symptoms of depression are associated with change in body mass index (BMI) and waist circumference. Methods. We used latent growth curve modeling to examine data from years 5, 10, 15, and 20 of the Coronary Artery Risk Development in Young Adults study (n = 4643). We assessed depressive symptomatology with the Center for Epidemiological Studies Depression scale. Results. Respondents who started out with higher levels of depressive symptoms experienced a faster rate of increase in BMI (for Whites only) and waist circumference (for Blacks and Whites) over time than did those who reported fewer symptoms of depression in year 5. Initial BMI and waist circumference did not influence the rate of change in symptoms of depression over time. Conclusions. Depressive symptomatology likely plays a role in the development of physical health problems, such as cardiovascular disease, through its association with increases in relative weight and abdominal obesity over time. PMID:20395582

  1. A pilot programme of clinical practice improvement for future consultant doctors.

    PubMed

    Oates, Kim; Vinters, Cathy; Cass-Verco, John; Fletcher, Mandy; Kaur, Narinder; Mherekumombe, Martha; Tang, Alice

    2017-04-01

    To provide junior doctors with tools to improve patient care in their workplace, a partnership was developed between the Clinical Excellence Commission (CEC) and the Royal Australasian College of Physicians (RACP) to help trainee consultants carry out clinical practice improvement (CPI) projects during clinical work. Based on a patient-care problem they wished to resolve, trainee consultants attended a 2-day face-to-face workshop to learn quality-improvement methods, describe their proposals and refine them using CPI methodology. They were provided with continuing supervision, participated in a mid-point review and were responsible for driving their projects. Trainee consultants attended a 2-day face-to-face workshop to learn quality-improvement methods RESULTS: Examples of five projects are: reducing mislabelled specimens leaving an emergency department, from 82 in the baseline period to 18 following the intervention; creating a multidisciplinary team to reduce hypoglycaemic episodes on a diabetic ward, from 23 episodes at baseline to three episodes over the same time period after the intervention; establishing an acute paediatric review clinic that reduced avoidable admissions of pneumonia by 74 per cent; providing 100 per cent of patients in a palliative care unit with an effective pain-management plan; developing an education package to increase staff confidence in recognising and responding to anaphylaxis in children, producing an increase in confidence from 51 per cent at baseline to 100 per cent after the intervention. Involving a learned college such as the RACP in patient-care improvement, with educational input from a partner organisation, shows how junior staff can become effective leaders in improving patient care. © 2016 John Wiley & Sons Ltd.

  2. The WISTAH hand study: a prospective cohort study of distal upper extremity musculoskeletal disorders.

    PubMed

    Garg, Arun; Hegmann, Kurt T; Wertsch, Jacqueline J; Kapellusch, Jay; Thiese, Matthew S; Bloswick, Donald; Merryweather, Andrew; Sesek, Richard; Deckow-Schaefer, Gwen; Foster, James; Wood, Eric; Kendall, Richard; Sheng, Xiaoming; Holubkov, Richard

    2012-06-06

    Few prospective cohort studies of distal upper extremity musculoskeletal disorders have been performed. Past studies have provided somewhat conflicting evidence for occupational risk factors and have largely reported data without adjustments for many personal and psychosocial factors. A multi-center prospective cohort study was incepted to quantify risk factors for distal upper extremity musculoskeletal disorders and potentially develop improved methods for analyzing jobs. Disorders to analyze included carpal tunnel syndrome, lateral epicondylalgia, medial epicondylalgia, trigger digit, deQuervain's stenosing tenosynovitis and other tendinoses. Workers have thus far been enrolled from 17 different employment settings in 3 diverse US states and performed widely varying work. At baseline, workers undergo laptop administered questionnaires, structured interviews, two standardized physical examinations and nerve conduction studies to ascertain demographic, medical history, psychosocial factors and current musculoskeletal disorders. All workers' jobs are individually measured for physical factors and are videotaped. Workers are followed monthly for the development of musculoskeletal disorders. Repeat nerve conduction studies are performed for those with symptoms of tingling and numbness in the prior six months. Changes in jobs necessitate re-measure and re-videotaping of job physical factors. Case definitions have been established. Point prevalence of carpal tunnel syndrome is a combination of paraesthesias in at least two median nerve-served digits plus an abnormal nerve conduction study at baseline. The lifetime cumulative incidence of carpal tunnel syndrome will also include those with a past history of carpal tunnel syndrome. Incident cases will exclude those with either a past history or prevalent cases at baseline. Statistical methods planned include survival analyses and logistic regression. A prospective cohort study of distal upper extremity musculoskeletal disorders is underway and has successfully enrolled over 1,000 workers to date.

  3. Assessing historical fish community composition using surveys, historical collection data, and species distribution models.

    PubMed

    Labay, Ben; Cohen, Adam E; Sissel, Blake; Hendrickson, Dean A; Martin, F Douglas; Sarkar, Sahotra

    2011-01-01

    Accurate establishment of baseline conditions is critical to successful management and habitat restoration. We demonstrate the ability to robustly estimate historical fish community composition and assess the current status of the urbanized Barton Creek watershed in central Texas, U.S.A. Fish species were surveyed in 2008 and the resulting data compared to three sources of fish occurrence information: (i) historical records from a museum specimen database and literature searches; (ii) a nearly identical survey conducted 15 years earlier; and (iii) a modeled historical community constructed with species distribution models (SDMs). This holistic approach, and especially the application of SDMs, allowed us to discover that the fish community in Barton Creek was more diverse than the historical data and survey methods alone indicated. Sixteen native species with high modeled probability of occurrence within the watershed were not found in the 2008 survey, seven of these were not found in either survey or in any of the historical collection records. Our approach allowed us to more rigorously establish the true baseline for the pre-development fish fauna and then to more accurately assess trends and develop hypotheses regarding factors driving current fish community composition to better inform management decisions and future restoration efforts. Smaller, urbanized freshwater systems, like Barton Creek, typically have a relatively poor historical biodiversity inventory coupled with long histories of alteration, and thus there is a propensity for land managers and researchers to apply inaccurate baseline standards. Our methods provide a way around that limitation by using SDMs derived from larger and richer biodiversity databases of a broader geographic scope. Broadly applied, we propose that this technique has potential to overcome limitations of popular bioassessment metrics (e.g., IBI) to become a versatile and robust management tool for determining status of freshwater biotic communities.

  4. Increased Visceral Adipose Tissue Is an Independent Predictor for Future Development of Atherogenic Dyslipidemia.

    PubMed

    Hwang, You-Cheol; Fujimoto, Wilfred Y; Hayashi, Tomoshige; Kahn, Steven E; Leonetti, Donna L; Boyko, Edward J

    2016-02-01

    Atherogenic dyslipidemia is frequently observed in persons with a greater amount of visceral adipose tissue (VAT). However, it is still uncertain whether VAT is independently associated with the future development of atherogenic dyslipidemia. The aim of this study was to determine whether baseline and changes in VAT and subcutaneous adipose tissue (SAT) are associated with future development of atherogenic dyslipidemia independent of baseline lipid levels and standard anthropometric indices. Community-based prospective cohort study with 5 years of follow-up. A total of 452 Japanese Americans (240 men, 212 women), aged 34-75 years were assessed at baseline and after 5 years of follow-up. Abdominal fat areas were measured by computed tomography. Atherogenic dyslipidemia was defined as one or more abnormalities in high-density lipoprotein (HDL) cholesterol, triglycerides, or non-HDL cholesterol levels. Baseline VAT and change in VAT over 5 years were independently associated with log-transformed HDL cholesterol, log-transformed triglyceride, and non-HDL cholesterol after 5 years (standardized β = -0.126, 0.277, and 0.066 for baseline VAT, respectively, and -0.095, 0.223, and 0.090 for change in VAT, respectively). However, baseline and change in SAT were not associated with any future atherogenic lipid level. In multivariate logistic regression analysis, incremental change in VAT (odds ratio [95% confidence interval], 1.73 [1.20-2.48]; P = .003), triglycerides (4.01 [1.72-9.33]; P = .001), HDL cholesterol (0.32 [0.18-0.58]; P < .001), and non-HDL cholesterol (7.58 [4.43-12.95]; P < .001) were significantly associated with the future development of atherogenic dyslipidemia independent of age, sex, diastolic blood pressure, homeostasis model assessment insulin resistance, body mass index (BMI), change in BMI, SAT, and baseline atherogenic lipid levels. Baseline and change in VAT were independent predictors for future development of atherogenic dyslipidemia. However, BMI, waist circumference, and SAT were not associated with future development of atherogenic dyslipidemia.

  5. Space Station needs, attributes and architectural options. Volume 2, book 1, part 1: Mission requirements

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The baseline mission model used to develop the space station mission-related requirements is described as well as the 90 civil missions that were evaluated, (including the 62 missions that formed the baseline model). Mission-related requirements for the space station baseline are defined and related to space station architectural development. Mission-related sensitivity analyses are discussed.

  6. Increase in relative skeletal muscle mass over time and its inverse association with metabolic syndrome development: a 7-year retrospective cohort study.

    PubMed

    Kim, Gyuri; Lee, Seung-Eun; Jun, Ji Eun; Lee, You-Bin; Ahn, Jiyeon; Bae, Ji Cheol; Jin, Sang-Man; Hur, Kyu Yeon; Jee, Jae Hwan; Lee, Moon-Kyu; Kim, Jae Hyeon

    2018-02-05

    Skeletal muscle mass was negatively associated with metabolic syndrome prevalence in previous cross-sectional studies. The aim of this study was to investigate the impact of baseline skeletal muscle mass and changes in skeletal muscle mass over time on the development of metabolic syndrome in a large population-based 7-year cohort study. A total of 14,830 and 11,639 individuals who underwent health examinations at the Health Promotion Center at Samsung Medical Center, Seoul, Korea were included in the analyses of baseline skeletal muscle mass and those changes from baseline over 1 year, respectively. Skeletal muscle mass was estimated by bioelectrical impedance analysis and was presented as a skeletal muscle mass index (SMI), a body weight-adjusted appendicular skeletal muscle mass value. Using Cox regression models, hazard ratio for developing metabolic syndrome associated with SMI values at baseline or changes of SMI over a year was analyzed. During 7 years of follow-up, 20.1% of subjects developed metabolic syndrome. Compared to the lowest sex-specific SMI tertile at baseline, the highest sex-specific SMI tertile showed a significant inverse association with metabolic syndrome risk (adjusted hazard ratio [AHR] = 0.61, 95% confidence interval [CI] 0.54-0.68). Furthermore, compared with SMI changes < 0% over a year, multivariate-AHRs for metabolic syndrome development were 0.87 (95% CI 0.78-0.97) for 0-1% changes and 0.67 (0.56-0.79) for > 1% changes in SMI over 1 year after additionally adjusting for baseline SMI and glycometabolic parameters. An increase in relative skeletal muscle mass over time has a potential preventive effect on developing metabolic syndrome, independently of baseline skeletal muscle mass and glycometabolic parameters.

  7. Conduct Disorder and Initiation of Substance Use: A Prospective Longitudinal Study

    ERIC Educational Resources Information Center

    Hopfer, Christian; Salomonsen-Sautel, Stacy; Mikulich-Gilbertson, Susan; Min, Sung-Joon; McQueen, Matt; Crowley, Thomas; Young, Susan; Corley, Robin; Sakai, Joseph; Thurstone, Christian; Hoffenberg, Analice; Hartman, Christie; Hewitt, John

    2013-01-01

    Objective: To examine the influence of conduct disorder (CD) on substance use initiation. Method: Community adolescents without CD (n = 1,165, mean baseline age = 14.6 years), with CD (n = 194, mean baseline age = 15.3 years), and youth with CD recruited from treatment (n = 268, mean baseline age = 15.7 years) were prospectively followed and…

  8. Health-promoting factors in medical students and students of science, technology, engineering, and mathematics: design and baseline results of a comparative longitudinal study

    PubMed Central

    2014-01-01

    Background The negative impact of medical school on students' general and mental health has often been reported. Compared to students of other subjects, or employed peers, medical students face an increased risk of developing depression, anxiety and burnout. While pathogenetic factors have been studied extensively, less is known about health-promoting factors for medical students' health. This longitudinal study aims to identify predictors for maintaining good general and mental health during medical education. We report here the design of the study and its baseline results. Methods We initiated a prospective longitudinal cohort study at the University of Lübeck, Germany. Two consecutive classes of students, entering the university in 2011 and 2012, were recruited. Participants will be assessed annually for the duration of their course. We use validated psychometric instruments covering health outcomes (general and mental health) and personality traits, as well as self-developed, pre-tested items covering leisure activities and sociodemographic data. Results At baseline, compared to students of STEM (science, technology, engineering, and mathematics) subjects (n = 531; 60.8% response rate), a larger proportion of medical students (n = 350; 93.0% response rate) showed good general health (90.9% vs. 79.7%) and a similar proportion was in good mental health (88.3% vs. 86.3%). Medical students scored significantly higher in the personality traits of extraversion, conscientiousness, openness to experience and agreeableness. Neuroticism proved to be a statistically significant negative predictor for mental health in the logistic regression analyses. Satisfaction with life as a dimension of study-related behaviour and experience predicted general health at baseline. Physical activity was a statistically significant predictor for general health in medical students. Conclusions Baseline data revealed that medical students reported better general and similar mental health compared to STEM students. The annual follow-up questionnaires, combined with qualitative approaches, should clarify wether these differences reflect a higher resilience, a tendency to neglect personal health problems - as has been described for physicians - before entering medical school, or both. The final results may aid decision-makers in developing health-promotion programmes for medical students. PMID:24996637

  9. Simulation of turbulent separated flows using a novel, evolution-based, eddy-viscosity formulation

    NASA Astrophysics Data System (ADS)

    Castellucci, Paul

    Currently, there exists a lack of confidence in the computational simulation of turbulent separated flows at large Reynolds numbers. The most accurate methods available are too computationally costly to use in engineering applications. Thus, inexpensive models, developed using the Reynolds-averaged Navier-Stokes (RANS) equations, are often extended beyond their applicability. Although these methods will often reproduce integrated quantities within engineering tolerances, such metrics are often insensitive to details within a separated wake, and therefore, poor indicators of simulation fidelity. Using concepts borrowed from large-eddy simulation (LES), a two-equation RANS model is modified to simulate the turbulent wake behind a circular cylinder. This modification involves the computation of one additional scalar field, adding very little to the overall computational cost. When properly inserted into the baseline RANS model, this modification mimics LES in the separated wake, yet reverts to the unmodified form at the cylinder surface. In this manner, superior predictive capability may be achieved without the additional cost of fine spatial resolution associated with LES near solid boundaries. Simulations using modified and baseline RANS models are benchmarked against both LES and experimental data for a circular cylinder wake at Reynolds number 3900. In addition, the computational tool used in this investigation is subject to verification via the Method of Manufactured Solutions. Post-processing of the resultant flow fields includes both mean value and triple-decomposition analysis. These results reveal substantial improvements using the modified system and appear to drive the baseline wake solution toward that of LES, as intended.

  10. Piloted Flight Simulator Developed for Icing Effects Training

    NASA Technical Reports Server (NTRS)

    Ratvasky, Thomas P.

    2005-01-01

    In an effort to expand pilot training methods to avoid icing-related accidents, the NASA Glenn Research Center and Bihrle Applied Research Inc. have developed the Ice Contamination Effects Flight Training Device (ICEFTD). ICEFTD simulates the flight characteristics of the NASA Twin Otter Icing Research Aircraft in a no-ice baseline and in two ice configurations simulating ice-protection-system failures. Key features of the training device are the force feedback in the yoke, the instrument panel and out-the-window graphics, the instructor s workstation, and the portability of the unit.

  11. Developing Performance Cost Index Targets for ASHRAE Standard 90.1 Appendix G – Performance Rating Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael I.; Hart, Philip R.

    2016-02-16

    Appendix G, the Performance Rating Method in ASHRAE Standard 90.1 has been updated to make two significant changes for the 2016 edition, to be published in October of 2016. First, it allows Appendix G to be used as a third path for compliance with the standard in addition to rating beyond code building performance. This prevents modelers from having to develop separate building models for code compliance and beyond code programs. Using this new version of Appendix G to show compliance with the 2016 edition of the standard, the proposed building design needs to have a performance cost index (PCI)more » less than targets shown in a new table based on building type and climate zone. The second change is that the baseline design is now fixed at a stable level of performance set approximately equal to the 2004 code. Rather than changing the stringency of the baseline with each subsequent edition of the standard, compliance with new editions will simply require a reduced PCI (a PCI of zero is a net-zero building). Using this approach, buildings of any era can be rated using the same method. The intent is that any building energy code or beyond code program can use this methodology and merely set the appropriate PCI target for their needs. This report discusses the process used to set performance criteria for compliance with ASHRAE Standard 90.1-2016 and suggests a method for demonstrating compliance with other codes and beyond code programs.« less

  12. Developing Performance Cost Index Targets for ASHRAE Standard 90.1 Appendix G – Performance Rating Method - Rev.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael I.; Hart, Philip R.

    2016-03-01

    Appendix G, the Performance Rating Method in ASHRAE Standard 90.1 has been updated to make two significant changes for the 2016 edition, to be published in October of 2016. First, it allows Appendix G to be used as a third path for compliance with the standard in addition to rating beyond code building performance. This prevents modelers from having to develop separate building models for code compliance and beyond code programs. Using this new version of Appendix G to show compliance with the 2016 edition of the standard, the proposed building design needs to have a performance cost index (PCI)more » less than targets shown in a new table based on building type and climate zone. The second change is that the baseline design is now fixed at a stable level of performance set approximately equal to the 2004 code. Rather than changing the stringency of the baseline with each subsequent edition of the standard, compliance with new editions will simply require a reduced PCI (a PCI of zero is a net-zero building). Using this approach, buildings of any era can be rated using the same method. The intent is that any building energy code or beyond code program can use this methodology and merely set the appropriate PCI target for their needs. This report discusses the process used to set performance criteria for compliance with ASHRAE Standard 90.1-2016 and suggests a method for demonstrating compliance with other codes and beyond code programs.« less

  13. Evaluating the predictive power of multivariate tensor-based morphometry in Alzheimer's disease progression via convex fused sparse group Lasso

    NASA Astrophysics Data System (ADS)

    Tsao, Sinchai; Gajawelli, Niharika; Zhou, Jiayu; Shi, Jie; Ye, Jieping; Wang, Yalin; Lepore, Natasha

    2014-03-01

    Prediction of Alzheimers disease (AD) progression based on baseline measures allows us to understand disease progression and has implications in decisions concerning treatment strategy. To this end we combine a predictive multi-task machine learning method1 with novel MR-based multivariate morphometric surface map of the hippocampus2 to predict future cognitive scores of patients. Previous work by Zhou et al.1 has shown that a multi-task learning framework that performs prediction of all future time points (or tasks) simultaneously can be used to encode both sparsity as well as temporal smoothness. They showed that this can be used in predicting cognitive outcomes of Alzheimers Disease Neuroimaging Initiative (ADNI) subjects based on FreeSurfer-based baseline MRI features, MMSE score demographic information and ApoE status. Whilst volumetric information may hold generalized information on brain status, we hypothesized that hippocampus specific information may be more useful in predictive modeling of AD. To this end, we applied Shi et al.2s recently developed multivariate tensor-based (mTBM) parametric surface analysis method to extract features from the hippocampal surface. We show that by combining the power of the multi-task framework with the sensitivity of mTBM features of the hippocampus surface, we are able to improve significantly improve predictive performance of ADAS cognitive scores 6, 12, 24, 36 and 48 months from baseline.

  14. On the possibility of producing definitive magnetic observatory data within less than one year

    NASA Astrophysics Data System (ADS)

    Mandić, Igor; Korte, Monika

    2017-04-01

    Geomagnetic observatory data are fundamental in geomagnetic field studies and are widely used in other applications. Often they are combined with satellite and ground survey data. Unfortunately, the observatory definitive data are only available with a time lag ranging from several months up to more than a year. The reason for this lag is the annual production of the final calibration values, i.e. baselines that are used to correct preliminary data from continuously recording magnetometers. In this paper, we will show that the preparation of definitive geomagnetic data is possible within a calendar year and presents an original method for prompt and automatic estimation of the observatory baselines. The new baselines, obtained in a mostly automatic manner, are compared with the baselines reported on INTERMAGNET DVDs for the 2009-2011 period. The high quality of the baselines obtained by the proposed method indicates its suitability for data processing in fully automatic observatories when automated absolute instruments will be deployed at remote sites.

  15. Risk Factors Analysis and Death Prediction in Some Life-Threatening Ailments Using Chi-Square Case-Based Reasoning (χ2 CBR) Model.

    PubMed

    Adeniyi, D A; Wei, Z; Yang, Y

    2018-01-30

    A wealth of data are available within the health care system, however, effective analysis tools for exploring the hidden patterns in these datasets are lacking. To alleviate this limitation, this paper proposes a simple but promising hybrid predictive model by suitably combining the Chi-square distance measurement with case-based reasoning technique. The study presents the realization of an automated risk calculator and death prediction in some life-threatening ailments using Chi-square case-based reasoning (χ 2 CBR) model. The proposed predictive engine is capable of reducing runtime and speeds up execution process through the use of critical χ 2 distribution value. This work also showcases the development of a novel feature selection method referred to as frequent item based rule (FIBR) method. This FIBR method is used for selecting the best feature for the proposed χ 2 CBR model at the preprocessing stage of the predictive procedures. The implementation of the proposed risk calculator is achieved through the use of an in-house developed PHP program experimented with XAMP/Apache HTTP server as hosting server. The process of data acquisition and case-based development is implemented using the MySQL application. Performance comparison between our system, the NBY, the ED-KNN, the ANN, the SVM, the Random Forest and the traditional CBR techniques shows that the quality of predictions produced by our system outperformed the baseline methods studied. The result of our experiment shows that the precision rate and predictive quality of our system in most cases are equal to or greater than 70%. Our result also shows that the proposed system executes faster than the baseline methods studied. Therefore, the proposed risk calculator is capable of providing useful, consistent, faster, accurate and efficient risk level prediction to both the patients and the physicians at any time, online and on a real-time basis.

  16. [Research on Identification and Determination of Pesticides in Apples Using Raman Spectroscopy].

    PubMed

    Zhai, Chen; Peng, Yan-kun; Li, Yong-yu; Dhakal, Sagar; Xu, Tian-feng; Guo, Lang-hua

    2015-08-01

    Raman spectroscopy combined with chemometric methods has been thought to an efficient method for identification and determination of pesticide residues in fruits and vegetables. In the present research, a rapid and nondestructive method was proposed and testified based on self-developed Raman system for the identification and determination of deltamethrin and acetamiprid remaining in apple. The peaks of Raman spectra at 574 and 843 cm(-1) can be used to identify deltamethrin and acetamiprid, respectively, the characteristic peaks of deltamethrin and acetamiprid were still visible when the concentrations of the two pesticides were 0.78 and 0.15 mg · kg(-1) in apples samples, respectively. Calibration models of pesticide content were developed by partial least square (PLS) algorithm with different spectra pretreatment methods (Savitzky-Golay smoothing, first derivative transformation, second derivative transformation, baseline calibration, standard normal variable transformation). The baseline calibration methods by 8th order polynomial fitting gave the best results. For deltamethrin, the obtained prediction coefficient (Rp) value from PLS model for the results of prediction and gas chromatography measurement was 0.94; and the root mean square error of prediction (RMSEP) was 0.55 mg · kg(-1). The values of Rp and RMSEP were respective 0.85 and 0.12 mg · kg(-1) for acetamiprid. According to the detect performance, applying Raman technology in the nondestructive determination of pesticide residuals in apples is feasible. In consideration of that it needs no pretreatment before spectra collection and causes no damage to sample, this technology can be used in detection department, fruit and vegetable processing enterprises, supermarket, and vegetable market. The result of this research is promising for development of industrially feasible technology for rapid, nondestructive and real time detection of different types of pesticide with its concentration in apples. This supplies a rapid nondestructive and environmentally friendly way for the determination of fruit and vegetable quality and safety.

  17. Ensemble framework based real-time respiratory motion prediction for adaptive radiotherapy applications.

    PubMed

    Tatinati, Sivanagaraja; Nazarpour, Kianoush; Tech Ang, Wei; Veluvolu, Kalyana C

    2016-08-01

    Successful treatment of tumors with motion-adaptive radiotherapy requires accurate prediction of respiratory motion, ideally with a prediction horizon larger than the latency in radiotherapy system. Accurate prediction of respiratory motion is however a non-trivial task due to the presence of irregularities and intra-trace variabilities, such as baseline drift and temporal changes in fundamental frequency pattern. In this paper, to enhance the accuracy of the respiratory motion prediction, we propose a stacked regression ensemble framework that integrates heterogeneous respiratory motion prediction algorithms. We further address two crucial issues for developing a successful ensemble framework: (1) selection of appropriate prediction methods to ensemble (level-0 methods) among the best existing prediction methods; and (2) finding a suitable generalization approach that can successfully exploit the relative advantages of the chosen level-0 methods. The efficacy of the developed ensemble framework is assessed with real respiratory motion traces acquired from 31 patients undergoing treatment. Results show that the developed ensemble framework improves the prediction performance significantly compared to the best existing methods. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. Flexible functional regression methods for estimating individualized treatment regimes.

    PubMed

    Ciarleglio, Adam; Petkova, Eva; Tarpey, Thaddeus; Ogden, R Todd

    2016-01-01

    A major focus of personalized medicine is on the development of individualized treatment rules. Good decision rules have the potential to significantly advance patient care and reduce the burden of a host of diseases. Statistical methods for developing such rules are progressing rapidly, but few methods have considered the use of pre-treatment functional data to guide in decision-making. Furthermore, those methods that do allow for the incorporation of functional pre-treatment covariates typically make strong assumptions about the relationships between the functional covariates and the response of interest. We propose two approaches for using functional data to select an optimal treatment that address some of the shortcomings of previously developed methods. Specifically, we combine the flexibility of functional additive regression models with Q -learning or A -learning in order to obtain treatment decision rules. Properties of the corresponding estimators are discussed. Our approaches are evaluated in several realistic settings using synthetic data and are applied to real data arising from a clinical trial comparing two treatments for major depressive disorder in which baseline imaging data are available for subjects who are subsequently treated.

  19. Evaluation of Sleep by Detrended Fluctuation Analysis of the Heartbeat

    NASA Astrophysics Data System (ADS)

    Yazawa, Toru; Shimoda, Yukio; Hutapea, Albert M.

    2011-08-01

    There are already-established methods for investigating biological signals such as rhythmic heartbeats. We used detrended fluctuation analysis (DFA), originally developed by Peng et al. (1995) to check power-law characteristics, because the method can quantify the heart condition numerically. In this article, we studied the heartbeat of sleeping subjects. Our purpose was to test whether DFA is useful to evaluate the subject's wellness of both during being awake and sleeping. This is a challenge to measure sleep without complex/expensive machine, an electro encephalography (EEG). We conducted electrophysiological recording to measure heartbeats during sleep using electrocardiograph with three-leads, one ground electrode and two active electrodes attached to chest. For good recording, a stable baseline must be maintained even when subjects move their body. We needed a tool to ensure long-term steady recording. We thus invented a new electric-circuit designed to produce this desired result. This gadget allowed us to perform heartbeat recording without any drifting baseline. We then were able to detect 100% of heartbeat peaks over the entire period of sleep. Here, we show a case study as empirical evidence that DFA is useful numerical method for quantifying sleep by using the scaling exponents.

  20. Intravenous Solutions for Exploration Missions

    NASA Technical Reports Server (NTRS)

    Miller, Fletcher J.; Niederhaus, Charles; Barlow, Karen; Griffin, DeVon

    2007-01-01

    This paper describes the intravenous (IV) fluids requirements being developed for medical care during NASA s future exploration class missions. Previous research on IV solution generation and mixing in space is summarized. The current exploration baseline mission profiles are introduced, potential medical conditions described and evaluated for fluidic needs, and operational issues assessed. We briefly introduce potential methods for generating IV fluids in microgravity. Conclusions on the recommended fluid volume requirements are presented.

  1. Creating a fuels baseline and establishing fire frequency relationships to develop a landscape management strategy at the Savannah River Site

    Treesearch

    Bernard R. Parresol; Dan Shea; Roger Ottmar

    2006-01-01

    The Savannah River Site is a Department of Energy Nuclear Defense Facility and a National Environmental Research Park located in the upper coastal plain of South Carolina. Prescribed burning is conducted on 15,000 to 20,000 ac annually. We modified standard forest inventory methods to incorporate a complete assessment of fuel components on 622 plots, assessing coarse...

  2. Modelling fluid accumulation in the neck using simple baseline fluid metrics: implications for sleep apnea.

    PubMed

    Vena, Daniel; Yadollahi, A; Bradley, T Douglas

    2014-01-01

    Obstructive sleep apnea (OSA) is a common respiratory disorder among adults. Recently we have shown that sedentary lifestyle causes an increase in diurnal leg fluid volume (LFV), which can shift into the neck at night when lying down to sleep and increase OSA severity. The purpose of this work was to investigate various metrics that represent baseline fluid retention in the legs and examine their correlation with neck fluid volume (NFV) and to develop a robust model for predicting fluid accumulation in the neck. In 13 healthy awake non-obese men, LFV and NFV were recorded continuously and simultaneously while standing for 5 minutes and then lying supine for 90 minutes. Simple regression was used to examine correlations between baseline LFV, baseline neck circumference (NC) and change in LFV with the outcome variables: change in NC (ΔNC) and in NFV (ΔNFV90) after lying supine for 90 minutes. An exhaustive grid search was implemented to find combinations of input variables which best modeled outcomes. We found strong positive correlations between baseline LFV (supine and standing) and ΔNFV90. Models developed for predicting ΔNFV90 included baseline standing LFV, baseline NC combined with change in LFV after lying supine for 90 minutes. These correlations and the developed models suggest that a greater baseline LFV might contribute to increased fluid accumulation in the neck. These results give more evidence that sedentary lifestyle might play a role in the pathogenesis of OSA by increasing the baseline LFV. The best models for predicting ΔNC include baseline LFV and NC; they improved accuracies of estimating ΔNC over individual predictors, suggesting that a combination of baseline fluid metrics is a good predictor of the change in NC while lying supine. Future work is aimed at adding additional baseline demographic features to improve model accuracy and eventually use it as a screening tool to predict severity of OSA prior to sleep.

  3. Identifying critically ill patients who benefit the most from nutrition therapy: the development and initial validation of a novel risk assessment tool

    PubMed Central

    2011-01-01

    Introduction To develop a scoring method for quantifying nutrition risk in the intensive care unit (ICU). Methods A prospective, observational study of patients expected to stay > 24 hours. We collected data for key variables considered for inclusion in the score which included: age, baseline APACHE II, baseline SOFA score, number of comorbidities, days from hospital admission to ICU admission, Body Mass Index (BMI) < 20, estimated % oral intake in the week prior, weight loss in the last 3 months and serum interleukin-6 (IL-6), procalcitonin (PCT), and C-reactive protein (CRP) levels. Approximate quintiles of each variable were assigned points based on the strength of their association with 28 day mortality. Results A total of 597 patients were enrolled in this study. Based on the statistical significance in the multivariable model, the final score used all candidate variables except BMI, CRP, PCT, estimated percentage oral intake and weight loss. As the score increased, so did mortality rate and duration of mechanical ventilation. Logistic regression demonstrated that nutritional adequacy modifies the association between the score and 28 day mortality (p = 0.01). Conclusions This scoring algorithm may be helpful in identifying critically ill patients most likely to benefit from aggressive nutrition therapy. PMID:22085763

  4. Regression to the Mean and Changes in Risk Behavior following Study Enrollment in a Cohort of US Women at Risk for HIV

    PubMed Central

    Hughes, James P.; Haley, Danielle F.; Frew, Paula M.; Golin, Carol E.; Adimora, Adaora A; Kuo, Irene; Justman, Jessica; Soto-Torres, Lydia; Wang, Jing; Hodder, Sally

    2015-01-01

    Purpose Reductions in risk behaviors are common following enrollment in HIV prevention studies. We develop methods to quantify the proportion of change in risk behaviors that can be attributed to regression to the mean versus study participation and other factors. Methods A novel model that incorporates both regression to the mean and study participation effects is developed for binary measures. The model is used to estimate the proportion of change in the prevalence of “unprotected sex in the past 6 months” that can be attributed to study participation versus regression to the mean in a longitudinal cohort of women at risk for HIV infection who were recruited from ten US communities with high rates of HIV and poverty. HIV risk behaviors were evaluated using audio computer-assisted self-interviews at baseline and every 6 months for up to 12 months. Results The prevalence of “unprotected sex in the past 6 months” declined from 96% at baseline to 77% at 12 months. However, this change could be almost completely explained by regression to the mean. Conclusions Analyses that examine changes over time in cohorts selected for high or low risk behaviors should account for regression to the mean effects. PMID:25883065

  5. Can Signal Abnormalities Detected with MR Imaging in Knee Articular Cartilage Be Used to Predict Development of Morphologic Cartilage Defects? 48-Month Data from the Osteoarthritis Initiative

    PubMed Central

    Gersing, Alexandra S.; Mbapte Wamba, John; Nevitt, Michael C.; McCulloch, Charles E.; Link, Thomas M.

    2016-01-01

    Purpose To determine the incidence with which morphologic articular cartilage defects develop over 48 months in cartilage with signal abnormalities at baseline magnetic resonance (MR) imaging in comparison with the incidence in articular cartilage without signal abnormalities at baseline. Materials and Methods The institutional review boards of all participating centers approved this HIPAA-compliant study. Right knees of 90 subjects from the Osteoarthritis Initiative (mean age, 55 years ± 8 [standard deviation]; 51% women) with cartilage signal abnormalities but without morphologic cartilage defects at 3.0-T MR imaging and without radiographic osteoarthritis (Kellgren-Lawrence score, 0–1) were frequency matched for age, sex, Kellgren-Lawrence score, and body mass index with right knees in 90 subjects without any signal abnormalities or morphologic defects in the articular cartilage (mean age, 54 years ± 5; 51% women). Individual signal abnormalities (n = 126) on intermediate-weighted fast spin-echo MR images were categorized into four subgrades: subgrade A, hypointense; subgrade B, inhomogeneous; subgrade C, hyperintense; and subgrade D, hyperintense with swelling. The development of morphologic articular cartilage defects (Whole-Organ MR Imaging Score ≥2) at 48 months was analyzed on a compartment level and was compared between groups by using generalized estimating equation logistic regression models. Results Cartilage signal abnormalities were more frequent in the patellofemoral joint than in the tibiofemoral joint (59.5% vs 39.5%). Subgrade A was seen more frequently than were subgrades C and D (36% vs 22%). Incidence of morphologic cartilage defects at 48 months was 57% in cartilage with baseline signal abnormalities, while only 4% of compartments without baseline signal abnormalities developed morphologic defects at 48 months (all compartments combined and each compartment separately, P < .01). The development of morphologic defects was not significantly more likely in any of the subgrades (P = .98) and was significantly associated with progression of bone marrow abnormalities (P = .002). Conclusion Knee cartilage signal abnormalities detected with MR imaging are precursors of morphologic defects with osteoarthritis and may serve as imaging biomarkers with which to assess risk for cartilage degeneration. © RSNA, 2016 PMID:27135833

  6. Enhancement of lung sounds based on empirical mode decomposition and Fourier transform algorithm.

    PubMed

    Mondal, Ashok; Banerjee, Poulami; Somkuwar, Ajay

    2017-02-01

    There is always heart sound (HS) signal interfering during the recording of lung sound (LS) signals. This obscures the features of LS signals and creates confusion on pathological states, if any, of the lungs. In this work, a new method is proposed for reduction of heart sound interference which is based on empirical mode decomposition (EMD) technique and prediction algorithm. In this approach, first the mixed signal is split into several components in terms of intrinsic mode functions (IMFs). Thereafter, HS-included segments are localized and removed from them. The missing values of the gap thus produced, is predicted by a new Fast Fourier Transform (FFT) based prediction algorithm and the time domain LS signal is reconstructed by taking an inverse FFT of the estimated missing values. The experiments have been conducted on simulated and recorded HS corrupted LS signals at three different flow rates and various SNR levels. The performance of the proposed method is evaluated by qualitative and quantitative analysis of the results. It is found that the proposed method is superior to the baseline method in terms of quantitative and qualitative measurement. The developed method gives better results compared to baseline method for different SNR levels. Our method gives cross correlation index (CCI) of 0.9488, signal to deviation ratio (SDR) of 9.8262, and normalized maximum amplitude error (NMAE) of 26.94 for 0 dB SNR value. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Development and validation of a simple and robust method for arsenic speciation in human urine using HPLC/ICP-MS.

    PubMed

    Sen, Indranil; Zou, Wei; Alvaran, Josephine; Nguyen, Linda; Gajek, Ryszard; She, Jianwen

    2015-01-01

    In order to better distinguish the different toxic inorganic and organic forms of arsenic (As) exposure in individuals, we have developed and validated a simple and robust analytical method for determining the following six As species in human urine: arsenous (III) acid (As-III), As (V) acid, monomethylarsonic acid, dimethylarsinic acid, arsenobetaine (AsB), and arsenocholine. In this method, human urine is diluted using a pH 5.8 buffer, separation is performed using an anion exchange column with isocratic HPLC, and detection is achieved using inductively coupled plasma-MS. The method uses a single mobile phase consisting of low concentrations of both phosphate buffer (5 mM) and ammonium nitrate salt (5 mM) at pH 9.0; this minimizes the column equilibration time and overcomes challenges with separation between AsB and As-III. In addition, As-III oxidation is prevented by degassing the sample preparation buffer at pH 5.8, degassing the mobile phase online at pH 9.0, and by the use of low temperature (-70 °C) and flip-cap airtight tubes for long term storage of samples. The method was validated using externally provided reference samples. Results were in agreement with target values at varying concentrations and successfully passed external performance test criteria. Internal QC samples were prepared and repeatedly analyzed to assess the method's long-term precision, and further analyses were completed on anonymous donor urine to assess the quality of the method's baseline separation. Results from analyses of external reference samples agreed with target values at varying concentrations, and results from precision studies yielded absolute CV values of 3-14% and recovery from 82 to 115% for the six As species. Analysis of anonymous donor urine confirmed the well-resolved baseline separation capabilities of the method for real participant samples.

  8. Baseline models of trace elements in major aquifers of the United States

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace-element concentrations in baseline samples from a survey of aquifers used as potable-water supplies in the United States are summarized using methods appropriate for data with multiple detection limits. The resulting statistical distribution models are used to develop summary statistics and estimate probabilities of exceeding water-quality standards. The models are based on data from the major aquifer studies of the USGS National Water Quality Assessment (NAWQA) Program. These data were produced with a nationally-consistent sampling and analytical framework specifically designed to determine the quality of the most important potable groundwater resources during the years 1991-2001. The analytical data for all elements surveyed contain values that were below several detection limits. Such datasets are referred to as multiply-censored data. To address this issue, a robust semi-parametric statistical method called regression on order statistics (ROS) is employed. Utilizing the 90th-95th percentile as an arbitrary range for the upper limits of expected baseline concentrations, the models show that baseline concentrations of dissolved Ba and Zn are below 500 ??g/L. For the same percentile range, dissolved As, Cu and Mo concentrations are below 10 ??g/L, and dissolved Ag, Be, Cd, Co, Cr, Ni, Pb, Sb and Se are below 1-5 ??g/L. These models are also used to determine the probabilities that potable ground waters exceed drinking water standards. For dissolved Ba, Cr, Cu, Pb, Ni, Mo and Se, the likelihood of exceeding the US Environmental Protection Agency standards at the well-head is less than 1-1.5%. A notable exception is As, which has approximately a 7% chance of exceeding the maximum contaminant level (10 ??g/L) at the well head.

  9. Optical Coherence Tomography Reflective Drusen Substructures Predict Progression to Geographic Atrophy in Non-neovascular Age-related Macular Degeneration

    PubMed Central

    Veerappan, Malini; El-Hage Sleiman, Abdul-Karim M.; Tai, Vincent; Chiu, Stephanie J.; Winter, Katrina P.; Stinnett, Sandra S.; Hwang, Thomas S.; Hubbard, G. Baker; Michelson, Michelle; Gunther, Randall; Wong, Wai T.; Chew, Emily Y.; Toth, Cynthia A.

    2016-01-01

    Purpose Structural and compositional heterogeneity within drusen, composed of lipid, carbohydrates, and proteins, have been previously described. We sought to detect and define phenotypic patterns of drusen heterogeneity in the form of optical coherence tomography–reflective drusen substructures (ODS) and examine their associations with age-related macular degeneration (AMD)-related features and AMD progression. Design Retrospective analysis in a prospective study. Participants Patients with intermediate AMD (n = 349) enrolled in the multicenter Age-Related Eye Disease Study 2 (AREDS2) ancillary spectral domain optical coherence tomography (SD OCT) study. Methods Baseline SD OCT scans of 1 eye per patient were analyzed for presence of ODS. Cross-sectional and longitudinal associations of ODS presence with AMD-related features visible on SD OCT and color photographs, including drusen volume, geographic atrophy (GA), and preatrophic features, were evaluated for the entire macular region. Similar associations were also made locally within a 0.5-mm diameter region around individual ODS and corresponding control region without ODS in the same eye. Main Outcome Measures Preatrophy SD OCT changes and GA, central GA, and choroidal neovascularization (CNV) from color photographs. Results Four phenotypic subtypes of ODS were defined: low reflective cores, high reflective cores, conical debris, and split drusen. Of the 349 participants, there were 307 eligible eyes and 74 (24%) had at least 1 ODS. The ODS at baseline were associated with (1) greater macular drusen volume at baseline (P < 0.001), (2) development of preatrophic changes at year 2 (P = 0.001–0.01), and (3) development of macular GA (P = 0.005) and preatrophic changes at year 3 (P = 0.002–0.008), but not development of CNV. The ODS at baseline in a local region were associated with (1) presence of preatrophy changes at baseline (P = 0.02-0.03) and (2) development of preatrophy changes at years 2 and 3 within the region (P = 0.008-0.05). Conclusions Optical coherence tomography–reflective drusen substructures are optical coherence tomography–based biomarkers of progression to GA, but not to CNV, in eyes with intermediate AMD. Optical coherence tomography–reflective drusen substructures may be a clinical entity helpful in monitoring AMD progression and informing mechanisms in GA pathogenesis. PMID:27793356

  10. [Environmental geochemical baseline of heavy metals in soils of the Ili river basin and pollution evaluation].

    PubMed

    Zhao, Xin-Ru; Nasier, Telajin; Cheng, Yong-Yi; Zhan, Jiang-Yu; Yang, Jian-Hong

    2014-06-01

    Environmental geochemical baseline models of Cu, Zn, Pb, As, Hg were established by standardized method in the ehernozem, chestnut soil, sierozem and saline soil from the Ili river valley region. The theoretical baseline values were calculated. Baseline factor pollution index evaluation method, environmental background value evaluation method and heavy metal cleanliness evaluation method were used to compare soil pollution degrees. The baseline factor pollution index evaluation showed that As pollution was the most prominent among the four typical types of soils within the river basin, with 7.14%, 9.76%, 7.50% of sampling points in chernozem, chestnut soil and sierozem reached the heavy pollution, respectively. 7.32% of sampling points of chestnut soil reached the permitted heavy metal Pb pollution index in the chestnut soil. The variation extent of As and Pb was the largest, indicating large human disturbance. Environmental background value evaluation showed that As was the main pollution element, followed by Cu, Zn and Pb. Heavy metal cleanliness evaluation showed that Cu, Zn and Pb were better than cleanliness level 2 and Hg was the of cleanliness level 1 in all four types of soils. As showed moderate pollution in sierozem, and it was of cleanliness level 2 or better in chernozem, chestnut soil and saline-alkali soil. Comparing the three evaluation systems, the baseline factor pollution index evaluation more comprehensively reflected the geochemical migration characteristics of elements and the soil formation processes, and the pollution assessment could be specific to the sampling points. The environmental background value evaluation neglected the natural migration of heavy metals and the deposition process in the soil since it was established on the regional background values. The main purpose of the heavy metal cleanliness evaluation was to evaluate the safety degree of soil environment.

  11. Assessment of dental caries predictors in 6-year-old school children - results from 5-year retrospective cohort study

    PubMed Central

    2012-01-01

    Background This was a retrospective cohort study undertaken to assess the rate and pattern of dental caries development in 6-year-old school children followed-up for a period of 5 years, and to identify baseline risk factors that were associated with 5 years caries experience in Malaysian children. Methods This 5-years retrospective cohort study comprised primary school children initially aged 6 years in 2004. Caries experience of each child was recorded annually using World Health Organization criteria. The rates of dental caries were recorded in prevalence and incidence density of carious lesions from baseline to final examination. Risk assessment was done to assess relative risk for caries after 5 years in children with baseline caries status. Simple and multiple logistic regression analysis were performed to identify significant independent risk factors for caries. Results The sample consisted of 1830 school children. All components of DMFT showed significant differences between baseline and final examination. Filled teeth (FT) component of the DMFT showed the greatest increases. Results revealed the initial baseline caries level in permanent dentition was a strong predictor for future caries after 5 years (RR=3.78, 95% CI=3.48-4.10, P<0.001). Logistic regression analysis showed significant association between caries occurrence and residence (urban/rural) (OR=1.80, P<0.001). However, it was not significantly associated with gender and ethnicity. The incidence density of caries, affected persons (IDp) observed from baseline and after 5 years was 5.80 persons/100 person-year of observation. The rate of new caries-affected tooth (IDt) in the period from baseline and after 5-years was 0.76 teeth/100 teeth-year of observation. Conclusion The majority of 12-year-old school children (70%) were caries-free and most of the caries were concentrated in only a small proportion (30%) of them. We found that the presence of caries in permanent teeth at the age of 6 years was a strong predictor of future caries development in this population. The strong evidence of early permanent teeth caries at six years old to predict future caries incidence at 12-year-olds, which could be obtained at almost no cost, questions the need for and cost-effectiveness of expensive technology-based commercial caries predictions kits. PMID:23158416

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Dong; Heidelberger, Philip; Sugawara, Yutaka

    An apparatus and method for extending the scalability and improving the partitionability of networks that contain all-to-all links for transporting packet traffic from a source endpoint to a destination endpoint with low per-endpoint (per-server) cost and a small number of hops. An all-to-all wiring in the baseline topology is decomposed into smaller all-to-all components in which each smaller all-to-all connection is replaced with star topology by using global switches. Stacking multiple copies of the star topology baseline network creates a multi-planed switching topology for transporting packet traffic. Point-to-point unified stacking method using global switch wiring methods connects multiple planes ofmore » a baseline topology by using the global switches to create a large network size with a low number of hops, i.e., low network latency. Grouped unified stacking method increases the scalability (network size) of a stacked topology.« less

  13. Different antidiabetic regimens and the development of renal dysfunction in US Veterans with type 2 diabetes mellitus.

    PubMed

    Gosmanova, Elvira O; Canada, Robert B; Wan, Jim; Mangold, Therese A; Wall, Barry M

    2012-10-01

    The aim of this study was to evaluate the development of renal dysfunction in veterans with type 2 diabetes mellitus (T2DM) treated with different antidiabetic regimens. A retrospective cohort study involving 1715 patients with T2DM and baseline serum creatinine (SCr) of 1.5 mg/dL or lesser. The development of renal dysfunction, defined as 0.5 mg/dL or greater increase from baseline SCr during 4.8 years of follow-up with monotherapy metformin (M), 2 combination therapy groups: metformin + insulin (MI) and metformin + sulfonylurea (MS) users were compared with changes observed in sulfonylurea monotherapy users (S). Both MI and MS groups had higher mean baseline hemoglobin A1C (HbA1C) (9.0 and 8.6%, respectively) and higher rates of baseline macroalbuminuria (17.3 and 12.1%, respectively) as compared with M and S groups (mean HbA1C7.7% in both groups, and proteinuria M-5.1% and S-7.4%). In unadjusted analysis, the development of renal dysfunction was more frequent in MI and MS but not in M group as compared with sulfonylurea monotherapy (unadjusted HRs and [95% confidence interval (CI), 2.1[1.4-3.0], 1.4[1.1-1.9], and 1.0[0.6-1.7], respectively). However, differences in the development of renal dysfunction were not significant between the 4 groups after adjusting for baseline variables. Baseline macroalbuminuria was a strong predictor of Scr elevation of 0.5 mg/dL or greater during follow-up (adjusted HR, 3.1[1.9-4.7]). Unexpectedly, baseline use of renin-angiotensin-aldosterone system blockers was also associated with the development of renal dysfunction (adjusted HR, 1.9[1.3-2.8]). In this retrospective cohort study involving US predominantly male veterans with T2DM, baseline macroalbuminuria and use of RAAS blockers were associated with increased risk of development of renal dysfunction, whereas different antidiabetic regimens were not.

  14. Large-scale linear rankSVM.

    PubMed

    Lee, Ching-Pei; Lin, Chih-Jen

    2014-04-01

    Linear rankSVM is one of the widely used methods for learning to rank. Although its performance may be inferior to nonlinear methods such as kernel rankSVM and gradient boosting decision trees, linear rankSVM is useful to quickly produce a baseline model. Furthermore, following its recent development for classification, linear rankSVM may give competitive performance for large and sparse data. A great deal of works have studied linear rankSVM. The focus is on the computational efficiency when the number of preference pairs is large. In this letter, we systematically study existing works, discuss their advantages and disadvantages, and propose an efficient algorithm. We discuss different implementation issues and extensions with detailed experiments. Finally, we develop a robust linear rankSVM tool for public use.

  15. Uncertainty of future projections of species distributions in mountainous regions.

    PubMed

    Tang, Ying; Winkler, Julie A; Viña, Andrés; Liu, Jianguo; Zhang, Yuanbin; Zhang, Xiaofeng; Li, Xiaohong; Wang, Fang; Zhang, Jindong; Zhao, Zhiqiang

    2018-01-01

    Multiple factors introduce uncertainty into projections of species distributions under climate change. The uncertainty introduced by the choice of baseline climate information used to calibrate a species distribution model and to downscale global climate model (GCM) simulations to a finer spatial resolution is a particular concern for mountainous regions, as the spatial resolution of climate observing networks is often insufficient to detect the steep climatic gradients in these areas. Using the maximum entropy (MaxEnt) modeling framework together with occurrence data on 21 understory bamboo species distributed across the mountainous geographic range of the Giant Panda, we examined the differences in projected species distributions obtained from two contrasting sources of baseline climate information, one derived from spatial interpolation of coarse-scale station observations and the other derived from fine-spatial resolution satellite measurements. For each bamboo species, the MaxEnt model was calibrated separately for the two datasets and applied to 17 GCM simulations downscaled using the delta method. Greater differences in the projected spatial distributions of the bamboo species were observed for the models calibrated using the different baseline datasets than between the different downscaled GCM simulations for the same calibration. In terms of the projected future climatically-suitable area by species, quantification using a multi-factor analysis of variance suggested that the sum of the variance explained by the baseline climate dataset used for model calibration and the interaction between the baseline climate data and the GCM simulation via downscaling accounted for, on average, 40% of the total variation among the future projections. Our analyses illustrate that the combined use of gridded datasets developed from station observations and satellite measurements can help estimate the uncertainty introduced by the choice of baseline climate information to the projected changes in species distribution.

  16. Uncertainty of future projections of species distributions in mountainous regions

    PubMed Central

    Tang, Ying; Viña, Andrés; Liu, Jianguo; Zhang, Yuanbin; Zhang, Xiaofeng; Li, Xiaohong; Wang, Fang; Zhang, Jindong; Zhao, Zhiqiang

    2018-01-01

    Multiple factors introduce uncertainty into projections of species distributions under climate change. The uncertainty introduced by the choice of baseline climate information used to calibrate a species distribution model and to downscale global climate model (GCM) simulations to a finer spatial resolution is a particular concern for mountainous regions, as the spatial resolution of climate observing networks is often insufficient to detect the steep climatic gradients in these areas. Using the maximum entropy (MaxEnt) modeling framework together with occurrence data on 21 understory bamboo species distributed across the mountainous geographic range of the Giant Panda, we examined the differences in projected species distributions obtained from two contrasting sources of baseline climate information, one derived from spatial interpolation of coarse-scale station observations and the other derived from fine-spatial resolution satellite measurements. For each bamboo species, the MaxEnt model was calibrated separately for the two datasets and applied to 17 GCM simulations downscaled using the delta method. Greater differences in the projected spatial distributions of the bamboo species were observed for the models calibrated using the different baseline datasets than between the different downscaled GCM simulations for the same calibration. In terms of the projected future climatically-suitable area by species, quantification using a multi-factor analysis of variance suggested that the sum of the variance explained by the baseline climate dataset used for model calibration and the interaction between the baseline climate data and the GCM simulation via downscaling accounted for, on average, 40% of the total variation among the future projections. Our analyses illustrate that the combined use of gridded datasets developed from station observations and satellite measurements can help estimate the uncertainty introduced by the choice of baseline climate information to the projected changes in species distribution. PMID:29320501

  17. [Calculation on ecological security baseline based on the ecosystem services value and the food security].

    PubMed

    He, Ling; Jia, Qi-jian; Li, Chao; Xu, Hao

    2016-01-01

    The rapid development of coastal economy in Hebei Province caused rapid transition of coastal land use structure, which has threatened land ecological security. Therefore, calculating ecosystem service value of land use and exploring ecological security baseline can provide the basis for regional ecological protection and rehabilitation. Taking Huanghua, a city in the southeast of Hebei Province, as an example, this study explored the joint point, joint path and joint method between ecological security and food security, and then calculated the ecological security baseline of Huanghua City based on the ecosystem service value and the food safety standard. The results showed that ecosystem service value of per unit area from maximum to minimum were in this order: wetland, water, garden, cultivated land, meadow, other land, salt pans, saline and alkaline land, constructive land. The order of contribution rates of each ecological function value from high to low was nutrient recycling, water conservation, entertainment and culture, material production, biodiversity maintenance, gas regulation, climate regulation and environmental purification. The security baseline of grain production was 0.21 kg · m⁻², the security baseline of grain output value was 0.41 yuan · m⁻², the baseline of ecosystem service value was 21.58 yuan · m⁻², and the total of ecosystem service value in the research area was 4.244 billion yuan. In 2081 the ecological security will reach the bottom line and the ecological system, in which human is the subject, will be on the verge of collapse. According to the ecological security status, Huanghua can be divided into 4 zones, i.e., ecological core protection zone, ecological buffer zone, ecological restoration zone and human activity core zone.

  18. Cyber and Traditional Bullying Victimization as a Risk Factor for Mental Health Problems and Suicidal Ideation in Adolescents

    PubMed Central

    Bannink, Rienke; Broeren, Suzanne; van de Looij – Jansen, Petra M.; de Waart, Frouwkje G.; Raat, Hein

    2014-01-01

    Purpose To examine whether traditional and cyber bullying victimization were associated with adolescent's mental health problems and suicidal ideation at two-year follow-up. Gender differences were explored to determine whether bullying affects boys and girls differently. Methods A two-year longitudinal study was conducted among first-year secondary school students (N = 3181). Traditional and cyber bullying victimization were assessed at baseline, whereas mental health status and suicidal ideation were assessed at baseline and follow-up by means of self-report questionnaires. Logistic regression analyses were conducted to assess associations between these variables while controlling for baseline problems. Additionally, we tested whether gender differences in mental health and suicidal ideation were present for the two types of bullying. Results There was a significant interaction between gender and traditional bullying victimization and between gender and cyber bullying victimization on mental health problems. Among boys, traditional and cyber bullying victimization were not related to mental health problems after controlling for baseline mental health. Among girls, both traditional and cyber bullying victimization were associated with mental health problems after controlling for baseline mental health. No significant interaction between gender and traditional or cyber bullying victimization on suicidal ideation was found. Traditional bullying victimization was associated with suicidal ideation, whereas cyber bullying victimization was not associated with suicidal ideation after controlling for baseline suicidal ideation. Conclusions Traditional bullying victimization is associated with an increased risk of suicidal ideation, whereas traditional, as well as cyber bullying victimization is associated with an increased risk of mental health problems among girls. These findings stress the importance of programs aimed at reducing bullying behavior, especially because early-onset mental health problems may pose a risk for the development of psychiatric disorders in adulthood. PMID:24718563

  19. Receptivity to alcohol marketing predicts initiation of alcohol use

    PubMed Central

    Henriksen, Lisa; Feighery, Ellen C.; Schleicher, Nina C.; Fortmann, Stephen P.

    2008-01-01

    Purpose This longitudinal study examined the influence of alcohol advertising and promotions on the initiation of alcohol use. A measure of receptivity to alcohol marketing was developed from research about tobacco marketing. Recall and recognition of alcohol brand names were also examined. Methods Data were obtained from in-class surveys of 6th, 7th, and 8th graders at baseline and 12-month follow-up. Participants who were classified as never drinkers at baseline (n=1,080) comprised the analysis sample. Logistic regression models examined the association of advertising receptivity at baseline with any alcohol use and current drinking at follow-up, adjusting for multiple risk factors, including peer alcohol use, school performance, risk taking, and demographics. Results At baseline, 29% of never drinkers either owned or wanted to use an alcohol branded promotional item (high receptivity), 12% students named the brand of their favorite alcohol ad (moderate receptivity) and 59% were not receptive to alcohol marketing. Approximately 29% of adolescents reported any alcohol use at follow-up; 13% reported drinking at least 1 or 2 days in the past month. Never drinkers who reported high receptivity to alcohol marketing at baseline were 77% more likely to initiate drinking by follow-up than those were not receptive. Smaller increases in the odds of alcohol use at follow-up were associated with better recall and recognition of alcohol brand names at baseline. Conclusions Alcohol advertising and promotions are associated with the uptake of drinking. Prevention programs may reduce adolescents’ receptivity to alcohol marketing by limiting their exposure to alcohol ads and promotions and by increasing their skepticism about the sponsors’ marketing tactics. PMID:18155027

  20. Atmospheric phase characteristics of the ALMA long baseline

    NASA Astrophysics Data System (ADS)

    Matsushita, Satoki; Asaki, Yoshiharu; Fomalont, Edward B.; Barkats, Denis; Corder, Stuartt A.; Hills, Richard E.; Kawabe, Ryohei; Maud, Luke T.; Morita, Koh-Ichiro; Nikolic, Bojan; Tilanus, Remo P. J.; Vlahakis, Catherine

    2016-07-01

    Atacama Large Millimeter/submillimeter Array (ALMA) is the world's largest millimeter/ submillimeter (mm / Submm) interferometer. Along with science observations, ALMA has performed several long baseline campaigns in the last 6 years to characterize and optimize its long baseline capabilities. To achieve full long baseline capability of ALMA, it is important to understand the characteristics of atmospheric phase fluctuation at long baselines, since it is believed to be the main cause of mm/submm image degradation. For the first time, we present detailed properties of atmospheric phase fluctuation at mm/submm wavelength from baselines up to 15 km in length. Atmospheric phase fluctuation increases as a function of baseline length with a power-law slope close to 0.6, and many of the data display a shallower slope (02.-03) at baseline length greater than about 15 km. Some of the data, on the other hand, show a single slope up to the maximum baseline length of around 15 km. The phase correction method based on water vapor radiometers (WVRs) works well, especially for cases with precipitable water vapor (PWV) greater than 1 mm, typically yielding a 50% decrease or more in the degree of phase fluctuation. However, signicant amount of atmospheric phase fluctuation still remains after the WVR phase correction: about 200 micron in rms excess path length (rms phase fluctuation in unit of length) even at PWV less than 1 mm. This result suggests the existence of other non-water-vapor sources of phase fluctuation. and emphasizes the need for additional phase correction methods, such as band-to-band and/or fast switching.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsuta, Y; Tohoku University Graduate School of Medicine, Sendal, Miyagi; Kadoya, N

    Purpose: In this study, we developed a system to calculate three dimensional (3D) dose that reflects dosimetric error caused by leaf miscalibration for head and neck and prostate volumetric modulated arc therapy (VMAT) without additional treatment planning system calculation on real time. Methods: An original system called clarkson dose calculation based dosimetric error calculation to calculate dosimetric error caused by leaf miscalibration was developed by MATLAB (Math Works, Natick, MA). Our program, first, calculates point doses at isocenter for baseline and modified VMAT plan, which generated by inducing MLC errors that enlarged aperture size of 1.0 mm with clarkson dosemore » calculation. Second, error incuced 3D dose was generated with transforming TPS baseline 3D dose using calculated point doses. Results: Mean computing time was less than 5 seconds. For seven head and neck and prostate plans, between our method and TPS calculated error incuced 3D dose, the 3D gamma passing rates (0.5%/2 mm, global) are 97.6±0.6% and 98.0±0.4%. The dose percentage change with dose volume histogram parameter of mean dose on target volume were 0.1±0.5% and 0.4±0.3%, and with generalized equivalent uniform dose on target volume were −0.2±0.5% and 0.2±0.3%. Conclusion: The erroneous 3D dose calculated by our method is useful to check dosimetric error caused by leaf miscalibration before pre treatment patient QA dosimetry checks.« less

  2. Conditional analysis of mixed Poisson processes with baseline counts: implications for trial design and analysis.

    PubMed

    Cook, Richard J; Wei, Wei

    2003-07-01

    The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).

  3. DEM generation in cloudy-rainy mountainous area with multi-baseline SAR interferometry

    NASA Astrophysics Data System (ADS)

    Wu, Hong'an; Zhang, Yonghong; Jiang, Decai; Kang, Yonghui

    2018-03-01

    Conventional singe baseline InSAR is easily affected by atmospheric artifacts, making it difficult to generate highprecision DEM. To solve this problem, in this paper, a multi-baseline interferometric phase accumulation method with weights fixed by coherence is proposed to generate higher accuracy DEM. The mountainous area in Kunming, Yunnan Province, China is selected as study area, which is characterized by cloudy weather, rugged terrain and dense vegetation. The multi-baseline InSAR experiments are carried out by use of four ALOS-2 PALSAR-2 images. The generated DEM is evaluated by Chinese Digital Products of Fundamental Geographic Information 1:50000 DEM. The results demonstrate that: 1) the proposed method can reduce atmospheric artifacts significantly; 2) the accuracy of InSAR DEM generated by six interferograms satisfies the standard of 1:50000 DEM Level Three and American DTED-1.

  4. Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-09-20

    These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less

  5. Comparative Study of Drift Compensation Methods for Environmental Gas Sensors

    NASA Astrophysics Data System (ADS)

    Abidin, M. Z.; Asmat, Arnis; Hamidon, M. N.

    2018-02-01

    Most drift compensation attempts in environmental gas sensors are only emphasize on the “already-known” drift-causing parameter (i.e., ambient temperature, relative humidity) in compensating the sensor drift. Less consideration is taken to another parameter (i.e., baseline responses) that might have affected indirectly with the promotion of drift-causing parameter variable (in this context, is ambient temperature variable). In this study, the “indirect” drift-causing parameter (drifted baseline responses) has been taken into consideration in compensating the sensor drift caused by ambient temperature variable, by means of a proposed drift compensation method (named as RT-method). The effectiveness of this method in its efficacy of compensating drift was analysed and compared with the common method that used the “already-known” drift-causing parameter (named as T-method), using drift reduction percentage. From the results analysis, the RT-method has outperformed T- method in the drift reduction percentage, with its ability to reduce drift up to 64% rather than the T-method which only able to reduce up to 45% for TGS2600 sensor. It has proven that the inclusion of drifted baseline responses into drift compensation attempt would resulted to an improved drift compensation efficiency.

  6. Developing and Piloting a Baselining Tool for Education for Sustainable Development and Global Citizenship (ESDGC) in Welsh Higher Education

    ERIC Educational Resources Information Center

    Glover, Alison; Jones, Yvonne; Claricoates, Jane; Morgan, Jan; Peters, Carl

    2013-01-01

    Mainstreaming Education for Sustainable Development in higher education is vital if graduates are to possess the abilities, skills, and knowledge needed to tackle the sustainability issues of the future. In this article we explain the development and piloting of a baselining tool, the Education for Sustainable Development and Global Citizenship…

  7. A New Strategy for ECG Baseline Wander Elimination Using Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Shahbakhti, Mohammad; Bagheri, Hamed; Shekarchi, Babak; Mohammadi, Somayeh; Naji, Mohsen

    2016-06-01

    Electrocardiogram (ECG) signals might be affected by various artifacts and noises that have biological and external sources. Baseline wander (BW) is a low-frequency artifact that may be caused by breathing, body movements and loose sensor contact. In this paper, a novel method based on empirical mode decomposition (EMD) for removal of baseline noise from ECG is presented. When compared to other EMD-based methods, the novelty of this research is to reach the optimized number of decomposed levels for ECG BW de-noising using mean power frequency (MPF), while the reduction of processing time is considered. To evaluate the performance of the proposed method, a fifth-order Butterworth high pass filtering (BHPF) with cut-off frequency at 0.5Hz and wavelet approach are applied. Three performance indices, signal-to-noise ratio (SNR), mean square error (MSE) and correlation coefficient (CC), between pure and filtered signals have been utilized for qualification of presented techniques. Results suggest that the EMD-based method outperforms the other filtering method.

  8. A Fast Radio Burst Search Method for VLBI Observation

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Tong, Fengxian; Zheng, Weimin; Zhang, Juan; Tong, Li

    2018-02-01

    We introduce the cross-spectrum-based fast radio burst (FRB) search method for Very Long Baseline Interferometer (VLBI) observation. This method optimizes the fringe fitting scheme in geodetic VLBI data post-processing, which fully utilizes the cross-spectrum fringe phase information and therefore maximizes the power of single-pulse signals. Working with cross-spectrum greatly reduces the effect of radio frequency interference compared with using auto-power spectrum. Single-pulse detection confidence increases by cross-identifying detections from multiple baselines. By combining the power of multiple baselines, we may improve the detection sensitivity. Our method is similar to that of coherent beam forming, but without the computational expense to form a great number of beams to cover the whole field of view of our telescopes. The data processing pipeline designed for this method is easy to implement and parallelize, which can be deployed in various kinds of VLBI observations. In particular, we point out that VGOS observations are very suitable for FRB search.

  9. "Together at school"--a school-based intervention program to promote socio-emotional skills and mental health in children: study protocol for a cluster randomized controlled trial.

    PubMed

    Björklund, Katja; Liski, Antti; Samposalo, Hanna; Lindblom, Jallu; Hella, Juho; Huhtinen, Heini; Ojala, Tiina; Alasuvanto, Paula; Koskinen, Hanna-Leena; Kiviruusu, Olli; Hemminki, Elina; Punamäki, Raija-Leena; Sund, Reijo; Solantaus, Tytti; Santalahti, Päivi

    2014-10-07

    Schools provide a natural context to promote children's mental health. However, there is a need for more evidence-based, high quality school intervention programs combined with an accurate evaluation of their general effectiveness and effectiveness of specific intervention methods. The aim of this paper is to present a study protocol of a cluster randomized controlled trial evaluating the "Together at School" intervention program. The intervention program is designed to promote social-emotional skills and mental health by utilizing whole-school approach and focuses on classroom curriculum, work environment of school staff, and parent-teacher collaboration methods. The evaluation study examines the effects of the intervention on children's socio-emotional skills and mental health in a cluster randomized controlled trial design with 1) an intervention group and 2) an active control group. Altogether 79 primary school participated at baseline. A multi-informant setting involves the children themselves, their parents, and teachers. The primary outcomes are measured using parent and teacher ratings of children's socio-emotional skills and psychological problems measured by the Strengths and Difficulties Questionnaire and the Multisource Assessment of Social Competence Scale. Secondary outcomes for the children include emotional understanding, altruistic behavior, and executive functions (e.g. working memory, planning, and inhibition). Secondary outcomes for the teachers include ratings of e.g. school environment, teaching style and well-being. Secondary outcomes for both teachers and parents include e.g. emotional self-efficacy, child rearing practices, and teacher-parent collaboration. The data was collected at baseline (autumn 2013), 6 months after baseline, and will be collected also 18 months after baseline from the same participants. This study protocol outlines a trial which aims to add to the current state of intervention programs by presenting and studying a contextually developed and carefully tested intervention program which is tailored to fit a national school system. Identification of effective intervention elements to promote children's mental health in early school years is crucial for optimal later development. ClinicalTrials.gov register: NCT02178332.

  10. Statistical issues on the analysis of change in follow-up studies in dental research.

    PubMed

    Blance, Andrew; Tu, Yu-Kang; Baelum, Vibeke; Gilthorpe, Mark S

    2007-12-01

    To provide an overview to the problems in study design and associated analyses of follow-up studies in dental research, particularly addressing three issues: treatment-baselineinteractions; statistical power; and nonrandomization. Our previous work has shown that many studies purport an interacion between change (from baseline) and baseline values, which is often based on inappropriate statistical analyses. A priori power calculations are essential for randomized controlled trials (RCTs), but in the pre-test/post-test RCT design it is not well known to dental researchers that the choice of statistical method affects power, and that power is affected by treatment-baseline interactions. A common (good) practice in the analysis of RCT data is to adjust for baseline outcome values using ancova, thereby increasing statistical power. However, an important requirement for ancova is there to be no interaction between the groups and baseline outcome (i.e. effective randomization); the patient-selection process should not cause differences in mean baseline values across groups. This assumption is often violated for nonrandomized (observational) studies and the use of ancova is thus problematic, potentially giving biased estimates, invoking Lord's paradox and leading to difficulties in the interpretation of results. Baseline interaction issues can be overcome by use of statistical methods; not widely practiced in dental research: Oldham's method and multilevel modelling; the latter is preferred for its greater flexibility to deal with more than one follow-up occasion as well as additional covariates To illustrate these three key issues, hypothetical examples are considered from the fields of periodontology, orthodontics, and oral implantology. Caution needs to be exercised when considering the design and analysis of follow-up studies. ancova is generally inappropriate for nonrandomized studies and causal inferences from observational data should be avoided.

  11. Development and validation of an MEKC method for determination of nitrogen-containing drugs in pharmaceutical preparations.

    PubMed

    Buiarelli, Francesca; Coccioli, Franco; Jasionowska, Renata; Terracciano, Alessandro

    2008-09-01

    A fast and accurate micellar electrokinetic capillary chromatography method was developed for quality control of pharmaceutical preparations containing cold remedies as acetaminophen, salicylamide, caffeine, phenylephrine, pseudoephedrine, norephedrine and chlorpheniramine. The method optimization was realized on a Beckman P/ACE System MDQ instrument. The baseline separation of seven analytes was performed in an uncoated fused silica capillary internal diameter (ID)=50 microm using tris-borate (20 mM, pH=8.5) containing sodium dodecyl sulphate 30 mM BGE. On line-UV detection at 214 nm was performed and the applied voltage was 10 kV. The operating temperature was 25 degrees C. After experimental conditions optimization, the proposed method was validated. The evaluated parameters were: precision of migration time and of corrected peak area ratio, linearity range, limit of detection, limit of quantification, accuracy (recovery), ruggedness and applicability. The method was then successfully applied for the analysis of three pharmaceutical preparations containing some of the analytes listed before.

  12. High-performance liquid chromatographic analysis of dextromethorphan, guaifenesin and benzoate in a cough syrup for stability testing.

    PubMed

    Galli, V; Barbas, C

    2004-09-10

    A method has been developed for the analysis of a cough syrup containing dextromethorphan, guaifenesin, benzoic acid, saccharin and other components. Forced degradation was also studied to demonstrate that the method could be employed during a stability study of the syrup. Final conditions were phosphate buffer (25 mM, pH 2.8) with triethylamine (TEA)-acetonitrile (75:25, v/v). In such conditions, all the actives, excipients and degradation products were baseline resolved in less than 14 min, and different wavelengths were used for the different analytes and related compounds.

  13. Long-Baseline Comparisons of the Brazilian National Time Scale to UTC (NIST) Using Near Real-Time and Postprocessed Solutions

    DTIC Science & Technology

    2007-11-01

    long baseline of ~8600 km. The comparisons were made with measurement systems developed for the Sistema Interamericano de Metrologia (SIM) comparison...measurements are compared and summarized. I. INTRODUCTION The Sistema Interamericano de Metrologia (SIM) is a regional metrology organization...Brazil. The two time scales are separated by a long baseline of ~8600 km. The comparisons were made with measurement systems developed for the Sistema

  14. News video story segmentation method using fusion of audio-visual features

    NASA Astrophysics Data System (ADS)

    Wen, Jun; Wu, Ling-da; Zeng, Pu; Luan, Xi-dao; Xie, Yu-xiang

    2007-11-01

    News story segmentation is an important aspect for news video analysis. This paper presents a method for news video story segmentation. Different form prior works, which base on visual features transform, the proposed technique uses audio features as baseline and fuses visual features with it to refine the results. At first, it selects silence clips as audio features candidate points, and selects shot boundaries and anchor shots as two kinds of visual features candidate points. Then this paper selects audio feature candidates as cues and develops different fusion method, which effectively using diverse type visual candidates to refine audio candidates, to get story boundaries. Experiment results show that this method has high efficiency and adaptability to different kinds of news video.

  15. Development of TPS flight test and operational instrumentation

    NASA Technical Reports Server (NTRS)

    Carnahan, K. R.; Hartman, G. J.; Neuner, G. J.

    1975-01-01

    Thermal and flow sensor instrumentation was developed for use as an integral part of the space shuttle orbiter reusable thermal protection system. The effort was performed in three tasks: a study to determine the optimum instruments and instrument installations for the space shuttle orbiter RSI and RCC TPS; tests and/or analysis to determine the instrument installations to minimize measurement errors; and analysis using data from the test program for comparison to analytical methods. A detailed review of existing state of the art instrumentation in industry was performed to determine the baseline for the departure of the research effort. From this information, detailed criteria for thermal protection system instrumentation were developed.

  16. Cooperative Strategies to Develop Effective Stroke and Heart Attack Awareness Messages in Rural American Indian Communities, 2009–2010

    PubMed Central

    Gohdes, Dorothy; Fogle, Crystelle C.; Tadios, Fawn; Doore, Velva; Bell, Doreen S.; Harwell, Todd S.; Helgerson, Steven D.

    2013-01-01

    Introduction National initiatives to improve the recognition of heart attack and stroke warning signs have encouraged symptomatic people to seek early treatment, but few have shown significant effects in rural American Indian (AI) communities. Methods During 2009 and 2010, the Montana Cardiovascular Health Program, in collaboration with 2 tribal health departments, developed and conducted culturally specific public awareness campaigns for signs and symptoms of heart attack and stroke via local media. Telephone surveys were conducted before and after each campaign to evaluate the effectiveness of the campaigns. Results Knowledge of 3 or more heart attack warning signs and symptoms increased significantly on 1 reservation from 35% at baseline to 47% postcampaign. On the second reservation, recognition of 2 or more stroke signs and symptoms increased from 62% at baseline to 75% postcampaign, and the level of awareness remained at 73% approximately 4 months after the high-intensity campaign advertisements ended. Intent to call 9-1-1 did not increase in the heart attack campaign but did improve in the stroke campaign for specific symptoms. Recall of media campaigns on both reservations increased significantly from baseline to postcampaign for both media outlets (ie, radio and newspaper). Conclusion Carefully designed, culturally specific campaigns may help eliminate disparities in the recognition of heart attack and stroke warning signs in AI communities. PMID:23680509

  17. Factors associated with developing a fear of falling in subjects with primary open-angle glaucoma.

    PubMed

    Adachi, Sayaka; Yuki, Kenya; Awano-Tanabe, Sachiko; Ono, Takeshi; Shiba, Daisuke; Murata, Hiroshi; Asaoka, Ryo; Tsubota, Kazuo

    2018-02-13

    To investigate the relationship between clinical risk factors, including visual field (VF) defects and visual acuity, and a fear of falling, among patients with primary open-angle glaucoma (POAG). All participants answered the following question at a baseline ophthalmic examination: Are you afraid of falling? The same question was then answered every 12 months for 3 years. A binocular integrated visual field was calculated by merging a patient's monocular Humphrey field analyzer VFs, using the 'best sensitivity' method. The means of total deviation values in the whole, superior peripheral, superior central, inferior central, and inferior peripheral VFs were calculated. The relationship between these mean VF measurements, and various clinical factors, against patients' baseline fear of falling and future fear of falling was analyzed using multiple logistic regression. Among 392 POAG subjects, 342 patients (87.2%) responded to the fear of falling question at least twice in the 3 years study period. The optimal regression model for patients' baseline fear of falling included age, gender, mean of total deviation values in the inferior peripheral VF and number of previous falls. The optimal regression equation for future fear of falling included age, gender, mean of total deviation values in the inferior peripheral VF and number of previous falls. Defects in the inferior peripheral VF area are significantly related to the development of a fear of falling.

  18. Effects of the TRPV1 antagonist ABT-102 on body temperature in healthy volunteers: pharmacokinetic/pharmacodynamic analysis of three phase 1 trials

    PubMed Central

    Othman, Ahmed A; Nothaft, Wolfram; Awni, Walid M; Dutta, Sandeep

    2013-01-01

    Aim To characterize quantitatively the relationship between ABT-102, a potent and selective TRPV1 antagonist, exposure and its effects on body temperature in humans using a population pharmacokinetic/pharmacodynamic modelling approach. Methods Serial pharmacokinetic and body temperature (oral or core) measurements from three double-blind, randomized, placebo-controlled studies [single dose (2, 6, 18, 30 and 40 mg, solution formulation), multiple dose (2, 4 and 8 mg twice daily for 7 days, solution formulation) and multiple-dose (1, 2 and 4 mg twice daily for 7 days, solid dispersion formulation)] were analyzed. nonmem was used for model development and the model building steps were guided by pre-specified diagnostic and statistical criteria. The final model was qualified using non-parametric bootstrap and visual predictive check. Results The developed body temperature model included additive components of baseline, circadian rhythm (cosine function of time) and ABT-102 effect (Emax function of plasma concentration) with tolerance development (decrease in ABT-102 Emax over time). Type of body temperature measurement (oral vs. core) was included as a fixed effect on baseline, amplitude of circadian rhythm and residual error. The model estimates (95% bootstrap confidence interval) were: baseline oral body temperature, 36.3 (36.3, 36.4)°C; baseline core body temperature, 37.0 (37.0, 37.1)°C; oral circadian amplitude, 0.25 (0.22, 0.28)°C; core circadian amplitude, 0.31 (0.28, 0.34)°C; circadian phase shift, 7.6 (7.3, 7.9) h; ABT-102 Emax, 2.2 (1.9, 2.7)°C; ABT-102 EC50, 20 (15, 28) ng ml−1; tolerance T50, 28 (20, 43) h. Conclusions At exposures predicted to exert analgesic activity in humans, the effect of ABT-102 on body temperature is estimated to be 0.6 to 0.8°C. This effect attenuates within 2 to 3 days of dosing. PMID:22966986

  19. Mindful "Vitality in Practice": an intervention to improve the work engagement and energy balance among workers; the development and design of the randomised controlled trial.

    PubMed

    van Berkel, Jantien; Proper, Karin I; Boot, Cécile R L; Bongers, Paulien M; van der Beek, Allard J

    2011-09-27

    Modern working life has become more mental and less physical in nature, contributing to impaired mental health and a disturbed energy balance. This may result in mental health problems and overweight. Both are significant threats to the health of workers and thus also a financial burden for society, including employers. Targeting work engagement and energy balance could prevent impaired mental health and overweight, respectively. The study population consists of highly educated workers in two Dutch research institutes. The intervention was systematically developed, based on the Intervention Mapping (IM) protocol, involving workers and management in the process. The workers' needs were assessed by combining the results of interviews, focus group discussions and a questionnaire with available literature. Suitable methods and strategies were selected resulting in an intervention including: eight weeks of customized mindfulness training, followed by eight sessions of e-coaching and supporting elements, such as providing fruit and snack vegetables at the workplace, lunch walking routes, and a buddy system. The effects of the intervention will be evaluated in a RCT, with measurements at baseline, six months (T1) and 12 months (T2). In addition, cost-effectiveness and process of the intervention will also be evaluated. At baseline the level of work engagement of the sample was "average". Of the study population, 60.1% did not engage in vigorous physical activity at all. An average working day consists of eight sedentary hours. For the Phase II RCT, there were no significant differences between the intervention and the control group at baseline, except for vigorous physical activity. The baseline characteristics of the study population were congruent with the results of the needs assessment. The IM protocol used for the systematic development of the intervention produced an appropriate intervention to test in the planned RCT. Netherlands Trial Register (NTR): NTR2199.

  20. SU-F-BRB-07: A Plan Comparison Tool to Ensure Robustness and Deliverability in Online-Adaptive Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, P; Labby, Z; Bayliss, R A

    Purpose: To develop a plan comparison tool that will ensure robustness and deliverability through analysis of baseline and online-adaptive radiotherapy plans using similarity metrics. Methods: The ViewRay MRIdian treatment planning system allows export of a plan file that contains plan and delivery information. A software tool was developed to read and compare two plans, providing information and metrics to assess their similarity. In addition to performing direct comparisons (e.g. demographics, ROI volumes, number of segments, total beam-on time), the tool computes and presents histograms of derived metrics (e.g. step-and-shoot segment field sizes, segment average leaf gaps). Such metrics were investigatedmore » for their ability to predict that an online-adapted plan reasonably similar to a baseline plan where deliverability has already been established. Results: In the realm of online-adaptive planning, comparing ROI volumes offers a sanity check to verify observations found during contouring. Beyond ROI analysis, it has been found that simply editing contours and re-optimizing to adapt treatment can produce a delivery that is substantially different than the baseline plan (e.g. number of segments increased by 31%), with no changes in optimization parameters and only minor changes in anatomy. Currently the tool can quickly identify large omissions or deviations from baseline expectations. As our online-adaptive patient population increases, we will continue to develop and refine quantitative acceptance criteria for adapted plans and relate them historical delivery QA measurements. Conclusion: The plan comparison tool is in clinical use and reports a wide range of comparison metrics, illustrating key differences between two plans. This independent check is accomplished in seconds and can be performed in parallel to other tasks in the online-adaptive workflow. Current use prevents large planning or delivery errors from occurring, and ongoing refinements will lead to increased assurance of plan quality.« less

  1. A Complete Procedure for Predicting and Improving the Performance of HAWT's

    NASA Astrophysics Data System (ADS)

    Al-Abadi, Ali; Ertunç, Özgür; Sittig, Florian; Delgado, Antonio

    2014-06-01

    A complete procedure for predicting and improving the performance of the horizontal axis wind turbine (HAWT) has been developed. The first process is predicting the power extracted by the turbine and the derived rotor torque, which should be identical to that of the drive unit. The BEM method and a developed post-stall treatment for resolving stall-regulated HAWT is incorporated in the prediction. For that, a modified stall-regulated prediction model, which can predict the HAWT performance over the operating range of oncoming wind velocity, is derived from existing models. The model involves radius and chord, which has made it more general in applications for predicting the performance of different scales and rotor shapes of HAWTs. The second process is modifying the rotor shape by an optimization process, which can be applied to any existing HAWT, to improve its performance. A gradient- based optimization is used for adjusting the chord and twist angle distribution of the rotor blade to increase the extraction of the power while keeping the drive torque constant, thus the same drive unit can be kept. The final process is testing the modified turbine to predict its enhanced performance. The procedure is applied to NREL phase-VI 10kW as a baseline turbine. The study has proven the applicability of the developed model in predicting the performance of the baseline as well as the optimized turbine. In addition, the optimization method has shown that the power coefficient can be increased while keeping same design rotational speed.

  2. Different hip and knee priority score systems: are they good for the same thing?

    PubMed

    Escobar, Antonio; Quintana, Jose Maria; Espallargues, Mireia; Allepuz, Alejandro; Ibañez, Berta

    2010-10-01

    The aim of the present study was to compare two priority tools used for joint replacement for patients on waiting lists, which use two different methods. Two prioritization tools developed and validated by different methodologies were used on the same cohort of patients. The first, an IRYSS hip and knee priority score (IHKPS) developed by RAND method, was applied while patients were on the waiting list. The other, a Catalonia hip-knee priority score (CHKPS) developed by conjoint analysis, was adapted and applied retrospectively. In addition, all patients fulfilled pre-intervention the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC). Correlation between them was studied by Pearson correlation coefficient (r). Agreement was analysed by means of intra-class correlation coefficient (ICC), Kendall coefficient and Cohern kappa. The relationship between IHKPS, CHKPS and baseline WOMAC scores by r coefficient was studied. The sample consisted of 774 consecutive patients. Pearson correlation coefficient between IHKPS and CHKPS was 0.79. The agreement study showed that ICC was 0.74, Kendall coefficient 0.86 and kappa 0.66. Finally, correlation between CHKPS and baseline WOMAC ranged from 0.43 to 0.64. The results according to the relationship between IHKPS and WOMAC ranged from 0.50 to 0.74. Results support the hypothesis that if the final objective of the prioritization tools is to organize and sort patients on the waiting list, although they use different methodologies, the results are similar. © 2010 Blackwell Publishing Ltd.

  3. Urgency Urinary Incontinence in Women ≥ 50 years: Incidence, Remission and Predictors of Change

    PubMed Central

    Komesu, YM; Schrader, RM; Rogers, RG; Ketai, LH

    2011-01-01

    Objectives To estimate 2 year incidence, remission and predictors of urgency urinary incontinence (UUI) in a community based population of women ≥50. Methods We analyzed 2004–2006 data in the Health and Retirement Study. Subjects were women ≥ 50 with baseline and follow-up UUI information. UUI incidence and remission were calculated. Predictors of UUI progression and improvement were estimated controlling for age, ethnicity, body mass index (BMI), parity, psychiatric illness, medical co-morbidities, functional limitations and stress urinary incontinence (SUI). We evaluated whether baseline UUI status predicted follow-up status and used multivariable logistic regression to identify predictor variables. Results 8,581 women reported UUI status at baseline and follow-up. Of 7,244 women continent at baseline, 268 affirmed UUI at follow-up for a 2 year incidence of 3.7%. Of 581 women with UUI at baseline, 150 were continent at follow-up for a 2 year remission of 25.8%. Predictors of UUI development included increased age (7th and 10th decade compared to 6th decade; OR 1.5 and 7.2, CI 1.1–2.1 and 4.2–12.5, respectively), obesity (OR 1.6, CI 1.2–2.1), history of psychiatric illness (OR 1.6, CI 1.3–2.0), functional limitations (OR 6.2, CI 4.2–9.2) and SUI (OR 5.0, CI 3.0–8.3). Women who denied UUI at baseline were also likely to deny UUI at follow-up (OR 47.4, CI 22.9–98.1). Conclusions In this community based population of women ≥ 50 UUI incidence was low and remission was high. Predictors of UUI included increased age, severe obesity, functional limitations, a positive psychiatric history and incontinence status at baseline. PMID:22453668

  4. An Improved Gaussian Mixture Model for Damage Propagation Monitoring of an Aircraft Wing Spar under Changing Structural Boundary Conditions.

    PubMed

    Qiu, Lei; Yuan, Shenfang; Mei, Hanfei; Fang, Fang

    2016-02-26

    Structural Health Monitoring (SHM) technology is considered to be a key technology to reduce the maintenance cost and meanwhile ensure the operational safety of aircraft structures. It has gradually developed from theoretic and fundamental research to real-world engineering applications in recent decades. The problem of reliable damage monitoring under time-varying conditions is a main issue for the aerospace engineering applications of SHM technology. Among the existing SHM methods, Guided Wave (GW) and piezoelectric sensor-based SHM technique is a promising method due to its high damage sensitivity and long monitoring range. Nevertheless the reliability problem should be addressed. Several methods including environmental parameter compensation, baseline signal dependency reduction and data normalization, have been well studied but limitations remain. This paper proposes a damage propagation monitoring method based on an improved Gaussian Mixture Model (GMM). It can be used on-line without any structural mechanical model and a priori knowledge of damage and time-varying conditions. With this method, a baseline GMM is constructed first based on the GW features obtained under time-varying conditions when the structure under monitoring is in the healthy state. When a new GW feature is obtained during the on-line damage monitoring process, the GMM can be updated by an adaptive migration mechanism including dynamic learning and Gaussian components split-merge. The mixture probability distribution structure of the GMM and the number of Gaussian components can be optimized adaptively. Then an on-line GMM can be obtained. Finally, a best match based Kullback-Leibler (KL) divergence is studied to measure the migration degree between the baseline GMM and the on-line GMM to reveal the weak cumulative changes of the damage propagation mixed in the time-varying influence. A wing spar of an aircraft is used to validate the proposed method. The results indicate that the crack propagation under changing structural boundary conditions can be monitored reliably. The method is not limited by the properties of the structure, and thus it is feasible to be applied to composite structure.

  5. Projections of Global Mortality and Burden of Disease from 2002 to 2030

    PubMed Central

    Mathers, Colin D; Loncar, Dejan

    2006-01-01

    Background Global and regional projections of mortality and burden of disease by cause for the years 2000, 2010, and 2030 were published by Murray and Lopez in 1996 as part of the Global Burden of Disease project. These projections, which are based on 1990 data, continue to be widely quoted, although they are substantially outdated; in particular, they substantially underestimated the spread of HIV/AIDS. To address the widespread demand for information on likely future trends in global health, and thereby to support international health policy and priority setting, we have prepared new projections of mortality and burden of disease to 2030 starting from World Health Organization estimates of mortality and burden of disease for 2002. This paper describes the methods, assumptions, input data, and results. Methods and Findings Relatively simple models were used to project future health trends under three scenarios—baseline, optimistic, and pessimistic—based largely on projections of economic and social development, and using the historically observed relationships of these with cause-specific mortality rates. Data inputs have been updated to take account of the greater availability of death registration data and the latest available projections for HIV/AIDS, income, human capital, tobacco smoking, body mass index, and other inputs. In all three scenarios there is a dramatic shift in the distribution of deaths from younger to older ages and from communicable, maternal, perinatal, and nutritional causes to noncommunicable disease causes. The risk of death for children younger than 5 y is projected to fall by nearly 50% in the baseline scenario between 2002 and 2030. The proportion of deaths due to noncommunicable disease is projected to rise from 59% in 2002 to 69% in 2030. Global HIV/AIDS deaths are projected to rise from 2.8 million in 2002 to 6.5 million in 2030 under the baseline scenario, which assumes coverage with antiretroviral drugs reaches 80% by 2012. Under the optimistic scenario, which also assumes increased prevention activity, HIV/AIDS deaths are projected to drop to 3.7 million in 2030. Total tobacco-attributable deaths are projected to rise from 5.4 million in 2005 to 6.4 million in 2015 and 8.3 million in 2030 under our baseline scenario. Tobacco is projected to kill 50% more people in 2015 than HIV/AIDS, and to be responsible for 10% of all deaths globally. The three leading causes of burden of disease in 2030 are projected to include HIV/AIDS, unipolar depressive disorders, and ischaemic heart disease in the baseline and pessimistic scenarios. Road traffic accidents are the fourth leading cause in the baseline scenario, and the third leading cause ahead of ischaemic heart disease in the optimistic scenario. Under the baseline scenario, HIV/AIDS becomes the leading cause of burden of disease in middle- and low-income countries by 2015. Conclusions These projections represent a set of three visions of the future for population health, based on certain explicit assumptions. Despite the wide uncertainty ranges around future projections, they enable us to appreciate better the implications for health and health policy of currently observed trends, and the likely impact of fairly certain future trends, such as the ageing of the population, the continued spread of HIV/AIDS in many regions, and the continuation of the epidemiological transition in developing countries. The results depend strongly on the assumption that future mortality trends in poor countries will have a relationship to economic and social development similar to those that have occurred in the higher-income countries. PMID:17132052

  6. Baseline glucose level is an individual trait that is negatively associated with lifespan and increases due to adverse environmental conditions during development and adulthood.

    PubMed

    Montoya, Bibiana; Briga, Michael; Jimeno, Blanca; Moonen, Sander; Verhulst, Simon

    2018-05-01

    High baseline glucose levels are associated with pathologies and shorter lifespan in humans, but little is known about causes and consequences of individual variation in glucose levels in other species. We tested to what extent baseline blood glucose level is a repeatable trait in adult zebra finches, and whether glucose levels were associated with age, manipulated environmental conditions during development (rearing brood size) and adulthood (foraging cost), and lifespan. We found that: (1) repeatability of glucose levels was 30%, both within and between years. (2) Having been reared in a large brood and living with higher foraging costs as adult were independently associated with higher glucose levels. Furthermore, the finding that baseline glucose was low when ambient temperature was high, and foraging costs were low, indicates that glucose is regulated at a lower level when energy turnover is low. (3) Survival probability decreased with increasing baseline glucose. We conclude that baseline glucose is an individual trait negatively associated with survival, and increases due to adverse environmental conditions during development (rearing brood size) and adulthood (foraging cost). Blood glucose may be, therefore, part of the physiological processes linking environmental conditions to lifespan.

  7. Assessing covariate balance when using the generalized propensity score with quantitative or continuous exposures.

    PubMed

    Austin, Peter C

    2018-01-01

    Propensity score methods are increasingly being used to estimate the effects of treatments and exposures when using observational data. The propensity score was initially developed for use with binary exposures (e.g., active treatment vs. control). The generalized propensity score is an extension of the propensity score for use with quantitative exposures (e.g., dose or quantity of medication, income, years of education). A crucial component of any propensity score analysis is that of balance assessment. This entails assessing the degree to which conditioning on the propensity score (via matching, weighting, or stratification) has balanced measured baseline covariates between exposure groups. Methods for balance assessment have been well described and are frequently implemented when using the propensity score with binary exposures. However, there is a paucity of information on how to assess baseline covariate balance when using the generalized propensity score. We describe how methods based on the standardized difference can be adapted for use with quantitative exposures when using the generalized propensity score. We also describe a method based on assessing the correlation between the quantitative exposure and each covariate in the sample when weighted using generalized propensity score -based weights. We conducted a series of Monte Carlo simulations to evaluate the performance of these methods. We also compared two different methods of estimating the generalized propensity score: ordinary least squared regression and the covariate balancing propensity score method. We illustrate the application of these methods using data on patients hospitalized with a heart attack with the quantitative exposure being creatinine level.

  8. The Effect of Psychosocial Factors on Acute and Persistent Pain Following Childbirth

    DTIC Science & Technology

    2015-10-14

    longitudinal study. Methods: Baseline measures of psychosocial variables were obtained during the last 8 weeks of pregnancy . Delivery and acute pain...low income by the US Census Bureau???s definition. Routine assessment of depression during pregnancy may identify those at risk of developing...were obtained during the last 8 weeks of pregnancy . Delivery and acute pain data were collected from the electronic medical record. Follow-up data were

  9. Multilingual Vocabularies in Automatic Speech Recognition

    DTIC Science & Technology

    2000-08-01

    monolingual (a few thousands) is an obstacle to a full generalization of the inventories, then moved to the multilingual case. In the approach towards the...direction of language independence. In this monolingual experiment, we developed two types of unit sets for paper, we extend the method presented in [3...sound ji is not assimilated 3.2.1 Monolingual experiments to the corresponding sound in Spanish, but it is left apart as a The baseline model for English

  10. Estimating mean QALYs in trial-based cost-effectiveness analysis: the importance of controlling for baseline utility.

    PubMed

    Manca, Andrea; Hawkins, Neil; Sculpher, Mark J

    2005-05-01

    In trial-based cost-effectiveness analysis baseline mean utility values are invariably imbalanced between treatment arms. A patient's baseline utility is likely to be highly correlated with their quality-adjusted life-years (QALYs) over the follow-up period, not least because it typically contributes to the QALY calculation. Therefore, imbalance in baseline utility needs to be accounted for in the estimation of mean differential QALYs, and failure to control for this imbalance can result in a misleading incremental cost-effectiveness ratio. This paper discusses the approaches that have been used in the cost-effectiveness literature to estimate absolute and differential mean QALYs alongside randomised trials, and illustrates the implications of baseline mean utility imbalance for QALY calculation. Using data from a recently conducted trial-based cost-effectiveness study and a micro-simulation exercise, the relative performance of alternative estimators is compared, showing that widely used methods to calculate differential QALYs provide incorrect results in the presence of baseline mean utility imbalance regardless of whether these differences are formally statistically significant. It is demonstrated that multiple regression methods can be usefully applied to generate appropriate estimates of differential mean QALYs and an associated measure of sampling variability, while controlling for differences in baseline mean utility between treatment arms in the trial. Copyright 2004 John Wiley & Sons, Ltd

  11. Auto-antibodies and Autoimmune Disease during Treatment of Children with Chronic Hepatitis C

    PubMed Central

    Molleston, Jean P.; Mellman, William; Narkewicz, Michael R.; Balistreri, William F.; Gonzalez-Peralta, Regino P.; Jonas, Maureen M.; Lobritto, Steven J.; Mohan, Parvathi; Murray, Karen F.; Njoku, Dolores; Rosenthal, Philip; Barton, Bruce A.; Talor, Monica V.; Cheng, Irene; Schwarz, Kathleen B.; Haber, Barbara A.

    2012-01-01

    Objectives Auto-antibodies were studied in a well-characterized cohort of children with chronic hepatitis C (CHC) during treatment with PEG-IFN and ribavirin to assess the relationship to treatment and development of autoimmune disease. Methods 114 children (5–17 years), previously screened for the presence of high titer autoantibodies, were randomized to Peg-IFN with or without ribavirin. Anti-nuclear (ANA), anti-liver-kidney-microsomal (LKM), anti-thyroglobulin (TG), anti-thyroid peroxidase (TPO), insulin (IA2), anti-glutamic acid decarboxylase (GAD) antibodies were measured after trial completion using frozen sera. Results At baseline,19% had auto-antibodies: ANA (8%), LKM (4%), and GAD (4%). At 24 and 72 weeks (24 weeks after treatment completion), 23% and 26% had auto-antibodies (p=0.50, 0.48 compared to baseline). One child developed diabetes and two hypothyroidism during treatment; none developed autoimmune hepatitis. At 24 weeks, the incidence of flu-like symptoms, gastrointestinal symptoms, and headaches were 42%, 8% and 19% in those with auto-antibodies vs. 52%, 17%, and 26% in those without (p=0.18, 0.36, and 0.20, respectively). In children with negative HCV PCR at 24 weeks, there was no difference in the rate of early virologic response /sustained virologic response respectively in those with auto-antibodies 76%/69%, vs 58%/65% in those without (p=0.48). Conclusions Despite screening, we found autoantibodies commonly at baseline, during treatment for CHC and after. The presence of antibodies did not correlate with viral response, side effects, or autoimmune hepatitis. Neither screening nor archived samples assayed for thyroid and diabetes-related antibodies identified the 3 subjects who developed overt autoimmune disease, diabetes (1) and hypothyroidism (2). PMID:23439301

  12. Gender differences among treatment-seeking adults with cannabis use disorder: Clinical profiles of women and men enrolled in the Achieving Cannabis Cessation – Evaluating N-acetylcysteine Treatment (ACCENT) study

    PubMed Central

    Sherman, Brian J.; McRae-Clark, Aimee L.; Baker, Nathaniel L.; Sonne, Susan C.; Killeen, Therese K.; Cloud, Kasie; Gray, Kevin M.

    2017-01-01

    Background and Objectives Recent evidence suggests that women may fare worse than men in cannabis trials with pharmacologic interventions. Identifying baseline clinical profiles of treatment-seeking cannabis-dependent adults could inform gender-specific treatment planning and development. Methods The current study compared baseline demographic, cannabis use, and psychiatric factors between women (n = 86) and men (n = 216) entering the Achieving Cannabis Cessation – Evaluating N-acetylcysteine Treatment (ACCENT) study, a multi-site, randomized controlled trial conducted within the National Drug Abuse Treatment Clinical Trials Network. Results Women reported greater withdrawal intensity (p = 0.001) and negative impact of withdrawal (p = 0.001), predominantly due to physiological and mood symptoms. Women were more likely to have lifetime panic disorder (p = 0.038) and current agoraphobia (p = 0.022), and reported more days of poor physical health (p = 0.006) and cannabis-related medical problems (p = 0.023). Women reporting chronic pain had greater mean pain scores than men with chronic pain (p = 0.006). Men and women did not differ on any measures of baseline cannabis use. Discussion and Conclusion Cannabis-dependent women may present for treatment with more severe and impairing withdrawal symptoms and psychiatric conditions compared to cannabis-dependent men. This might help explain recent evidence suggesting that women fare worse than men in cannabis treatment trials of pharmacologic interventions. Baseline clinical profiles of treatment-seeking adults can inform gender-specific treatment planning and development. Scientific Significance Cannabis-dependent women may benefit from integrated treatment focusing on co-occurring psychiatric disorders and targeted treatment of cannabis withdrawal syndrome. PMID:28152236

  13. Pre-radiographic MRI findings are associated with onset of knee symptoms: the most study

    PubMed Central

    Javaid, M. K.; Lynch, J. A.; Tolstykh, I.; Guermazi, A.; Roemer, F.; Aliabadi, P.; McCulloch, C.; Curtis, J.; Felson, D.; Lane, N. E.; Torner, J.; Nevitt, M.

    2010-01-01

    Summary Objective Magnetic resonance imaging (MRI) has greater sensitivity to detect osteoarthritis (OA) damage than radiographs but it is uncertain which MRI findings in early OA are clinically important. We examined MRI abnormalities detected in knees without radiographic OA and their association with incident knee symptoms. Method Participants from the Multicenter Osteoarthritis Study (MOST) without frequent knee symptoms (FKS) at baseline were eligible if they also lacked radiographic features of OA at baseline. At 15 months, knees that developed FKS were defined as cases while control knees were drawn from those that remained without FKS. Baseline MRIs were scored at each subregion for cartilage lesions (CARTs); osteophytes (OST); bone marrow lesions (BML) and cysts. We compared cases and controls using marginal logistic regression models, adjusting for age, gender, race, body mass index (BMI), previous injury and clinic site. Results 36 case knees and 128 control knees were analyzed. MRI damage was common in both cases and controls. The presence of a severe CART (P = 0.03), BML (P = 0.02) or OST (P = 0.02) in the whole knee joint was more common in cases while subchondral cysts did not differ significantly between cases and controls (P > 0.1). Case status at 15 months was predicted by baseline damage at only two locations; a BML in the lateral patella (P = 0.047) and at the tibial subspinous subregions (P = 0.01). Conclusion In knees without significant symptoms or radiographic features of OA, MRI lesions of OA in only a few specific locations preceded onset of clinical symptoms and suggest that changes in bone play a role in the early development of knee pain. Confirmation of these findings in other prospective studies of knee OA is warranted. PMID:19919856

  14. LONGITUDINAL PATTERNS OF CARDIORESPIRATORY FITNESS PREDICT THE DEVELOPMENT OF HYPERTENSION AMONG MEN AND WOMEN

    PubMed Central

    Sui, Xuemei; Sarzynski, Mark A.; Lee, Duck-chul; Lavie, Carl J.; Zhang, Jiajia; Kokkinos, Peter F.; Payne, Jonathan; Blair, Steven N.

    2016-01-01

    Background Most of the existing literature has linked either a baseline cardiorespiratory fitness, or change between baseline and one follow-up measurement of cardiorespiratory fitness, to hypertension. The purpose of the study is to assess the association between longitudinal patterns of cardiorespiratory fitness changes with time and incident hypertension in adult men and women. Patients and Methods Participants were aged from 20 to 82 years, free of hypertension during the first three examinations, and received at least four preventive medical examinations at the Cooper Clinic in Dallas, TX, during 1971 – 2006. They were classified into one of five groups based on all of the measured cardiorespiratory fitness values (in metabolic equivalents) during maximal treadmill tests. Logistic regression was used to compute odds ratios and 95% confidence intervals. Results Among a total of 4,932 participants (13% women), 1,954 developed hypertension. After controlling for baseline potential confounders, follow-up duration, and number of follow-up visits, odds ratios (95% confidence intervals) for hypertension were: 1.00 for decreasing group (referent), 0.64 (0.52–0.80) for increasing, 0.89 (0.70–1.12) for Bell-shape, 0.78 (0.62–0.98) for U-shape, and 0.83 (0.69–1.00) for inconsistent group. The general pattern of the association was consistent regardless of participants’ baseline cardiorespiratory fitness or body mass index levels. Conclusion An increasing pattern of cardiorespiratory fitness provides the lowest risk of hypertension in this middle-aged relatively healthy population. Identifying specific pattern(s) of cardiorespiratory fitness change may be important for determining associations with comorbidity such as hypertension. PMID:27986522

  15. [Various methods of overcoming secondary resistance to treatment developing in relation to adaptation to psychotropic drugs during long-term treatment (clinico-experimental study)].

    PubMed

    Avrutskiĭ, G Ia; Allikmets, L Kh; Neduva, A A; Zharkovskiĭ, A M; Beliakov, A V

    1984-01-01

    Clinical and experimental studies into the phenomenon of adaptation to neuroleptic agents and into the methods of its overcoming were carried out. An experimental study of the long-term administrations of haloperidol revealed the formation of adaptation to the drug which can be overcome by a zigzag-like sharp elevation of the dosage followed by rapid reduction to the baseline level. The trial of this method under clinical conditions showed that it was expedient to use on a large scale the experimental findings on the specific features of the formation and prevention of the secondary therapeutic resistance.

  16. Simulation-Based Valuation of Transactive Energy Systems

    DOE PAGES

    Huang, Qiuhua; McDermott, Tom; Tang, Yingying; ...

    2018-05-18

    Transactive Energy (TE) has been recognized as a promising technique for integrating responsive loads and distributed energy resources as well as advancing grid modernization. To help the industry better understand the value of TE and compare different TE schemes in a systematic and transparent manner, a comprehensive simulation-based TE valuation method is developed. The method has the following salient features: 1) it formally defines the valuation scenarios, use cases, baseline and valuation metrics; 2) an open-source simulation platform for transactive energy systems has been developed by integrating transmission, distribution and building simulators, and plugin TE and non-TE agents through themore » Framework for Network Co-Simulation (FNCS); 3) transparency and flexibility of the valuation is enhanced through separation of simulation and valuation, base valuation metrics and final valuation metrics. In conclusion, a valuation example based on the Smart Grid Interoperability Panel (SGIP) Use Case 1 is provided to demonstrate the developed TE simulation program and the valuation method.« less

  17. Simulation-Based Valuation of Transactive Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; McDermott, Tom; Tang, Yingying

    Transactive Energy (TE) has been recognized as a promising technique for integrating responsive loads and distributed energy resources as well as advancing grid modernization. To help the industry better understand the value of TE and compare different TE schemes in a systematic and transparent manner, a comprehensive simulation-based TE valuation method is developed. The method has the following salient features: 1) it formally defines the valuation scenarios, use cases, baseline and valuation metrics; 2) an open-source simulation platform for transactive energy systems has been developed by integrating transmission, distribution and building simulators, and plugin TE and non-TE agents through themore » Framework for Network Co-Simulation (FNCS); 3) transparency and flexibility of the valuation is enhanced through separation of simulation and valuation, base valuation metrics and final valuation metrics. In conclusion, a valuation example based on the Smart Grid Interoperability Panel (SGIP) Use Case 1 is provided to demonstrate the developed TE simulation program and the valuation method.« less

  18. Meat intake and meat preparation in relation to risk of postmenopausal breast cancer in the NIH-AARP diet and health study.

    PubMed

    Kabat, Geoffrey C; Cross, Amanda J; Park, Yikyung; Schatzkin, Arthur; Hollenbeck, Albert R; Rohan, Thomas E; Sinha, Rashmi

    2009-05-15

    A number of studies have reported that intake of red meat or meat cooked at high temperatures is associated with increased risk of breast cancer, but other studies have shown no association. We assessed the association between meat, meat-cooking methods, and meat-mutagen intake and postmenopausal breast cancer in the NIH-AARP Diet and Health Study cohort of 120,755 postmenopausal women who completed a food frequency questionnaire at baseline (1995-1996) as well as a detailed meat-cooking module within 6 months following baseline. During 8 years of follow-up, 3,818 cases of invasive breast cancer were identified in this cohort. Cox proportional hazards models were used to estimate hazard ratios (HR) and 95% confidence intervals (95% CI). After adjusting for covariates, intake of total meat, red meat, meat cooked at high temperatures, and meat mutagens showed no association with breast cancer risk. This large prospective study with detailed information on meat preparation methods provides no support for a role of meat mutagens in the development of postmenopausal breast cancer. (c) 2008 Wiley-Liss, Inc.

  19. Training Compliance Control Yields Improvements in Drawing as a Function of Beery Scores

    PubMed Central

    Snapp-Childs, Winona; Flatters, Ian; Fath, Aaron; Mon-Williams, Mark; Bingham, Geoffrey P.

    2014-01-01

    Many children have difficulty producing movements well enough to improve in sensori-motor learning. Previously, we developed a training method that supports active movement generation to allow improvement at a 3D tracing task requiring good compliance control. Here, we tested 7–8 year old children from several 2nd grade classrooms to determine whether 3D tracing performance could be predicted using the Beery VMI. We also examined whether 3D tracing training lead to improvements in drawing. Baseline testing included Beery, a drawing task on a tablet computer, and 3D tracing. We found that baseline performance in 3D tracing and drawing co-varied with the visual perception (VP) component of the Beery. Differences in 3D tracing between children scoring low versus high on the Beery VP replicated differences previously found between children with and without motor impairments, as did post-training performance that eliminated these differences. Drawing improved as a result of training in the 3D tracing task. The training method improved drawing and reduced differences predicted by Beery scores. PMID:24651280

  20. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    NASA Astrophysics Data System (ADS)

    Han, G.; Lin, B.; Xu, Z.

    2017-03-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  1. Association of Baseline Anterior Segment Parameters With the Development of Incident Gonioscopic Angle Closure.

    PubMed

    Nongpiur, Monisha E; Aboobakar, Inas F; Baskaran, Mani; Narayanaswamy, Arun; Sakata, Lisandro M; Wu, Renyi; Atalay, Eray; Friedman, David S; Aung, Tin

    2017-03-01

    Baseline anterior segment imaging parameters associated with incident gonioscopic angle closure, to our knowledge, are unknown. To identify baseline quantitative anterior segment optical coherence tomography parameters associated with the development of incident gonioscopic angle closure after 4 years among participants with gonioscopically open angles at baseline. Three hundred forty-two participants aged 50 years or older were recruited to participate in this prospective, community-based observational study. Participants underwent gonioscopy and anterior segment optical coherence tomography imaging at baseline and after 4 years. Custom image analysis software was used to quantify anterior chamber parameters from anterior segment optical coherence tomography images. Baseline anterior segment optical coherence tomography measurements among participants with gonioscopically open vs closed angles at follow-up. Of the 342 participants, 187 (55%) were women and 297 (87%) were Chinese. The response rate was 62.4%. Forty-nine participants (14.3%) developed gonioscopic angle closure after 4 years. The mean age (SD) at baseline of the 49 participants was 62.9 (8.0) years, 15 (30.6%) were men, and 43 (87.8%) were Chinese. These participants had a smaller baseline angle opening distance at 750 µm (AOD750) (0.15 mm; 95% CI, 0.12-0.18), trabecular iris surface area at 750 µm (0.07 mm2; 95% CI, 0.05-0.08), anterior chamber area (30 mm2; 95% CI, 2.27-3.74), and anterior chamber volume (24.32 mm2; 95% CI, 18.20-30.44) (all P < .001). Baseline iris curvature (-0.08; 95% CI, -0.12 to -0.04) and lens vault (LV) measurements (-0.29 mm; 95% CI, -0.37 to -0.21) were larger among these participants ( all P < .001). A model consisting of the LV and AOD750 measurements explained 38% of the variance in gonioscopic angle closure occurring at 4 years, with LV accounting for 28% of this variance. For every 0.1 mm increase in LV and 0.1 mm decrease in AOD750, the odds of developing gonioscopic angle closure was 1.29 (95% CI, 1.07-1.57) and 3.27 (95% CI, 1.87-5.69), respectively. In terms of per SD change in LV and AOD750, this translates to an odds ratio of 2.14 (95% CI, 2.48-12.34) and 5.53 (95% CI, 1.22-3.77), respectively. A baseline LV cut-off value of >0.56 mm had 64.6% sensitivity and 84.0% specificity for identifying participants who developed angle closure. These findings suggest that smaller AOD750 and larger LV measurements are associated with the development of incident gonioscopic angle closure after 4 years among participants with gonioscopically open angles at baseline.

  2. An absolute calibration method of an ethyl alcohol biosensor based on wavelength-modulated differential photothermal radiometry

    NASA Astrophysics Data System (ADS)

    Liu, Yi Jun; Mandelis, Andreas; Guo, Xinxin

    2015-11-01

    In this work, laser-based wavelength-modulated differential photothermal radiometry (WM-DPTR) is applied to develop a non-invasive in-vehicle alcohol biosensor. WM-DPTR features unprecedented ethanol-specificity and sensitivity by suppressing baseline variations through a differential measurement near the peak and baseline of the mid-infrared ethanol absorption spectrum. Biosensor signal calibration curves are obtained from WM-DPTR theory and from measurements in human blood serum and ethanol solutions diffused from skin. The results demonstrate that the WM-DPTR-based calibrated alcohol biosensor can achieve high precision and accuracy for the ethanol concentration range of 0-100 mg/dl. The high-performance alcohol biosensor can be incorporated into ignition interlocks that could be fitted as a universal accessory in vehicles in an effort to reduce incidents of drinking and driving.

  3. An absolute calibration method of an ethyl alcohol biosensor based on wavelength-modulated differential photothermal radiometry.

    PubMed

    Liu, Yi Jun; Mandelis, Andreas; Guo, Xinxin

    2015-11-01

    In this work, laser-based wavelength-modulated differential photothermal radiometry (WM-DPTR) is applied to develop a non-invasive in-vehicle alcohol biosensor. WM-DPTR features unprecedented ethanol-specificity and sensitivity by suppressing baseline variations through a differential measurement near the peak and baseline of the mid-infrared ethanol absorption spectrum. Biosensor signal calibration curves are obtained from WM-DPTR theory and from measurements in human blood serum and ethanol solutions diffused from skin. The results demonstrate that the WM-DPTR-based calibrated alcohol biosensor can achieve high precision and accuracy for the ethanol concentration range of 0-100 mg/dl. The high-performance alcohol biosensor can be incorporated into ignition interlocks that could be fitted as a universal accessory in vehicles in an effort to reduce incidents of drinking and driving.

  4. Plans for phase coherent long baseline interferometry for geophysical applications using the Anik-B communications satellite

    NASA Technical Reports Server (NTRS)

    Cannon, W. H.; Petrachenko, W. T.; Yen, J. L.; Galt, J. A.; Waltman, W. B.; Knoweles, S. H.; Popelar, J.

    1980-01-01

    A pilot project to establish an operational phase stable very long baseline interferometer (VLBI) for geophysical studies is described. Methods for implementation as well as practical applications are presented.

  5. Numerical simulation and optimal design of Segmented Planar Imaging Detector for Electro-Optical Reconnaissance

    NASA Astrophysics Data System (ADS)

    Chu, Qiuhui; Shen, Yijie; Yuan, Meng; Gong, Mali

    2017-12-01

    Segmented Planar Imaging Detector for Electro-Optical Reconnaissance (SPIDER) is a cutting-edge electro-optical imaging technology to realize miniaturization and complanation of imaging systems. In this paper, the principle of SPIDER has been numerically demonstrated based on the partially coherent light theory, and a novel concept of adjustable baseline pairing SPIDER system has further been proposed. Based on the results of simulation, it is verified that the imaging quality could be effectively improved by adjusting the Nyquist sampling density, optimizing the baseline pairing method and increasing the spectral channel of demultiplexer. Therefore, an adjustable baseline pairing algorithm is established for further enhancing the image quality, and the optimal design procedure in SPIDER for arbitrary targets is also summarized. The SPIDER system with adjustable baseline pairing method can broaden its application and reduce cost under the same imaging quality.

  6. Localization of an Underwater Control Network Based on Quasi-Stable Adjustment.

    PubMed

    Zhao, Jianhu; Chen, Xinhua; Zhang, Hongmei; Feng, Jie

    2018-03-23

    There exists a common problem in the localization of underwater control networks that the precision of the absolute coordinates of known points obtained by marine absolute measurement is poor, and it seriously affects the precision of the whole network in traditional constraint adjustment. Therefore, considering that the precision of underwater baselines is good, we use it to carry out quasi-stable adjustment to amend known points before constraint adjustment so that the points fit the network shape better. In addition, we add unconstrained adjustment for quality control of underwater baselines, the observations of quasi-stable adjustment and constrained adjustment, to eliminate the unqualified baselines and improve the results' accuracy of the two adjustments. Finally, the modified method is applied to a practical LBL (Long Baseline) experiment and obtains a mean point location precision of 0.08 m, which improves by 38% compared with the traditional method.

  7. Localization of an Underwater Control Network Based on Quasi-Stable Adjustment

    PubMed Central

    Chen, Xinhua; Zhang, Hongmei; Feng, Jie

    2018-01-01

    There exists a common problem in the localization of underwater control networks that the precision of the absolute coordinates of known points obtained by marine absolute measurement is poor, and it seriously affects the precision of the whole network in traditional constraint adjustment. Therefore, considering that the precision of underwater baselines is good, we use it to carry out quasi-stable adjustment to amend known points before constraint adjustment so that the points fit the network shape better. In addition, we add unconstrained adjustment for quality control of underwater baselines, the observations of quasi-stable adjustment and constrained adjustment, to eliminate the unqualified baselines and improve the results’ accuracy of the two adjustments. Finally, the modified method is applied to a practical LBL (Long Baseline) experiment and obtains a mean point location precision of 0.08 m, which improves by 38% compared with the traditional method. PMID:29570627

  8. Developing Teaching Expertise in Dental Education

    ERIC Educational Resources Information Center

    Lyon, Lucinda J.

    2009-01-01

    This exploratory study was designed to develop a baseline model of expertise in dental education utilizing the Dreyfus and Dreyfus continuum of skill acquisition. The goal was the development of a baseline model of expertise, which will contribute to the body of knowledge about dental faculty skill acquisition and may enable dental schools to…

  9. [Effectiveness of integrated early childhood development intervention on nurturing care for children aged 0-35 months in rural China].

    PubMed

    Shi, H F; Zhang, J X; Wang, X L; Xu, Y Y; Dong, S L; Zhao, C X; Huang, X N; Zhao, Q; Chen, X F; Zhou, Y; O'Sullivan, Margo; Pouwels, Ron; Scherpbier, Robert W

    2018-02-02

    Objective: To explore whether Integrated Early Childhood Development (IECD) program has effectively improved the nurturing care for children aged 0-35 months in rural China. Methods: IECD has been implemented by the government of China with support from the United Nations Children's Fund (UNICEF) in four poverty-stricken rural counties since 2014. The interventions targeting the five key components of nurturing care (i.e. child and caregiver health, child nutrition, early learning support, child protection and social security) were delivered through the IECD program to children aged 0 to 35 months and their caregivers. A population-based intervention trial was designed to evaluate intervention effectiveness with data collected in 2013 (baseline) and 2016 (mid-term). The changes of nurturing care in the intervention and control group were analyzed by using a difference-in-differences (DID) model. This approach provided adjustment for sociodemographic and other confounding factors. Results: The baseline and mid-term survey enrolled 1 468 and 1 384 children in the intervention group, and 1 485 and 1 361 in the control group. After two years of implementation, the prevalence of caregiver's depression in the intervention group showed a decrease of 9.1% (mid-term 34.8% (479/1 377) vs. baseline 43.9% (621/1 414)), whereas that in control group showed a decrease of 1.6% (mid-term 34.3% (464/1 353) vs. baseline 35.9% (509/1 419)). With the confounding adjusted in the difference-in-differences model, the decrease of the caregiver's depression prevalence in the intervention group was 7.0% greater than that in the control group ( P= 0.008). The qualified rate of minimum meal frequency in the intervention group showed an increase of 10.4% (mid-term 69.0% (532/771) vs. baseline 58.6% (481/821)), whereas the qualified rate in the intervention group showed an increase of 2.9% (mid-term 66.4% (469/706) vs. baseline 63.5% (508/800)). With the confounding adjusted in the difference-in-differences model, the increase of the qualified rate in the intervention group was 8.2% greater than that in the control group ( P= 0.021). The proportion of violent discipline by caregivers in the intervention group showed a decrease of 6.2% (mid-term 49.1% (478/973) vs. baseline 55.3% (554/1 001)), whereas the proportion in control group showed an increase of 4.5% (mid-term 58.4% (560/959) vs. baseline 53.9% (558/1 036)), and with the confounding adjusted in the difference-in-differences model, the difference in increase rate between two groups was 11.0% ( P= 0.001). The proportion of families with three or more children's books in the intervention group showed an increase of 12.7% (mid-term 42.7% (588/1 378) vs. baseline 30.0% (432/1 440)), whereas the proportion of the control group showed an increase of 4.2% (mid-term 25.7% (349/1 357) vs. baseline 21.5% (298/1 388)), and with the confounding adjusted in the difference-in-differences model, the difference in increase rate between two groups was 6.1% ( P= 0.007). Conclusions: The IECD intervention strategy implemented in rural China effectively improved the mental health of caregivers, optimizes families' child feeding and early stimulation behaviors, while reducing violent discipline and other risk factors. IECD provides better nurturing care for the early development of children aged 0-35 months in rural China.

  10. Quantitative PET Imaging in Drug Development: Estimation of Target Occupancy.

    PubMed

    Naganawa, Mika; Gallezot, Jean-Dominique; Rossano, Samantha; Carson, Richard E

    2017-12-11

    Positron emission tomography, an imaging tool using radiolabeled tracers in humans and preclinical species, has been widely used in recent years in drug development, particularly in the central nervous system. One important goal of PET in drug development is assessing the occupancy of various molecular targets (e.g., receptors, transporters, enzymes) by exogenous drugs. The current linear mathematical approaches used to determine occupancy using PET imaging experiments are presented. These algorithms use results from multiple regions with different target content in two scans, a baseline (pre-drug) scan and a post-drug scan. New mathematical estimation approaches to determine target occupancy, using maximum likelihood, are presented. A major challenge in these methods is the proper definition of the covariance matrix of the regional binding measures, accounting for different variance of the individual regional measures and their nonzero covariance, factors that have been ignored by conventional methods. The novel methods are compared to standard methods using simulation and real human occupancy data. The simulation data showed the expected reduction in variance and bias using the proper maximum likelihood methods, when the assumptions of the estimation method matched those in simulation. Between-method differences for data from human occupancy studies were less obvious, in part due to small dataset sizes. These maximum likelihood methods form the basis for development of improved PET covariance models, in order to minimize bias and variance in PET occupancy studies.

  11. Intrafractional baseline drift during free breathing breast cancer radiation therapy.

    PubMed

    Jensen, Christer Andre; Acosta Roa, Ana María; Lund, Jo-Åsmund; Frengen, Jomar

    2017-06-01

    Intrafraction motion in breast cancer radiation therapy (BCRT) has not yet been thoroughly described in the literature. It has been observed that baseline drift occurs as part of the intrafraction motion. This study aims to measure baseline drift and its incidence in free-breathing BCRT patients using an in-house developed laser system for tracking the position of the sternum. Baseline drift was monitored in 20 right-sided breast cancer patients receiving free breathing 3D-conformal RT by using an in-house developed laser system which measures one-dimensional distance in the AP direction. A total of 357 patient respiratory traces from treatment sessions were logged and analysed. Baseline drift was compared to patient positioning error measured from in-field portal imaging. The mean overall baseline drift at end of treatment sessions was -1.3 mm for the patient population. Relatively small baseline drift was observed during the first fraction; however it was clearly detected already at the second fraction. Over 90% of the baseline drift occurs during the first 3 min of each treatment session. The baseline drift rate for the population was -0.5 ± 0.2 mm/min in the posterior direction the first minute after localization. Only 4% of the treatment sessions had a 5 mm or larger baseline drift at 5 min, all towards the posterior direction. Mean baseline drift in the posterior direction in free breathing BCRT was observed in 18 of 20 patients over all treatment sessions. This study shows that there is a substantial baseline drift in free breathing BCRT patients. No clear baseline drift was observed during the first treatment session; however, baseline drift was markedly present at the rest of the sessions. Intrafraction motion due to baseline drift should be accounted for in margin calculations.

  12. Classification of baseline toxicants for QSAR predictions to replace fish acute toxicity studies.

    PubMed

    Nendza, Monika; Müller, Martin; Wenzel, Andrea

    2017-03-22

    Fish acute toxicity studies are required for environmental hazard and risk assessment of chemicals by national and international legislations such as REACH, the regulations of plant protection products and biocidal products, or the GHS (globally harmonised system) for classification and labelling of chemicals. Alternative methods like QSARs (quantitative structure-activity relationships) can replace many ecotoxicity tests. However, complete substitution of in vivo animal tests by in silico methods may not be realistic. For the so-called baseline toxicants, it is possible to predict the fish acute toxicity with sufficient accuracy from log K ow and, hence, valid QSARs can replace in vivo testing. In contrast, excess toxicants and chemicals not reliably classified as baseline toxicants require further in silico, in vitro or in vivo assessments. Thus, the critical task is to discriminate between baseline and excess toxicants. For fish acute toxicity, we derived a scheme based on structural alerts and physicochemical property thresholds to classify chemicals as either baseline toxicants (=predictable by QSARs) or as potential excess toxicants (=not predictable by baseline QSARs). The step-wise approach identifies baseline toxicants (true negatives) in a precautionary way to avoid false negative predictions. Therefore, a certain fraction of false positives can be tolerated, i.e. baseline toxicants without specific effects that may be tested instead of predicted. Application of the classification scheme to a new heterogeneous dataset for diverse fish species results in 40% baseline toxicants, 24% excess toxicants and 36% compounds not classified. Thus, we can conclude that replacing about half of the fish acute toxicity tests by QSAR predictions is realistic to be achieved in the short-term. The long-term goals are classification criteria also for further groups of toxicants and to replace as many in vivo fish acute toxicity tests as possible with valid QSAR predictions.

  13. Carrier phase ambiguity resolution for the Global Positioning System applied to geodetic baselines up to 2000 km

    NASA Technical Reports Server (NTRS)

    Blewitt, Geoffrey

    1989-01-01

    A technique for resolving the ambiguities in the GPS carrier phase data (which are biased by an integer number of cycles) is described which can be applied to geodetic baselines up to 2000 km in length and can be used with dual-frequency P code receivers. The results of such application demonstrated that a factor of 3 improvement in baseline accuracy could be obtained, giving centimeter-level agreement with coordinates inferred by very-long-baseline interferometry in the western United States. It was found that a method using pseudorange data is more reliable than one using ionospheric constraints for baselines longer than 200 km. It is recommended that future GPS networks have a wide spectrum of baseline lengths (ranging from baselines shorter than 100 km to those longer than 1000 km) and that GPS receivers be used which can acquire dual-frequency P code data.

  14. Baseline estimation in flame's spectra by using neural networks and robust statistics

    NASA Astrophysics Data System (ADS)

    Garces, Hugo; Arias, Luis; Rojas, Alejandro

    2014-09-01

    This work presents a baseline estimation method in flame spectra based on artificial intelligence structure as a neural network, combining robust statistics with multivariate analysis to automatically discriminate measured wavelengths belonging to continuous feature for model adaptation, surpassing restriction of measuring target baseline for training. The main contributions of this paper are: to analyze a flame spectra database computing Jolliffe statistics from Principal Components Analysis detecting wavelengths not correlated with most of the measured data corresponding to baseline; to systematically determine the optimal number of neurons in hidden layers based on Akaike's Final Prediction Error; to estimate baseline in full wavelength range sampling measured spectra; and to train an artificial intelligence structure as a Neural Network which allows to generalize the relation between measured and baseline spectra. The main application of our research is to compute total radiation with baseline information, allowing to diagnose combustion process state for optimization in early stages.

  15. Evaluation of capillary electrophoresis for in-flight ionic contaminant monitoring of SSF potable water

    NASA Technical Reports Server (NTRS)

    Mudgett, Paul D.; Schultz, John R.; Sauer, Richard L.

    1992-01-01

    Until 1989, ion chromatography (IC) was the baseline technology selected for the Specific Ion Analyzer, an in-flight inorganic water quality monitor being designed for Space Station Freedom. Recent developments in capillary electrophoresis (CE) may offer significant savings of consumables, power consumption, and weight/volume allocation, relative to IC technology. A thorough evaluation of CE's analytical capability, however, is necessary before one of the two techniques is chosen. Unfortunately, analytical methods currently available for inorganic CE are unproven for NASA's target list of anions and cations. Thus, CE electrolyte chemistry and methods to measure the target contaminants must be first identified and optimized. This paper reports the status of a study to evaluate CE's capability with regard to inorganic and carboxylate anions, alkali and alkaline earth cations, and transition metal cations. Preliminary results indicate that CE has an impressive selectivity and trace sensitivity, although considerable methods development remains to be performed.

  16. The Flight Optimization System Weights Estimation Method

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.

    2017-01-01

    FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.

  17. Designing Studies That Would Address the Multilayered Nature of Health Care

    PubMed Central

    Pennell, Michael; Rhoda, Dale; Hade, Erinn M.; Paskett, Electra D.

    2010-01-01

    We review design and analytic methods available for multilevel interventions in cancer research with particular attention to study design, sample size requirements, and potential to provide statistical evidence for causal inference. The most appropriate methods will depend on the stage of development of the research and whether randomization is possible. Early on, fractional factorial designs may be used to screen intervention components, particularly when randomization of individuals is possible. Quasi-experimental designs, including time-series and multiple baseline designs, can be useful once the intervention is designed because they require few sites and can provide the preliminary evidence to plan efficacy studies. In efficacy and effectiveness studies, group-randomized trials are preferred when randomization is possible and regression discontinuity designs are preferred otherwise if assignment based on a quantitative score is possible. Quasi-experimental designs may be used, especially when combined with recent developments in analytic methods to reduce bias in effect estimates. PMID:20386057

  18. Traumatic brain injury and risk of dementia in older veterans

    PubMed Central

    Kaup, Allison; Kirby, Katharine A.; Byers, Amy L.; Diaz-Arrastia, Ramon; Yaffe, Kristine

    2014-01-01

    Objectives: Traumatic brain injury (TBI) is common in military personnel, and there is growing concern about the long-term effects of TBI on the brain; however, few studies have examined the association between TBI and risk of dementia in veterans. Methods: We performed a retrospective cohort study of 188,764 US veterans aged 55 years or older who had at least one inpatient or outpatient visit during both the baseline (2000–2003) and follow-up (2003–2012) periods and did not have a dementia diagnosis at baseline. TBI and dementia diagnoses were determined using ICD-9 codes in electronic medical records. Fine-Gray proportional hazards models were used to determine whether TBI was associated with greater risk of incident dementia, accounting for the competing risk of death and adjusting for demographics, medical comorbidities, and psychiatric disorders. Results: Veterans were a mean age of 68 years at baseline. During the 9-year follow-up period, 16% of those with TBI developed dementia compared with 10% of those without TBI (adjusted hazard ratio, 1.57; 95% confidence interval: 1.35–1.83). There was evidence of an additive association between TBI and other conditions on risk of dementia. Conclusions: TBI in older veterans was associated with a 60% increase in the risk of developing dementia over 9 years after accounting for competing risks and potential confounders. Our results suggest that TBI in older veterans may predispose toward development of symptomatic dementia and raise concern about the potential long-term consequences of TBI in younger veterans and civilians. PMID:24966406

  19. Analyzing industrial energy use through ordinary least squares regression models

    NASA Astrophysics Data System (ADS)

    Golden, Allyson Katherine

    Extensive research has been performed using regression analysis and calibrated simulations to create baseline energy consumption models for residential buildings and commercial institutions. However, few attempts have been made to discuss the applicability of these methodologies to establish baseline energy consumption models for industrial manufacturing facilities. In the few studies of industrial facilities, the presented linear change-point and degree-day regression analyses illustrate ideal cases. It follows that there is a need in the established literature to discuss the methodologies and to determine their applicability for establishing baseline energy consumption models of industrial manufacturing facilities. The thesis determines the effectiveness of simple inverse linear statistical regression models when establishing baseline energy consumption models for industrial manufacturing facilities. Ordinary least squares change-point and degree-day regression methods are used to create baseline energy consumption models for nine different case studies of industrial manufacturing facilities located in the southeastern United States. The influence of ambient dry-bulb temperature and production on total facility energy consumption is observed. The energy consumption behavior of industrial manufacturing facilities is only sometimes sufficiently explained by temperature, production, or a combination of the two variables. This thesis also provides methods for generating baseline energy models that are straightforward and accessible to anyone in the industrial manufacturing community. The methods outlined in this thesis may be easily replicated by anyone that possesses basic spreadsheet software and general knowledge of the relationship between energy consumption and weather, production, or other influential variables. With the help of simple inverse linear regression models, industrial manufacturing facilities may better understand their energy consumption and production behavior, and identify opportunities for energy and cost savings. This thesis study also utilizes change-point and degree-day baseline energy models to disaggregate facility annual energy consumption into separate industrial end-user categories. The baseline energy model provides a suitable and economical alternative to sub-metering individual manufacturing equipment. One case study describes the conjoined use of baseline energy models and facility information gathered during a one-day onsite visit to perform an end-point energy analysis of an injection molding facility conducted by the Alabama Industrial Assessment Center. Applying baseline regression model results to the end-point energy analysis allowed the AIAC to better approximate the annual energy consumption of the facility's HVAC system.

  20. Effect of mixed mutans streptococci colonization on caries development.

    PubMed

    Seki, M; Yamashita, Y; Shibata, Y; Torigoe, H; Tsuda, H; Maeno, M

    2006-02-01

    To evaluate the clinical importance of mixed mutans streptococci colonization in predicting caries in preschool children. Caries prevalence was examined twice, with a 6-month interval, in 410 preschool children aged 3-4 years at baseline. A commercial strip method was used to evaluate the mutans streptococci score in plaque collected from eight selected interdental spaces and in saliva. Mutans streptococci typing polymerase chain reaction (PCR) assays (Streptococcus sobrinus and Streptococcus mutans, including serotypes c, e, and f) were performed using colonies on the strips as template. Twenty variables were examined in a univariate analysis to predict caries development: questionnaire variables, results of clinical examination, mutans streptococci scores, and PCR detection of S. sobrinus and S. mutans (including serotypes c, e, and f). Sixteen variables showed statistically significant associations (P < 0.04) in the univariate analysis. However, when entered into a logistic regression, only five variables remained significant (P < 0.05): caries experience at baseline; mixed colonization of S. sobrinus and S. mutans including S. mutans serotypes; high plaque mutans streptococci score; habitual use of sweet drinks; and nonuse of fluoride toothpaste. 'Mixed mutans streptococci colonization' is a novel measure correlated with caries development in their primary dentition.

  1. The development of peripheral fatigue and short-term recovery during self-paced high-intensity exercise

    PubMed Central

    Froyd, Christian; Millet, Guillaume Y; Noakes, Timothy D

    2013-01-01

    The time course of muscular fatigue that develops during and after an intense bout of self-paced dynamic exercise was characterized by using different forms of electrical stimulation (ES) of the exercising muscles. Ten active subjects performed a time trial (TT) involving repetitive concentric extension/flexion of the right knee using a Biodex dynamometer. Neuromuscular function (NMF), including ES and a 5 s maximal isometric voluntary contraction (MVC), was assessed before the start of the TT and immediately (<5 s) after each 20% of the TT had been completed, as well as 1, 2, 4 and 8 min after TT termination. The TT time was 347 ± 98 s. MVCs were 52% of baseline values at TT termination. Torque responses from ES were reduced to 33–68% of baseline using different methods of stimulation, suggesting that the extent to which peripheral fatigue is documented during exercise depends upon NMF assessment methodology. The major changes in muscle function occurred within the first 40% of exercise. Significant recovery in skeletal muscle function occurs within the first 1–2 min after exercise, showing that previous studies may have underestimated the extent to which peripheral fatigue develops during exercise. PMID:23230235

  2. Baseline experimental investigation of an electrohydrodynamically assisted heat pipe

    NASA Technical Reports Server (NTRS)

    Duncan, A. B.

    1995-01-01

    The increases in power demand and associated thermal management requirements of future space programs such as potential Lunar/Mars missions will require enhancing the operating efficiencies of thermal management devices. Currently, the use of electrohydrodynamically (EHD) assisted thermal control devices is under consideration as a potential method of increasing thermal management system capacity. The objectives of the currently described investigation included completing build-up of the EHD-Assisted Heat Pipe Test bed, developing test procedures for an experimental evaluation of the unassisted heat pipe, developing an analytical model capable of predicting the performance limits of the unassisted heat pipe, and obtaining experimental data which would define the performance characteristics of the unassisted heat pipe. The information obtained in the currently proposed study will be used in order to provide extensive comparisons with the EHD-assisted performance observations to be obtained during the continuing investigation of EHD-Assisted heat transfer devices. Through comparisons of the baseline test bed data and the EHD assisted test bed data, accurate insight into the performance enhancing characteristics of EHD augmentation may be obtained. This may lead to optimization, development, and implementation of EHD technology for future space programs.

  3. A new, objective, quantitative scale for measuring local skin responses following topical actinic keratosis therapy with ingenol mebutate.

    PubMed

    Rosen, Robert; Marmur, Ellen; Anderson, Lawrence; Welburn, Peter; Katsamas, Janelle

    2014-12-01

    Local skin responses (LSRs) are the most common adverse effects of topical actinic keratosis (AK) therapy. There is currently no method available that allows objective characterization of LSRs. Here, the authors describe a new scale developed to quantitatively and objectively assess the six most common LSRs resulting from topical AK therapy with ingenol mebutate. The LSR grading scale was developed using a 0-4 numerical rating, with clinical descriptors and representative photographic images for each rating. Good inter-observer grading concordance was demonstrated in peer review during development of the tool. Data on the use of the scale are described from four phase III double-blind studies of ingenol mebutate (n = 1,005). LSRs peaked on days 4 (face/scalp) or 8 (trunk/extremities), with mean maximum composite LSR scores of 9.1 and 6.8, respectively, and a rapid return toward baseline by day 15 in most cases. Mean composite LSR score at day 57 was generally lower than at baseline. The LSR grading scale is an objective tool allowing practicing dermatologists to characterize and compare LSRs to existing and, potentially, future AK therapies.

  4. Neurocognitive predictors of financial capacity in traumatic brain injury.

    PubMed

    Martin, Roy C; Triebel, Kristen; Dreer, Laura E; Novack, Thomas A; Turner, Crystal; Marson, Daniel C

    2012-01-01

    To develop cognitive models of financial capacity (FC) in patients with traumatic brain injury (TBI). Longitudinal design. Inpatient brain injury rehabilitation unit. Twenty healthy controls, and 24 adults with moderate-to-severe TBI were assessed at baseline (30 days postinjury) and 6 months postinjury. The FC instrument (FCI) and a neuropsychological test battery. Univariate correlation and multiple regression procedures were employed to develop cognitive models of FCI performance in the TBI group, at baseline and 6-month time follow-up. Three cognitive predictor models of FC were developed. At baseline, measures of mental arithmetic/working memory and immediate verbal memory predicted baseline FCI performance (R = 0.72). At 6-month follow-up, measures of executive function and mental arithmetic/working memory predicted 6-month FCI performance (R = 0.79), and a third model found that these 2 measures at baseline predicted 6-month FCI performance (R = 0.71). Multiple cognitive functions are associated with initial impairment and partial recovery of FC in moderate-to-severe TBI patients. In particular, arithmetic, working memory, and executive function skills appear critical to recovery of FC in TBI. The study results represent an initial step toward developing a neurocognitive model of FC in patients with TBI.

  5. Design and Phenomenology of the FEBSTAT Study

    PubMed Central

    Hesdorffer, Dale C; Shinnar, Shlomo; Lewis, Darrell V; Moshé, Solomon L; Nordli, Douglas R; Pellock, John M; MacFall, James; Shinnar, Ruth C; Masur, David; Frank, L Matthew; Epstein, Leon G; Litherland, Claire; Seinfeld, Syndi; Bello, Jacqueline A; Chan, Stephen; Bagiella, Emilia; Sun, Shumei

    2012-01-01

    Purpose Febrile status epilepticus (FSE) has been associated with hippocampal injury and subsequent hipppocampal sclerosis (HS) and temporal lobe epilepsy. The FEBSTAT study was designed to prospectively examine the association between prolonged febrile seizures and development of HS and associated temporal lobe epilepsy, one of the most controversial issues in epilepsy. We report on the baseline phenomenology of the final cohorts as well as detailed aims and methodology. Methods The “Consequences of Prolonged Febrile Seizures in Childhood” (FEBSTAT) study is a prospective, multicenter study. Enrolled are children, aged 1 month to 6 years, presenting with a febrile seizure lasting 30 minutes or more based upon ambulance, emergency department, and hospital records, and parental interview. At baseline, procedures included an MRI and EEG done within 72 hours of FSE, and a detailed history and neurological examination. Baseline development and behavior are assessed at one month. The baseline assessment is repeated, with age- appropriate developmental testing at one and five years after enrollment as well as at the development of epilepsy and one year after that. Telephone calls every three months document further seizures. Two other groups of children are included: a ‘control’ group consisting of children with a first febrile seizure ascertained at Columbia University and with almost identical baseline and one year follow-up examinations and a pilot cohort of FSE from Duke University. Key findings The FEBSTAT cohort consists of 199 children with a median age at baseline of 16.0 months (Interquartile range (IQR)=12.0–24.0) and a median duration of FSE of 70.0 minutes (IQR=47.0–110.0). Seizures were continuous in 57.3% and behaviorally intermittent (without recovery in between) in 31.2%; most were partial (4;2.0%) or secondary generalized (65.8%), and almost all (98.0%) culminated in a generalized tonic clonic seizure. Of the 199 children, 86.4% had normal development and 20% had prior febrile seizures. In one third of cases, FSE was unrecognized in the emergency department. The Duke existing cohort consists of 23 children with a median age of FSE onset of 18.0 months (IQR 14.0–28.0) and median duration of FSE of 90.0 minutes (IQR 50.0–170.0). The Columbia control cohort consists of 159 children with a first febrile seizure who received almost the same work-up as the FEBSTAT cohort at baseline and at one-year. They were followed by telephone every 4 months for a median of 42 months. Among the control cohort, 64.2% had a first simple FS, 26.4% had a first complex FS that was not FSE, and 9.4% had FSE. Among the 15 with FSE, the median age at onset was 14.0 months (IQR 12.0–20.0) and the median duration of FSE was 43.0 minutes (IQR 35.0–75.0). Significance The FEBSTAT study presents an opportunity to prospectively study the relationship between FSE and acute hippocampal damage, the development of MTS, epilepsy (particularly TLE), and impaired hippocampal function in a large cohort. It is hoped that this study may illuminate a major mystery in clinical epilepsy today, and permit the development of interventions designed to prevent the sequelae of FSE. PMID:22742587

  6. The Female Athlete Body (FAB) study: Rationale, design, and baseline characteristics.

    PubMed

    Stewart, Tiffany M; Pollard, Tarryn; Hildebrandt, Tom; Beyl, Robbie; Wesley, Nicole; Kilpela, Lisa Smith; Becker, Carolyn Black

    2017-09-01

    Eating Disorders (EDs) are serious psychiatric illnesses marked by psychiatric comorbidity, medical complications, and functional impairment. Research indicates that female athletes are often at greater risk for developing ED pathology versus non-athlete females. The Female Athlete Body (FAB) study is a three-site, randomized controlled trial (RCT) designed to assess the efficacy of a behavioral ED prevention program for female collegiate athletes when implemented by community providers. This paper describes the design, intervention, and participant baseline characteristics. Future papers will discuss outcomes. Female collegiate athletes (N=481) aged 17-21 were randomized by site, team, and sport type to either FAB or a waitlist control group. FAB consisted of three sessions (1.3h each) of a behavioral ED prevention program. Assessments were conducted at baseline (pre-intervention), post-intervention (3weeks), and six-, 12-, and 18-month follow-ups. This study achieved 96% (N=481) of target recruitment (N=500). Few group differences emerged at baseline. Total sample analyses revealed moderately low baseline instances of ED symptoms and clinical cases. Health risks associated with EDs necessitate interventions for female athletes. The FAB study is the largest existing RCT for female athletes aimed at both reduction of ED risk factors and ED prevention. The methods presented and population recruited for this study represent an ideal intervention for assessing the effects of FAB on both the aforementioned outcomes. We anticipate that findings of this study (reported in future papers) will make a significant contribution to the ED risk factor reduction and prevention literature. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study.

    PubMed

    Egbewale, Bolaji E; Lewis, Martyn; Sim, Julius

    2014-04-09

    Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. 126 hypothetical trial scenarios were evaluated (126,000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power.

  8. Biological baseline data Youngs Bay, Oregon, 1974

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMechan, K.J.; Higley, D.L.; Holton, R.L.

    1975-04-01

    This report presents biological baseline information gathered during the research project, Physical, Chemical and Biological Studies on Youngs Bay.'' Youngs Bay is a shallow embayment located on the south shore of the Columbia River, near Astoria, Oregon. Research on Youngs Bay was motivated by the proposed construction by Alumax Pacific Aluminum Corporation of an aluminum reduction plant at Warrenton, Oregon. The research was designed to provide biological baseline information on Youngs Bay in anticipation of potential harmful effects from plant effluents. The information collected concerns the kinds of animals found in the Youngs Bay area, and their distribution and seasonalmore » patterns of abundance. In addition, information was collected on the feeding habits of selected fish species, and on the life history and behavioral characteristics of the most abundant benthic amphipod, Corophium salmonis. Sampling was conducted at approximately three-week intervals, using commonly accepted methods of animal collection. Relatively few stations were sampled for fish, because of the need to standardize conditions of capture. Data on fish capture are reported in terms of catch-per-unit effort by a particular sampling gear at a specific station. Methods used in sampling invertebrates were generally more quantitative, and allowed sampling at a greater variety of places, as well as a valid basis for the computation of densities. Checklists of invertebrate species and fish species were developed from these samples, and are referred to throughout the report. The invertebrate checklist is more specific taxonomically than are tables reporting invertebrate densities. This is because the methods employed in identification were more precise than those used in counts. 9 refs., 27 figs., 25 tabs.« less

  9. From theory to application: using performance measures for contraceptive care in the Title X family planning program.

    PubMed

    Loyola Briceno, Ana Carolina; Kawatu, Jennifer; Saul, Katie; DeAngelis, Katie; Frederiksen, Brittni; Moskosky, Susan B; Gavin, Lorrie

    2017-09-01

    The objective was to describe a Performance Measure Learning Collaborative (PMLC) designed to help Title X family planning grantees use new clinical performance measures for contraceptive care. Twelve Title X grantee-service site teams participated in an 8-month PMLC from November 2015 to June 2016; baseline was assessed in October 2015. Each team documented their selected best practices and strategies to improve performance, and calculated the contraceptive care performance measures at baseline and for each of the subsequent 8 months. PMLC sites implemented a mix of best practices: (a) ensuring access to a broad range of methods (n=7 sites), (b) supporting women through client-centered counseling and reproductive life planning (n=8 sites), (c) developing systems for same-day provision of all methods (n=10 sites) and (d) utilizing diverse payment options to reduce cost as a barrier (n=4 sites). Ten sites (83%) observed an increase in the clinical performance measures focused on most and moderately effective methods (MME), with a median percent change of 6% for MME (from a median of 73% at baseline to 77% post-PMLC). Evidence suggests that the PMLC model is an approach that can be used to improve the quality of contraceptive care offered to clients in some settings. Further replication of the PMLC among other groups and beyond the Title X network will help strengthen the current model through lessons learned. Using the performance measures in the context of a learning collaborative may be a useful strategy for other programs (e.g., Federally Qualified Health Centers, Medicaid, private health plans) that provide contraceptive care. Expanded use of the measures may help increase access to contraceptive care to achieve national goals for family planning. Published by Elsevier Inc.

  10. Epicardial Adipose Tissue Contributes to the Development of Non-Calcified Coronary Plaque: A 5-Year Computed Tomography Follow-up Study

    PubMed Central

    Hwang, In-Chang; Choi, Su-Yeon

    2017-01-01

    Aim: Epicardial adipose tissue (EAT) has been suggested as a contributing factor for coronary atherosclerosis based on the previous cross-sectional studies and pathophysiologic background. However, a causal relationship between EAT and the development of non-calcified coronary plaque (NCP) has not been investigated. Methods: A total of 122 asymptomatic individuals (age, 56.0 ± 7.6 years; male, 80.3%) without prior history of coronary artery disease (CAD) or metabolic syndrome and without NCP or obstructive CAD at baseline cardiac computed tomography (CT) were enrolled. Repeat cardiac CT was performed with an interval of more than 5 years. Epicardial fat volume index (EFVi; cm3/m2) was assessed in relation to the development of NCP on the follow-up CT where the results were classified into “calcified plaque (CP),” “no plaque,” and “NCP” groups. Results: On the follow-up CT performed with a median interval of 65.4 months, we observed newly developed NCP in 24 (19.7%) participants. Baseline EFVi was significantly higher in the NCP group (79.9 ± 30.3 cm3/m2) than in the CP group (63.7 ± 22.7 cm3/m2; P = 0.019) and in the no plaque group (62.5 ± 24.7 cm3/m2; P = 0.021). Multivariable logistic regression analysis demonstrated that the presence of diabetes (OR, 9.081; 95% CI, 1.682–49.034; P = 0.010) and the 3rd tertile of EFVi (OR, 4.297; 95% CI, 1.040–17.757; P = 0.044 compared to the 1st tertile) were the significant predictors for the development of NCP on follow-up CT. Conclusions: Greater amount of EAT at baseline CT independently predicts the development of NCP in asymptomatic individuals. PMID:27506880

  11. Detection of QT prolongation using a novel electrocardiographic analysis algorithm applying intelligent automation: prospective blinded evaluation using the Cardiac Safety Research Consortium electrocardiographic database.

    PubMed

    Green, Cynthia L; Kligfield, Paul; George, Samuel; Gussak, Ihor; Vajdic, Branislav; Sager, Philip; Krucoff, Mitchell W

    2012-03-01

    The Cardiac Safety Research Consortium (CSRC) provides both "learning" and blinded "testing" digital electrocardiographic (ECG) data sets from thorough QT (TQT) studies annotated for submission to the US Food and Drug Administration (FDA) to developers of ECG analysis technologies. This article reports the first results from a blinded testing data set that examines developer reanalysis of original sponsor-reported core laboratory data. A total of 11,925 anonymized ECGs including both moxifloxacin and placebo arms of a parallel-group TQT in 181 subjects were blindly analyzed using a novel ECG analysis algorithm applying intelligent automation. Developer-measured ECG intervals were submitted to CSRC for unblinding, temporal reconstruction of the TQT exposures, and statistical comparison to core laboratory findings previously submitted to FDA by the pharmaceutical sponsor. Primary comparisons included baseline-adjusted interval measurements, baseline- and placebo-adjusted moxifloxacin QTcF changes (ddQTcF), and associated variability measures. Developer and sponsor-reported baseline-adjusted data were similar with average differences <1 ms for all intervals. Both developer- and sponsor-reported data demonstrated assay sensitivity with similar ddQTcF changes. Average within-subject SD for triplicate QTcF measurements was significantly lower for developer- than sponsor-reported data (5.4 and 7.2 ms, respectively; P < .001). The virtually automated ECG algorithm used for this analysis produced similar yet less variable TQT results compared with the sponsor-reported study, without the use of a manual core laboratory. These findings indicate that CSRC ECG data sets can be useful for evaluating novel methods and algorithms for determining drug-induced QT/QTc prolongation. Although the results should not constitute endorsement of specific algorithms by either CSRC or FDA, the value of a public domain digital ECG warehouse to provide prospective, blinded comparisons of ECG technologies applied for QT/QTc measurement is illustrated. Copyright © 2012 Mosby, Inc. All rights reserved.

  12. Evaluation of the fibromyalgia impact questionnaire at baseline as a predictor for time to pain improvement in two clinical trials of pregabalin.

    PubMed

    Bushmakin, A G; Cappelleri, J C; Chandran, A B; Zlateva, G

    2013-01-01

    The Fibromyalgia Impact Questionnaire (FIQ) is a patient-reported outcome that evaluates the impact of fibromyalgia (FM) on daily life. This study evaluated the relationships between the functional status of FM patients, measured with the FIQ at baseline, and median time to a clinically relevant pain reduction. Data were derived from two randomised, placebo-controlled trials that evaluated pregabalin 300, 450 and 600 mg/day for the treatment of FM. The Kaplan-Meier (nonparametric) method was applied to estimate median times to 'transient' and 'stable' events. The transient event was defined as a ≥ 27.9% improvement on an 11-point daily pain diary scale (0 = no pain, 10 = worst possible pain), and the stable event was defined as the mean of the daily improvements ≥ 27.9% relative to baseline over the subsequent study duration starting on the day of the transient event. A parametric model using time-to-event analysis was developed for evaluating the relationship between baseline FIQ score and the median time to these events. Median time was longer among patients treated with placebo relative to pregabalin for the transient events (11-12 days vs. 5-7 days) and stable events (86 days vs. 13-29 days). A significant association was observed between baseline FIQ scores and median time to transient and stable events (p < 0.001). Median times to events were similar between the studies. For transient pain reduction events, median times ranged from 3.0 to 4.5 days for baseline FIQ scores of 10, and 9.1-9.6 days for FIQ scores of 100; for stable pain reduction events, the median time ranged from 11.0 to 13.0 days and from 27.0 to 28.5 days for baseline FIQ scores of 10 and 100 respectively. Time to a clinically relevant reduction in pain was significantly associated with FM severity at baseline as measured by the FIQ. Such an analysis can inform patient and physician expectations in clinical practice. © 2012 Blackwell Publishing Ltd.

  13. Semiparametric temporal process regression of survival-out-of-hospital.

    PubMed

    Zhan, Tianyu; Schaubel, Douglas E

    2018-05-23

    The recurrent/terminal event data structure has undergone considerable methodological development in the last 10-15 years. An example of the data structure that has arisen with increasing frequency involves the recurrent event being hospitalization and the terminal event being death. We consider the response Survival-Out-of-Hospital, defined as a temporal process (indicator function) taking the value 1 when the subject is currently alive and not hospitalized, and 0 otherwise. Survival-Out-of-Hospital is a useful alternative strategy for the analysis of hospitalization/survival in the chronic disease setting, with the response variate representing a refinement to survival time through the incorporation of an objective quality-of-life component. The semiparametric model we consider assumes multiplicative covariate effects and leaves unspecified the baseline probability of being alive-and-out-of-hospital. Using zero-mean estimating equations, the proposed regression parameter estimator can be computed without estimating the unspecified baseline probability process, although baseline probabilities can subsequently be estimated for any time point within the support of the censoring distribution. We demonstrate that the regression parameter estimator is asymptotically normal, and that the baseline probability function estimator converges to a Gaussian process. Simulation studies are performed to show that our estimating procedures have satisfactory finite sample performances. The proposed methods are applied to the Dialysis Outcomes and Practice Patterns Study (DOPPS), an international end-stage renal disease study.

  14. Satisfaction with daily occupations amongst asylum seekers in Denmark.

    PubMed

    Morville, Anne-Le; Erlandsson, Lena-Karin; Danneskiold-Samsøe, Bente; Amris, Kirstine; Eklund, Mona

    2015-05-01

    The aim of this study was to describe asylum seekers' satisfaction with daily occupations and activity level while in a Danish asylum centre, and whether this changed over time. Another aim was to describe whether exposure to torture, self-rated health measures, and ADL ability were related to their satisfaction with daily occupations and activity level. A total of 43 asylum seekers at baseline and 17 at follow-up were included. The questionnaires Satisfaction with Daily Occupations, Major Depression Inventory, WHO-5 Wellbeing, Pain Detect, a questionnaire covering torture, and basic social information were used as well as Assessment of Motor and Process Skills. The results showed a low level of satisfaction with daily occupations at both baseline and follow-up. There was no statistically significant change in satisfaction or activity level between baseline and the follow-up. Associations between AMPS process skills--education, worst pain and activity level--were present at baseline, as was a relationship between AMPS process skills and satisfaction. At follow-up, associations between WHO-5 and satisfaction and activity level and between MDI scores and activity level were found. Asylum seekers experience a low level of satisfaction with daily occupations, both at arrival and after 10 months in an asylum centre. There is a need for further research and development of occupation-focused rehabilitation methods for the asylum seeker population.

  15. Maintenance of Pain in Children With Functional Abdominal Pain.

    PubMed

    Czyzewski, Danita I; Self, Mariella M; Williams, Amy E; Weidler, Erica M; Blatz, Allison M; Shulman, Robert J

    2016-03-01

    A significant proportion of children with functional abdominal pain develop chronic pain. Identifying clinical characteristics predicting pain persistence is important in targeting interventions. We examined whether child anxiety and/or pain-stooling relations were related to maintenance of abdominal pain frequency and compared the predictive value of 3 methods for assessing pain-stooling relations (ie, diary, parent report, child report). Seventy-six children (7-10 years old at baseline) who presented for medical treatment of functional abdominal pain were followed up 18 to 24 months later. Baseline anxiety and abdominal pain-stooling relations based on pain and stooling diaries and child- and parent questionnaires were examined in relationship to the persistence of abdominal pain frequency. Children's baseline anxiety was not related to persistence of pain frequency. Children who, however, displayed irritable bowel syndrome (IBS) symptoms at baseline maintained pain frequency at follow-up, whereas in children in whom there was no relationship between pain and stooling, pain frequency decreased. Pain and stool diaries and parent report of pain-stooling relations were predictive of pain persistence but child-report questionnaires were not. The presence of IBS symptoms in school-age children with functional abdominal pain appears to predict persistence of abdominal pain over time, whereas anxiety does not. Prospective pain and stooling diaries and parent report of IBS symptoms were predictors of pain maintenance, but child report of symptoms was not.

  16. Sensitivity to azoxystrobin in Didymella bryoniae isolates collected before and after field use of strobilurin fungicides.

    PubMed

    Keinath, Anthony P

    2009-10-01

    Isolates of Didymella bryoniae (Auersw.) Rehm, causal agent of gummy stem blight on cucurbits, developed insensitivity to azoxystrobin in the eastern United States 2 years after first commercial use in 1998. Baseline sensitivity of this fungus to azoxystrobin has never been reported. The objectives were to compare baseline sensitivities of D. bryoniae from South Carolina and other locations to sensitivities of isolates exposed to azoxystrobin for one or more seasons, and to compare sensitivity in vitro and in vivo. Sixty-one isolates of D. bryoniae collected before 1998 were sensitive. Median EC50 was 0.055 mg L(-1) azoxystrobin (range 0.005 to 0.81). Forty isolates collected after exposure during 1998 also were sensitive. Fifty-three of 64 isolates collected in South and North Carolina between 2000 and 2006 were insensitive to 10 mg L(-1) azoxystrobin. Sensitive and insensitive isolates were distinguished by disease severity on Cucumis melo L. seedlings treated with azoxystrobin (20 or 200 mg L(-1)). An azoxystrobin baseline sensitivity distribution was established in vitro for isolates of D. bryoniae never exposed to strobilurins. Baseline values were comparable with those of other ascomycetes. Insensitive isolates were found in fields with a history of strobilurin applications. An in vivo method distinguished sensitive and insensitive isolates. Copyright 2009 Society of Chemical Industry.

  17. The Geobiosphere Emergy Baseline: A synthesis

    EPA Science Inventory

    Following the Eighth Biennial Emergy Conference (January, 2014), the need for revisiting the procedures and assumptions used to compute the Geobiosphere Emergy Baseline (GEB) emerged as a necessity to strengthen the method of Emergy Accounting and remove sources of ambiguity and ...

  18. Comparison of Baseline Wander Removal Techniques considering the Preservation of ST Changes in the Ischemic ECG: A Simulation Study

    PubMed Central

    Pilia, Nicolas; Schulze, Walther H. W.; Dössel, Olaf

    2017-01-01

    The most important ECG marker for the diagnosis of ischemia or infarction is a change in the ST segment. Baseline wander is a typical artifact that corrupts the recorded ECG and can hinder the correct diagnosis of such diseases. For the purpose of finding the best suited filter for the removal of baseline wander, the ground truth about the ST change prior to the corrupting artifact and the subsequent filtering process is needed. In order to create the desired reference, we used a large simulation study that allowed us to represent the ischemic heart at a multiscale level from the cardiac myocyte to the surface ECG. We also created a realistic model of baseline wander to evaluate five filtering techniques commonly used in literature. In the simulation study, we included a total of 5.5 million signals coming from 765 electrophysiological setups. We found that the best performing method was the wavelet-based baseline cancellation. However, for medical applications, the Butterworth high-pass filter is the better choice because it is computationally cheap and almost as accurate. Even though all methods modify the ST segment up to some extent, they were all proved to be better than leaving baseline wander unfiltered. PMID:28373893

  19. Time-lapse joint AVO inversion using generalized linear method based on exact Zoeppritz equations

    NASA Astrophysics Data System (ADS)

    Zhi, Longxiao; Gu, Hanming

    2018-03-01

    The conventional method of time-lapse AVO (Amplitude Versus Offset) inversion is mainly based on the approximate expression of Zoeppritz equations. Though the approximate expression is concise and convenient to use, it has certain limitations. For example, its application condition is that the difference of elastic parameters between the upper medium and lower medium is little and the incident angle is small. In addition, the inversion of density is not stable. Therefore, we develop the method of time-lapse joint AVO inversion based on exact Zoeppritz equations. In this method, we apply exact Zoeppritz equations to calculate the reflection coefficient of PP wave. And in the construction of objective function for inversion, we use Taylor series expansion to linearize the inversion problem. Through the joint AVO inversion of seismic data in baseline survey and monitor survey, we can obtain the P-wave velocity, S-wave velocity, density in baseline survey and their time-lapse changes simultaneously. We can also estimate the oil saturation change according to inversion results. Compared with the time-lapse difference inversion, the joint inversion doesn't need certain assumptions and can estimate more parameters simultaneously. It has a better applicability. Meanwhile, by using the generalized linear method, the inversion is easily implemented and its calculation cost is small. We use the theoretical model to generate synthetic seismic records to test and analyze the influence of random noise. The results can prove the availability and anti-noise-interference ability of our method. We also apply the inversion to actual field data and prove the feasibility of our method in actual situation.

  20. Predicting Regional Pattern of Longitudinal β-Amyloid Accumulation by Baseline PET.

    PubMed

    Guo, Tengfei; Brendel, Matthias; Grimmer, Timo; Rominger, Axel; Yakushev, Igor

    2017-04-01

    Knowledge about spatial and temporal patterns of β-amyloid (Aβ) accumulation is essential for understanding Alzheimer disease (AD) and for design of antiamyloid drug trials. Here, we tested whether the regional pattern of longitudinal Aβ accumulation can be predicted by baseline amyloid PET. Methods: Baseline and 2-y follow-up 18 F-florbetapir PET data from 58 patients with incipient and manifest dementia due to AD were analyzed. With the determination of how fast amyloid deposits in a given region relative to the whole-brain gray matter, a pseudotemporal accumulation rate for each region was calculated. The actual accumulation rate of 18 F-florbetapir was calculated from follow-up data. Results: Pseudotemporal measurements from baseline PET data explained 87% ( P < 0.001) of the variance in longitudinal accumulation rate across 62 regions. The method accurately predicted the top 10 fast and slow accumulating regions. Conclusion: Pseudotemporal analysis of baseline PET images is capable of predicting the regional pattern of longitudinal Aβ accumulation in AD at a group level. This approach may be useful in exploring spatial patterns of Aβ accumulation in other amyloid-associated disorders such as Lewy body disease and atypical forms of AD. In addition, the method allows identification of brain regions with a high accumulation rate of Aβ, which are of particular interest for antiamyloid clinical trials. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  1. Improved silicon carbide for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas J.

    1987-01-01

    This is the second annual technical report entitled, Improved Silicon Carbide for Advanced Heat Engines, and includes work performed during the period February 16, 1986 to February 15, 1987. The program is conducted for NASA under contract NAS3-24384. The objective is the development of high strength, high reliability silicon carbide parts with complex shapes suitable for use in advanced heat engines. The fabrication methods used are to be adaptable for mass production of such parts on an economically sound basis. Injection molding is the forming method selected. This objective is to be accomplished in a two-phase program: (1) to achieve a 20 percent improvement in strength and a 100 percent increase in Weibull modulus of the baseline material; and (2) to produce a complex shaped part, a gas turbine rotor, for example, with the improved mechanical properties attained in the first phase. Eight tasks are included in the first phase covering the characterization of the properties of a baseline material, the improvement of those properties and the fabrication of complex shaped parts. Activities during the first contract year concentrated on two of these areas: fabrication and characterization of the baseline material (Task 1) and improvement of material and processes (Task 7). Activities during the second contract year included an MOR bar matrix study to improve mechanical properties (Task 2), materials and process improvements (Task 7), and a Ford-funded task to mold a turbocharger rotor with an improved material (Task 8).

  2. The Tupange Project in Kenya: A Multifaceted Approach to Increasing Use of Long-Acting Reversible Contraceptives

    PubMed Central

    Muthamia, Michael; Owino, Kenneth; Nyachae, Paul; Kilonzo, Margaret; Kamau, Mercy; Otai, Jane; Kabue, Mark; Keyonzo, Nelson

    2016-01-01

    ABSTRACT Background: Long-acting reversible contraceptives (LARCs) are safe and highly effective, and they have higher continuation rates than short-acting methods. Because only a small percentage of sexually active women in Kenya use LARCs, the Tupange project implemented a multifaceted approach to increase uptake of LARCs, particularly among the urban poor. The project included on-site mentoring, whole-site orientation, commodity security, quality improvement, and multiple demand-promotion and service-provision strategies, in the context of wide method choice. We report on activities in Nairobi between July 2011 and December 2014, the project implementation period. Methods: We used a household longitudinal survey of women of reproductive age to measure changes in the contraceptive prevalence rate (CPR) and other family planning-related variables. At baseline in July 2010, 2,676 women were interviewed; about 50% were successfully tracked and interviewed at endline in December 2014. A baseline service delivery point (SDP) survey of 112 health facilities and 303 service providers was conducted in July 2011, and an endline SDP survey was conducted in December 2014 to measure facility-based interventions. The SDP baseline survey was conducted after the household survey, as facilities were selected based on where clients said they obtained services. Results: The project led to significant increases in use of implants and intrauterine devices (IUDs). Uptake of implants increased by 6.5 percentage points, from 2.4% at baseline to 8.9% by endline, and uptake of IUDs increased by 2.1 percentage points, from 2.2% to 4.3%. By the endline survey, 37.7% of clients using pills and injectables at baseline had switched to LARCs. Contraceptive use among the poorest and poor wealth quintiles increased by 20.5 and 21.5 percentage points, respectively, from baseline to endline. Various myths and misconceptions reported about family planning methods declined significantly between baseline and endline. Conclusion: Training, commodity security, multiple service delivery models, and demand promotion were the cornerstones of a successful approach to reach the urban poor in Nairobi with LARCs. PMID:27540124

  3. An integral projection model with YY-males and application to evaluating grass carp control

    USGS Publications Warehouse

    Erickson, Richard A.; Eager, Eric A.; Brey, Marybeth; Hansen, Michael J.; Kocovsky, Patrick

    2017-01-01

    Invasive fish species disrupt ecosystems and cause economic damage. Several methods have been discussed to control populations of invasive fish including the release of YY-males. YY-males are fish that have 2 male chromosomes compared to a XY-male. When YY-males mate, they only produce male (XY) offspring. This decreases the female proportion of the population and can, in theory, eradicate local populations by biasing the sex-ratio. YY-males have been used as a population control tool for brook trout in montane streams and lakes in Idaho, USA. The YY-male control method has been discussed for grass carp in Lake Erie, North America. We developed and presented an integral projection model for grass carp to model the use of YY-males as a control method for populations in this lake. Using only the YY-male control method, we found that high levels of YY-males would need to be release annually to control the species. Specifically, these levels were the same order of magnitude as the baseline adult population (e.g., 1000 YY-males needed to be released annual for 20 years to control a baseline adult population of 2500 grass carp). These levels may not be reasonable or obtainable for fisheries managers given the impacts of YY-males on aquatic vegetation and other constraints of natural resource management.

  4. Threshold regression to accommodate a censored covariate.

    PubMed

    Qian, Jing; Chiou, Sy Han; Maye, Jacqueline E; Atem, Folefac; Johnson, Keith A; Betensky, Rebecca A

    2018-06-22

    In several common study designs, regression modeling is complicated by the presence of censored covariates. Examples of such covariates include maternal age of onset of dementia that may be right censored in an Alzheimer's amyloid imaging study of healthy subjects, metabolite measurements that are subject to limit of detection censoring in a case-control study of cardiovascular disease, and progressive biomarkers whose baseline values are of interest, but are measured post-baseline in longitudinal neuropsychological studies of Alzheimer's disease. We propose threshold regression approaches for linear regression models with a covariate that is subject to random censoring. Threshold regression methods allow for immediate testing of the significance of the effect of a censored covariate. In addition, they provide for unbiased estimation of the regression coefficient of the censored covariate. We derive the asymptotic properties of the resulting estimators under mild regularity conditions. Simulations demonstrate that the proposed estimators have good finite-sample performance, and often offer improved efficiency over existing methods. We also derive a principled method for selection of the threshold. We illustrate the approach in application to an Alzheimer's disease study that investigated brain amyloid levels in older individuals, as measured through positron emission tomography scans, as a function of maternal age of dementia onset, with adjustment for other covariates. We have developed an R package, censCov, for implementation of our method, available at CRAN. © 2018, The International Biometric Society.

  5. Simultaneous Determination of Ursodeoxycholic Acid and Chenodeoxycholic Acid in Pharmaceutical Dosage Form by HPLC-UV Detection.

    PubMed

    Khairy, Mostafa A; Mansour, Fotouh R

    2017-01-01

    A reversed-phase HPLC method was developed for the simultaneous determination of ursodeoxycholic acid (UDCA) and the epimeric isomer, chenodeoxycholic acid (CDCA), in their synthetic mixtures and in tablet dosage form. The proposed HPLC method uses a C18 column and mobile phase consisting of an acetonitrile-phosphate buffer mixture (pH 2.3, 100 mM; 50 + 50, v/v) at a flow rate of 2.0 mL/min with UV detection at 210 nm. The method was validated according to the International Conference on Harmonization guidelines; and linearity, range, accuracy, precision, robustness, and system suitability were studied. The LOD and LOQ were also calculated and found to be 1.23 and 3.73 μg/mL for UDCA and 0.83 and 2.52 μg/mL for CDCA, respectively. The method was adapted for UHPLC, in which baseline separation was achieved in <2.5 min. The assay results of Ursomix tablets by the developed method were statistically compared with those obtained by the reference method using t- and F-tests, and no significant differences were observed.

  6. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  7. Improving the dictionary lookup approach for disease normalization using enhanced dictionary and query expansion

    PubMed Central

    Jonnagaddala, Jitendra; Jue, Toni Rose; Chang, Nai-Wen; Dai, Hong-Jie

    2016-01-01

    The rapidly increasing biomedical literature calls for the need of an automatic approach in the recognition and normalization of disease mentions in order to increase the precision and effectivity of disease based information retrieval. A variety of methods have been proposed to deal with the problem of disease named entity recognition and normalization. Among all the proposed methods, conditional random fields (CRFs) and dictionary lookup method are widely used for named entity recognition and normalization respectively. We herein developed a CRF-based model to allow automated recognition of disease mentions, and studied the effect of various techniques in improving the normalization results based on the dictionary lookup approach. The dataset from the BioCreative V CDR track was used to report the performance of the developed normalization methods and compare with other existing dictionary lookup based normalization methods. The best configuration achieved an F-measure of 0.77 for the disease normalization, which outperformed the best dictionary lookup based baseline method studied in this work by an F-measure of 0.13. Database URL: https://github.com/TCRNBioinformatics/DiseaseExtract PMID:27504009

  8. Study of metallic structural design concepts for an arrow wing supersonic cruise configuration

    NASA Technical Reports Server (NTRS)

    Turner, M. J.; Grande, D. L.

    1977-01-01

    A structural design study was made, to assess the relative merits of various metallic structural concepts and materials for an advanced supersonic aircraft cruising at Mach 2.7. Preliminary studies were made to ensure compliance of the configuration with general design criteria, integrate the propulsion system with the airframe, select structural concepts and materials, and define an efficient structural arrangement. An advanced computerized structural design system was used, in conjunction with a relatively large, complex finite element model, for detailed analysis and sizing of structural members to satisfy strength and flutter criteria. A baseline aircraft design was developed for assessment of current technology. Criteria, analysis methods, and results are presented. The effect on design methods of using the computerized structural design system was appraised, and recommendations are presented concerning further development of design tools, development of materials and structural concepts, and research on basic technology.

  9. Potential of spark ignition engine for increased fuel efficiency. Final report, January-October 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, T. Jr.; Cole, D.; Bolt, J.A.

    The objective of this study was to assess the potential of the spark ignition engine to deliver maximum fuel efficiency at 1981 Statutory Emission Standards in the 1983-1984 timeframe and beyond that to 1990. Based on the results of an extensive literature search, manufacturer's known product plans, and fuel economies of 1978 engines as a baseline, proposed methods of attaining fuel economy while complying with the future standards were ascertained. Methods of engine control optimization, engine design optimization as well as methods of varying engine parameters were considered. The potential improvements in fuel economy associated with these methods, singly andmore » in combination, were determined and are expressed as percentage changes of the fuel economy of the baseline engines. A summary of the principal conclusions are presented, followed by a description of the engine baseline reference, analysis and projection of fuel economy improvements, and a preliminary assessment of the impact of fuel economy benefits on manufacturing cost.« less

  10. Very long baseline interferometry applied to polar motion, relativity, and geodesy. Ph. D. thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, C.

    1978-01-01

    The causes and effects of diurnal polar motion are described. An algorithm was developed for modeling the effects on very long baseline interferometry observables. A selection was made between two three-station networks for monitoring polar motion. The effects of scheduling and the number of sources observed on estimated baseline errors are discussed. New hardware and software techniques in very long baseline interferometry are described.

  11. Evaluation of Earth's Geobiosphere Emergy Baseline and the Emergy of Crustal Cycling

    NASA Astrophysics Data System (ADS)

    De Vilbiss, Chris

    This dissertation quantitatively analyzed the exergy supporting the nucleosynthesis of the heavy isotopes, Earth's geobiosphere, and its crustal cycling. Exergy is that portion of energy that is available to drive work. The exergy sources that drive the geobiosphere are sunlight, Earth's rotational kinetic energy and relic heat, and radionuclides in Earth's interior. These four exergy sources were used to compute the Earth's geobiosphere emergy baseline (GEB), expressed as a single unit, solar equivalent joules (seJ). The seJ of radionuclides were computed by determining the quantity of gravitational exergy that dissipated in the production of both sunlight and heavy isotopes. This is a new method of computing solar equivalences also was applied to Earth's relic heat and rotational energy. The equivalent quantities of these four exergy sources were then added to express the GEB. This new baseline was compared with several other contemporary GEB methods. The new GEB is modeled as the support to Earth's crustal cycle and ultimately to the economical mineral deposits used in the US economy. Given the average annual cycling of crustal material and its average composition, specific emergies were calculated to express the average emergy per mass of particular crustal minerals. Chemical exergies of the minerals were used to develop transformities and specific emergies of minerals at heightened concentrations, i.e. minable concentrations. The effect of these new mineral emergy values were examined using the US economy as an example. The final result is an 83% reduction in the emergy of limestone, a 91% reduction in the aggregated emergy of all other minerals, and a 23% reduction in the emergy of the US economy. This dissertation explored three unique and innovative methods to compute the emergy of Earth's exergy sources and resources. First was a method for computing the emergy of radionuclides. Second was a method to evaluate the Earth's relic heat and dissipation of gravitational exergy that uses forward computation. Third is a more consistent method to compute the emergy value of crustal minerals based on their chemical exergy.

  12. Participation in the Analysis of the Far-Infrared/Submillmeter Interferometer

    NASA Technical Reports Server (NTRS)

    Lorenzini, Enrico C.

    2005-01-01

    We have contributed to the development of the Submillimiter Probe of the Evolution of Cosmic Structure (SPECS) by analyzing various aspects related to the tethers that connect the spacecraft of this space interferometer. We have focused our analysis on key topics as follows: (a) helping in the configuration selection; (b) computing the system eigenfrequencies as a function of baseline length; (c) developing techniques and conceptual design of devices for damping the tether oscillations; (d) carrying out numerical simulations of tethered formation to assess the effects of environmental perturbations upon the baseline length variation; (e) developing control laws for reconfiguring the baseline length; (f) devising control laws for fast retargeting of the interferometer at moderate baseline lengths; (g) estimating the survivability to micrometeoroid impacts of a tether at L2; and (h) developing a conceptual design of a high- strength and survivable tether. The work was conducted for NASA Goddard Space Flight Center under Grant NNG04GQ21G with William Danchi as technical monitor.

  13. Development and validation of stability indicating the RP-HPLC method for the estimation of related compounds of guaifenesin in pharmaceutical dosage forms

    PubMed Central

    Reddy, Sunil Pingili; Babu, K. Sudhakar; Kumar, Navneet; Sekhar, Y. V. V. Sasi

    2011-01-01

    Aim and background: A stability-indicating gradient reverse phase liquid chromatographic (RP-LC) method was developed for the quantitative determination of related substances of guaifenesin in pharmaceutical formulations. Materials and methods: The baseline separation for guaifenesin and all impurities was achieved by utilizing a Water Symmetry C18 (150 mm × 4.6 mm) 5 μm column particle size and a gradient elution method. The mobile phase A contains a mixture of 0.02 M KH2PO4 (pH 3.2) and methanol in the ratio of 90:10 v/v, while the mobile phase B contains 0.02 M KH2PO4 (pH 3.2) and methanol in the ratio of 10:90 v/v, respectively. The flow rate of the mobile phase was 0.8 ml/min with a column temperature of 25°C and detection wavelength at 273 nm. Results: Guaifenesin was subjected to the stress conditions of oxidative, acid, base, hydrolytic, thermal, and photolytic degradation. Conclusion: The developed method was validated as per ICH guidelines with respect to specificity, linearity, limit of detection and quantification, accuracy, precision, and robustness. PMID:23781462

  14. Development of an external ceramic insulation for the space shuttle orbiter. Part 2: Optimization

    NASA Technical Reports Server (NTRS)

    Tanzilli, R. A. (Editor)

    1973-01-01

    The basic insulation improvement study concentrated upon evaluating variables which could result in significant near-term gains in mechanical behavior and insulation effectiveness of the baseline system. The approaches undertaken included: evaluation of small diameter fibers, optimization of binder: slurry characteristics, evaluation of techniques for controlling fiber orientation, optimization of firing cycle, and the evaluation of methods for improving insulation efficiency. A detailed discussion of these basic insulation improvement studies is presented.

  15. Machine Learning-based Individual Assessment of Cortical Atrophy Pattern in Alzheimer's Disease Spectrum: Development of the Classifier and Longitudinal Evaluation.

    PubMed

    Lee, Jin San; Kim, Changsoo; Shin, Jeong-Hyeon; Cho, Hanna; Shin, Dae-Seock; Kim, Nakyoung; Kim, Hee Jin; Kim, Yeshin; Lockhart, Samuel N; Na, Duk L; Seo, Sang Won; Seong, Joon-Kyung

    2018-03-07

    To develop a new method for measuring Alzheimer's disease (AD)-specific similarity of cortical atrophy patterns at the individual-level, we employed an individual-level machine learning algorithm. A total of 869 cognitively normal (CN) individuals and 473 patients with probable AD dementia who underwent high-resolution 3T brain MRI were included. We propose a machine learning-based method for measuring the similarity of an individual subject's cortical atrophy pattern with that of a representative AD patient cohort. In addition, we validated this similarity measure in two longitudinal cohorts consisting of 79 patients with amnestic-mild cognitive impairment (aMCI) and 27 patients with probable AD dementia. Surface-based morphometry classifier for discriminating AD from CN showed sensitivity and specificity values of 87.1% and 93.3%, respectively. In the longitudinal validation study, aMCI-converts had higher atrophy similarity at both baseline (p < 0.001) and first year visits (p < 0.001) relative to non-converters. Similarly, AD patients with faster decline had higher atrophy similarity than slower decliners at baseline (p = 0.042), first year (p = 0.028), and third year visits (p = 0.027). The AD-specific atrophy similarity measure is a novel approach for the prediction of dementia risk and for the evaluation of AD trajectories on an individual subject level.

  16. Remote sensing of land use and water quality relationships - Wisconsin shore, Lake Michigan

    NASA Technical Reports Server (NTRS)

    Haugen, R. K.; Marlar, T. L.

    1976-01-01

    This investigation assessed the utility of remote sensing techniques in the study of land use-water quality relationships in an east central Wisconsin test area. The following types of aerial imagery were evaluated: high altitude (60,000 ft) color, color infrared, multispectral black and white, and thermal; low altitude (less than 5000 ft) color infrared, multispectral black and white, thermal, and passive microwave. A non-imaging hand-held four-band radiometer was evaluated for utility in providing data on suspended sediment concentrations. Land use analysis includes the development of mapping and quantification methods to obtain baseline data for comparison to water quality variables. Suspended sediment loads in streams, determined from water samples, were related to land use differences and soil types in three major watersheds. A multiple correlation coefficient R of 0.85 was obtained for the relationship between the 0.6-0.7 micrometer incident and reflected radiation data from the hand-held radiometer and concurrent ground measurements of suspended solids in streams. Applications of the methods and baseline data developed in this investigation include: mapping and quantification of land use; input to watershed runoff models; estimation of effects of land use changes on stream sedimentation; and remote sensing of suspended sediment content of streams. High altitude color infrared imagery was found to be the most acceptable remote sensing technique for the mapping and measurement of land use types.

  17. Serial Echocardiographic Characteristics, Novel Biomarkers and Cachexia Development in Patients with Stable Chronic Heart Failure.

    PubMed

    Gaggin, Hanna K; Belcher, Arianna M; Gandhi, Parul U; Ibrahim, Nasrien E; Januzzi, James L

    2016-12-01

    Little is known regarding objective predictors of cachexia affecting patients with heart failure (HF). We studied 108 stable chronic systolic HF patients with serial echocardiography and biomarker measurements over 10 months. Cachexia was defined as weight loss ≥5 % from baseline or final BMI <20 kg/m 2 ; 18.5 % developed cachexia. While there were no significant differences in baseline or serial echocardiographic measures in those developing cachexia, we found significant differences in baseline amino-terminal pro-B type natriuretic peptide (NT-proBNP), highly sensitive troponin I, sST2, and endothelin-1. Baseline log NT-proBNP (hazard ratio (HR) = 2.57, p = 0.004) and edema (HR = 3.36, p = 0.04) were predictive of cachexia in an adjusted analysis. When serial measurement of biomarkers was considered, only percent time with NT-proBNP ≥1000 pg/mL was predictive of cachexia. Thus, a close association exists between baseline and serial measurement of NT-proBNP and HF cachexia.

  18. Supply and demand: application of Lean Six Sigma methods to improve drug round efficiency and release nursing time.

    PubMed

    Kieran, Maríosa; Cleary, Mary; De Brún, Aoife; Igoe, Aileen

    2017-10-01

    To improve efficiency, reduce interruptions and reduce the time taken to complete oral drug rounds. Lean Six Sigma methods were applied to improve drug round efficiency using a pre- and post-intervention design. A 20-bed orthopaedic ward in a large teaching hospital in Ireland. Pharmacy, nursing and quality improvement staff. A multifaceted intervention was designed which included changes in processes related to drug trolley organization and drug supply planning. A communications campaign aimed at reducing interruptions during nurse-led during rounds was also developed and implemented. Average number of interruptions, average drug round time and variation in time taken to complete drug round. At baseline, the oral drug round took an average of 125 min. Following application of Lean Six Sigma methods, the average drug round time decreased by 51 min. The average number of interruptions per drug round reduced from an average of 12 at baseline to 11 following intervention, with a 75% reduction in drug supply interruptions. Lean Six Sigma methodology was successfully employed to reduce interruptions and to reduce time taken to complete the oral drug round. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  19. Application of probabilistic fiber-tracking method of MR imaging to measure impact of cranial irradiation on structural brain connectivity in children treated for medulloblastoma

    NASA Astrophysics Data System (ADS)

    Duncan, Elizabeth C.; Reddick, Wilburn E.; Glass, John O.; Hyun, Jung Won; Ji, Qing; Li, Yimei; Gajjar, Amar

    2016-03-01

    We applied a modified probabilistic fiber-tracking method for the extraction of fiber pathways to quantify decreased white matter integrity as a surrogate of structural loss in connectivity due to cranial radiation therapy (CRT) as treatment for pediatric medulloblastoma. Thirty subjects were examined (n=8 average-risk, n=22 high-risk) and the groups did not differ significantly in age at examination. The pathway analysis created a structural connectome focused on sub-networks within the central executive network (CEN) for comparison between baseline and post-CRT scans and for comparison between standard and high dose CRT. A paired-wise comparison of the connectivity between baseline and post-CRT scans showed the irradiation did have a significant detrimental impact on white matter integrity (decreased fractional anisotropy (FA) and decreased axial diffusivity (AX)) in most of the CEN sub-networks. Group comparisons of the change in the connectivity revealed that patients receiving high dose CRT experienced significant AX decreases in all sub-networks while the patients receiving standard dose CRT had relatively stable AX measures across time. This study on pediatric patients with medulloblastoma demonstrated the utility of this method to identify specific sub-networks within the developing brain affected by CRT.

  20. Informal caregivers and detection of delirium in postacute care: a correlational study of the confusion assessment method (CAM), confusion assessment method-family assessment method (CAM-FAM) and DSM-IV criteria.

    PubMed

    Flanagan, Nina M; Spencer, Gale

    2016-09-01

    Delirium is a common, serious and potentially life-threatening syndrome affecting older adults. This syndrome continues to be under-recognised and under treated by healthcare professionals across all care settings. Older adults who develop delirium have poorer outcomes, higher mortality and higher care costs. The purposes of this study were to correlate the confusion assessment method-family assessment method and confusion assessment method in the detection of delirium in postacute care, to correlate the confusion assessment method-family assessment method and diagnostic and statistical manual of mental disorders text revision criteria in detection of delirium in postacute care, to determine the prevalence of delirium in postacute care elders and to describe the relationship of level of cognitive impairment and delirium in the postacute care setting. Implications for Practice Delirium is disturbing for patients and caregivers. Frequently . family members want to provide information about their loved one. The use of the CAM-FAM and CAM can give a more definitive determination of baseline status. Frequent observations using both instruments may lead to better recognition of delirium and implementation of interventions to prevent lasting sequelae. Descriptive studies determined the strengths of relationship between the confusion assessment method, confusion assessment method-family assessment method, Mini-Cog and diagnostic and statistical manual of mental disorders text revision criteria in detection of delirium in the postacute care setting. Prevalence of delirium in this study was 35%. The confusion assessment method-family assessment method highly correlates with the confusion assessment method and diagnostic and statistical manual of mental disorders text revision criteria for detecting delirium in older adults in the postacute care setting. Persons with cognitive impairment are more likely to develop delirium. Family members recognise symptoms of delirium when asked. The confusion assessment method-family assessment method is a valid tool for detection of delirium. Delirium is disturbing for patients and caregivers. Frequently. family members want to provide information about their loved one. The use of the CAM-FAM and CAM can give a more definitive determination of baseline status. Frequent observations using both instruments may lead to better recognition of delirium and implementation of interventions to prevent lasting sequelae. © 2015 John Wiley & Sons Ltd.

  1. Blade pitch optimization methods for vertical-axis wind turbines

    NASA Astrophysics Data System (ADS)

    Kozak, Peter

    Vertical-axis wind turbines (VAWTs) offer an inherently simpler design than horizontal-axis machines, while their lower blade speed mitigates safety and noise concerns, potentially allowing for installation closer to populated and ecologically sensitive areas. While VAWTs do offer significant operational advantages, development has been hampered by the difficulty of modeling the aerodynamics involved, further complicated by their rotating geometry. This thesis presents results from a simulation of a baseline VAWT computed using Star-CCM+, a commercial finite-volume (FVM) code. VAWT aerodynamics are shown to be dominated at low tip-speed ratios by dynamic stall phenomena and at high tip-speed ratios by wake-blade interactions. Several optimization techniques have been developed for the adjustment of blade pitch based on finite-volume simulations and streamtube models. The effectiveness of the optimization procedure is evaluated and the basic architecture for a feedback control system is proposed. Implementation of variable blade pitch is shown to increase a baseline turbine's power output between 40%-100%, depending on the optimization technique, improving the turbine's competitiveness when compared with a commercially-available horizontal-axis turbine.

  2. Self port scanning tool : providing a more secure computing Environment through the use of proactive port scanning

    NASA Technical Reports Server (NTRS)

    Kocher, Joshua E; Gilliam, David P.

    2005-01-01

    Secure computing is a necessity in the hostile environment that the internet has become. Protection from nefarious individuals and organizations requires a solution that is more a methodology than a one time fix. One aspect of this methodology is having the knowledge of which network ports a computer has open to the world, These network ports are essentially the doorways from the internet into the computer. An assessment method which uses the nmap software to scan ports has been developed to aid System Administrators (SAs) with analysis of open ports on their system(s). Additionally, baselines for several operating systems have been developed so that SAs can compare their open ports to a baseline for a given operating system. Further, the tool is deployed on a website where SAs and Users can request a port scan of their computer. The results are then emailed to the requestor. This tool aids Users, SAs, and security professionals by providing an overall picture of what services are running, what ports are open, potential trojan programs or backdoors, and what ports can be closed.

  3. Aquatic toxicity of petroleum products and dispersant agents ...

    EPA Pesticide Factsheets

    The U.S. EPA Office of Research and Development has developed baseline data on the ecotoxicity of selected petroleum products and several chemical dispersants as part of its oil spills research program. Two diluted bitumens (dilbits) from the Alberta Tar Sands were tested for acute and chronic toxicity to standard freshwater and marine organisms given their spill potential during shipment within the United States. Separately, two reference crude oils representing a range of characteristics, and their mixtures with four representative dispersants, were tested to evaluate acute and chronic toxicity to marine organisms in support of Subpart J of the U.S. National Contingency Plan. Water accommodated fractions (WAF) of oil were prepared using traditional slow-stir methods and toxicity tests generally followed U.S. EPA standard effluent testing guidelines. WAFs were characterized for petroleum hydrocarbons including alkyl PAH homologs. The results of these studies will assist the U.S. EPA to assess toxicity data for unconventional oils (dilbits), and establish baseline toxicity data for selected crude oils and dispersant in support of planning and response activities. Abstract reporting the results of EPA's oil and dispersant toxicity testing program

  4. Cold flow testing of the Space Shuttle Main Engine alternate turbopump development high pressure fuel turbine model

    NASA Technical Reports Server (NTRS)

    Gaddis, Stephen W.; Hudson, Susan T.; Johnson, P. D.

    1992-01-01

    NASA's Marshall Space Flight Center has established a cold airflow turbine test program to experimentally determine the performance of liquid rocket engine turbopump drive turbines. Testing of the SSME alternate turbopump development (ATD) fuel turbine was conducted for back-to-back comparisons with the baseline SSME fuel turbine results obtained in the first quarter of 1991. Turbine performance, Reynolds number effects, and turbine diagnostics, such as stage reactions and exit swirl angles, were investigated at the turbine design point and at off-design conditions. The test data showed that the ATD fuel turbine test article was approximately 1.4 percent higher in efficiency and flowed 5.3 percent more than the baseline fuel turbine test article. This paper describes the method and results used to validate the ATD fuel turbine aerodynamic design. The results are being used to determine the ATD high pressure fuel turbopump (HPFTP) turbine performance over its operating range, anchor the SSME ATD steady-state performance model, and validate various prediction and design analyses.

  5. Impact of Dose to the Bladder Trigone on Long-Term Urinary Function After High-Dose Intensity Modulated Radiation Therapy for Localized Prostate Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghadjar, Pirus; Zelefsky, Michael J.; Spratt, Daniel E.

    2014-02-01

    Purpose: To determine the potential association between genitourinary (GU) toxicity and planning dose–volume parameters for GU pelvic structures after high-dose intensity modulated radiation therapy in localized prostate cancer patients. Methods and Materials: A total of 268 patients who underwent intensity modulated radiation therapy to a prescribed dose of 86.4 Gy in 48 fractions during June 2004-December 2008 were evaluated with the International Prostate Symptom Score (IPSS) questionnaire. Dose–volume histograms of the whole bladder, bladder wall, urethra, and bladder trigone were analyzed. The primary endpoint for GU toxicity was an IPSS sum increase ≥10 points over baseline. Univariate and multivariate analysesmore » were done by the Kaplan-Meier method and Cox proportional hazard models, respectively. Results: Median follow-up was 5 years (range, 3-7.7 years). Thirty-nine patients experienced an IPSS sum increase ≥10 during follow-up; 84% remained event free at 5 years. After univariate analysis, lower baseline IPSS sum (P=.006), the V90 of the trigone (P=.006), and the maximal dose to the trigone (P=.003) were significantly associated with an IPSS sum increase ≥10. After multivariate analysis, lower baseline IPSS sum (P=.009) and increased maximal dose to the trigone (P=.005) remained significantly associated. Seventy-two patients had both a lower baseline IPSS sum and a higher maximal dose to the trigone and were defined as high risk, and 68 patients had both a higher baseline IPSS sum and a lower maximal dose to the trigone and were defined as low risk for development of an IPSS sum increase ≥10. Twenty-one of 72 high-risk patients (29%) and 5 of 68 low-risk patients (7%) experienced an IPSS sum increase ≥10 (P=.001; odds ratio 5.19). Conclusions: The application of hot spots to the bladder trigone was significantly associated with relevant changes in IPSS during follow-up. Reduction of radiation dose to the lower bladder and specifically the bladder trigone seems to be associated with a reduction in late GU toxicity.« less

  6. Development of Metric for Measuring the Impact of RD&D Funding on GTO's Geothermal Exploration Goals (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenne, S.; Young, K. R.; Thorsteinsson, H.

    The Department of Energy's Geothermal Technologies Office (GTO) provides RD&D funding for geothermal exploration technologies with the goal of lowering the risks and costs of geothermal development and exploration. In 2012, NREL was tasked with developing a metric to measure the impacts of this RD&D funding on the cost and time required for exploration activities. The development of this metric included collecting cost and time data for exploration techniques, creating a baseline suite of exploration techniques to which future exploration and cost and time improvements could be compared, and developing an online tool for graphically showing potential project impacts (allmore » available at http://en.openei.org/wiki/Gateway:Geothermal). The conference paper describes the methodology used to define the baseline exploration suite of techniques (baseline), as well as the approach that was used to create the cost and time data set that populates the baseline. The resulting product, an online tool for measuring impact, and the aggregated cost and time data are available on the Open EI website for public access (http://en.openei.org).« less

  7. Preventing Peer Violence Against Children: Methods and Baseline Data of a Cluster Randomized Controlled Trial in Pakistan

    PubMed Central

    McFarlane, Judith; Karmaliani, Rozina; Maqbool Ahmed Khuwaja, Hussain; Gulzar, Saleema; Somani, Rozina; Saeed Ali, Tazeen; Somani, Yasmeen H; Shehzad Bhamani, Shireen; Krone, Ryan D; Paulson, Rene M; Muhammad, Atta; Jewkes, Rachel

    2017-01-01

    ABSTRACT Background: Violence against and among children is a global public health problem that annually affects 50% of youth worldwide with major impacts on child development, education, and health including increased probability of major causes of morbidity and mortality in adulthood. It is also associated with the experience of and perpetration of later violence against women. The aim of this article is to describe the intervention, study design, methods, and baseline findings of a cluster randomized controlled trial underway in Pakistan to evaluate a school-based play intervention aiming to reduce peer violence and enhance mental health. Methods: A cluster randomized controlled design is being conducted with boys and girls in grade 6 in 40 schools in Hyderabad, Pakistan, over a period of 2 years. The Multidimensional Peer-Victimization and Peer Perpetration Scales and the Children's Depression Inventory 2 (CDI 2) are being used to measure the primary outcomes while investigator-derived scales are being used to assess domestic violence within the family. Specifics of the intervention, field logistics, ethical, and fidelity management issues employed to test the program's impact on school age youth in a volatile and politically unstable country form this report. Baseline Results: A total of 1,752 school-age youth were enrolled and interviewed at baseline. Over the preceding 4 weeks, 94% of the boys and 85% of the girls reported 1 or more occurrences of victimization, and 85% of the boys and 66% of the girls reported 1 or more acts of perpetration. Boys reported more depression compared with girls, as well as higher negative mood and self-esteem scores and more interpersonal and emotional problems. Interpretation: Globally, prevalence of youth violence perpetration and victimization is high and associated with poor physical and emotional health. Applying a randomized controlled design to evaluate a peer violence prevention program built on a firm infrastructure and that is ready for scale-up and sustainability will make an important contribution to identifying evidence-informed interventions that can reduce youth victimization and perpetration. PMID:28351880

  8. Application of electrochemical methods in corrosion and battery research

    NASA Astrophysics Data System (ADS)

    Sun, Zhaoli

    Various electrochemical methods have been applied in the development of corrosion protection methods for ammonia/water absorption heat pumps and the evaluation of the stability of metallic materials in Li-ion battery electrolyte. Rare earth metal salts (REMSs) and organic inhibitors have been evaluated for corrosion protection of mild steel in the baseline solution of 5 wt% NH 3 + 0.2 wt% NaOH to replace the conventionally used toxic chromate salt inhibitors. Cerium nitrate provided at least comparable corrosion inhibition efficiency as dichromate in the baseline solution at 100°C. The cerium (IV) oxide formed on mild steel through the cerating process exhibited increasing corrosion protection for mild steel with prolonged exposure time in the hot baseline solution. The optimum cerating process was found to be first cerating in a solution of 2.3 g/L CeCl3 + 4.4 wt% H2O2 + appropriate additives for 20 minutes at pH 2.2 at room temperature with 30 minutes solution aging prior to use, then sealing in 10% sodium (meta) silicate or sodium molybdate at 50°C for 30 minutes. Yttrium salts provided less corrosion protection for mild steel in the baseline solution than cerium salts. Glycerophosphate was found to be a promising chromate-free organic inhibitor for mild steel; however, its thermostability in hot ammonia/water solutions has not been confirmed yet. The stability of six metallic materials used in Li-ion batteries has been evaluated in 1M lithium hexafluorophosphate (LiPF6) dissolved in a 1:1 volume mixture of ethylene carbonate and diethyl carbonate at 37°C in a dry-box. Aluminum is the most stable material, while Copper is active under anodic potentials and susceptible to localized corrosion and galvanic corrosion. The higher the concentration of the alloying elements Al and/or V in a titanium alloy, the higher was the stability of the titanium alloy in the battery electrolyte. 90Pt-10Ir can cause decomposition of the electrolyte resulting in a low stable potential window.

  9. Study protocol title: a prospective cohort study of low back pain

    PubMed Central

    2013-01-01

    Background Few prospective cohort studies of workplace low back pain (LBP) with quantified job physical exposure have been performed. There are few prospective epidemiological studies for LBP occupational risk factors and reported data generally have few adjustments for many personal and psychosocial factors. Methods/design A multi-center prospective cohort study has been incepted to quantify risk factors for LBP and potentially develop improved methods for designing and analyzing jobs. Due to the subjectivity of LBP, six measures of LBP are captured: 1) any LBP, 2) LBP ≥ 5/10 pain rating, 3) LBP with medication use, 4) LBP with healthcare provider visits, 5) LBP necessitating modified work duties and 6) LBP with lost work time. Workers have thus far been enrolled from 30 different employment settings in 4 diverse US states and performed widely varying work. At baseline, workers undergo laptop-administered questionnaires, structured interviews, and two standardized physical examinations to ascertain demographics, medical history, psychosocial factors, hobbies and physical activities, and current musculoskeletal disorders. All workers’ jobs are individually measured for physical factors and are videotaped. Workers are followed monthly for the development of low back pain. Changes in jobs necessitate re-measure and re-videotaping of job physical factors. The lifetime cumulative incidence of low back pain will also include those with a past history of low back pain. Incident cases will exclude prevalent cases at baseline. Statistical methods planned include survival analyses and logistic regression. Discussion Data analysis of a prospective cohort study of low back pain is underway and has successfully enrolled over 800 workers to date. PMID:23497211

  10. Emergent HIV-1 Drug Resistance Mutations Were Not Present at Low-Frequency at Baseline in Non-Nucleoside Reverse Transcriptase Inhibitor-Treated Subjects in the STaR Study

    PubMed Central

    Porter, Danielle P.; Daeumer, Martin; Thielen, Alexander; Chang, Silvia; Martin, Ross; Cohen, Cal; Miller, Michael D.; White, Kirsten L.

    2015-01-01

    At Week 96 of the Single-Tablet Regimen (STaR) study, more treatment-naïve subjects that received rilpivirine/emtricitabine/tenofovir DF (RPV/FTC/TDF) developed resistance mutations compared to those treated with efavirenz (EFV)/FTC/TDF by population sequencing. Furthermore, more RPV/FTC/TDF-treated subjects with baseline HIV-1 RNA >100,000 copies/mL developed resistance compared to subjects with baseline HIV-1 RNA ≤100,000 copies/mL. Here, deep sequencing was utilized to assess the presence of pre-existing low-frequency variants in subjects with and without resistance development in the STaR study. Deep sequencing (Illumina MiSeq) was performed on baseline and virologic failure samples for all subjects analyzed for resistance by population sequencing during the clinical study (n = 33), as well as baseline samples from control subjects with virologic response (n = 118). Primary NRTI or NNRTI drug resistance mutations present at low frequency (≥2% to 20%) were detected in 6.6% of baseline samples by deep sequencing, all of which occurred in control subjects. Deep sequencing results were generally consistent with population sequencing but detected additional primary NNRTI and NRTI resistance mutations at virologic failure in seven samples. HIV-1 drug resistance mutations emerging while on RPV/FTC/TDF or EFV/FTC/TDF treatment were not present at low frequency at baseline in the STaR study. PMID:26690199

  11. Emergent HIV-1 Drug Resistance Mutations Were Not Present at Low-Frequency at Baseline in Non-Nucleoside Reverse Transcriptase Inhibitor-Treated Subjects in the STaR Study.

    PubMed

    Porter, Danielle P; Daeumer, Martin; Thielen, Alexander; Chang, Silvia; Martin, Ross; Cohen, Cal; Miller, Michael D; White, Kirsten L

    2015-12-07

    At Week 96 of the Single-Tablet Regimen (STaR) study, more treatment-naïve subjects that received rilpivirine/emtricitabine/tenofovir DF (RPV/FTC/TDF) developed resistance mutations compared to those treated with efavirenz (EFV)/FTC/TDF by population sequencing. Furthermore, more RPV/FTC/TDF-treated subjects with baseline HIV-1 RNA >100,000 copies/mL developed resistance compared to subjects with baseline HIV-1 RNA ≤100,000 copies/mL. Here, deep sequencing was utilized to assess the presence of pre-existing low-frequency variants in subjects with and without resistance development in the STaR study. Deep sequencing (Illumina MiSeq) was performed on baseline and virologic failure samples for all subjects analyzed for resistance by population sequencing during the clinical study (n = 33), as well as baseline samples from control subjects with virologic response (n = 118). Primary NRTI or NNRTI drug resistance mutations present at low frequency (≥2% to 20%) were detected in 6.6% of baseline samples by deep sequencing, all of which occurred in control subjects. Deep sequencing results were generally consistent with population sequencing but detected additional primary NNRTI and NRTI resistance mutations at virologic failure in seven samples. HIV-1 drug resistance mutations emerging while on RPV/FTC/TDF or EFV/FTC/TDF treatment were not present at low frequency at baseline in the STaR study.

  12. Unlocking Sensitivity for Visibility-based Estimators of the 21 cm Reionization Power Spectrum

    NASA Astrophysics Data System (ADS)

    Zhang, Yunfan Gerry; Liu, Adrian; Parsons, Aaron R.

    2018-01-01

    Radio interferometers designed to measure the cosmological 21 cm power spectrum require high sensitivity. Several modern low-frequency interferometers feature drift-scan antennas placed on a regular grid to maximize the number of instantaneously coherent (redundant) measurements. However, even for such maximum-redundancy arrays, significant sensitivity comes through partial coherence between baselines. Current visibility-based power-spectrum pipelines, though shown to ease control of systematics, lack the ability to make use of this partial redundancy. We introduce a method to leverage partial redundancy in such power-spectrum pipelines for drift-scan arrays. Our method cross-multiplies baseline pairs at a time lag and quantifies the sensitivity contributions of each pair of baselines. Using the configurations and beams of the 128-element Donald C. Backer Precision Array for Probing the Epoch of Reionization (PAPER-128) and staged deployments of the Hydrogen Epoch of Reionization Array, we illustrate how our method applies to different arrays and predict the sensitivity improvements associated with pairing partially coherent baselines. As the number of antennas increases, we find partial redundancy to be of increasing importance in unlocking the full sensitivity of upcoming arrays.

  13. Detection of early changes in lung cell cytology by flow-systems analysis techniques. [Rats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steinkamp, J.A.; Wilson, J.S.; Svitra, Z.V.

    1980-03-01

    Ongoing experiments designed to develop automated flow-analysis methods for assaying damage to pulmonary lavage cells in experimental animals exposed by inhalation to environmental pollutants are summarized. Pulmonary macrophages were characterized on their ability to phagocytize polystyrene latex fluorescent spheres. Lung cells consisting primarily of macrophages and leukocytes were analyzed for fluorescence (phagocytosis of spheres) and size using flow cytometric methods. Studies also concentrated on combining phagocytosis with other cellular parameters (DNA content, cell viability, and B-glucuronidase activity). As baseline studies are completed in normal animals, experimental animals will be exposed to gaseous and particulate environmental pollutants. (ERB

  14. Smoking is a risk factor for development of adult T-cell leukemia/lymphoma in Japanese human T-cell leukemia virus type-1 carriers.

    PubMed

    Kondo, Hisayoshi; Soda, Midori; Sawada, Norie; Inoue, Manami; Imaizumi, Yoshitaka; Miyazaki, Yasushi; Iwanaga, Masako; Tanaka, Yasuhito; Mizokami, Masashi; Tsugane, Shoichiro

    2016-09-01

    Adult T-cell leukemia/lymphoma (ATLL) is an aggressive hematological malignancy caused by human T-cell leukemia virus type-1 (HTLV-1); no effective methods have yet been identified to prevent development of ATLL in carriers of HTLV-1. This study investigated the association between cigarette smoking and the risk of ATLL development among Japanese carriers of HTLV-1. This study examined the association between smoking and development of ATLL in a cohort of 1,332 Japanese HTLV-1 carriers aged 40-69 years free of ATLL at baseline from two different HTLV-1-endemic areas of Japan. Cox proportional hazards models adjusted for sex, geographic area, age at baseline, and alcohol drinking were used to estimate the effect of cigarette smoking on ATLL development. Between 1993 and 2012, 25 new ATLL cases were identified among these subjects. The overall crude incidence rate for ATLL was 1.08 per 1,000 person-years among HTLV-1 carriers and was higher among male carriers than among female carriers (2.21 vs. 0.74). The risk of ATLL development increased significantly with increasing numbers of cigarettes smoked per day (hazard ratio for every increment of 20 cigarettes, 2.03; 95 % confidence interval (CI) 1.13-3.66 overall, 2.07 (95 % CI 1.13-3.73) in male carriers). Cigarette smoking may influence ATLL development among HTLV-1 carriers in Japan.

  15. Relationship of Baseline Hemoglobin Level with Serum Ferritin, Postphlebotomy Hemoglobin Changes, and Phlebotomy Requirements among HFE C282Y Homozygotes

    PubMed Central

    Mousavi, Seyed Ali; Mahmood, Faiza; Aandahl, Astrid; Knutsen, Teresa Risopatron; Llohn, Abid Hussain

    2015-01-01

    Objectives. We aimed to examine whether baseline hemoglobin levels in C282Y-homozygous patients are related to the degree of serum ferritin (SF) elevation and whether patients with different baseline hemoglobin have different phlebotomy requirements. Methods. A total of 196 patients (124 males and 72 females) who had undergone therapeutic phlebotomy and had SF and both pre- and posttreatment hemoglobin values were included in the study. Results. Bivariate correlation analysis suggested that baseline SF explains approximately 6 to 7% of the variation in baseline hemoglobin. The results also showed that males who had higher (≥150 g/L) baseline hemoglobin levels had a significantly greater reduction in their posttreatment hemoglobin despite requiring fewer phlebotomies to achieve iron depletion than those who had lower (<150 g/L) baseline hemoglobin, regardless of whether baseline SF was below or above 1000 µg/L. There were no significant differences between hemoglobin subgroups regarding baseline and treatment characteristics, except for transferrin saturation between male subgroups with SF above 1000 µg/L. Similar differences were observed when females with higher (≥138 g/L) baseline hemoglobin were compared with those with lower (<138 g/L) baseline hemoglobin. Conclusion. Dividing C282Y-homozygous patients into just two subgroups according to the degree of baseline SF elevation may obscure important subgroup variations. PMID:26380265

  16. Predicting Coronary Artery Aneurysms in Kawasaki Disease at a North American Center: An Assessment of Baseline z Scores.

    PubMed

    Son, Mary Beth F; Gauvreau, Kimberlee; Kim, Susan; Tang, Alexander; Dedeoglu, Fatma; Fulton, David R; Lo, Mindy S; Baker, Annette L; Sundel, Robert P; Newburger, Jane W

    2017-05-31

    Accurate risk prediction of coronary artery aneurysms (CAAs) in North American children with Kawasaki disease remains a clinical challenge. We sought to determine the predictive utility of baseline coronary dimensions adjusted for body surface area ( z scores) for future CAAs in Kawasaki disease and explored the extent to which addition of established Japanese risk scores to baseline coronary artery z scores improved discrimination for CAA development. We explored the relationships of CAA with baseline z scores; with Kobayashi, Sano, Egami, and Harada risk scores; and with the combination of baseline z scores and risk scores. We defined CAA as a maximum z score (zMax) ≥2.5 of the left anterior descending or right coronary artery at 4 to 8 weeks of illness. Of 261 patients, 77 patients (29%) had a baseline zMax ≥2.0. CAAs occurred in 15 patients (6%). CAAs were strongly associated with baseline zMax ≥2.0 versus <2.0 (12 [16%] versus 3 [2%], respectively, P <0.001). Baseline zMax ≥2.0 had a C statistic of 0.77, good sensitivity (80%), and excellent negative predictive value (98%). None of the risk scores alone had adequate discrimination. When high-risk status per the Japanese risk scores was added to models containing baseline zMax ≥2.0, none were significantly better than baseline zMax ≥2.0 alone. In a North American center, baseline zMax ≥2.0 in children with Kawasaki disease demonstrated high predictive utility for later development of CAA. Future studies should validate the utility of our findings. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  17. Racial Differences in the Relationship Between Alcohol Consumption in Early Adulthood and Occupational Attainment at Midlife

    PubMed Central

    Malone, Patrick S.; Kertesz, Stefan G.; Wang, Yang; Costanzo, Philip R.

    2009-01-01

    Objectives. We assessed the relationship between alcohol consumption in young adulthood (ages 18–30 years) and occupational success 15 years later among Blacks and Whites. Methods. We analyzed data from the Coronary Artery Risk Development in Young Adults Study on employment status and occupational prestige at year 15 from baseline. The primary predictor was weekly alcohol use at baseline, after stratification by race and adjustment for socioeconomic factors. Results. We detected racial differences in the relationship between alcohol use in early adulthood and employment status at midlife. Blacks who were very heavy drinkers at baseline were more than 4 times as likely as Blacks who were occasional drinkers to be unemployed at year 15 (odds ratio [OR] = 4.34; 95% confidence interval [CI] = 2.22, 8.47). We found no statistically significant relationship among Whites. Occupational prestige at midlife was negatively related to very heavy drinking, but after adjustment for marital status, active coping, life stress, and educational attainment, this relationship was statistically significant only among Blacks. Conclusions. Heavy drinking during young adulthood was negatively associated with labor market success at midlife, especially among Blacks. PMID:19834006

  18. Decreasing Postanesthesia Care Unit to Floor Transfer Times to Facilitate Short Stay Total Joint Replacements.

    PubMed

    Sibia, Udai S; Grover, Jennifer; Turcotte, Justin J; Seanger, Michelle L; England, Kimberly A; King, Jennifer L; King, Paul J

    2018-04-01

    We describe a process for studying and improving baseline postanesthesia care unit (PACU)-to-floor transfer times after total joint replacements. Quality improvement project using lean methodology. Phase I of the investigational process involved collection of baseline data. Phase II involved developing targeted solutions to improve throughput. Phase III involved measured project sustainability. Phase I investigations revealed that patients spent an additional 62 minutes waiting in the PACU after being designated ready for transfer. Five to 16 telephone calls were needed between the PACU and the unit to facilitate each patient transfer. The most common reason for delay was unavailability of the unit nurse who was attending to another patient (58%). Phase II interventions resulted in transfer times decreasing to 13 minutes (79% reduction, P < .001). Phase III recorded sustained transfer times at 30 minutes, a net 52% reduction (P < .001) from baseline. Lean methodology resulted in the immediate decrease of PACU-to-floor transfer times by 79%, with a 52% sustained improvement. Our methods can also be used to improve efficiencies of care at other institutions. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, L.H.

    In its beginning, the U.S. Department of Energy (DOE) Office of Environmental Management (EM) viewed private industry as lacking adequate technology know-how to meet demands of hazardous and radioactive waste problems at the DOE`s laboratories and nuclear weapons production facilities. In November 1989, EM`s Office of Technology Development (recently renamed the Office of Science and Technology) embarked on a bold program of developing and demonstrating {open_quotes}innovative{close_quotes} waste cleanup technologies that would be safer, faster, more effective, and less expensive than the {open_quotes}baseline{close_quotes} commercial methods. This program has engaged DOE sites, national laboratories, and universities to produce preferred solutions to the problems of handling and treating DOE wastes. More recently, much of this work has shifted to joint efforts with private industry partners to accelerate the use of newly developed technologies and to enhance existing commercial methods. To date, the total funding allocation to the Office of Science and Technology program has been aboutmore » $2.8 billion. If the technology applications` projects of the EM Offices of Environmental Restoration and Waste Management are included, the total funding is closer to $$4 billion. Yet, the environmental industry generally has not been very receptive to EM`s innovative technology offerings. And, essentially the same can be said for DOE sites. According to the U.S. General Accounting Office in an August 1994 report, {open_quotes}Although DOE has spent a substantial amount to develop waste cleanup technologies, little new technology finds its way into the agency`s cleanup actions{close_quotes}. The DOE Baseline Environmental Management Report estimated cleanups of DOE`s Cold War legacy of wastes to require the considerable cost of $$226 billion over a period of 75 years. 1 tab.« less

  20. A review of calibrated blood oxygenation level-dependent (BOLD) methods for the measurement of task-induced changes in brain oxygen metabolism

    PubMed Central

    Blockley, Nicholas P.; Griffeth, Valerie E. M.; Simon, Aaron B.; Buxton, Richard B.

    2013-01-01

    The dynamics of the blood oxygenation level-dependent (BOLD) response are dependent on changes in cerebral blood flow, cerebral blood volume and the cerebral metabolic rate of oxygen consumption. Furthermore, the amplitude of the response is dependent on the baseline physiological state, defined by the haematocrit, oxygen extraction fraction and cerebral blood volume. As a result of this complex dependence, the accurate interpretation of BOLD data and robust intersubject comparisons when the baseline physiology is varied are difficult. The calibrated BOLD technique was developed to address these issues. However, the methodology is complex and its full promise has not yet been realised. In this review, the theoretical underpinnings of calibrated BOLD, and issues regarding this theory that are still to be resolved, are discussed. Important aspects of practical implementation are reviewed and reported applications of this methodology are presented. PMID:22945365

  1. An absolute calibration method of an ethyl alcohol biosensor based on wavelength-modulated differential photothermal radiometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yi Jun; Mandelis, Andreas, E-mail: mandelis@mie.utoronto.ca; Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5S 3G9

    In this work, laser-based wavelength-modulated differential photothermal radiometry (WM-DPTR) is applied to develop a non-invasive in-vehicle alcohol biosensor. WM-DPTR features unprecedented ethanol-specificity and sensitivity by suppressing baseline variations through a differential measurement near the peak and baseline of the mid-infrared ethanol absorption spectrum. Biosensor signal calibration curves are obtained from WM-DPTR theory and from measurements in human blood serum and ethanol solutions diffused from skin. The results demonstrate that the WM-DPTR-based calibrated alcohol biosensor can achieve high precision and accuracy for the ethanol concentration range of 0-100 mg/dl. The high-performance alcohol biosensor can be incorporated into ignition interlocks that couldmore » be fitted as a universal accessory in vehicles in an effort to reduce incidents of drinking and driving.« less

  2. Experiential acceptance, motivation for recovery, and treatment outcome in eating disorders

    PubMed Central

    Espel, Hallie M.; Goldstein, Stephanie P.; Manasse, Stephanie M.; Juarascio, Adrienne S.

    2016-01-01

    Purpose This study sought to test whether the relationship between experiential acceptance (EA) and treatment outcome among eating disorder (ED) patients was mediated by motivation. Methods Upon admission to a residential ED treatment facility, female patients completed measures of EA, motivation, and baseline ED symptom severity (covariate); symptom severity was reassessed at discharge. Results Higher levels of baseline EA predicted significantly greater symptom reduction during treatment. Moreover, results from bootstrapped mediation analyses indicated that the relationship between EA and treatment outcome was partially mediated by motivation: increased EA was associated with greater motivation to give up ED behaviors at the beginning of treatment, and this led to greater symptom reduction from admission to discharge. Conclusions Motivation appears to be one mechanism by which EA facilitates improved treatment outcomes in EDs. Further development of interventions that promote EA as a means for improving motivation and subsequent ED treatment response may be warranted. PMID:26511501

  3. Electricity End Uses, Energy Efficiency, and Distributed Energy Resources Baseline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Lisa; Wei, Max; Morrow, William

    This report was developed by a team of analysts at Lawrence Berkeley National Laboratory, with Argonne National Laboratory contributing the transportation section, and is a DOE EPSA product and part of a series of “baseline” reports intended to inform the second installment of the Quadrennial Energy Review (QER 1.2). QER 1.2 provides a comprehensive review of the nation’s electricity system and cover the current state and key trends related to the electricity system, including generation, transmission, distribution, grid operations and planning, and end use. The baseline reports provide an overview of elements of the electricity system. This report focuses onmore » end uses, electricity consumption, electric energy efficiency, distributed energy resources (DERs) (such as demand response, distributed generation, and distributed storage), and evaluation, measurement, and verification (EM&V) methods for energy efficiency and DERs.« less

  4. A method for estimating the contribution of seed potatoes, machinery and soil tare in field infestations with potato cyst nematodes on a national scale.

    PubMed

    Goeminne, M; Demeulemeester, K; Viaene, N

    2011-01-01

    In order to make a cost benefit analysis for the management of the potato cyst nematodes Globodera rostochiensis and G. pallida, we developed a method to estimate the relative importance of three basic distribution channels of potato cyst nematodes: seed potatoes, machinery and soil tare. The baseline is determined by the area planted with potatoes, the area infested with potato cysts, the proportion of resistant potato cultivars and the distribution of cysts trough different channels. This quantification forms a basis for the evaluation of the effects of different control measures for potato cyst nematode on a national scale. The method can be useful as an example for application in other countries.

  5. Development of a general baseline toxicity QSAR model for the fish embryo acute toxicity test.

    PubMed

    Klüver, Nils; Vogs, Carolina; Altenburger, Rolf; Escher, Beate I; Scholz, Stefan

    2016-12-01

    Fish embryos have become a popular model in ecotoxicology and toxicology. The fish embryo acute toxicity test (FET) with the zebrafish embryo was recently adopted by the OECD as technical guideline TG 236 and a large database of concentrations causing 50% lethality (LC 50 ) is available in the literature. Quantitative Structure-Activity Relationships (QSARs) of baseline toxicity (also called narcosis) are helpful to estimate the minimum toxicity of chemicals to be tested and to identify excess toxicity in existing data sets. Here, we analyzed an existing fish embryo toxicity database and established a QSAR for fish embryo LC 50 using chemicals that were independently classified to act according to the non-specific mode of action of baseline toxicity. The octanol-water partition coefficient K ow is commonly applied to discriminate between non-polar and polar narcotics. Replacing the K ow by the liposome-water partition coefficient K lipw yielded a common QSAR for polar and non-polar baseline toxicants. This developed baseline toxicity QSAR was applied to compare the final mode of action (MOA) assignment of 132 chemicals. Further, we included the analysis of internal lethal concentration (ILC 50 ) and chemical activity (La 50 ) as complementary approaches to evaluate the robustness of the FET baseline toxicity. The analysis of the FET dataset revealed that specifically acting and reactive chemicals converged towards the baseline toxicity QSAR with increasing hydrophobicity. The developed FET baseline toxicity QSAR can be used to identify specifically acting or reactive compounds by determination of the toxic ratio and in combination with appropriate endpoints to infer the MOA for chemicals. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Layover and shadow detection based on distributed spaceborne single-baseline InSAR

    NASA Astrophysics Data System (ADS)

    Huanxin, Zou; Bin, Cai; Changzhou, Fan; Yun, Ren

    2014-03-01

    Distributed spaceborne single-baseline InSAR is an effective technique to get high quality Digital Elevation Model. Layover and Shadow are ubiquitous phenomenon in SAR images because of geometric relation of SAR imaging. In the signal processing of single-baseline InSAR, the phase singularity of Layover and Shadow leads to the phase difficult to filtering and unwrapping. This paper analyzed the geometric and signal model of the Layover and Shadow fields. Based on the interferometric signal autocorrelation matrix, the paper proposed the signal number estimation method based on information theoretic criteria, to distinguish Layover and Shadow from normal InSAR fields. The effectiveness and practicability of the method proposed in the paper are validated in the simulation experiments and theoretical analysis.

  7. Aircraft Engine On-Line Diagnostics Through Dual-Channel Sensor Measurements: Development of a Baseline System

    NASA Technical Reports Server (NTRS)

    Kobayashi, Takahisa; Simon, Donald L.

    2008-01-01

    In this paper, a baseline system which utilizes dual-channel sensor measurements for aircraft engine on-line diagnostics is developed. This system is composed of a linear on-board engine model (LOBEM) and fault detection and isolation (FDI) logic. The LOBEM provides the analytical third channel against which the dual-channel measurements are compared. When the discrepancy among the triplex channels exceeds a tolerance level, the FDI logic determines the cause of the discrepancy. Through this approach, the baseline system achieves the following objectives: (1) anomaly detection, (2) component fault detection, and (3) sensor fault detection and isolation. The performance of the baseline system is evaluated in a simulation environment using faults in sensors and components.

  8. Activation and Coagulation Biomarkers are Independent Predictors for the Development of Opportunistic Disease in Patients with HIV Infection

    PubMed Central

    Rodger, Alison J; Fox, Zoe; Lundgren, Jens D; Kuller, Lew; Boesecke, Christoph; Gey, Daniela; Skoutelis, Athanassios; Goetz, Matthew Bidwell; Phillips, Andrew N

    2010-01-01

    Background Activation and coagulation biomarkers were measured within the SMART trial. Their associations with opportunistic disease (OD) in HIV-positive patients were examined. Methods Inflammatory (high-sensitivity C-reactive protein [hsCRP], interleukin-6 [IL-6], amyloid-A, and amyloid-P) and coagulation (D-dimer and prothrombin-fragment 1+2) markers were determined. Conditional logistic regression analyses were used to assess associations between these biomarkers and risk of OD. Results The 91 patients who developed an OD were matched to 182 controls. Patients with hsCRP≥5 μg/mL at baseline had a 3.5 (95%CI: 1.5-8.1) higher odds of OD versus those with hsCRP<1 μg/ml, Ptrend=0.003, and patients with IL-6≥3 pg/mL at baseline had a 2.4 (95%CI: 1.0-5.4) higher odds of OD versus those with IL-6<1.5 pg/mL, Ptrend=0.02. No other baseline biomarkers predicted development of an OD. Latest hsCRP (OR: 7.6 (95%CI: 2.0-28.5) for those with hsCRP≥5 μg/mL versus hsCRP<1 μg/mL, Ptrend=0.002), latest amyloid-A (OR: 3.8 (95%CI: 1.1-13.4) for those with amyloid-A ≥6 mg/L versus amyloid-A <2 mg/L, Ptrend=0.03) and latest IL-6 (OR 2.4 (95%CI: 0.7-8.8) for those with IL-6≥3 pg/mL versus IL-6<1.5 pg/mL, Ptrend=0.04) were also associated with developing an OD. Conclusions Higher IL-6 and hsCRP independently predicted development of OD. These biomarkers could provide additional prognostic information for predicting risk of OD. PMID:19678756

  9. Multi-project baselines for potential clean development mechanism projects in the electricity sector in South Africa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winkler, H.; Spalding-Fecher, R.; Sathaye, J.

    2002-06-26

    The United Nations Framework Convention on Climate Change (UNFCCC) aims to reduce emissions of greenhouse gases (GHGs) in order to ''prevent dangerous anthropogenic interference with the climate system'' and promote sustainable development. The Kyoto Protocol, which was adopted in 1997 and appears likely to be ratified by 2002 despite the US withdrawing, aims to provide means to achieve this objective. The Clean Development Mechanism (CDM) is one of three ''flexibility mechanisms'' in the Protocol, the other two being Joint Implementation (JI) and Emissions Trading (ET). These mechanisms allow flexibility for Annex I Parties (industrialized countries) to achieve reductions by extra-territorialmore » as well as domestic activities. The underlying concept is that trade and transfer of credits will allow emissions reductions at least cost. Since the atmosphere is a global, well-mixed system, it does not matter where greenhouse gas emissions are reduced. The CDM allows Annex I Parties to meet part of their emissions reductions targets by investing in developing countries. CDM projects must also meet the sustainable development objectives of the developing country. Further criteria are that Parties must participate voluntarily, that emissions reductions are ''real, measurable and long-term'', and that they are additional to those that would have occurred anyway. The last requirement makes it essential to define an accurate baseline. The remaining parts of section 1 outline the theory of baselines, emphasizing the balance needed between environmental integrity and reducing transaction costs. Section 2 develops an approach to multi-project baseline for the South African electricity sector, comparing primarily to near future capacity, but also considering recent plants. Five potential CDM projects are briefly characterized in section 3, and compared to the baseline in section 4. Section 5 concludes with a discussion of options and choices for South Africa regarding electricity sector baselines.« less

  10. Physiological and Performance Measures for Baseline Concussion Assessment.

    PubMed

    Dobney, Danielle M; Thomas, Scott G; Taha, Tim; Keightley, Michelle

    2017-05-17

    Baseline testing is a common strategy for concussion assessment and management. Research continues to evaluate novel measures for potential to improve baseline testing methods. The primary objective was to; 1) determine the feasibility of including physiological, neuromuscular and mood measures as part of baseline concussion testing protocol, 2) describe typical values in a varsity athlete sample, and 3) estimate the influence of concussion history on these baseline measures. Prospective observational study. University Athletic Therapy Clinic. 100 varsity athletes. Frequency and domain measures of heart rate variability (HRV), blood pressure (BP), grip strength, Profile of Mood States and the Sport Concussion Assessment Tool-2. Physiological, neuromuscular performance and mood measures were feasible at baseline. Participants with a history of two or more previous concussions displayed significantly higher diastolic blood pressure. Females reported higher total mood disturbance compared to males. Physiological and neuromuscular performance measures are safe and feasible as baseline concussion assessment outcomes. History of concussion may have an influence on diastolic blood pressure.

  11. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: BASELINE QUESTIONNAIRE (HOUSEHOLD) (UA-D-7.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Baseline Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the "Border" study. Household and individual data were combined in a single Baseline Questionnaire data file. Key...

  12. Nonlinear assessment of cerebral autoregulation from spontaneous blood pressure and cerebral blood flow fluctuations.

    PubMed

    Hu, Kun; Peng, C K; Czosnyka, Marek; Zhao, Peng; Novak, Vera

    2008-03-01

    Cerebral autoregulation (CA) is an most important mechanism responsible for the relatively constant blood flow supply to brain when cerebral perfusion pressure varies. Its assessment in nonacute cases has been relied on the quantification of the relationship between noninvasive beat-to-beat blood pressure (BP) and blood flow velocity (BFV). To overcome the nonstationary nature of physiological signals such as BP and BFV, a computational method called multimodal pressure-flow (MMPF) analysis was recently developed to study the nonlinear BP-BFV relationship during the Valsalva maneuver (VM). The present study aimed to determine (i) whether this method can estimate autoregulation from spontaneous BP and BFV fluctuations during baseline rest conditions; (ii) whether there is any difference between the MMPF measures of autoregulation based on intra-arterial BP (ABP) and based on cerebral perfusion pressure (CPP); and (iii) whether the MMPF method provides reproducible and reliable measure for noninvasive assessment of autoregulation. To achieve these aims, we analyzed data from existing databases including: (i) ABP and BFV of 12 healthy control, 10 hypertensive, and 10 stroke subjects during baseline resting conditions and during the Valsalva maneuver, and (ii) ABP, CPP, and BFV of 30 patients with traumatic brain injury (TBI) who were being paralyzed, sedated, and ventilated. We showed that autoregulation in healthy control subjects can be characterized by specific phase shifts between BP and BFV oscillations during the Valsalva maneuver, and the BP-BFV phase shifts were reduced in hypertensive and stroke subjects (P < 0.01), indicating impaired autoregulation. Similar results were found during baseline condition from spontaneous BP and BFV oscillations. The BP-BFV phase shifts obtained during baseline and during VM were highly correlated (R > 0.8, P < 0.0001), showing no statistical difference (paired-t test P > 0.47). In TBI patients there were strong correlations between phases of ABP and CPP oscillations (R = 0.99, P < 0.0001) and, thus, between ABP-BFV and CPP-BFV phase shifts (P < 0.0001, R = 0.76). By repeating the MMPF 4 times on data of TBI subjects, each time on a selected cycle of spontaneous BP and BFV oscillations, we showed that MMPF had better reproducibility than traditional autoregulation index. These results indicate that the MMPF method, based on instantaneous phase relationships between cerebral blood flow velocity and peripheral blood pressure, has better performance than the traditional standard method, and can reliably assess cerebral autoregulation dynamics from ambulatory blood pressure and cerebral blood flow during supine rest conditions.

  13. Nonlinear Assessment of Cerebral Autoregulation from Spontaneous Blood Pressure and Cerebral Blood Flow Fluctuations

    PubMed Central

    Peng, C. K.; Czosnyka, Marek; Zhao, Peng

    2009-01-01

    Cerebral autoregulation (CA) is an most important mechanism responsible for the relatively constant blood flow supply to brain when cerebral perfusion pressure varies. Its assessment in nonacute cases has been relied on the quantification of the relationship between noninvasive beat-to-beat blood pressure (BP) and blood flow velocity (BFV). To overcome the nonstationary nature of physiological signals such as BP and BFV, a computational method called multimodal pressure-flow (MMPF) analysis was recently developed to study the nonlinear BP–BFV relationship during the Valsalva maneuver (VM). The present study aimed to determine (i) whether this method can estimate autoregulation from spontaneous BP and BFV fluctuations during baseline rest conditions; (ii) whether there is any difference between the MMPF measures of autoregulation based on intra-arterial BP (ABP) and based on cerebral perfusion pressure (CPP); and (iii) whether the MMPF method provides reproducible and reliable measure for noninvasive assessment of autoregulation. To achieve these aims, we analyzed data from existing databases including: (i) ABP and BFV of 12 healthy control, 10 hypertensive, and 10 stroke subjects during baseline resting conditions and during the Valsalva maneuver, and (ii) ABP, CPP, and BFV of 30 patients with traumatic brain injury (TBI) who were being paralyzed, sedated, and ventilated. We showed that autoregulation in healthy control subjects can be characterized by specific phase shifts between BP and BFV oscillations during the Valsalva maneuver, and the BP–BFV phase shifts were reduced in hypertensive and stroke subjects (P < 0.01), indicating impaired autoregulation. Similar results were found during baseline condition from spontaneous BP and BFV oscillations. The BP–BFV phase shifts obtained during baseline and during VM were highly correlated (R > 0.8, P < 0.0001), showing no statistical difference (paired-t test P > 0.47). In TBI patients there were strong correlations between phases of ABP and CPP oscillations (R = 0.99, P < 0.0001) and, thus, between ABP–BFV and CPP–BFV phase shifts (P < 0.0001, R = 0.76). By repeating the MMPF 4 times on data of TBI subjects, each time on a selected cycle of spontaneous BP and BFV oscillations, we showed that MMPF had better reproducibility than traditional autoregulation index. These results indicate that the MMPF method, based on instantaneous phase relationships between cerebral blood flow velocity and peripheral blood pressure, has better performance than the traditional standard method, and can reliably assess cerebral autoregulation dynamics from ambulatory blood pressure and cerebral blood flow during supine rest conditions. PMID:18080758

  14. Development of the IBSAL-SimMOpt Method for the Optimization of Quality in a Corn Stover Supply Chain

    DOE PAGES

    Chavez, Hernan; Castillo-Villar, Krystel; Webb, Erin

    2017-08-01

    Variability on the physical characteristics of feedstock has a relevant effect on the reactor’s reliability and operating cost. Most of the models developed to optimize biomass supply chains have failed to quantify the effect of biomass quality and preprocessing operations required to meet biomass specifications on overall cost and performance. The Integrated Biomass Supply Analysis and Logistics (IBSAL) model estimates the harvesting, collection, transportation, and storage cost while considering the stochastic behavior of the field-to-biorefinery supply chain. This paper proposes an IBSAL-SimMOpt (Simulation-based Multi-Objective Optimization) method for optimizing the biomass quality and costs associated with the efforts needed to meetmore » conversion technology specifications. The method is developed in two phases. For the first phase, a SimMOpt tool that interacts with the extended IBSAL is developed. For the second phase, the baseline IBSAL model is extended so that the cost for meeting and/or penalization for failing in meeting specifications are considered. The IBSAL-SimMOpt method is designed to optimize quality characteristics of biomass, cost related to activities intended to improve the quality of feedstock, and the penalization cost. A case study based on 1916 farms in Ontario, Canada is considered for testing the proposed method. Analysis of the results demonstrates that this method is able to find a high-quality set of non-dominated solutions.« less

  15. Development of the IBSAL-SimMOpt Method for the Optimization of Quality in a Corn Stover Supply Chain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavez, Hernan; Castillo-Villar, Krystel; Webb, Erin

    Variability on the physical characteristics of feedstock has a relevant effect on the reactor’s reliability and operating cost. Most of the models developed to optimize biomass supply chains have failed to quantify the effect of biomass quality and preprocessing operations required to meet biomass specifications on overall cost and performance. The Integrated Biomass Supply Analysis and Logistics (IBSAL) model estimates the harvesting, collection, transportation, and storage cost while considering the stochastic behavior of the field-to-biorefinery supply chain. This paper proposes an IBSAL-SimMOpt (Simulation-based Multi-Objective Optimization) method for optimizing the biomass quality and costs associated with the efforts needed to meetmore » conversion technology specifications. The method is developed in two phases. For the first phase, a SimMOpt tool that interacts with the extended IBSAL is developed. For the second phase, the baseline IBSAL model is extended so that the cost for meeting and/or penalization for failing in meeting specifications are considered. The IBSAL-SimMOpt method is designed to optimize quality characteristics of biomass, cost related to activities intended to improve the quality of feedstock, and the penalization cost. A case study based on 1916 farms in Ontario, Canada is considered for testing the proposed method. Analysis of the results demonstrates that this method is able to find a high-quality set of non-dominated solutions.« less

  16. Development of a sensitive and rapid method for rifampicin impurity analysis using supercritical fluid chromatography.

    PubMed

    Li, Wei; Wang, Jun; Yan, Zheng-Yu

    2015-10-10

    A novel simple, fast and efficient supercritical fluid chromatography (SFC) method was developed and compared with RPLC method for the separation and determination of impurities in rifampicin. The separation was performed using a packed diol column and a mobile phase B (modifier) consisting of methanol with 0.1% ammonium formate (w/v) and 2% water (v/v). Overall satisfactory resolutions and peak shapes for rifampicin quinone (RQ), rifampicin (RF), rifamycin SV (RSV), rifampicin N-oxide (RNO) and 3-formylrifamycinSV (3-FR) were obtained by optimization of the chromatography system. With gradient elution of mobile phase, all of the impurities and the active were separated within 4 min. Taking full advantage of features of SFC (such as particular selectivity, non-sloping baseline in gradient elution, and without injection solvent effects), the method was successfully used for determination of impurities in rifampicin, with more impurity peaks detected, better resolution achieved and much less analysis time needed compared with conventional reversed-phase liquid chromatography (RPLC) methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Constructing Benchmark Databases and Protocols for Medical Image Analysis: Diabetic Retinopathy

    PubMed Central

    Kauppi, Tomi; Kämäräinen, Joni-Kristian; Kalesnykiene, Valentina; Sorri, Iiris; Uusitalo, Hannu; Kälviäinen, Heikki

    2013-01-01

    We address the performance evaluation practices for developing medical image analysis methods, in particular, how to establish and share databases of medical images with verified ground truth and solid evaluation protocols. Such databases support the development of better algorithms, execution of profound method comparisons, and, consequently, technology transfer from research laboratories to clinical practice. For this purpose, we propose a framework consisting of reusable methods and tools for the laborious task of constructing a benchmark database. We provide a software tool for medical image annotation helping to collect class label, spatial span, and expert's confidence on lesions and a method to appropriately combine the manual segmentations from multiple experts. The tool and all necessary functionality for method evaluation are provided as public software packages. As a case study, we utilized the framework and tools to establish the DiaRetDB1 V2.1 database for benchmarking diabetic retinopathy detection algorithms. The database contains a set of retinal images, ground truth based on information from multiple experts, and a baseline algorithm for the detection of retinopathy lesions. PMID:23956787

  18. Elevated serum uric acid increases risks for developing high LDL cholesterol and hypertriglyceridemia: A five-year cohort study in Japan.

    PubMed

    Kuwabara, Masanari; Borghi, Claudio; Cicero, Arrigo F G; Hisatome, Ichiro; Niwa, Koichiro; Ohno, Minoru; Johnson, Richard J; Lanaspa, Miguel A

    2018-06-15

    High serum uric acid (SUA) is associated with the dyslipidemia, but whether hyperuricemia predicts an increase in serum low-density lipoprotein (LDL) cholesterol is unknown. This study is to evaluate whether an elevated SUA predicts the development of high LDL cholesterol as well as hypertriglyceridemia. This is a retrospective 5-year cohort study of 6476 healthy Japanese adults (age, 45.7 ± 10.1 years; 2.243 men) who underwent health examinations at 2004 and were reevaluated in 2009 at St. Luke's International Hospital, Tokyo, Japan. Subjects were included if at their baseline examination they did not have hypertension, diabetes mellitus, dyslipidemia, chronic kidney disease, or if they were on medication for hyperuricemia and/or gout. The analysis was adjusted for age, body mass index (BMI), smoking and drinking habits, baseline estimated glomerular filtration rate (eGFR), baseline SUA and SUA change over the 5 years. High baseline SUA was an independent risk for developing high LDL cholesterol both in men (OR: 1.159 per 1 mg/dL increase, 95% CI:1.009-1.331) and women (OR: 1.215, 95% CI:1.061-1.390). Other risk factors included a higher baseline LDL cholesterol, higher BMI, and higher baseline eGFR (the latter two in women only). Increased SUA over 5 years were also independent risks for developing high LDL cholesterol and hypertriglyceridemia, but not for low high-density lipoprotein (HDL) cholesterol. This is the first study to report that an elevated SUA increases the risk for developing high LDL cholesterol, as well as hypertriglyceridemia. This may shed light into the role of SUA in cardiovascular disease. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Efficient Wide Baseline Structure from Motion

    NASA Astrophysics Data System (ADS)

    Michelini, Mario; Mayer, Helmut

    2016-06-01

    This paper presents a Structure from Motion approach for complex unorganized image sets. To achieve high accuracy and robustness, image triplets are employed and (an approximate) camera calibration is assumed to be known. The focus lies on a complete linking of images even in case of large image distortions, e.g., caused by wide baselines, as well as weak baselines. A method for embedding image descriptors into Hamming space is proposed for fast image similarity ranking. The later is employed to limit the number of pairs to be matched by a wide baseline method. An iterative graph-based approach is proposed formulating image linking as the search for a terminal Steiner minimum tree in a line graph. Finally, additional links are determined and employed to improve the accuracy of the pose estimation. By this means, loops in long image sequences are implicitly closed. The potential of the proposed approach is demonstrated by results for several complex image sets also in comparison with VisualSFM.

  20. [Discussion of scattering in THz time domain spectrum tests].

    PubMed

    Yan, Fang; Zhang, Zhao-hui; Zhao, Xiao-yan; Su, Hai-xia; Li, Zhi; Zhang, Han

    2014-06-01

    Using THz-TDS to extract the absorption spectrum of a sample is an important branch of various THz applications. Basically, we believe that the THz radiation scatters from sample particles, leading to an obvious baseline increasing with frequencies in its absorption spectrum. The baseline will affect the measurement accuracy due to ambiguous height and pattern of the spectrum. The authors should try to remove the baseline, and eliminate the effects of scattering. In the present paper, we investigated the causes of baselines, reviewed some of scatter mitigating methods and summarized some of research aspects in the future. In order to validate the correctness of these methods, we designed a series of experiments to compare the computational accuracy of molar concentration. The result indicated that the computational accuracy of molar concentration can be improved, which can be the basis of quantitative analysis in further researches. Finally, with comprehensive experimental results, we presented further research directions on THz absorption spectrum that is needed for the removal of scattering effects.

  1. Finding the optimal shape of the leading-and-trailing car of a high-speed train using design-by-morphing

    NASA Astrophysics Data System (ADS)

    Oh, Sahuck; Jiang, Chung-Hsiang; Jiang, Chiyu; Marcus, Philip S.

    2017-10-01

    We present a new, general design method, called design-by-morphing for an object whose performance is determined by its shape due to hydrodynamic, aerodynamic, structural, or thermal requirements. To illustrate the method, we design a new leading-and-trailing car of a train by morphing existing, baseline leading-and-trailing cars to minimize the drag. In design-by-morphing, the morphing is done by representing the shapes with polygonal meshes and spectrally with a truncated series of spherical harmonics. The optimal design is found by computing the optimal weights of each of the baseline shapes so that the morphed shape has minimum drag. As a result of optimization, we found that with only two baseline trains that mimic current high-speed trains with low drag that the drag of the optimal train is reduced by 8.04% with respect to the baseline train with the smaller drag. When we repeat the optimization by adding a third baseline train that under-performs compared to the other baseline train, the drag of the new optimal train is reduced by 13.46% . This finding shows that bad examples of design are as useful as good examples in determining an optimal design. We show that design-by-morphing can be extended to many engineering problems in which the performance of an object depends on its shape.

  2. Finding the optimal shape of the leading-and-trailing car of a high-speed train using design-by-morphing

    NASA Astrophysics Data System (ADS)

    Oh, Sahuck; Jiang, Chung-Hsiang; Jiang, Chiyu; Marcus, Philip S.

    2018-07-01

    We present a new, general design method, called design-by-morphing for an object whose performance is determined by its shape due to hydrodynamic, aerodynamic, structural, or thermal requirements. To illustrate the method, we design a new leading-and-trailing car of a train by morphing existing, baseline leading-and-trailing cars to minimize the drag. In design-by-morphing, the morphing is done by representing the shapes with polygonal meshes and spectrally with a truncated series of spherical harmonics. The optimal design is found by computing the optimal weights of each of the baseline shapes so that the morphed shape has minimum drag. As a result of optimization, we found that with only two baseline trains that mimic current high-speed trains with low drag that the drag of the optimal train is reduced by 8.04% with respect to the baseline train with the smaller drag. When we repeat the optimization by adding a third baseline train that under-performs compared to the other baseline train, the drag of the new optimal train is reduced by 13.46%. This finding shows that bad examples of design are as useful as good examples in determining an optimal design. We show that design-by-morphing can be extended to many engineering problems in which the performance of an object depends on its shape.

  3. The prognostic utility of baseline alpha-fetoprotein for hepatocellular carcinoma patients.

    PubMed

    Silva, Jack P; Gorman, Richard A; Berger, Nicholas G; Tsai, Susan; Christians, Kathleen K; Clarke, Callisia N; Mogal, Harveshp; Gamblin, T Clark

    2017-12-01

    Alpha-fetoprotein (AFP) has a valuable role in postoperative surveillance for hepatocellular carcinoma (HCC) recurrence. The utility of pretreatment or baseline AFP remains controversial. The present study hypothesized that elevated baseline AFP levels are associated with worse overall survival in HCC patients. Adult HCC patients were identified using the National Cancer Database (2004-2013). Patients were stratified according to baseline AFP measurements into the following groups: Negative (<20), Borderline (20-199), Elevated (200-1999), and Highly Elevated (>2000). The primary outcome was overall survival (OS), which was analyzed by log-rank test and graphed using Kaplan-Meier method. Multivariate regression modeling was used to determine hazard ratios (HR) for OS. Of 41 107 patients identified, 15 809 (33.6%) were Negative. Median overall survival was highest in the Negative group, followed by Borderline, Elevated, and Highly Elevated (28.7 vs 18.9 vs 8.8 vs 3.2 months; P < 0.001). On multivariate analysis, overall survival hazard ratios for the Borderline, Elevated, and Highly Elevated groups were 1.18 (P = 0.267), 1.94 (P < 0.001), and 1.77 (P = 0.007), respectively (reference Negative). Baseline AFP independently predicted overall survival in HCC patients regardless of treatment plan. A baseline AFP value is a simple and effective method to assist in expected survival for HCC patients. © 2017 Wiley Periodicals, Inc.

  4. Methodological Issues Surrounding the Use of Baseline Health-Related Quality of Life Data to Inform Trial-Based Economic Evaluations of Interventions Within Emergency and Critical Care Settings: A Systematic Literature Review.

    PubMed

    Dritsaki, Melina; Achana, Felix; Mason, James; Petrou, Stavros

    2017-05-01

    Trial-based cost-utility analyses require health-related quality of life data that generate utility values in order to express health outcomes in terms of quality-adjusted life years (QALYs). Assessments of baseline health-related quality of life are problematic where trial participants are incapacitated or critically ill at the time of randomisation. This review aims to identify and critique methods for handling non-availability of baseline health-related quality of life data in trial-based cost-utility analyses within emergency and critical illness settings. A systematic literature review was conducted, following PRISMA guidelines, to identify trial-based cost-utility analyses of interventions within emergency and critical care settings. Databases searched included the National Institute for Health Research (NIHR) Journals Library (1991-July 2016), Cochrane Library (all years); National Health Service (NHS) Economic Evaluation Database (all years) and Ovid MEDLINE/Embase (without time restriction). Strategies employed to handle non-availability of baseline health-related quality of life data in final QALY estimations were identified and critiqued. A total of 4224 published reports were screened, 19 of which met the study inclusion criteria (mean trial size 1670): 14 (74 %) from the UK, four (21%) from other European countries and one (5%) from India. Twelve studies (63%) were based in emergency departments and seven (37%) in intensive care units. Only one study was able to elicit patient-reported health-related quality of life at baseline. To overcome the lack of baseline data when estimating QALYs, eight studies (42%) assigned a fixed utility weight corresponding to either death, an unconscious health state or a country-specific norm to patients at baseline, four (21%) ignored baseline utilities, three (16%) applied values from another study, one (5%) generated utility values via retrospective recall and one (5%) elicited utilities from experts. A preliminary exploration of these methods shows that incremental QALY estimation is unlikely to be biased if balanced trial allocation is achieved and subsequent collection of health-related quality of life data occurs at the earliest possible opportunity following commencement of treatment, followed by an adequate number of follow-up assessments. Trial-based cost-utility analyses within emergency and critical illness settings have applied different methods for QALY estimation, employing disparate assumptions about the health-related quality of life of patients at baseline. Where baseline measurement is not practical, measurement at the earliest opportunity following commencement of treatment should minimise bias in QALY estimation.

  5. Methodological criteria for the assessment of moderators in systematic reviews of randomised controlled trials: a consensus study

    PubMed Central

    2011-01-01

    Background Current methodological guidelines provide advice about the assessment of sub-group analysis within RCTs, but do not specify explicit criteria for assessment. Our objective was to provide researchers with a set of criteria that will facilitate the grading of evidence for moderators, in systematic reviews. Method We developed a set of criteria from methodological manuscripts (n = 18) using snowballing technique, and electronic database searches. Criteria were reviewed by an international Delphi panel (n = 21), comprising authors who have published methodological papers in this area, and researchers who have been active in the study of sub-group analysis in RCTs. We used the Research ANd Development/University of California Los Angeles appropriateness method to assess consensus on the quantitative data. Free responses were coded for consensus and disagreement. In a subsequent round additional criteria were extracted from the Cochrane Reviewers' Handbook, and the process was repeated. Results The recommendations are that meta-analysts report both confirmatory and exploratory findings for sub-groups analysis. Confirmatory findings must only come from studies in which a specific theory/evidence based a-priori statement is made. Exploratory findings may be used to inform future/subsequent trials. However, for inclusion in the meta-analysis of moderators, the following additional criteria should be applied to each study: Baseline factors should be measured prior to randomisation, measurement of baseline factors should be of adequate reliability and validity, and a specific test of the interaction between baseline factors and interventions must be presented. Conclusions There is consensus from a group of 21 international experts that methodological criteria to assess moderators within systematic reviews of RCTs is both timely and necessary. The consensus from the experts resulted in five criteria divided into two groups when synthesising evidence: confirmatory findings to support hypotheses about moderators and exploratory findings to inform future research. These recommendations are discussed in reference to previous recommendations for evaluating and reporting moderator studies. PMID:21281501

  6. Assessing Historical Fish Community Composition Using Surveys, Historical Collection Data, and Species Distribution Models

    PubMed Central

    Labay, Ben; Cohen, Adam E.; Sissel, Blake; Hendrickson, Dean A.; Martin, F. Douglas; Sarkar, Sahotra

    2011-01-01

    Accurate establishment of baseline conditions is critical to successful management and habitat restoration. We demonstrate the ability to robustly estimate historical fish community composition and assess the current status of the urbanized Barton Creek watershed in central Texas, U.S.A. Fish species were surveyed in 2008 and the resulting data compared to three sources of fish occurrence information: (i) historical records from a museum specimen database and literature searches; (ii) a nearly identical survey conducted 15 years earlier; and (iii) a modeled historical community constructed with species distribution models (SDMs). This holistic approach, and especially the application of SDMs, allowed us to discover that the fish community in Barton Creek was more diverse than the historical data and survey methods alone indicated. Sixteen native species with high modeled probability of occurrence within the watershed were not found in the 2008 survey, seven of these were not found in either survey or in any of the historical collection records. Our approach allowed us to more rigorously establish the true baseline for the pre-development fish fauna and then to more accurately assess trends and develop hypotheses regarding factors driving current fish community composition to better inform management decisions and future restoration efforts. Smaller, urbanized freshwater systems, like Barton Creek, typically have a relatively poor historical biodiversity inventory coupled with long histories of alteration, and thus there is a propensity for land managers and researchers to apply inaccurate baseline standards. Our methods provide a way around that limitation by using SDMs derived from larger and richer biodiversity databases of a broader geographic scope. Broadly applied, we propose that this technique has potential to overcome limitations of popular bioassessment metrics (e.g., IBI) to become a versatile and robust management tool for determining status of freshwater biotic communities. PMID:21966438

  7. Projections of the Current and Future Disease Burden of Hepatitis C Virus Infection in Malaysia

    PubMed Central

    McDonald, Scott A.; Dahlui, Maznah; Mohamed, Rosmawati; Naning, Herlianna; Shabaruddin, Fatiha Hana; Kamarulzaman, Adeeba

    2015-01-01

    Background The prevalence of hepatitis C virus (HCV) infection in Malaysia has been estimated at 2.5% of the adult population. Our objective, satisfying one of the directives of the WHO Framework for Global Action on Viral Hepatitis, was to forecast the HCV disease burden in Malaysia using modelling methods. Methods An age-structured multi-state Markov model was developed to simulate the natural history of HCV infection. We tested three historical incidence scenarios that would give rise to the estimated prevalence in 2009, and calculated the incidence of cirrhosis, end-stage liver disease, and death, and disability-adjusted life-years (DALYs) under each scenario, to the year 2039. In the baseline scenario, current antiviral treatment levels were extended from 2014 to the end of the simulation period. To estimate the disease burden averted under current sustained virological response rates and treatment levels, the baseline scenario was compared to a counterfactual scenario in which no past or future treatment is assumed. Results In the baseline scenario, the projected disease burden for the year 2039 is 94,900 DALYs/year (95% credible interval (CrI): 77,100 to 124,500), with 2,002 (95% CrI: 1340 to 3040) and 540 (95% CrI: 251 to 1,030) individuals predicted to develop decompensated cirrhosis and hepatocellular carcinoma, respectively, in that year. Although current treatment practice is estimated to avert a cumulative total of 2,200 deaths from DC or HCC, a cumulative total of 63,900 HCV-related deaths is projected by 2039. Conclusions The HCV-related disease burden is already high and is forecast to rise steeply over the coming decades under current levels of antiviral treatment. Increased governmental resources to improve HCV screening and treatment rates and to reduce transmission are essential to address the high projected HCV disease burden in Malaysia. PMID:26042425

  8. A new prior for bayesian anomaly detection: application to biosurveillance.

    PubMed

    Shen, Y; Cooper, G F

    2010-01-01

    Bayesian anomaly detection computes posterior probabilities of anomalous events by combining prior beliefs and evidence from data. However, the specification of prior probabilities can be challenging. This paper describes a Bayesian prior in the context of disease outbreak detection. The goal is to provide a meaningful, easy-to-use prior that yields a posterior probability of an outbreak that performs at least as well as a standard frequentist approach. If this goal is achieved, the resulting posterior could be usefully incorporated into a decision analysis about how to act in light of a possible disease outbreak. This paper describes a Bayesian method for anomaly detection that combines learning from data with a semi-informative prior probability over patterns of anomalous events. A univariate version of the algorithm is presented here for ease of illustration of the essential ideas. The paper describes the algorithm in the context of disease-outbreak detection, but it is general and can be used in other anomaly detection applications. For this application, the semi-informative prior specifies that an increased count over baseline is expected for the variable being monitored, such as the number of respiratory chief complaints per day at a given emergency department. The semi-informative prior is derived based on the baseline prior, which is estimated from using historical data. The evaluation reported here used semi-synthetic data to evaluate the detection performance of the proposed Bayesian method and a control chart method, which is a standard frequentist algorithm that is closest to the Bayesian method in terms of the type of data it uses. The disease-outbreak detection performance of the Bayesian method was statistically significantly better than that of the control chart method when proper baseline periods were used to estimate the baseline behavior to avoid seasonal effects. When using longer baseline periods, the Bayesian method performed as well as the control chart method. The time complexity of the Bayesian algorithm is linear in the number of the observed events being monitored, due to a novel, closed-form derivation that is introduced in the paper. This paper introduces a novel prior probability for Bayesian outbreak detection that is expressive, easy-to-apply, computationally efficient, and performs as well or better than a standard frequentist method.

  9. Characterization of fission gas bubbles in irradiated U-10Mo fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casella, Andrew M.; Burkes, Douglas E.; MacFarlan, Paul J.

    2017-09-01

    Irradiated U-10Mo fuel samples were prepared with traditional mechanical potting and polishing methods with in a hot cell. They were then removed and imaged with an SEM located outside of a hot cell. The images were then processed with basic imaging techniques from 3 separate software packages. The results were compared and a baseline method for characterization of fission gas bubbles in the samples is proposed. It is hoped that through adoption of or comparison to this baseline method that sample characterization can be somewhat standardized across the field of post irradiated examination of metal fuels.

  10. Study design, intervention, and baseline characteristics of a group randomized trial involving a faith-based healthy eating and physical activity intervention (Walk by Faith) to reduce weight and cancer risk among overweight and obese Appalachian adults☆,☆☆

    PubMed Central

    Baltic, Ryan D.; Weier, Rory C.; Katz, Mira L.; Kennedy, Stephenie K.; Lengerich, Eugene J.; Lesko, Samuel M.; Reese, David; Roberto, Karen A.; Schoenberg, Nancy E.; Young, Gregory S.; Dignan, Mark B.; Paskett, Electra D.

    2017-01-01

    Background Increased prevalence of overweight and obesity among Appalachian residents may contribute to increased cancer rates in this region. This manuscript describes the design, components, and participant baseline characteristics of a faith-based study to decrease overweight and obesity among Appalachian residents. Methods A group randomized study design was used to assign 13 churches to an intervention to reduce overweight and obesity (Walk by Faith) and 15 churches to a cancer screening intervention (Ribbons of Faith). Church members with a body mass index (BMI) ≥25 were recruited from these churches in Appalachian counties in five states to participate in the study. A standard protocol was used to measure participant characteristics at baseline. The same protocol will be followed to obtain measurements after completion of the active intervention phase (12 months) and the sustainability phase (24 months). Primary outcome is change in BMI from baseline to 12 months. Secondary outcomes include changes in blood pressure, waist-to-hip ratio, and fruit and vegetable consumption, as well as intervention sustainability. Results Church members (n = 664) from 28 churches enrolled in the study. At baseline 64.3% of the participants were obese (BMI ≥30), less than half (41.6%) reported regular exercise, and 85.5% reported consuming less than 5 servings of fruits and vegetables per day. Conclusions Church members recruited to participate in a faith-based study across the Appalachian region reported high rates of unhealthy behaviors. We have demonstrated the feasibility of developing and recruiting participants to a faith-based intervention aimed at improving diet and increasing exercise among underserved populations. PMID:26115879

  11. The Evolution of REM Sleep Behavior Disorder in Early Parkinson Disease

    PubMed Central

    Sixel-Döring, Friederike; Zimmermann, Johannes; Wegener, Andrea; Mollenhauer, Brit; Trenkwalder, Claudia

    2016-01-01

    Study Objectives: To investigate the development of REM sleep behavior disorder (RBD) and REM sleep behavioral events (RBE) not yet fulfilling diagnostic criteria for RBD as markers for neurodegeneration in a cohort of Parkinson disease (PD) patients between their de novo baseline assessment and two-year follow-up in comparison to healthy controls (HC). Methods: Clinically confirmed PD patients and HC with video-supported polysomnography (vPSG) data at baseline were re-investigated after two years. Diagnostic scoring for RBE and RBD was performed in both groups and related to baseline findings. Results: One hundred thirteen PD patients and 102 healthy controls (HC) were included in the study. Within two years, the overall occurrence of behaviors during REM sleep in PD patients increased from 50% to 63% (P = 0.02). RBD increased from 25% to 43% (P < 0.001). Eleven of 29 (38%) RBE positive PD patients and 10/56 (18%) patients with normal REM sleep at baseline converted to RBD. In HC, the occurrence of any REM behavior increased from 17% to 20% (n.s.). RBD increased from 2% to 4% (n.s.). One of 15 (7%) RBE positive HC and 1/85 (1%) HC with normal REM at baseline converted to RBD. Conclusions: RBD increased significantly in PD patients from the de novo state to two-year follow-up. We propose RBE being named “prodromal RBD” as it may follow a continuous evolution in PD possibly similar to the spreading of Lewy bodies in PD patients. RBD itself was shown as a robust and stable marker of early PD. Citation: Sixel-Döring F, Zimmermann J, Wegener A, Mollenhauer B, Trenkwalder C. The evolution of REM sleep behavior disorder in early Parkinson disease. SLEEP 2016;39(9):1737–1742. PMID:27306265

  12. Relapse among Cigarette Smokers: The CARDIA longitudinal study - 1985–2011☆

    PubMed Central

    Caraballo, Ralph S.; Kruger, Judy; Asman, Kat; Pederson, Linda; Widome, Rachel; Kiefe, Catarina I.; Hitsman, Brian; Jacobs, David R.

    2015-01-01

    Rationale There is little information about long-term relapse patterns for cigarette smokers. Objective To describe long-term prevalence of relapse and related smoking patterns by sex, race, age, and education level among a community-based cohort of young adults followed for 25 years. Methods We examined 25 years of data from Coronary Artery Risk Development in Young Adults (CARDIA), an ongoing study of a community-based cohort of 5115 men and women aged 18 to 30 years at baseline with periodic re-examinations. At each examination smoking, quitting, and relapse were queried. We examined prevalence of smoking relapse among 3603 participants who attended at least 6 of the 8 examinations. Results About 53% of 3603 participants never reported smoking on a regular basis. Among the remaining 1682 ever smokers, 52.8% of those who reported current smoking at baseline were still smoking by the end of the study, compared to 10.7% of those who initiated smoking by year 5. Among those classified as former smokers at baseline, 39% relapsed at least once; of these, 69.5% had quit again by the end of the study. Maximum education level attained, age at study baseline, and race were associated with failure to quit smoking by the end of the study and relapse among those who did quit. Maximum education level attained and age at study baseline were also associated with ability to successfully quit after a relapse. Conclusions Smoking relapse after quitting is common, especially in those with lower education level. Education was the strongest predictor of all three outcomes. Improvements in access to treatment and treatment options, especially for underserved populations, are needed to prevent relapse when smokers quit. PMID:24172753

  13. Validating the Novel Method of Measuring Cortisol Levels in Cetacean Skin by use of an ACTH Challenge in Bottlenose Dolphins

    DTIC Science & Technology

    2015-09-30

    e.g. blubber biopsies). This process has shown to significantly raise both cortisol and aldosterone above baseline conditions and thus equals an...opening up a new avenue of research in physiological response studies following exposure to stressors. The current study will provide the validation... Physiology , 3: doi:10.1093/conphys/cov016 PUBLICATIONS Bechshøft TØ, Wright AJ, Teilmann J, Dietz R, Hansen M, Weisser JJ & Styrishave B. Developing a

  14. Implementation of EcoAIM (trademark) - A Multi-Objective Decision Support Tool for Ecosystem Services at Department of Defense Installations

    DTIC Science & Technology

    2014-09-26

    neotropical birds , as well as its high-quality habitat for resident birds in the Chesapeake Bay watershed, such as bald eagles and several species of...Duelli and Obrist 2003). Several papers have developed methods that relate measurable habitat features at various spatial scales to species richness ...advised to first establish a baseline in which a species adheres to patterns of ideal habitat selection (Johnson 2007). In the case of the APG bird

  15. RISK FACTORS FOR FOUR-YEAR INCIDENT VISUAL IMPAIRMENT AND BLINDNESS: THE LOS ANGELES LATINO EYE STUDY

    PubMed Central

    Yonekawa, Yoshihiro; Varma, Rohit; Choudhury, Farzana; Torres, Mina; Azen, Stanley P.

    2016-01-01

    Purpose To identify independent risk factors for incident visual impairment (VI) and monocular blindness. Design Population-based prospective cohort study. Participants 4,658 Latinos aged 40 years in the Los Angeles Latino Eye Study (LALES) Methods A detailed history and comprehensive ophthalmological examination was performed at baseline and at the 4-year follow-up on 4,658 Latinos aged 40 years and older from Los Angeles, California. Incident VI was defined as best corrected visual acuity (BCVA) of <20/40 and >20/200 in the better-seeing eye at the 4 year follow-up examination in persons who had a BCVA of ≥20/40 in the better seeing eye at baseline. Incident monocular blindness was defined as BCVA of ≤20/200 in one eye at follow-up in persons who had a BCVA >20/200 in both eyes at baseline. Socio-demographic and clinical risk factors identified at the baseline interview and examination and associated with incident VI and loss of vision were determined using multivariable regression. Odds ratios (OR) were calculated for those variables that were independently associated with visual impairment and monocular blindness. Main Outcome Measures ORs for various risk factors for incident VI and monocular blindness Results Independent risk factors for incident VI were older age (70–79 years OR=4.8, ≥80 years OR=17.9), being unemployment (OR=3.5), and having diabetes mellitus (OR=2.2). Independent risk factors for monocular blindness were being retired (OR=3.4) or widowed (OR=3.7), having diabetes mellitus (OR=2.1) or any ocular disease (OR=5.6) at baseline. Persons with self-reported excellent/good vision were less likely to develop VI or monocular blindness (OR=0.4–0.5). Conclusion Our data highlight that older Latinos and Latinos with diabetes mellitus or self-reported eye diseases are at high risk of developing vision loss. Furthermore, being unemployed, widowed or retired confers an independent risk of monocular blindness. Interventions that prevent, treat, and focus on the modifiable factors may reduce the burden of vision loss in this fastest growing segment of the United States population. PMID:21788079

  16. Predictors of pneumonia on routine chest radiographs in patients with COPD: a post hoc analysis of two 1-year randomized controlled trials

    PubMed Central

    Rubin, David B; Ahmad, Harris A; O’Neal, Michael; Bennett, Sophie; Lettis, Sally; Galkin, Dmitry V; Crim, Courtney

    2018-01-01

    Background Patients with COPD are at risk for life-threatening pneumonia. Although anatomical abnormalities in the thorax may predispose to pneumonia, those abnormalities identified on routine chest X-rays (CXRs) in patients with COPD have not been studied to better understand pneumonia risk. Methods We conducted a post hoc exploratory analysis of data from two replicate year-long clinical trials assessing the impact of fluticasone furoate–vilanterol versus vilanterol alone on COPD exacerbations (GSK studies: HZC102871/NCT01009463 and HZC102970/NCT01017952). Abnormalities on baseline CXRs from 179 patients who developed pneumonia and 50 randomly selected patients who did not were identified by blinded consensus readings conducted by two radiologists. Positive and negative likelihood ratios and diagnostic odds ratios (ORs) were calculated to evaluate the markers for subsequent pneumonia development during the 1-year study period. Results Baseline characteristics distinguishing the pneumonia and non-pneumonia groups included a lower body mass index (24.9 vs 27.5 kg/m2, P=0.008), more severe airflow obstruction (mean post-bronchodilator forced expiratory volume in 1 second [FEV1]/forced vital capacity ratio: 42.3% vs 47.6%, P=0.003), and prior pneumonia (36% vs 20%, P=0.030). Baseline CXR findings with the highest diagnostic ORs were: elevated hemi-diaphragm (OR: 6.87; 95% CI: 0.90, 52.26), thick tracheal-esophageal stripe (OR: 4.39 [0.25, 78.22]), narrow cardiac silhouette (OR: 2.91 [0.85, 9.99]), calcified pleural plaque/mid-chest pleural thickening (OR: 2.82 [0.15, 53.76]), and large/prominent pulmonary artery shadow (OR: 1.94 [0.95, 3.97]). The presence of a narrow cardiac silhouette at baseline was associated with a statistically significant lower mean pre-bronchodilator FEV1 (P=0.040). There was also a trend for a lower mean pre-bronchodilator FEV1 in patients with a large/prominent pulmonary artery shadow at baseline (P=0.095). Conclusion Findings on routine CXR that relate to pathophysiological mechanisms of pneumonia could help determine pneumonia risk in patients with COPD. PMID:29386888

  17. Cognitive activities delay onset of memory decline in persons who develop dementia

    PubMed Central

    Hall, C B.; Lipton, R B.; Sliwinski, M; Katz, M J.; Derby, C A.; Verghese, J

    2009-01-01

    Background: Persons destined to develop dementia experience an accelerated rate of decline in cognitive ability, particularly in memory. Early life education and participation in cognitively stimulating leisure activities later in life are 2 factors thought to reflect cognitive reserve, which may delay the onset of the memory decline in the preclinical stages of dementia. Methods: We followed 488 initially cognitively intact community residing individuals with epidemiologic, clinical, and cognitive assessments every 12 to 18 months in the Bronx Aging Study. We assessed the influence of self-reported participation in cognitively stimulating leisure activities on the onset of accelerated memory decline as measured by the Buschke Selective Reminding Test in 101 individuals who developed incident dementia using a change point model. Results: Each additional self-reported day of cognitive activity at baseline delayed the onset of accelerated memory decline by 0.18 years. Higher baseline levels of cognitive activity were associated with more rapid memory decline after that onset. Inclusion of education did not significantly add to the fit of the model beyond the effect of cognitive activities. Conclusions: Our findings show that late life cognitive activities influence cognitive reserve independently of education. The effect of early life education on cognitive reserve may be mediated by cognitive activity later in life. Alternatively, early life education may be a determinant of cognitive reserve, and individuals with more education may choose to participate in cognitive activities without influencing reserve. Future studies should examine the efficacy of increasing participation in cognitive activities to prevent or delay dementia. GLOSSARY AD = Alzheimer disease; BL = baseline; CAS = Cognitive Activity Scale; CI = confidence interval; DSM = Diagnostic and Statistical Manual of Mental Disorders; dx = diagnosis; NIA = National Institute on Aging; SRT = Selective Reminding Test; WAIS VIQ = Wechsler Adult Intelligence Scale Verbal IQ. PMID:19652139

  18. Healthy Immigrant Families: Participatory Development and Baseline Characteristics of a Community-Based Physical Activity and Nutrition Intervention

    PubMed Central

    Wieland, Mark L.; Weis, Jennifer A.; Hanza, Marcelo M.K.; Meiers, Sonja J.; Patten, Christi A.; Clark, Matthew M.; Sloan, Jeff A.; Novotny, Paul J.; Njeru, Jane W.; Abbenyi, Adeline; Levine, James A.; Goodson, Miriam; Capetillo, Maria Graciela D. Porraz; Osman, Ahmed; Hared, Abdullah; Nigon, Julie A.; Sia, Irene G.

    2015-01-01

    Background US immigrants often have escalating cardiovascular risk. Barriers to optimal physical activity and diet have a significant role in this risk accumulation. Methods We developed a physical activity and nutrition intervention with immigrant and refugee families through a community-based participatory research approach. Work groups of community members and health scientists developed an intervention manual with 12 content modules that were based on social-learning theory. Family health promoters from the participating communities (Hispanic, Somali, Sudanese) were trained to deliver the intervention through 12 home visits during the first 6 months and up to 12 phone calls during the second 6 months. The intervention was tested through a randomized community-based trial with a delayed-intervention control group, with measurements at baseline, 6, 12, and 24 months. Primary measurements included accelerometer-based assessment of physical activity and 24-hour dietary recall. Secondary measures included biometrics and theory-based instruments. Results One hundred fifty-one individuals (81 adolescents, 70 adults; 44 families) were randomized. At baseline, mean (SD) time spent in moderate-to-vigorous physical activity was 64.7 (30.2) minutes/day for adolescents and 43.1 (35.4) minutes/day for adults. Moderate dietary quality was observed in both age groups. Biometric measures showed that 45.7% of adolescents and 80.0% of adults were overweight or obese. Moderate levels of self-efficacy and social support were reported for physical activity and nutrition. Discussion Processes and products from this program are relevant to other communities aiming to reduce cardiovascular risk and negative health behaviors among immigrants and refugees. Trial Registration This trial was registered at Clinicaltrials.gov (NCT01952808). PMID:26655431

  19. Initial poor quality of life and new onset of dyspepsia: results from a longitudinal 10‐year follow‐up study

    PubMed Central

    Ford, Alexander C; Forman, David; Bailey, Alastair G; Axon, Anthony T R; Moayyedi, Paul

    2007-01-01

    Background Numerous studies examining the prevalence and natural history of dyspepsia in the general population have been conducted. However, few have reported the effect of quality of life on the development of dyspepsia. A 10‐year longitudinal follow‐up study examining the effect of quality of life on subsequent dyspepsia was performed. Methods Individuals originally enrolled in a population‐screening programme for Helicobacter pylori were contacted through a validated postal dyspepsia questionnaire. Baseline demographic data, quality of life at original study entry, and dyspepsia and irritable bowel syndrome (IBS) symptom data were already on file. Consent to examine primary‐care records was sought, and data regarding non‐steroidal anti‐inflammatory drugs (NSAID) and aspirin use were obtained from these. Results Of 8407 individuals originally involved, 3912 (46.5%) provided symptom data at baseline and 10‐year follow‐up. Of 2550 (65%) individuals asymptomatic at study entry, 717 (28%) developed new‐onset dyspepsia at 10 years, an incidence of 2.8% per year. After multivariate logistic regression, lower quality of life at study entry (OR 2.63; 99% CI 1.86 to 3.71), higher body mass index (OR per unit 1.05; 99% CI 1.02 to 1.08), presence of IBS at study entry (OR 3.1; 99% CI 1.51 to 6.37) and use of NSAIDs and/or aspirin (OR 1.32; 99% CI 0.99 to 1.75) were significant risk factors for new‐onset dyspepsia. Conclusions The incidence of new‐onset dyspepsia was almost 3% per year. Low quality of life at baseline exerted a strong effect on the likelihood of developing dyspepsia at 10 years. PMID:16908511

  20. Hyperuricemia Is a Risk Factor for the Onset of Impaired Fasting Glucose in Men with a High Plasma Glucose Level: A Community-Based Study

    PubMed Central

    Miyake, Teruki; Kumagi, Teru; Furukawa, Shinya; Hirooka, Masashi; Kawasaki, Keitarou; Koizumi, Mitsuhito; Todo, Yasuhiko; Yamamoto, Shin; Abe, Masanori; Kitai, Kohichiro; Matsuura, Bunzo; Hiasa, Yoichi

    2014-01-01

    Background It is not clear whether elevated uric acid is a risk factor for the onset of impaired fasting glucose after stratifying by baseline fasting plasma glucose levels. We conducted a community-based retrospective longitudinal cohort study to clarify the relationship between uric acid levels and the onset of impaired fasting glucose, according to baseline fasting plasma glucose levels. Methods We enrolled 6,403 persons (3,194 men and 3,209 women), each of whom was 18–80 years old and had >2 annual check-ups during 2003–2010. After excluding persons who had fasting plasma glucose levels ≥6.11 mM and/or were currently taking anti-diabetic agents, the remaining 5,924 subjects were classified into quartiles according to baseline fasting plasma glucose levels. The onset of impaired fasting glucose was defined as fasting plasma glucose ≥6.11 mM during the observation period. Results In the quartile groups, 0.9%, 2.1%, 3.4%, and 20.2% of the men developed impaired fasting glucose, respectively, and 0.1%, 0.3%, 0.5%, and 5.6% of the women developed impaired fasting glucose, respectively (P trend <0.001). After adjusting for age, body mass index, systolic blood pressure, triacylglycerols, high density lipoprotein-cholesterol, creatinine, fatty liver, family history of diabetes, alcohol consumption, and current smoking, uric acid levels were positively associated with onset of impaired fasting glucose in men with highest-quartile fasting plasma glucose levels (adjusted hazard ratio, 1.003; 95% confidence interval, 1.0001–1.005, P = 0.041). Conclusions Among men with high fasting plasma glucose, hyperuricemia may be independently associated with an elevated risk of developing impaired fasting glucose. PMID:25237894

  1. ALMA Long Baseline Campaigns: Phase Characteristics of Atmosphere at Long Baselines in the Millimeter and Submillimeter Wavelengths

    NASA Astrophysics Data System (ADS)

    Matsushita, Satoki; Asaki, Yoshiharu; Fomalont, Edward B.; Morita, Koh-Ichiro; Barkats, Denis; Hills, Richard E.; Kawabe, Ryohei; Maud, Luke T.; Nikolic, Bojan; Tilanus, Remo P. J.; Vlahakis, Catherine; Whyborn, Nicholas D.

    2017-03-01

    We present millimeter- and submillimeter-wave phase characteristics measured between 2012 and 2014 of Atacama Large Millimeter/submillimeter Array long baseline campaigns. This paper presents the first detailed investigation of the characteristics of phase fluctuation and phase correction methods obtained with baseline lengths up to ˜15 km. The basic phase fluctuation characteristics can be expressed with the spatial structure function (SSF). Most of the SSFs show that the phase fluctuation increases as a function of baseline length, with a power-law slope of ˜0.6. In many cases, we find that the slope becomes shallower (average of ˜0.2-0.3) at baseline lengths longer than ˜1 km, namely showing a turn-over in SSF. These power law slopes do not change with the amount of precipitable water vapor (PWV), but the fitted constants have a weak correlation with PWV, so that the phase fluctuation at a baseline length of 10 km also increases as a function of PWV. The phase correction method using water vapor radiometers (WVRs) works well, especially for the cases where PWV > 1 {mm}, which reduces the degree of phase fluctuations by a factor of two in many cases. However, phase fluctuations still remain after the WVR phase correction, suggesting the existence of other turbulent constituent that cause the phase fluctuation. This is supported by occasional SSFs that do not exhibit any turn-over; these are only seen when the PWV is low (i.e., when the WVR phase correction works less effectively) or after WVR phase correction. This means that the phase fluctuation caused by this turbulent constituent is inherently smaller than that caused by water vapor. Since in these rare cases there is no turn-over in the SSF up to the maximum baseline length of ˜15 km, this turbulent constituent must have scale height of 10 km or more, and thus cannot be water vapor, whose scale height is around 1 km. Based on the characteristics, this large scale height turbulent constituent is likely to be water ice or a dry component. Excess path length fluctuation after the WVR phase correction at a baseline length of 10 km is large (≳ 200 μ {{m}}), which is significant for high frequency (> 450 {GHz} or < 700 μ {{m}}) observations. These results suggest the need for an additional phase correction method to reduce the degree of phase fluctuation, such as fast switching, in addition to the WVR phase correction. We simulated the fast switching phase correction method using observations of single quasars, and the result suggests that it works well, with shorter cycle times linearly improving the coherence.

  2. Safety and dose modification for patients receiving niraparib.

    PubMed

    Berek, J S; Matulonis, U A; Peen, U; Ghatage, P; Mahner, S; Redondo, A; Lesoin, A; Colombo, N; Vergote, I; Rosengarten, O; Ledermann, J; Pineda, M; Ellard, S; Sehouli, J; Gonzalez-Martin, A; Berton-Rigaud, D; Madry, R; Reinthaller, A; Hazard, S; Guo, W; Mirza, M R

    2018-05-14

    Niraparib is a poly(ADP-ribose) polymerase (PARP) inhibitor approved in the United States and Europe for maintenance treatment of adult patients with recurrent epithelial ovarian, fallopian tube, or primary peritoneal cancer who are in complete or partial response to platinum-based chemotherapy. In the pivotal ENGOT-OV16/NOVA trial, the dose reduction rate due to TEAE was 68.9%, and the discontinuation rate due to TEAE was 14.7%, including 3.3% due to thrombocytopenia. A retrospective analysis was performed to identify clinical parameters that predict dose reductions. All analyses were performed on the safety population, comprising all patients who received at least one dose of study drug. Patients were analyzed according to the study drug consumed (ie, as treated). A predictive modeling method (decision trees) was used to identify important variables for predicting the likelihood of developing grade ≥3 thrombocytopenia within 30 days after the first dose of niraparib and determine cutoff points for chosen variables. Following dose modification, 200 mg was the most commonly administered dose in the ENGOT-OV16/NOVA trial. Baseline platelet count and baseline body weight were identified as risk factors for increased incidence of grade ≥3 thrombocytopenia. Patients with a baseline body weight <77 kg or a baseline platelet count <150,000/μL in effect received an average daily dose approximating 200 mg (median = 207 mg) due to dose interruption and reduction. Progression-free survival in patients who were dose reduced to either 200 mg or 100 mg was consistent with that of patients who remained at the 300 mg starting dose. The analysis presented suggests that patients with baseline body weight of < 77 kg or baseline platelets of < 150,000/μL may benefit from a starting dose of 200 mg per day. (ClinicalTrials.gov ID: NCT01847274).

  3. Association of baseline level of physical activity and its temporal changes with incident hypertension and diabetes mellitus.

    PubMed

    Lee, Jong-Young; Ryu, Seungho; Sung, Ki-Chul

    2018-01-01

    Background The association between baseline and temporal changes in physical activity and incident hypertension or diabetes mellitus in initially non-hypertensive or non-diabetic subjects is rarely known. Methods Among individuals who underwent consecutive comprehensive health screenings, their physical activity level was measured using a self-reported international physical activity questionnaire. First, subjects were classified into four categories: no regular physical activity with a sedentary lifestyle; minimal physical activity (<75 min/week); insufficient physical activity (≥75 min but <150 min/week); and sufficient physical activity (≥150 min/week). Second, subjects were sub-grouped, based on temporal changes in physical activity level between baseline and consecutive follow-up: increase, no change, and decrease. Results Finally, among 174,314 subjects (mean age 36.7 ± 6.9 years), 5544 (3.18%) and 21,276 (12.2%) developed incident diabetes mellitus and arterial hypertension, respectively. After a multivariate adjustment, sufficient baseline physical activity was associated with significantly lower risk for incident hypertension (hazard ratio 0.89; 95% confidence interval (CI) 0.81 to 0.97), but the difference was not significant, and showed a lower trend in diabetes mellitus incidence (hazard ratio 0.87; 95% CI 0.69 to 1.04) in reference to no regular physical activity group. Regardless of the baseline physical activity level, subjects with a temporal increase in physical activity showed significantly decreased risk for incident hypertension (hazard ratio 0.93; 95% CI 0.87 to 0.99) and diabetes mellitus (hazard ratio 0.83; 95% CI 0.74 to 0.92) compared with those with a temporal decrease in their physical activity level. Conclusion Both sufficient baseline physical activity level and its temporal increase were associated with a lower risk of incident hypertension and diabetes mellitus in a large, relatively healthy, cohort.

  4. Elevated serum uric acid level predicts rapid decline in kidney function

    PubMed Central

    Kuwabara, Masanari; Bjornstad, Petter; Hisatome, Ichiro; Niwa, Koichiro; Roncal-Jimenez, Carlos A; Andres-Hernando, Ana; Jensen, Thomas; Milagres, Tamara; Sato, Yuka; Garcia, Gabriela; Ohno, Minoru; Lanaspa, Miguel A; Johnson, Richard J

    2018-01-01

    Background While elevated serum uric acid level (SUA) is a recognized risk factor for chronic kidney disease (CKD), it remains unclear whether change in SUA is independently associated with change in estimated glomerular filtration rate (eGFR) over time. Accordingly, we examined the longitudinal associations between change in SUA and change in eGFR over 5-years in a general Japanese population. Methods This was a large, single-center, retrospective 5-year cohort study at St. Luke's International Hospital, Tokyo, Japan, between 2004 and 2009. We included 13,070 subjects (30–85 years) in our analyses whose data were available at 2004 and 2009. Of those, we excluded 492 subjects with eGFR <60 mL/min/1.73m2 at baseline. In addition to examining the entire cohort (n=12,578), we stratified our analyses by baseline eGFR groups; 60–90 mL/min/1.73m2, 90–120 mL/min/1.73m2, and ≥120 mL/min/1.73m2. Linear and logistic regressions models were applied to examine the relationships between baseline and change in SUA, change in eGFR and rapid eGFR decline (defined as the highest quantile of change in eGFR), adjusted for age, sex, body mass index, abdominal circumference, hypertension, dyslipidemia, and diabetes mellitus. Results After multivariable adjustments including baseline eGFR, 1 mg/dL increase in baseline SUA was associated with greater odds of developing rapid eGFR decline (OR: 1.27, 95% CI: 1.17–1.38), and 1 mg/dL increase in SUA over 5-years was associated with 3.77-fold greater odds of rapid eGFR decline (OR: 3.77, 95% CI: 3.35–4.26). Conclusions Elevated baseline SUA and increasing SUA over time were independent risk factors for rapid eGFR decline over 5-years. PMID:28285309

  5. Highly Reusable Space Transportation (HRST) Baseline Concepts and Analysis: Rocket/RBCC Options. Part 2; A Comparative Study

    NASA Technical Reports Server (NTRS)

    Woodcock, Gordon

    1997-01-01

    This study is an extension of a previous effort by the Principal Investigator to develop baseline data to support comparative analysis of Highly Reusable Space Transportation (HRST) concepts. The analyses presented herin develop baseline data bases for two two-stage-to-orbit (TSTO) concepts: (1) Assisted horizontal take-off all rocket (assisted HTOHL); and (2) Assisted vertical take-off rocket based combined cycle (RBCC). The study objectives were to: (1) Provide configuration definitions and illustrations for assisted HTOHL and assisted RBCC; (2) Develop a rationalization approach and compare these concepts with the HRST reference; and (3) Analyze TSTO configurations which try to maintain SSTO benefits while reducing inert weight sensitivity.

  6. Climate change vulnerability of native and alien freshwater fishes of California: a systematic assessment approach.

    PubMed

    Moyle, Peter B; Kiernan, Joseph D; Crain, Patrick K; Quiñones, Rebecca M

    2013-01-01

    Freshwater fishes are highly vulnerable to human-caused climate change. Because quantitative data on status and trends are unavailable for most fish species, a systematic assessment approach that incorporates expert knowledge was developed to determine status and future vulnerability to climate change of freshwater fishes in California, USA. The method uses expert knowledge, supported by literature reviews of status and biology of the fishes, to score ten metrics for both (1) current status of each species (baseline vulnerability to extinction) and (2) likely future impacts of climate change (vulnerability to extinction). Baseline and climate change vulnerability scores were derived for 121 native and 43 alien fish species. The two scores were highly correlated and were concordant among different scorers. Native species had both greater baseline and greater climate change vulnerability than did alien species. Fifty percent of California's native fish fauna was assessed as having critical or high baseline vulnerability to extinction whereas all alien species were classified as being less or least vulnerable. For vulnerability to climate change, 82% of native species were classified as highly vulnerable, compared with only 19% for aliens. Predicted climate change effects on freshwater environments will dramatically change the fish fauna of California. Most native fishes will suffer population declines and become more restricted in their distributions; some will likely be driven to extinction. Fishes requiring cold water (<22°C) are particularly likely to go extinct. In contrast, most alien fishes will thrive, with some species increasing in abundance and range. However, a few alien species will likewise be negatively affected through loss of aquatic habitats during severe droughts and physiologically stressful conditions present in most waterways during summer. Our method has high utility for predicting vulnerability to climate change of diverse fish species. It should be useful for setting conservation priorities in many different regions.

  7. Climate Change Vulnerability of Native and Alien Freshwater Fishes of California: A Systematic Assessment Approach

    PubMed Central

    Moyle, Peter B.; Kiernan, Joseph D.; Crain, Patrick K.; Quiñones, Rebecca M.

    2013-01-01

    Freshwater fishes are highly vulnerable to human-caused climate change. Because quantitative data on status and trends are unavailable for most fish species, a systematic assessment approach that incorporates expert knowledge was developed to determine status and future vulnerability to climate change of freshwater fishes in California, USA. The method uses expert knowledge, supported by literature reviews of status and biology of the fishes, to score ten metrics for both (1) current status of each species (baseline vulnerability to extinction) and (2) likely future impacts of climate change (vulnerability to extinction). Baseline and climate change vulnerability scores were derived for 121 native and 43 alien fish species. The two scores were highly correlated and were concordant among different scorers. Native species had both greater baseline and greater climate change vulnerability than did alien species. Fifty percent of California’s native fish fauna was assessed as having critical or high baseline vulnerability to extinction whereas all alien species were classified as being less or least vulnerable. For vulnerability to climate change, 82% of native species were classified as highly vulnerable, compared with only 19% for aliens. Predicted climate change effects on freshwater environments will dramatically change the fish fauna of California. Most native fishes will suffer population declines and become more restricted in their distributions; some will likely be driven to extinction. Fishes requiring cold water (<22°C) are particularly likely to go extinct. In contrast, most alien fishes will thrive, with some species increasing in abundance and range. However, a few alien species will likewise be negatively affected through loss of aquatic habitats during severe droughts and physiologically stressful conditions present in most waterways during summer. Our method has high utility for predicting vulnerability to climate change of diverse fish species. It should be useful for setting conservation priorities in many different regions. PMID:23717503

  8. Minimally invasive estimation of ventricular dead space volume through use of Frank-Starling curves.

    PubMed

    Davidson, Shaun; Pretty, Chris; Pironet, Antoine; Desaive, Thomas; Janssen, Nathalie; Lambermont, Bernard; Morimont, Philippe; Chase, J Geoffrey

    2017-01-01

    This paper develops a means of more easily and less invasively estimating ventricular dead space volume (Vd), an important, but difficult to measure physiological parameter. Vd represents a subject and condition dependent portion of measured ventricular volume that is not actively participating in ventricular function. It is employed in models based on the time varying elastance concept, which see widespread use in haemodynamic studies, and may have direct diagnostic use. The proposed method involves linear extrapolation of a Frank-Starling curve (stroke volume vs end-diastolic volume) and its end-systolic equivalent (stroke volume vs end-systolic volume), developed across normal clinical procedures such as recruitment manoeuvres, to their point of intersection with the y-axis (where stroke volume is 0) to determine Vd. To demonstrate the broad applicability of the method, it was validated across a cohort of six sedated and anaesthetised male Pietrain pigs, encompassing a variety of cardiac states from healthy baseline behaviour to circulatory failure due to septic shock induced by endotoxin infusion. Linear extrapolation of the curves was supported by strong linear correlation coefficients of R = 0.78 and R = 0.80 average for pre- and post- endotoxin infusion respectively, as well as good agreement between the two linearly extrapolated y-intercepts (Vd) for each subject (no more than 7.8% variation). Method validity was further supported by the physiologically reasonable Vd values produced, equivalent to 44.3-53.1% and 49.3-82.6% of baseline end-systolic volume before and after endotoxin infusion respectively. This method has the potential to allow Vd to be estimated without a particularly demanding, specialised protocol in an experimental environment. Further, due to the common use of both mechanical ventilation and recruitment manoeuvres in intensive care, this method, subject to the availability of multi-beat echocardiography, has the potential to allow for estimation of Vd in a clinical environment.

  9. New method for quantification of vuggy porosity from digital optical borehole images as applied to the karstic Pleistocene limestone of the Biscayne aquifer, southeastern Florida

    USGS Publications Warehouse

    Cunningham, K.J.; Carlson, J.I.; Hurley, N.F.

    2004-01-01

    Vuggy porosity is gas- or fluid-filled openings in rock matrix that are large enough to be seen with the unaided eye. Well-connected vugs can form major conduits for flow of ground water, especially in carbonate rocks. This paper presents a new method for quantification of vuggy porosity calculated from digital borehole images collected from 47 test coreholes that penetrate the karstic Pleistocene limestone of the Biscayne aquifer, southeastern Florida. Basically, the method interprets vugs and background based on the grayscale color of each in digital borehole images and calculates a percentage of vuggy porosity. Development of the method was complicated because environmental conditions created an uneven grayscale contrast in the borehole images that makes it difficult to distinguish vugs from background. The irregular contrast was produced by unbalanced illumination of the borehole wall, which was a result of eccentering of the borehole-image logging tool. Experimentation showed that a simple, single grayscale threshold would not realistically differentiate between the grayscale contrast of vugs and background. Therefore, an equation was developed for an effective subtraction of the changing grayscale contrast, due to uneven illumination, to produce a grayscale threshold that successfully identifies vugs. In the equation, a moving average calculated around the circumference of the borehole and expressed as the background grayscale intensity is defined as a baseline from which to identify a grayscale threshold for vugs. A constant was derived empirically by calibration with vuggy porosity values derived from digital images of slabbed-core samples and used to make the subtraction from the background baseline to derive the vug grayscale threshold as a function of azimuth. The method should be effective in estimating vuggy porosity in any carbonate aquifer. ?? 2003 Published by Elsevier B.V.

  10. Predicting the Effectiveness of Work-Focused CBT for Common Mental Disorders: The Influence of Baseline Self-Efficacy, Depression and Anxiety.

    PubMed

    Brenninkmeijer, Veerle; Lagerveld, Suzanne E; Blonk, Roland W B; Schaufeli, Wilmar B; Wijngaards-de Meij, Leoniek D N V

    2018-02-15

    Purpose This study examined who benefits most from a cognitive behavioural therapy (CBT)-based intervention that aims to enhance return to work (RTW) among employees who are absent due to common mental disorders (CMDs) (e.g., depression, anxiety, or adjustment disorder). We researched the influence of baseline work-related self-efficacy and mental health (depressive complaints and anxiety) on treatment outcomes of two psychotherapeutic interventions. Methods Using a quasi-experimental design, 12-month follow-up data of 168 employees were collected. Participants either received work-focused cognitive behavioural therapy (W-CBT) that integrated work aspects early into the treatment (n = 89) or regular cognitive behavioural therapy (R-CBT) without a focus on work (n = 79). Results Compared with R-CBT, W-CBT resulted in a faster partial RTW, irrespective of baseline self-efficacy. Among individuals with high self-efficacy, W-CBT also resulted in faster full RTW. The effectiveness of W-CBT on RTW did not depend on baseline depressive complaints or anxiety. The decline of mental health complaints did not differ between the two interventions, nor depended on baseline self-efficacy or mental health. Conclusions Considering the benefits of W-CBT for partial RTW, we recommend this intervention as a preferred method for employees with CMDs, irrespective of baseline self-efficacy, depression and anxiety. For individuals with high baseline self-efficacy, this intervention also results in higher full RTW. For those with low self-efficacy, extra exercises or components may be needed to promote full RTW.

  11. Cardiac output by pulse contour analysis does not match the increase measured by rebreathing during human spaceflight.

    PubMed

    Hughson, Richard L; Peterson, Sean D; Yee, Nicholas J; Greaves, Danielle K

    2017-11-01

    Pulse contour analysis of the noninvasive finger arterial pressure waveform provides a convenient means to estimate cardiac output (Q̇). The method has been compared with standard methods under a range of conditions but never before during spaceflight. We compared pulse contour analysis with the Modelflow algorithm to estimates of Q̇ obtained by rebreathing during preflight baseline testing and during the final month of long-duration spaceflight in nine healthy male astronauts. By Modelflow analysis, stroke volume was greater in supine baseline than seated baseline or inflight. Heart rate was reduced in supine baseline so that there were no differences in Q̇ by Modelflow estimate between the supine (7.02 ± 1.31 l/min, means ± SD), seated (6.60 ± 1.95 l/min), or inflight (5.91 ± 1.15 l/min) conditions. In contrast, rebreathing estimates of Q̇ increased from seated baseline (4.76 ± 0.67 l/min) to inflight (7.00 ± 1.39 l/min, significant interaction effect of method and spaceflight, P < 0.001). Pulse contour analysis utilizes a three-element Windkessel model that incorporates parameters dependent on aortic pressure-area relationships that are assumed to represent the entire circulation. We propose that a large increase in vascular compliance in the splanchnic circulation invalidates the model under conditions of spaceflight. Future spaceflight research measuring cardiac function needs to consider this important limitation for assessing absolute values of Q̇ and stroke volume. NEW & NOTEWORTHY Noninvasive assessment of cardiac function during human spaceflight is an important tool to monitor astronaut health. This study demonstrated that pulse contour analysis of finger arterial blood pressure to estimate cardiac output failed to track the 46% increase measured by a rebreathing method. These results strongly suggest that alternative methods not dependent on pulse contour analysis are required to track cardiac function in spaceflight. Copyright © 2017 the American Physiological Society.

  12. Marital Conflict, Allostatic Load, and the Development of Children's Fluid Cognitive Performance

    PubMed Central

    Hinnant, J. Benjamin; El-Sheikh, Mona; Keiley, Margaret; Buckhalt, Joseph A.

    2013-01-01

    Relations between marital conflict, children’s respiratory sinus arrhythmia (RSA), and fluid cognitive performance were examined over three years to assess allostatic processes. Participants were 251 children reporting on marital conflict, baseline RSA and RSA reactivity to a lab challenge were recorded, and fluid cognitive performance was measured using the Woodcock-Johnson III. A cross-lagged model showed that higher levels of marital conflict at age 8 predicted weaker RSA-R at age 9 for children with lower baseline RSA. A growth model showed that lower baseline RSA in conjunction with weaker RSA-R predicted the slowest development of fluid cognitive performance. Findings suggest that stress may affect development of physiological systems regulating attention, which are tied to the development of fluid cognitive performance. PMID:23534537

  13. Site systems engineering fiscal year 1999 multi-year work plan (MYWP) update for WBS 1.8.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GRYGIEL, M.L.

    1998-10-08

    Manage the Site Systems Engineering process to provide a traceable integrated requirements-driven, and technically defensible baseline. Through the Site Integration Group(SIG), Systems Engineering ensures integration of technical activities across all site projects. Systems Engineering's primary interfaces are with the RL Project Managers, the Project Direction Office and with the Project Major Subcontractors, as well as with the Site Planning organization. Systems Implementation: (1) Develops, maintains, and controls the site integrated technical baseline, ensures the Systems Engineering interfaces between projects are documented, and maintain the Site Environmental Management Specification. (2) Develops and uses dynamic simulation models for verification of the baselinemore » and analysis of alternatives. (3) Performs and documents fictional and requirements analyses. (4) Works with projects, technology management, and the SIG to identify and resolve technical issues. (5) Supports technical baseline information for the planning and budgeting of the Accelerated Cleanup Plan, Multi-Year Work Plans, Project Baseline Summaries as well as performance measure reporting. (6) Works with projects to ensure the quality of data in the technical baseline. (7) Develops, maintains and implements the site configuration management system.« less

  14. Absolute versus Relative Difference Measures of Priming: Which Is Appropriate when Baseline Scores Change with Age?

    ERIC Educational Resources Information Center

    Murphy, Kristina; McKone, Elinor; Slee, Judith

    2006-01-01

    It is often of theoretical interest to know if implicit memory (repetition priming) develops across childhood under a given circumstance. Methodologically, however, it is difficult to determine whether development is present when baseline performance for unstudied items improves with age. Calculation of priming in absolute…

  15. Improved silicon carbide for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas J.; Mangels, J. A.

    1986-01-01

    The development of silicon carbide materials of high strength was initiated and components of complex shape and high reliability were formed. The approach was to adapt a beta-SiC powder and binder system to the injection molding process and to develop procedures and process parameters capable of providing a sintered silicon carbide material with improved properties. The initial effort was to characterize the baseline precursor materials, develop mixing and injection molding procedures for fabricating test bars, and characterize the properties of the sintered materials. Parallel studies of various mixing, dewaxing, and sintering procedures were performed in order to distinguish process routes for improving material properties. A total of 276 modulus-of-rupture (MOR) bars of the baseline material was molded, and 122 bars were fully processed to a sinter density of approximately 95 percent. Fluid mixing techniques were developed which significantly reduced flaw size and improved the strength of the material. Initial MOR tests indicated that strength of the fluid-mixed material exceeds the baseline property by more than 33 percent. the baseline property by more than 33 percent.

  16. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CODING: BASELINE QUESTIONNAIRE (HOUSEHOLD) (UA-D-7.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Baseline Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the Border study. Household and individual data were combined in a single Baseline Questionnaire data file. Keywo...

  17. Effect of vitamins C and E on biomarkers of oxidative stress in nonsmokers

    PubMed Central

    Block, Gladys; Jensen, Christopher D.; Morrow, Jason D.; Holland, Nina; Norkus, Edward P.; Milne, Ginger L.; Hudes, Mark; Dalvi, Tapashi B.; Crawford, Patricia B.; Fung, Ellen B.; Schumacher, Laurie; Harmatz, Paul

    2009-01-01

    Background Oxidative stress is elevated in obesity, and may be a major mechanism for obesity-related diseases. Methods Nonsmokers (n=396) were randomized to 1000 mg/day vitamin C, 800 IU/day vitamin E, or placebo, for two months. Treatment effect was examined in multiple regression analyses using an intention-to-treat approach. Results Vitamin C (p=0.001) and vitamin E (p=0.043) reduced plasma F2-isoprostanes. In the overall sample, changes from baseline were +6.8%, −10.6%, and −3.9% for placebo, vitamin C, and vitamin E groups, respectively. However, a significant interaction with baseline F2-isoprostane was found. When baseline F2-isoprostane was > 50 μg/mL, vitamin C reduced F2-isoprostane by 22% (p=0.01). Vitamin E reduced it by 9.8% (p=0.46). Below that cut-point, neither treatment produced further reductions. F2-isoprostane > 50 μg/mL was strongly associated with obesity, and was present in 42% of the sample. Change in malondialdehyde concentration was minimal. Discussion These findings suggest a role for vitamin C in reducing lipid peroxidation. Future research on effects of vitamins C or E on plasma F2-isoprostane should limit participants to those with baseline levels > 50 μg/mL. Further studies are needed to establish whether treatment with vitamins C or E in persons with concentrations above that cut-point could slow the development of cardiovascular disease. PMID:18455517

  18. Occupational Class Differences in Body Mass Index and Weight Gain in Japan and Finland

    PubMed Central

    Silventoinen, Karri; Tatsuse, Takashi; Martikainen, Pekka; Rahkonen, Ossi; Lahelma, Eero; Sekine, Michikazu; Lallukka, Tea

    2013-01-01

    Background Occupational class differences in body mass index (BMI) have been systematically reported in developed countries, but the studies have mainly focused on white populations consuming a Westernized diet. We compared occupational class differences in BMI and BMI change in Japan and Finland. Methods The baseline surveys were conducted during 1998–1999 among Japanese (n = 4080) and during 2000–2002 among Finnish (n = 8685) public-sector employees. Follow-up surveys were conducted among those still employed, in 2003 (n = 3213) and 2007 (n = 7086), respectively. Occupational class and various explanatory factors were surveyed in the baseline questionnaires. Linear regression models were used for data analysis. Results BMI was higher at baseline and BMI gain was more rapid in Finland than in Japan. In Finland, baseline BMI was lowest among men and women in the highest occupational class and progressively increased to the lowest occupational class; no gradient was found in Japan (country interaction effect, P = 0.020 for men and P < 0.0001 for women). Adjustment for confounding factors reflecting work conditions and health behavior increased the occupational class gradient among Finnish men and women, whereas factors related to social life had no effect. No statistically significant difference in BMI gain was found between occupational classes. Conclusions The occupational class gradient in BMI was strong among Finnish employees but absent among Japanese employees. This suggests that occupational class inequalities in obesity are not inevitable, even in high-income societies. PMID:24140817

  19. An advanced algorithm for deformation estimation in non-urban areas

    NASA Astrophysics Data System (ADS)

    Goel, Kanika; Adam, Nico

    2012-09-01

    This paper presents an advanced differential SAR interferometry stacking algorithm for high resolution deformation monitoring in non-urban areas with a focus on distributed scatterers (DSs). Techniques such as the Small Baseline Subset Algorithm (SBAS) have been proposed for processing DSs. SBAS makes use of small baseline differential interferogram subsets. Singular value decomposition (SVD), i.e. L2 norm minimization is applied to link independent subsets separated by large baselines. However, the interferograms used in SBAS are multilooked using a rectangular window to reduce phase noise caused for instance by temporal decorrelation, resulting in a loss of resolution and the superposition of topography and deformation signals from different objects. Moreover, these have to be individually phase unwrapped and this can be especially difficult in natural terrains. An improved deformation estimation technique is presented here which exploits high resolution SAR data and is suitable for rural areas. The implemented method makes use of small baseline differential interferograms and incorporates an object adaptive spatial phase filtering and residual topography removal for an accurate phase and coherence estimation, while preserving the high resolution provided by modern satellites. This is followed by retrieval of deformation via the SBAS approach, wherein, the phase inversion is performed using an L1 norm minimization which is more robust to the typical phase unwrapping errors encountered in non-urban areas. Meter resolution TerraSAR-X data of an underground gas storage reservoir in Germany is used for demonstrating the effectiveness of this newly developed technique in rural areas.

  20. Obesity Reduction Black Intervention Trial (ORBIT): Design and Baseline Characteristics

    PubMed Central

    Stolley, Melinda; Schiffer, Linda; Sharp, Lisa; Singh, Vicky; Van Horn, Linda; Dyer, Alan

    2008-01-01

    Abstract Background Obesity is associated with many chronic diseases, and weight loss can reduce the risk of developing these diseases. Obesity is highly prevalent among Black women, but weight loss treatment for black women has been understudied until recently. The Obesity Reduction black Intervention Trial (ORBIT) is a randomized controlled trial designed to assess the efficacy of a culturally proficient weight loss and weight loss maintenance program for black women. This paper describes the design of the trial, the intervention, and baseline characteristics of the participants. Methods Two hundred thirteen obese black women aged 30–65 years were randomized to the intervention group or a general health control group. The intervention consists of a 6-month weight loss program followed by a 1-year maintenance program. Weight, dietary intake, and energy expenditure are measured at baseline, 6 months, and 18 months. Results More than 40% of participants had a baseline body mass index (BMI) >40 kg/m2 (class III obesity). Intake of fat and saturated fat was higher and consumption of fruit, vegetables, and fiber was lower than currently recommended guidelines. Self-reported moderate to vigorous physical activity was high (median 85 min/day). However, objectively measured physical activity among a subgroup of participants was lower (median 15 min/day). Conclusions Weight loss among obese black women has received inadequate attention in relation to the magnitude of the problem. Factors that contribute to successful weight loss and more importantly, weight loss maintenance need to be identified. PMID:18774895

  1. Current Status of the Development of a Transportable and Compact VLBI System by NICT and GSI

    NASA Technical Reports Server (NTRS)

    Ishii, Atsutoshi; Ichikawa, Ryuichi; Takiguchi, Hiroshi; Takefuji, Kazuhiro; Ujihara, Hideki; Koyama, Yasuhiro; Kondo, Tetsuro; Kurihara, Shinobu; Miura, Yuji; Matsuzaka, Shigeru; hide

    2010-01-01

    MARBLE (Multiple Antenna Radio-interferometer for Baseline Length Evaluation) is under development by NICT and GSI. The main part of MARBLE is a transportable VLBI system with a compact antenna. The aim of this system is to provide precise baseline length over about 10 km for calibrating baselines. The calibration baselines are used to check and validate surveying instruments such as GPS receiver and EDM (Electro-optical Distance Meter). It is necessary to examine the calibration baselines regularly to keep the quality of the validation. The VLBI technique can examine and evaluate the calibration baselines. On the other hand, the following roles are expected of a compact VLBI antenna in the VLBI2010 project. In order to achieve the challenging measurement precision of VLBI2010, it is well known that it is necessary to deal with the problem of thermal and gravitational deformation of the antenna. One promising approach may be connected-element interferometry between a compact antenna and a VLBI2010 antenna. By measuring repeatedly the baseline between the small stable antenna and the VLBI2010 antenna, the deformation of the primary antenna can be measured and the thermal and gravitational models of the primary antenna will be able to be constructed. We made two prototypes of a transportable and compact VLBI system from 2007 to 2009. We performed VLBI experiments using theses prototypes and got a baseline length between the two prototypes. The formal error of the measured baseline length was 2.7 mm. We expect that the baseline length error will be reduced by using a high-speed A/D sampler.

  2. Incidence of Type 2 Diabetes Using Proposed HbA1c Diagnostic Criteria in the European Prospective Investigation of Cancer–Norfolk Cohort

    PubMed Central

    Chamnan, Parinya; Simmons, Rebecca K.; Forouhi, Nita G.; Luben, Robert N.; Khaw, Kay-Tee; Wareham, Nicholas J.; Griffin, Simon J.

    2011-01-01

    OBJECTIVE To evaluate the incidence and relative risk of type 2 diabetes defined by the newly proposed HbA1c diagnostic criteria in groups categorized by different baseline HbA1c levels. RESEARCH DESIGN AND METHODS Using data from the European Prospective Investigation of Cancer (EPIC)-Norfolk cohort with repeat HbA1c measurements, we estimated the prevalence of known and previously undiagnosed diabetes at baseline (baseline HbA1c ≥6.5%) and the incidence of diabetes over 3 years. We also examined the incidence and corresponding odds ratios (ORs) by different levels of baseline HbA1c. Incident diabetes was defined clinically (self-report at follow-up, prescribed diabetes medication, or inclusion on a diabetes register) or biochemically (HbA1c ≥6.5% at the second health assessment), or both. RESULTS The overall prevalence of diabetes was 4.7%; 41% of prevalent cases were previously undiagnosed. Among 5,735 participants without diabetes at baseline (identified clinically or using HbA1c criteria, or both), 72 developed diabetes over 3 years (1.3% [95% CI 1.0–1.5]), of which 49% were identified using the HbA1c criteria. In 6% of the total population, the baseline HbA1c was 6.0–6.4%; 36% of incident cases arose in this group. The incidence of diabetes in this group was 15 times higher than in those with a baseline HbA1c of <5.0% (OR 15.5 [95% CI 7.2–33.3]). CONCLUSIONS The cumulative incidence of diabetes defined using a newly proposed HbA1c threshold in this middle-aged British cohort was 1.3% over 3 years. Targeting interventions to individuals with an HbA1c of 6.0–6.4% might represent a feasible preventive strategy, although complementary population-based preventive strategies are also needed to reduce the growing burden of diabetes. PMID:20622160

  3. Immunogenicity of Fractional-Dose Vaccine during a Yellow Fever Outbreak - Preliminary Report.

    PubMed

    Ahuka-Mundeke, Steve; Casey, Rebecca M; Harris, Jennifer B; Dixon, Meredith G; Nsele, Pierre M; Kizito, Gabriel M; Umutesi, Grace; Laven, Janeen; Paluku, Gilson; Gueye, Abdou S; Hyde, Terri B; Sheria, Guylain K M; Muyembe-Tanfum, Jean-Jacques; Staples, J Erin

    2018-02-14

    Background In 2016, the response to a yellow fever outbreak in Angola and the Democratic Republic of Congo led to a global shortage of yellow fever vaccine. As a result, a fractional dose of the 17DD yellow fever vaccine (containing one fifth [0.1 ml] of the standard dose) was offered to 7.6 million children 2 years of age or older and nonpregnant adults in a preemptive campaign in Kinshasa. The goal of this study was to assess the immune response to the fractional dose in a large-scale campaign. Methods We recruited participants in four age strata at six vaccination sites. We assessed neutralizing antibody titers against yellow fever virus in blood samples obtained before vaccination and 28 to 35 days after vaccination, using a plaque reduction neutralization test with a 50% cutoff (PRNT 50 ). Participants with a PRNT 50 titer of 10 or higher at baseline were considered to be seropositive. Those with a baseline titer of less than 10 who became seropositive at follow-up were classified as having undergone seroconversion. Participants who were seropositive at baseline and who had an increase in the titer by a factor of 4 or more at follow-up were classified as having an immune response. Results Among 716 participants who completed follow-up, 705 (98%; 95% confidence interval [CI], 97 to 99) were seropositive after vaccination. Among 493 participants who were seronegative at baseline, 482 (98%; 95% CI, 96 to 99) underwent seroconversion. Among 223 participants who were seropositive at baseline, 148 (66%; 95% CI, 60 to 72) had an immune response. Lower baseline titers were associated with a higher probability of having an immune response (P<0.001). Conclusions A fractional dose of the 17DD yellow fever vaccine was effective at inducing seroconversion in most of the participants who were seronegative at baseline. These findings support the use of fractional-dose vaccination for outbreak control. (Funded by the U.S. Agency for International Development and the Centers for Disease Control and Prevention.).

  4. HIGH-PRECISION ASTROMETRIC MILLIMETER VERY LONG BASELINE INTERFEROMETRY USING A NEW METHOD FOR ATMOSPHERIC CALIBRATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rioja, M.; Dodson, R., E-mail: maria.rioja@icrar.org

    2011-04-15

    We describe a new method which achieves high-precision very long baseline interferometry (VLBI) astrometry in observations at millimeter (mm) wavelengths. It combines fast frequency-switching observations, to correct for the dominant non-dispersive tropospheric fluctuations, with slow source-switching observations, for the remaining ionospheric dispersive terms. We call this method source-frequency phase referencing. Provided that the switching cycles match the properties of the propagation media, one can recover the source astrometry. We present an analytic description of the two-step calibration strategy, along with an error analysis to characterize its performance. Also, we provide observational demonstrations of a successful application with observations using themore » Very Long Baseline Array at 86 GHz of the pairs of sources 3C274 and 3C273 and 1308+326 and 1308+328 under various conditions. We conclude that this method is widely applicable to mm-VLBI observations of many target sources, and unique in providing bona fide astrometrically registered images and high-precision relative astrometric measurements in mm-VLBI using existing and newly built instruments, including space VLBI.« less

  5. Runway Scheduling Using Generalized Dynamic Programming

    NASA Technical Reports Server (NTRS)

    Montoya, Justin; Wood, Zachary; Rathinam, Sivakumar

    2011-01-01

    A generalized dynamic programming method for finding a set of pareto optimal solutions for a runway scheduling problem is introduced. The algorithm generates a set of runway fight sequences that are optimal for both runway throughput and delay. Realistic time-based operational constraints are considered, including miles-in-trail separation, runway crossings, and wake vortex separation. The authors also model divergent runway takeoff operations to allow for reduced wake vortex separation. A modeled Dallas/Fort Worth International airport and three baseline heuristics are used to illustrate preliminary benefits of using the generalized dynamic programming method. Simulated traffic levels ranged from 10 aircraft to 30 aircraft with each test case spanning 15 minutes. The optimal solution shows a 40-70 percent decrease in the expected delay per aircraft over the baseline schedulers. Computational results suggest that the algorithm is promising for real-time application with an average computation time of 4.5 seconds. For even faster computation times, two heuristics are developed. As compared to the optimal, the heuristics are within 5% of the expected delay per aircraft and 1% of the expected number of runway operations per hour ad can be 100x faster.

  6. Postangioplasty restenosis followed with magnetic resonance imaging in an atherosclerotic rabbit model.

    PubMed

    Hänni, Mari; Leppänen, Olli; Smedby, Orjan

    2012-01-01

    Rationale and Objectives. Testing a quantitative, noninvasive method to assess postangioplasty vessel wall changes in an animal model. Material and Methods. Six New Zealand white rabbits were subjected to atherosclerotic injury, including cholesterol-enriched diet, deendothelialization, and percutaneous transluminal angioplasty (PTA) in the distal part of abdominal aorta (four weeks after deendothelialization). The animals were examined with a 1.5T MRI scanner at three times as follows: baseline (six weeks after diet start and two days after PTA) and four weeks and 10 weeks after-PTA. Inflow angiosequence (M2DI) and proton-density-weighted sequence (PDW) were performed to examine the aorta with axial slices. To identify the inner and outer vessel wall boundaries, a dynamic contour algorithm (Gradient Vector Flow Snakes) was applied to the images, followed by calculation of the vessel wall dimensions. The results were compared with histopathological analysis. Results. The wall thickness in the lesion was significantly higher than in the control region at 4 and 10 weeks, reflecting induction of experimentally created after-angioplasty lesion. At baseline, no significant difference between the two regions was present. Conclusions. It is possible to follow the development of vessel wall changes after-PTA with MRI in this rabbit model.

  7. Optimization of the Separation of NDA-Derivatized Methylarginines by Capillary and Microchip Electrophoresis

    PubMed Central

    Linz, Thomas H.; Snyder, Christa M.; Lunte, Susan M.

    2013-01-01

    The methylated arginines (MAs) monomethylarginine (MMA), asymmetric dimethylarginine (ADMA), and symmetric dimethylarginine (SDMA) have been shown to be independent predictors of cardiovascular disease. This article describes progress regarding the development of an analytical method capable of rapidly analyzing MAs using capillary electrophoresis (CE) and microchip electrophoresis (MCE) with laser-induced fluorescence (LIF) detection. Several parameters including buffer composition and separation voltage were optimized to achieve an ideal separation. The analytes of interest were derivatized with naphthalene-2,3-dicarboxaldehyde (NDA) to produce fluorescent 1-cyanobenz[f]isoindole (CBI) derivatives and then subjected to CE analysis. Baseline resolution of SDMA, ADMA, MMA, and arginine was achieved in less than 8 min. The limits of detection for SDMA, ADMA, MMA, and arginine were determined to be 15, 20, 25, and 5 nM, respectively, which are well below the expected plasma concentrations. The CE separation method was then transferred to a glass MCE device with LIF detection. MAs were baseline resolved in 3 min on-chip using a 14 cm separation channel with detection limits of approximately 10 nM for each species. To the best of the authors’ knowledge, this is the first report of the separation of MAs by MCE. PMID:22357605

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLellan, G.W.

    This test plan describes the field demonstration of the sonic drilling system being conducted as a coordinated effort between the VOC-Arid ID (Integrated Demonstration) and the 200 West Area Carbon Tetrachloride ERA (Expedited Response Action) programs at Hanford. The purpose of this test is to evaluate the Water Development Corporation`s drilling system, modify components as necessary and determine compatible drilling applications for the sonic drilling method for use at facilities in the DOE complex. The sonic demonstration is being conducted as the first field test under the Cooperative Research and Development Agreement (CRADA) which involves the US Department of Energy,more » Pacific Northwest Laboratory, Westinghouse Hanford Company and Water Development Corporation. The sonic drilling system will be used to drill a 45 degree vadose zone well, two vertical wells at the VOC-Arid ID site, and several test holes at the Drilling Technology Test Site north of the 200 Area fire station. Testing at other locations will depend on the performance of the drilling method. Performance of this technology will be compared to the baseline drilling method (cable-tool).« less

  9. Spectrometer Baseline Control Via Spatial Filtering

    NASA Technical Reports Server (NTRS)

    Burleigh, M. R.; Richey, C. R.; Rinehart, S. A.; Quijada, M. A.; Wollack, E. J.

    2016-01-01

    An absorptive half-moon aperture mask is experimentally explored as a broad-bandwidth means of eliminating spurious spectral features arising from reprocessed radiation in an infrared Fourier transform spectrometer. In the presence of the spatial filter, an order of magnitude improvement in the fidelity of the spectrometer baseline is observed. The method is readily accommodated within the context of commonly employed instrument configurations and leads to a factor of two reduction in optical throughput. A detailed discussion of the underlying mechanism and limitations of the method are provided.

  10. Development of realtime connected element interferometry at the Goldstone Deep Space Communications Complex

    NASA Technical Reports Server (NTRS)

    Edwards, C. D.

    1990-01-01

    Connected-element interferometry (CEI) has the potential to provide high-accuracy angular spacecraft tracking on short baselines by making use of the very precise phase delay observable. Within the Goldstone Deep Space Communications Complex (DSCC), one of three tracking complexes in the NASA Deep Space Network, baselines of up to 21 km in length are available. Analysis of data from a series of short-baseline phase-delay interferometry experiments are presented to demonstrate the potential tracking accuracy on these baselines. Repeated differential observations of pairs of angularly close extragalactic radio sources were made to simulate differential spacecraft-quasar measurements. Fiber-optic data links and a correlation processor are currently being developed and installed at Goldstone for a demonstration of real-time CEI in 1990.

  11. The Impact of Cardiorespiratory Fitness Levels on the Risk of Developing Atherogenic Dyslipidemia.

    PubMed

    Breneman, Charity B; Polinski, Kristen; Sarzynski, Mark A; Lavie, Carl J; Kokkinos, Peter F; Ahmed, Ali; Sui, Xuemei

    2016-10-01

    Low cardiorespiratory fitness has been established as a risk factor for cardiovascular-related morbidity. However, research about the impact of fitness on lipid abnormalities, including atherogenic dyslipidemia, has produced mixed results. The purpose of this investigation is to examine the influence of baseline fitness and changes in fitness on the development of atherogenic dyslipidemia. All participants completed at least 3 comprehensive medical examinations performed by a physician that included a maximal treadmill test between 1976 and 2006 at the Cooper Clinic in Dallas, Texas. Atherogenic dyslipidemia was defined as a triad of lipid abnormalities: low high-density-lipoprotein cholesterol ([HDL-C] <40 mg/dL), high triglycerides ([TGs] ≥200 mg/dL), and high low-density-lipoprotein cholesterol ([LDL-C] ≥160 mg/dL). A total of 193 participants developed atherogenic dyslipidemia during an average of 8.85 years of follow-up. High baseline fitness was protective against the development of atherogenic dyslipidemia in comparison with those with low fitness (odds ratio [OR] 0.57; 95% confidence interval [CI], 0.37-0.89); however, this relationship became nonsignificant after controlling for baseline HDL-C, LDL-C, and TG levels. Participants who maintained fitness over time had lower odds of developing atherogenic dyslipidemia than those with a reduction in fitness (OR 0.56; 95% CI, 0.34-0.91) after adjusting for baseline confounders and changes in known risk factors. High fitness at baseline and maintenance of fitness over time are protective against the development of atherogenic dyslipidemia. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Development and validation of stability indicating the RP-HPLC method for the estimation of related compounds of guaifenesin in pharmaceutical dosage forms.

    PubMed

    Reddy, Sunil Pingili; Babu, K Sudhakar; Kumar, Navneet; Sekhar, Y V V Sasi

    2011-10-01

    A stability-indicating gradient reverse phase liquid chromatographic (RP-LC) method was developed for the quantitative determination of related substances of guaifenesin in pharmaceutical formulations. The baseline separation for guaifenesin and all impurities was achieved by utilizing a Water Symmetry C18 (150 mm × 4.6 mm) 5 μm column particle size and a gradient elution method. The mobile phase A contains a mixture of 0.02 M KH2PO4 (pH 3.2) and methanol in the ratio of 90:10 v/v, while the mobile phase B contains 0.02 M KH2PO4 (pH 3.2) and methanol in the ratio of 10:90 v/v, respectively. The flow rate of the mobile phase was 0.8 ml/min with a column temperature of 25°C and detection wavelength at 273 nm. Guaifenesin was subjected to the stress conditions of oxidative, acid, base, hydrolytic, thermal, and photolytic degradation. The developed method was validated as per ICH guidelines with respect to specificity, linearity, limit of detection and quantification, accuracy, precision, and robustness.

  13. Separation of dietary omega-3 and omega-6 fatty acids in food by capillary electrophoresis.

    PubMed

    Soliman, Laiel C; Donkor, Kingsley K; Church, John S; Cinel, Bruno; Prema, Dipesh; Dugan, Michael E R

    2013-10-01

    A lower dietary omega-6/omega-3 (n-6/n-3) fatty acid ratio (<4) has been shown to be beneficial in preventing a number of chronic illnesses. Interest exists in developing more rapid and sensitive analytical methods for profiling fatty acid levels in foods. An aqueous CE method was developed for the simultaneous determination of 15 n-3 and n-6 relevant fatty acids. The effect of pH and concentration of buffer, type and concentration of organic modifier, and additive on the separation was investigated in order to determine the best conditions for the analysis. Baseline separations of the 15 fatty acids were achieved using 40 mM borate buffer at pH 9.50 containing 50 mM SDS, 10 mM β-cyclodextrin, and 10% acetonitrile. The developed CE method has LODs of <5 mg/L and good linearity (R(2) > 0.980) for all fatty acids studied. The proposed method was successfully applied to the determination of n-3 and n-6 fatty acids in flax seed, Udo® oils and a selection of grass-fed and grain-fed beef muscle samples. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Assessing oral health-related quality of life in general dental practice in Scotland: validation of the OHIP-14.

    PubMed

    Fernandes, Marcelo José; Ruta, Danny Adolph; Ogden, Graham Richard; Pitts, Nigel Berry; Ogston, Simon Alexander

    2006-02-01

    To validate the Oral Health Impact Profile (OHIP)-14 in a sample of patients attending general dental practice. Patients with pathology-free impacted wisdom teeth were recruited from six general dental practices in Tayside, Scotland, and followed for a year to assess the development of problems related to impaction. The OHIP-14 was completed at baseline and at 1-year follow-up, and analysed using three different scoring methods: a summary score, a weighted and standardized score and the total number of problems reported. Instrument reliability was measured by assessing internal consistency and test-retest reliability. Construct validity was assessed using a number of variables. Linear regression was then used to model the relationship between OHIP-14 and all significantly correlated variables. Responsiveness was measured using the standardized response mean (SRM). Adjusted R(2)s and SRMs were calculated for each of the three scoring methods. Estimates for the differences between adjusted R(2)s and the differences between SRMs were obtained with 95% confidence intervals. A total of 278 and 169 patients completed the questionnaire at baseline and follow-up, respectively. Reliability - Cronbach's alpha coefficients ranged from 0.30 to 0.75. Alpha coefficients for all 14 items were 0.88 and 0.87 for baseline and follow-up, respectively. Test-retest coefficients ranged from 0.72 to 0.78. Validity - OHIP-14 scores were significantly correlated with number of teeth, education, main activity, the use of mouthwash, frequency of seeing a dentist, the reason for the last dental appointment, smoking, alcohol intake, pain and symptoms. Adjusted R(2)s ranged from 0.123 to 0.202 and there were no statistically significant differences between those for the three different scoring methods. Responsiveness - The SRMs ranged from 0.37 to 0.56 and there was a statistically significant difference between the summary scores method and the total number of problems method for symptomatic patients. The OHIP-14 is a valid and reliable measure of oral health-related quality of life in general dental practice and is responsive to third molar clinical change. The summary score method demonstrated performance as good as, or better than, the other methods studied.

  15. Comparing model-based adaptive LMS filters and a model-free hysteresis loop analysis method for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Zhou, Cong; Chase, J. Geoffrey; Rodgers, Geoffrey W.; Xu, Chao

    2017-02-01

    The model-free hysteresis loop analysis (HLA) method for structural health monitoring (SHM) has significant advantages over the traditional model-based SHM methods that require a suitable baseline model to represent the actual system response. This paper provides a unique validation against both an experimental reinforced concrete (RC) building and a calibrated numerical model to delineate the capability of the model-free HLA method and the adaptive least mean squares (LMS) model-based method in detecting, localizing and quantifying damage that may not be visible, observable in overall structural response. Results clearly show the model-free HLA method is capable of adapting to changes in how structures transfer load or demand across structural elements over time and multiple events of different size. However, the adaptive LMS model-based method presented an image of greater spread of lesser damage over time and story when the baseline model is not well defined. Finally, the two algorithms are tested over a simpler hysteretic behaviour typical steel structure to quantify the impact of model mismatch between the baseline model used for identification and the actual response. The overall results highlight the need for model-based methods to have an appropriate model that can capture the observed response, in order to yield accurate results, even in small events where the structure remains linear.

  16. Supersonic Aftbody Closure Wind-Tunnel Testing, Data Analysis, and Computational Results

    NASA Technical Reports Server (NTRS)

    Allen, Jerry; Martin, Grant; Kubiatko, Paul

    1999-01-01

    This paper reports on the model, test, and results from the Langley Supersonic Aftbody Closure wind tunnel test. This project is an experimental evaluation of the 1.5% Technology Concept Aircraft (TCA) aftbody closure model (Model 23) in the Langley Unitary Plan Wind Tunnel. The baseline TCA design is the result of a multidisciplinary, multipoint optimization process and was developed using linear design and analysis methods, supplemented with Euler and Navier-Stokes numerical methods. After a thorough design review, it was decided to use an upswept blade attached to the forebody as the mounting system. Structural concerns dictated that a wingtip support system would not be feasible. Only the aftbody part of the model is metric. The metric break was chosen to be at the fuselage station where prior aft-sting supported models had been truncated. Model 23 is thus a modified version of Model 20. The wing strongback, flap parts, and nacelles from Model 20 were used, whereas new aftbodies, a common forebody, and some new tails were fabricated. In summary, significant differences in longitudinal and direction stability and control characteristics between the ABF and ABB aftbody geometries were measured. Correcting the experimental data obtained for the TCA configuration with the flared aftbody to the representative of the baseline TCA closed aftbody will result in a significant reduction in longitudinal stability, a moderate reduction in stabilizer effectiveness and directional stability, and a moderate to significant reduction in rudder effectiveness. These reductions in the stability and control effectiveness levels of the baseline TCA closed aftbody are attributed to the reduction in carry-over area.

  17. Effect of abdominal visceral fat on the development of new erosive oesophagitis: a prospective cohort study.

    PubMed

    Nam, Su Youn; Kim, Young-Woo; Park, Bum Joon; Ryu, Kum Hei; Choi, Il Ju; Nam, Byung-Ho; Kim, Hyun Boem

    2017-04-01

    Although abdominal visceral fat has been associated with erosive oesophagitis in cross-sectional studies, there are no data that describe its longitudinal effects. We aimed to evaluate the longitudinal effects of abdominal visceral fat on the development of new erosive oesophagitis in patients who did not have erosive oesophagitis at baseline. This was a single-centre prospective study. A total of 1503 participants without erosive oesophagitis at baseline were followed up for 34 months and they underwent oesophagogastroduodenoscopy and computed tomography at both baseline and during follow-up. The longitudinal effects of abdominal visceral fat on the development of new erosive oesophagitis were evaluated using odds ratios (ORs) and 95% confidence intervals (CIs). New oesophagitis developed in 83 patients. Compared with the first quartile, the third (OR=3.96, 95% CI: 1.54-10.18) and the fourth (OR=4.67, 95% CI: 1.79-12.23) of baseline visceral fat quartiles, the third (OR=3.03, 95% CI: 1.14-8.04) and the fourth (OR=7.50, 95% CI: 2.92-19.25) follow-up visceral fat quartiles, and the fourth visceral fat change quartile (OR=2.76, 95% CI: 1.47-5.21) were associated with increased development of new erosive oesophagitis, and the P value for each trend was less than 0.001. New erosive oesophagitis was inversely related to the follow-up Helicobacter pylori status and it was associated positively with the presence of a hiatal hernia and smoking during follow-up, but it was not associated with reflux symptoms, the H. pylori status, presence of a hiatal hernia or smoking at baseline. Higher level of visceral fat at baseline and follow-up visceral fat, and greater changes in the visceral level were associated linearly with the development of new erosive oesophagitis in this longitudinal study.

  18. Adaptive selection of diurnal minimum variation: a statistical strategy to obtain representative atmospheric CO2 data and its application to European elevated mountain stations

    NASA Astrophysics Data System (ADS)

    Yuan, Ye; Ries, Ludwig; Petermeier, Hannes; Steinbacher, Martin; Gómez-Peláez, Angel J.; Leuenberger, Markus C.; Schumacher, Marcus; Trickl, Thomas; Couret, Cedric; Meinhardt, Frank; Menzel, Annette

    2018-03-01

    Critical data selection is essential for determining representative baseline levels of atmospheric trace gases even at remote measurement sites. Different data selection techniques have been used around the world, which could potentially lead to reduced compatibility when comparing data from different stations. This paper presents a novel statistical data selection method named adaptive diurnal minimum variation selection (ADVS) based on CO2 diurnal patterns typically occurring at elevated mountain stations. Its capability and applicability were studied on records of atmospheric CO2 observations at six Global Atmosphere Watch stations in Europe, namely, Zugspitze-Schneefernerhaus (Germany), Sonnblick (Austria), Jungfraujoch (Switzerland), Izaña (Spain), Schauinsland (Germany), and Hohenpeissenberg (Germany). Three other frequently applied statistical data selection methods were included for comparison. Among the studied methods, our ADVS method resulted in a lower fraction of data selected as a baseline with lower maxima during winter and higher minima during summer in the selected data. The measured time series were analyzed for long-term trends and seasonality by a seasonal-trend decomposition technique. In contrast to unselected data, mean annual growth rates of all selected datasets were not significantly different among the sites, except for the data recorded at Schauinsland. However, clear differences were found in the annual amplitudes as well as the seasonal time structure. Based on a pairwise analysis of correlations between stations on the seasonal-trend decomposed components by statistical data selection, we conclude that the baseline identified by the ADVS method is a better representation of lower free tropospheric (LFT) conditions than baselines identified by the other methods.

  19. Fast HPLC-DAD quantification of nine polyphenols in honey by using second-order calibration method based on trilinear decomposition algorithm.

    PubMed

    Zhang, Xiao-Hua; Wu, Hai-Long; Wang, Jian-Yao; Tu, De-Zhu; Kang, Chao; Zhao, Juan; Chen, Yao; Miu, Xiao-Xia; Yu, Ru-Qin

    2013-05-01

    This paper describes the use of second-order calibration for development of HPLC-DAD method to quantify nine polyphenols in five kinds of honey samples. The sample treatment procedure was simplified effectively relative to the traditional ways. Baselines drift was also overcome by means of regarding the drift as additional factor(s) as well as the analytes of interest in the mathematical model. The contents of polyphenols obtained by the alternating trilinear decomposition (ATLD) method have been successfully used to distinguish different types of honey. This method shows good linearity (r>0.99), rapidity (t<7.60 min) and accuracy, which may be extremely promising as an excellent routine strategy for identification and quantification of polyphenols in the complex matrices. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Carotid intima-media thickness is associated with the progression of cognitive impairment in older adults.

    PubMed

    Moon, Jae Hoon; Lim, Soo; Han, Ji Won; Kim, Kyoung Min; Choi, Sung Hee; Park, Kyong Soo; Kim, Ki Woong; Jang, Hak Chul

    2015-04-01

    We investigated the association between cardiovascular risk factors, including carotid intima-media thickness (CIMT), and future risk of mild cognitive impairment (MCI) and dementia in elderly subjects. We conducted a population-based prospective study as a part of the Korean Longitudinal Study on Health and Aging. Our study included 348 participants who were nondemented at the baseline (mean age, 71.7±6.3 years) and underwent cognitive evaluation at the 5-year follow-up. Baseline cardiovascular risk factors were compared according to the development of MCI or dementia during the study period. At the baseline evaluation, 278 subjects were cognitively normal and 70 subjects had MCI. Diagnoses of cognitive function either remained unchanged or improved during the study period in 292 subjects (nonprogression group), whereas 56 subjects showed progression of cognitive impairment to MCI or dementia (progression group). The progression group exhibited a higher prevalence of hypertension and greater CIMT compared with the nonprogression group. Other baseline cardiovascular risk factors, including sex, body mass index, diabetes mellitus, insulin resistance, total cholesterol, waist-to-hip ratio, visceral fat, pulse wave velocity, and ankle-brachial index, were not significantly different between 2 groups. The association between greater baseline CIMT and the progression of cognitive impairment was maintained after adjustment for conventional baseline risk factors of cognitive impairment. Greater baseline CIMT was also independently associated with the development of MCI in the subjects whose baseline cognitive function was normal. Greater baseline CIMT was independently associated with the risk of cognitive impairment, such as MCI and dementia in elderly subjects. © 2015 American Heart Association, Inc.

Top