Sample records for model-based therapeutic correction

  1. Physiologically Based Pharmacokinetic Modeling of Therapeutic Proteins.

    PubMed

    Wong, Harvey; Chow, Timothy W

    2017-09-01

    Biologics or therapeutic proteins are becoming increasingly important as treatments for disease. The most common class of biologics are monoclonal antibodies (mAbs). Recently, there has been an increase in the use of physiologically based pharmacokinetic (PBPK) modeling in the pharmaceutical industry in drug development. We review PBPK models for therapeutic proteins with an emphasis on mAbs. Due to their size and similarity to endogenous antibodies, there are distinct differences between PBPK models for small molecules and mAbs. The high-level organization of a typical mAb PBPK model consists of a whole-body PBPK model with organ compartments interconnected by both blood and lymph flows. The whole-body PBPK model is coupled with tissue-level submodels used to describe key mechanisms governing mAb disposition including tissue efflux via the lymphatic system, elimination by catabolism, protection from catabolism binding to the neonatal Fc (FcRn) receptor, and nonlinear binding to specific pharmacological targets of interest. The use of PBPK modeling in the development of therapeutic proteins is still in its infancy. Further application of PBPK modeling for therapeutic proteins will help to define its developing role in drug discovery and development. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  2. Integrating School-Based and Therapeutic Conflict Management Models at School.

    ERIC Educational Resources Information Center

    D'Oosterlinck, Franky; Broekaert, Eric

    2003-01-01

    Explores the possibility of integrating school-based and therapeutic conflict management models, comparing two management models: a school-based conflict management program, "Teaching Students To Be Peacemakers"; and a therapeutic conflict management program, "Life Space Crisis Intervention." The paper concludes that integration might be possible…

  3. Ratio-based vs. model-based methods to correct for urinary creatinine concentrations.

    PubMed

    Jain, Ram B

    2016-08-01

    Creatinine-corrected urinary analyte concentration is usually computed as the ratio of the observed level of analyte concentration divided by the observed level of the urinary creatinine concentration (UCR). This ratio-based method is flawed since it implicitly assumes that hydration is the only factor that affects urinary creatinine concentrations. On the contrary, it has been shown in the literature, that age, gender, race/ethnicity, and other factors also affect UCR. Consequently, an optimal method to correct for UCR should correct for hydration as well as other factors like age, gender, and race/ethnicity that affect UCR. Model-based creatinine correction in which observed UCRs are used as an independent variable in regression models has been proposed. This study was conducted to evaluate the performance of ratio-based and model-based creatinine correction methods when the effects of gender, age, and race/ethnicity are evaluated one factor at a time for selected urinary analytes and metabolites. It was observed that ratio-based method leads to statistically significant pairwise differences, for example, between males and females or between non-Hispanic whites (NHW) and non-Hispanic blacks (NHB), more often than the model-based method. However, depending upon the analyte of interest, the reverse is also possible. The estimated ratios of geometric means (GM), for example, male to female or NHW to NHB, were also compared for the two methods. When estimated UCRs were higher for the group (for example, males) in the numerator of this ratio, these ratios were higher for the model-based method, for example, male to female ratio of GMs. When estimated UCR were lower for the group (for example, NHW) in the numerator of this ratio, these ratios were higher for the ratio-based method, for example, NHW to NHB ratio of GMs. Model-based method is the method of choice if all factors that affect UCR are to be accounted for.

  4. Integrating school-based and therapeutic conflict management models at schools.

    PubMed

    D'Oosterlinck, Franky; Broekaert, Eric

    2003-08-01

    Including children with emotional and behavioral needs in mainstream school systems leads to growing concern about the increasing number of violent and nonviolent conflicts. Schools must adapt to this evolution and adopt a more therapeutic dimension. This paper explores the possibility of integrating school-based and therapeutic conflict management models and compares two management models: a school-based conflict management program. Teaching Students To Be Peacemakers; and a therapeutic conflict management program, Life Space Crisis Intervention. The authors conclude that integration might be possible, but depends on establishing a positive school atmosphere, the central position of the teacher, and collaborative and social learning for pupils. Further implementation of integrated conflict management models can be considered but must be underpinned by appropriate scientific research.

  5. Tempest in a Therapeutic Community: Implementation and Evaluation Issues for Faith-Based Programming

    ERIC Educational Resources Information Center

    Scott, Diane L.; Crow, Matthew S.; Thompson, Carla J.

    2010-01-01

    The therapeutic community (TC) is an increasingly utilized intervention model in corrections settings. Rarely do these TCs include faith-based curriculum other than that included in Alcoholics Anonymous or Narcotics Anonymous programs as does the faith-based TC that serves as the basis for this article. Borrowing from the successful TC model, the…

  6. Model-based sensor-less wavefront aberration correction in optical coherence tomography.

    PubMed

    Verstraete, Hans R G W; Wahls, Sander; Kalkman, Jeroen; Verhaegen, Michel

    2015-12-15

    Several sensor-less wavefront aberration correction methods that correct nonlinear wavefront aberrations by maximizing the optical coherence tomography (OCT) signal are tested on an OCT setup. A conventional coordinate search method is compared to two model-based optimization methods. The first model-based method takes advantage of the well-known optimization algorithm (NEWUOA) and utilizes a quadratic model. The second model-based method (DONE) is new and utilizes a random multidimensional Fourier-basis expansion. The model-based algorithms achieve lower wavefront errors with up to ten times fewer measurements. Furthermore, the newly proposed DONE method outperforms the NEWUOA method significantly. The DONE algorithm is tested on OCT images and shows a significantly improved image quality.

  7. Physics Model-Based Scatter Correction in Multi-Source Interior Computed Tomography.

    PubMed

    Gong, Hao; Li, Bin; Jia, Xun; Cao, Guohua

    2018-02-01

    Multi-source interior computed tomography (CT) has a great potential to provide ultra-fast and organ-oriented imaging at low radiation dose. However, X-ray cross scattering from multiple simultaneously activated X-ray imaging chains compromises imaging quality. Previously, we published two hardware-based scatter correction methods for multi-source interior CT. Here, we propose a software-based scatter correction method, with the benefit of no need for hardware modifications. The new method is based on a physics model and an iterative framework. The physics model was derived analytically, and was used to calculate X-ray scattering signals in both forward direction and cross directions in multi-source interior CT. The physics model was integrated to an iterative scatter correction framework to reduce scatter artifacts. The method was applied to phantom data from both Monte Carlo simulations and physical experimentation that were designed to emulate the image acquisition in a multi-source interior CT architecture recently proposed by our team. The proposed scatter correction method reduced scatter artifacts significantly, even with only one iteration. Within a few iterations, the reconstructed images fast converged toward the "scatter-free" reference images. After applying the scatter correction method, the maximum CT number error at the region-of-interests (ROIs) was reduced to 46 HU in numerical phantom dataset and 48 HU in physical phantom dataset respectively, and the contrast-noise-ratio at those ROIs increased by up to 44.3% and up to 19.7%, respectively. The proposed physics model-based iterative scatter correction method could be useful for scatter correction in dual-source or multi-source CT.

  8. Caliber Corrected Markov Modeling (C2M2): Correcting Equilibrium Markov Models.

    PubMed

    Dixit, Purushottam D; Dill, Ken A

    2018-02-13

    Rate processes are often modeled using Markov State Models (MSMs). Suppose you know a prior MSM and then learn that your prediction of some particular observable rate is wrong. What is the best way to correct the whole MSM? For example, molecular dynamics simulations of protein folding may sample many microstates, possibly giving correct pathways through them while also giving the wrong overall folding rate when compared to experiment. Here, we describe Caliber Corrected Markov Modeling (C 2 M 2 ), an approach based on the principle of maximum entropy for updating a Markov model by imposing state- and trajectory-based constraints. We show that such corrections are equivalent to asserting position-dependent diffusion coefficients in continuous-time continuous-space Markov processes modeled by a Smoluchowski equation. We derive the functional form of the diffusion coefficient explicitly in terms of the trajectory-based constraints. We illustrate with examples of 2D particle diffusion and an overdamped harmonic oscillator.

  9. Model-Based Therapeutic Correction of Hypothalamic-Pituitary-Adrenal Axis Dysfunction

    PubMed Central

    Ben-Zvi, Amos; Vernon, Suzanne D.; Broderick, Gordon

    2009-01-01

    The hypothalamic-pituitary-adrenal (HPA) axis is a major system maintaining body homeostasis by regulating the neuroendocrine and sympathetic nervous systems as well modulating immune function. Recent work has shown that the complex dynamics of this system accommodate several stable steady states, one of which corresponds to the hypocortisol state observed in patients with chronic fatigue syndrome (CFS). At present these dynamics are not formally considered in the development of treatment strategies. Here we use model-based predictive control (MPC) methodology to estimate robust treatment courses for displacing the HPA axis from an abnormal hypocortisol steady state back to a healthy cortisol level. This approach was applied to a recent model of HPA axis dynamics incorporating glucocorticoid receptor kinetics. A candidate treatment that displays robust properties in the face of significant biological variability and measurement uncertainty requires that cortisol be further suppressed for a short period until adrenocorticotropic hormone levels exceed 30% of baseline. Treatment may then be discontinued, and the HPA axis will naturally progress to a stable attractor defined by normal hormone levels. Suppression of biologically available cortisol may be achieved through the use of binding proteins such as CBG and certain metabolizing enzymes, thus offering possible avenues for deployment in a clinical setting. Treatment strategies can therefore be designed that maximally exploit system dynamics to provide a robust response to treatment and ensure a positive outcome over a wide range of conditions. Perhaps most importantly, a treatment course involving further reduction in cortisol, even transient, is quite counterintuitive and challenges the conventional strategy of supplementing cortisol levels, an approach based on steady-state reasoning. PMID:19165314

  10. CD-SEM real time bias correction using reference metrology based modeling

    NASA Astrophysics Data System (ADS)

    Ukraintsev, V.; Banke, W.; Zagorodnev, G.; Archie, C.; Rana, N.; Pavlovsky, V.; Smirnov, V.; Briginas, I.; Katnani, A.; Vaid, A.

    2018-03-01

    Accuracy of patterning impacts yield, IC performance and technology time to market. Accuracy of patterning relies on optical proximity correction (OPC) models built using CD-SEM inputs and intra die critical dimension (CD) control based on CD-SEM. Sub-nanometer measurement uncertainty (MU) of CD-SEM is required for current technologies. Reported design and process related bias variation of CD-SEM is in the range of several nanometers. Reference metrology and numerical modeling are used to correct SEM. Both methods are slow to be used for real time bias correction. We report on real time CD-SEM bias correction using empirical models based on reference metrology (RM) data. Significant amount of currently untapped information (sidewall angle, corner rounding, etc.) is obtainable from SEM waveforms. Using additional RM information provided for specific technology (design rules, materials, processes) CD extraction algorithms can be pre-built and then used in real time for accurate CD extraction from regular CD-SEM images. The art and challenge of SEM modeling is in finding robust correlation between SEM waveform features and bias of CD-SEM as well as in minimizing RM inputs needed to create accurate (within the design and process space) model. The new approach was applied to improve CD-SEM accuracy of 45 nm GATE and 32 nm MET1 OPC 1D models. In both cases MU of the state of the art CD-SEM has been improved by 3x and reduced to a nanometer level. Similar approach can be applied to 2D (end of line, contours, etc.) and 3D (sidewall angle, corner rounding, etc.) cases.

  11. Short-Run Prosocial Behavior in Response to Receiving Corrections and Affirmations in Three Therapeutic Communities

    ERIC Educational Resources Information Center

    Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh

    2013-01-01

    Therapeutic communities (TCs) have a strong record of maintaining high quality social climates in prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms, and peer correction of behavior contrary to TC norms, will lead to…

  12. Sandmeier model based topographic correction to lunar spectral profiler (SP) data from KAGUYA satellite.

    PubMed

    Chen, Sheng-Bo; Wang, Jing-Ran; Guo, Peng-Ju; Wang, Ming-Chang

    2014-09-01

    The Moon may be considered as the frontier base for the deep space exploration. The spectral analysis is one of the key techniques to determine the lunar surface rock and mineral compositions. But the lunar topographic relief is more remarkable than that of the Earth. It is necessary to conduct the topographic correction for lunar spectral data before they are used to retrieve the compositions. In the present paper, a lunar Sandmeier model was proposed by considering the radiance effect from the macro and ambient topographic relief. And the reflectance correction model was also reduced based on the Sandmeier model. The Spectral Profile (SP) data from KAGUYA satellite in the Sinus Iridum quadrangle was taken as an example. And the digital elevation data from Lunar Orbiter Laser Altimeter are used to calculate the slope, aspect, incidence and emergence angles, and terrain-viewing factor for the topographic correction Thus, the lunar surface reflectance from the SP data was corrected by the proposed model after the direct component of irradiance on a horizontal surface was derived. As a result, the high spectral reflectance facing the sun is decreased and low spectral reflectance back to the sun is compensated. The statistical histogram of reflectance-corrected pixel numbers presents Gaussian distribution Therefore, the model is robust to correct lunar topographic effect and estimate lunar surface reflectance.

  13. Roi-Orientated Sensor Correction Based on Virtual Steady Reimaging Model for Wide Swath High Resolution Optical Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Zhu, Y.; Jin, S.; Tian, Y.; Wang, M.

    2017-09-01

    To meet the requirement of high accuracy and high speed processing for wide swath high resolution optical satellite imagery under emergency situation in both ground processing system and on-board processing system. This paper proposed a ROI-orientated sensor correction algorithm based on virtual steady reimaging model for wide swath high resolution optical satellite imagery. Firstly, the imaging time and spatial window of the ROI is determined by a dynamic search method. Then, the dynamic ROI sensor correction model based on virtual steady reimaging model is constructed. Finally, the corrected image corresponding to the ROI is generated based on the coordinates mapping relationship which is established by the dynamic sensor correction model for corrected image and rigours imaging model for original image. Two experimental results show that the image registration between panchromatic and multispectral images can be well achieved and the image distortion caused by satellite jitter can be also corrected efficiently.

  14. Model-based aberration correction in a closed-loop wavefront-sensor-less adaptive optics system.

    PubMed

    Song, H; Fraanje, R; Schitter, G; Kroese, H; Vdovin, G; Verhaegen, M

    2010-11-08

    In many scientific and medical applications, such as laser systems and microscopes, wavefront-sensor-less (WFSless) adaptive optics (AO) systems are used to improve the laser beam quality or the image resolution by correcting the wavefront aberration in the optical path. The lack of direct wavefront measurement in WFSless AO systems imposes a challenge to achieve efficient aberration correction. This paper presents an aberration correction approach for WFSlss AO systems based on the model of the WFSless AO system and a small number of intensity measurements, where the model is identified from the input-output data of the WFSless AO system by black-box identification. This approach is validated in an experimental setup with 20 static aberrations having Kolmogorov spatial distributions. By correcting N=9 Zernike modes (N is the number of aberration modes), an intensity improvement from 49% of the maximum value to 89% has been achieved in average based on N+5=14 intensity measurements. With the worst initial intensity, an improvement from 17% of the maximum value to 86% has been achieved based on N+4=13 intensity measurements.

  15. A three-dimensional model-based partial volume correction strategy for gated cardiac mouse PET imaging

    NASA Astrophysics Data System (ADS)

    Dumouchel, Tyler; Thorn, Stephanie; Kordos, Myra; DaSilva, Jean; Beanlands, Rob S. B.; deKemp, Robert A.

    2012-07-01

    Quantification in cardiac mouse positron emission tomography (PET) imaging is limited by the imaging spatial resolution. Spillover of left ventricle (LV) myocardial activity into adjacent organs results in partial volume (PV) losses leading to underestimation of myocardial activity. A PV correction method was developed to restore accuracy of the activity distribution for FDG mouse imaging. The PV correction model was based on convolving an LV image estimate with a 3D point spread function. The LV model was described regionally by a five-parameter profile including myocardial, background and blood activities which were separated into three compartments by the endocardial radius and myocardium wall thickness. The PV correction was tested with digital simulations and a physical 3D mouse LV phantom. In vivo cardiac FDG mouse PET imaging was also performed. Following imaging, the mice were sacrificed and the tracer biodistribution in the LV and liver tissue was measured using a gamma-counter. The PV correction algorithm improved recovery from 50% to within 5% of the truth for the simulated and measured phantom data and image uniformity by 5-13%. The PV correction algorithm improved the mean myocardial LV recovery from 0.56 (0.54) to 1.13 (1.10) without (with) scatter and attenuation corrections. The mean image uniformity was improved from 26% (26%) to 17% (16%) without (with) scatter and attenuation corrections applied. Scatter and attenuation corrections were not observed to significantly impact PV-corrected myocardial recovery or image uniformity. Image-based PV correction algorithm can increase the accuracy of PET image activity and improve the uniformity of the activity distribution in normal mice. The algorithm may be applied using different tracers, in transgenic models that affect myocardial uptake, or in different species provided there is sufficient image quality and similar contrast between the myocardium and surrounding structures.

  16. Benchmark model correction of monitoring system based on Dynamic Load Test of Bridge

    NASA Astrophysics Data System (ADS)

    Shi, Jing-xian; Fan, Jiang

    2018-03-01

    Structural health monitoring (SHM) is a field of research in the area, and it’s designed to achieve bridge safety and reliability assessment, which needs to be carried out on the basis of the accurate simulation of the finite element model. Bridge finite element model is simplified of the structural section form, support conditions, material properties and boundary condition, which is based on the design and construction drawings, and it gets the calculation models and the results.But according to the design and specification requirements established finite element model due to its cannot fully reflect the true state of the bridge, so need to modify the finite element model to obtain the more accurate finite element model. Based on Da-guan river crossing of Ma - Zhao highway in Yunnan province as the background to do the dynamic load test test, we find that the impact coefficient of the theoretical model of the bridge is very different from the coefficient of the actual test, and the change is different; according to the actual situation, the calculation model is adjusted to get the correct frequency of the bridge, the revised impact coefficient found that the modified finite element model is closer to the real state, and provides the basis for the correction of the finite model.

  17. Atmospheric correction for inland water based on Gordon model

    NASA Astrophysics Data System (ADS)

    Li, Yunmei; Wang, Haijun; Huang, Jiazhu

    2008-04-01

    Remote sensing technique is soundly used in water quality monitoring since it can receive area radiation information at the same time. But more than 80% radiance detected by sensors at the top of the atmosphere is contributed by atmosphere not directly by water body. Water radiance information is seriously confused by atmospheric molecular and aerosol scattering and absorption. A slight bias of evaluation for atmospheric influence can deduce large error for water quality evaluation. To inverse water composition accurately we have to separate water and air information firstly. In this paper, we studied on atmospheric correction methods for inland water such as Taihu Lake. Landsat-5 TM image was corrected based on Gordon atmospheric correction model. And two kinds of data were used to calculate Raleigh scattering, aerosol scattering and radiative transmission above Taihu Lake. Meanwhile, the influence of ozone and white cap were revised. One kind of data was synchronization meteorology data, and the other one was synchronization MODIS image. At last, remote sensing reflectance was retrieved from the TM image. The effect of different methods was analyzed using in situ measured water surface spectra. The result indicates that measured and estimated remote sensing reflectance were close for both methods. Compared to the method of using MODIS image, the method of using synchronization meteorology is more accurate. And the bias is close to inland water error criterion accepted by water quality inversing. It shows that this method is suitable for Taihu Lake atmospheric correction for TM image.

  18. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics.

    PubMed

    Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin

    2016-09-02

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method.

  19. Ionospheric propagation correction modeling for satellite altimeters

    NASA Technical Reports Server (NTRS)

    Nesterczuk, G.

    1981-01-01

    The theoretical basis and avaliable accuracy verifications were reviewed and compared for ionospheric correction procedures based on a global ionsopheric model driven by solar flux, and a technique in which measured electron content (using Faraday rotation measurements) for one path is mapped into corrections for a hemisphere. For these two techniques, RMS errors for correcting satellite altimeters data (at 14 GHz) are estimated to be 12 cm and 3 cm, respectively. On the basis of global accuracy and reliability after implementation, the solar flux model is recommended.

  20. Systematic Outcomes Research for Corrections-Based Treatment: Implications from the Criminal Justice Kentucky Treatment Outcome Study

    ERIC Educational Resources Information Center

    Staton-Tindall, Michele; McNees, Erin; Leukefeld, Carl G.; Walker, Robert; Thompson, LaDonna; Pangburn, Kevin; Oser, Carrie B.

    2009-01-01

    Over the last four years, the Kentucky correctional system has expanded corrections-based modified therapeutic community treatment from 6 programs to 24 programs. To examine the effectiveness of these programs, the state initiated a systematic treatment outcome study known as the Criminal Justice Kentucky Treatment Outcome Study (CJKTOS). The…

  1. Topographic correction realization based on the CBERS-02B image

    NASA Astrophysics Data System (ADS)

    Qin, Hui-ping; Yi, Wei-ning; Fang, Yong-hua

    2011-08-01

    The special topography of mountain terrain will induce the retrieval distortion in same species and surface spectral lines. In order to improve the research accuracy of topographic surface characteristic, many researchers have focused on topographic correction. Topographic correction methods can be statistical-empirical model or physical model, in which the methods based on the digital elevation model data are most popular. Restricted by spatial resolution, previous model mostly corrected topographic effect based on Landsat TM image, whose spatial resolution is 30 meter that can be easily achieved from internet or calculated from digital map. Some researchers have also done topographic correction based on high spatial resolution images, such as Quickbird and Ikonos, but there is little correlative research on the topographic correction of CBERS-02B image. In this study, liao-ning mountain terrain was taken as the objective. The digital elevation model data was interpolated to 2.36 meter by 15 meter original digital elevation model one meter by one meter. The C correction, SCS+C correction, Minnaert correction and Ekstrand-r were executed to correct the topographic effect. Then the corrected results were achieved and compared. The images corrected with C correction, SCS+C correction, Minnaert correction and Ekstrand-r were compared, and the scatter diagrams between image digital number and cosine of solar incidence angel with respect to surface normal were shown. The mean value, standard variance, slope of scatter diagram, and separation factor were statistically calculated. The analysed result shows that the shadow is weakened in corrected images than the original images, and the three-dimensional affect is removed. The absolute slope of fitting lines in scatter diagram is minished. Minnaert correction method has the most effective result. These demonstrate that the former correction methods can be successfully adapted to CBERS-02B images. The DEM data can be

  2. An Accurate Temperature Correction Model for Thermocouple Hygrometers 1

    PubMed Central

    Savage, Michael J.; Cass, Alfred; de Jager, James M.

    1982-01-01

    Numerous water relation studies have used thermocouple hygrometers routinely. However, the accurate temperature correction of hygrometer calibration curve slopes seems to have been largely neglected in both psychrometric and dewpoint techniques. In the case of thermocouple psychrometers, two temperature correction models are proposed, each based on measurement of the thermojunction radius and calculation of the theoretical voltage sensitivity to changes in water potential. The first model relies on calibration at a single temperature and the second at two temperatures. Both these models were more accurate than the temperature correction models currently in use for four psychrometers calibrated over a range of temperatures (15-38°C). The model based on calibration at two temperatures is superior to that based on only one calibration. The model proposed for dewpoint hygrometers is similar to that for psychrometers. It is based on the theoretical voltage sensitivity to changes in water potential. Comparison with empirical data from three dewpoint hygrometers calibrated at four different temperatures indicates that these instruments need only be calibrated at, e.g. 25°C, if the calibration slopes are corrected for temperature. PMID:16662241

  3. A model-based scatter artifacts correction for cone beam CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Wei; Zhu, Jun; Wang, Luyao

    2016-04-15

    Purpose: Due to the increased axial coverage of multislice computed tomography (CT) and the introduction of flat detectors, the size of x-ray illumination fields has grown dramatically, causing an increase in scatter radiation. For CT imaging, scatter is a significant issue that introduces shading artifact, streaks, as well as reduced contrast and Hounsfield Units (HU) accuracy. The purpose of this work is to provide a fast and accurate scatter artifacts correction algorithm for cone beam CT (CBCT) imaging. Methods: The method starts with an estimation of coarse scatter profiles for a set of CBCT data in either image domain ormore » projection domain. A denoising algorithm designed specifically for Poisson signals is then applied to derive the final scatter distribution. Qualitative and quantitative evaluations using thorax and abdomen phantoms with Monte Carlo (MC) simulations, experimental Catphan phantom data, and in vivo human data acquired for a clinical image guided radiation therapy were performed. Scatter correction in both projection domain and image domain was conducted and the influences of segmentation method, mismatched attenuation coefficients, and spectrum model as well as parameter selection were also investigated. Results: Results show that the proposed algorithm can significantly reduce scatter artifacts and recover the correct HU in either projection domain or image domain. For the MC thorax phantom study, four-components segmentation yields the best results, while the results of three-components segmentation are still acceptable. The parameters (iteration number K and weight β) affect the accuracy of the scatter correction and the results get improved as K and β increase. It was found that variations in attenuation coefficient accuracies only slightly impact the performance of the proposed processing. For the Catphan phantom data, the mean value over all pixels in the residual image is reduced from −21.8 to −0.2 HU and 0.7 HU for

  4. Dynamic Aberration Correction for Conformal Window of High-Speed Aircraft Using Optimized Model-Based Wavefront Sensorless Adaptive Optics

    PubMed Central

    Dong, Bing; Li, Yan; Han, Xin-li; Hu, Bin

    2016-01-01

    For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10−5 in optimized correction and is 1.427 × 10−5 in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method. PMID:27598161

  5. Treatment model in children with speech disorders and its therapeutic efficiency.

    PubMed

    Barberena, Luciana; Keske-Soares, Márcia; Cervi, Taís; Brandão, Mariane

    2014-07-01

    Introduction Speech articulation disorders affect the intelligibility of speech. Studies on therapeutic models show the effectiveness of the communication treatment. Objective To analyze the progress achieved by treatment with the ABAB-Withdrawal and Multiple Probes Model in children with different degrees of phonological disorders. Methods The diagnosis of speech articulation disorder was determined by speech and hearing evaluation and complementary tests. The subjects of this research were eight children, with the average age of 5:5. The children were distributed into four groups according to the degrees of the phonological disorders, based on the percentage of correct consonants, as follows: severe, moderate to severe, mild to moderate, and mild. The phonological treatment applied was the ABAB-Withdrawal and Multiple Probes Model. The development of the therapy by generalization was observed through the comparison between the two analyses: contrastive and distinctive features at the moment of evaluation and reevaluation. Results The following types of generalization were found: to the items not used in the treatment (other words), to another position in the word, within a sound class, to other classes of sounds, and to another syllable structure. Conclusion The different types of generalization studied showed the expansion of production and proper use of therapy-trained targets in other contexts or untrained environments. Therefore, the analysis of the generalizations proved to be an important criterion to measure the therapeutic efficacy.

  6. Correction tool for Active Shape Model based lumbar muscle segmentation.

    PubMed

    Valenzuela, Waldo; Ferguson, Stephen J; Ignasiak, Dominika; Diserens, Gaelle; Vermathen, Peter; Boesch, Chris; Reyes, Mauricio

    2015-08-01

    In the clinical environment, accuracy and speed of the image segmentation process plays a key role in the analysis of pathological regions. Despite advances in anatomic image segmentation, time-effective correction tools are commonly needed to improve segmentation results. Therefore, these tools must provide faster corrections with a low number of interactions, and a user-independent solution. In this work we present a new interactive correction method for correcting the image segmentation. Given an initial segmentation and the original image, our tool provides a 2D/3D environment, that enables 3D shape correction through simple 2D interactions. Our scheme is based on direct manipulation of free form deformation adapted to a 2D environment. This approach enables an intuitive and natural correction of 3D segmentation results. The developed method has been implemented into a software tool and has been evaluated for the task of lumbar muscle segmentation from Magnetic Resonance Images. Experimental results show that full segmentation correction could be performed within an average correction time of 6±4 minutes and an average of 68±37 number of interactions, while maintaining the quality of the final segmentation result within an average Dice coefficient of 0.92±0.03.

  7. Evidence-Based Nursing of the 3C Therapeutic Regimen for Type 1 Diabetes.

    PubMed

    Wu, Jianya; Zou, Ling

    2015-05-01

    The aim of this study is to explore the efficacy of the 3C therapeutic regimen for type 1 diabetes. Thirty-nine patients with type 1 diabetes, who were hospitalized from January 2013 to April 2014, were included to receive 3C therapeutic regimen. Evidence-based nursing was performed in the treatment period and the efficacy was observed 6 days after therapy. Six days after the administration of the 3C therapeutic regimen, the fasting glucose levels in all 39 patients were controlled to be 4.4-6.0 mmol/L and 2h-postprandial glucose levels to be 4.4-7.8 mmol/L. Three patients had a glucose level <3.9 mmol/L, which was corrected after adjusting the dose of insulin infusion. Evidence-based nursing was provided in the treatment period and no nursing-associated complication occurred. All patients were satisfied with the nursing service. The efficacy of the 3C therapeutic regimen for type 1 diabetes is satisfactory. The evidence-based nursing can help to ensure the efficacy and improve the quality of nursing service.

  8. a Semi-Empirical Topographic Correction Model for Multi-Source Satellite Images

    NASA Astrophysics Data System (ADS)

    Xiao, Sa; Tian, Xinpeng; Liu, Qiang; Wen, Jianguang; Ma, Yushuang; Song, Zhenwei

    2018-04-01

    Topographic correction of surface reflectance in rugged terrain areas is the prerequisite for the quantitative application of remote sensing in mountainous areas. Physics-based radiative transfer model can be applied to correct the topographic effect and accurately retrieve the reflectance of the slope surface from high quality satellite image such as Landsat8 OLI. However, as more and more images data available from various of sensors, some times we can not get the accurate sensor calibration parameters and atmosphere conditions which are needed in the physics-based topographic correction model. This paper proposed a semi-empirical atmosphere and topographic corrction model for muti-source satellite images without accurate calibration parameters.Based on this model we can get the topographic corrected surface reflectance from DN data, and we tested and verified this model with image data from Chinese satellite HJ and GF. The result shows that the correlation factor was reduced almost 85 % for near infrared bands and the classification overall accuracy of classification increased 14 % after correction for HJ. The reflectance difference of slope face the sun and face away the sun have reduced after correction.

  9. Comparing Explicit Exemplar-Based and Rule-Based Corrective Feedback: Introducing Analogy-Based Corrective Feedback

    ERIC Educational Resources Information Center

    Thomas, Kavita E.

    2018-01-01

    This study introduces an approach to providing corrective feedback to L2 learners termed analogy-based corrective feedback that is motivated by analogical learning theories and syntactic alignment in dialogue. Learners are presented with a structurally similar synonymous version of their output where the erroneous form is corrected, and they must…

  10. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    PubMed

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  11. Exemplar-based human action pose correction.

    PubMed

    Shen, Wei; Deng, Ke; Bai, Xiang; Leyvand, Tommer; Guo, Baining; Tu, Zhuowen

    2014-07-01

    The launch of Xbox Kinect has built a very successful computer vision product and made a big impact on the gaming industry. This sheds lights onto a wide variety of potential applications related to action recognition. The accurate estimation of human poses from the depth image is universally a critical step. However, existing pose estimation systems exhibit failures when facing severe occlusion. In this paper, we propose an exemplar-based method to learn to correct the initially estimated poses. We learn an inhomogeneous systematic bias by leveraging the exemplar information within a specific human action domain. Furthermore, as an extension, we learn a conditional model by incorporation of pose tags to further increase the accuracy of pose correction. In the experiments, significant improvements on both joint-based skeleton correction and tag prediction are observed over the contemporary approaches, including what is delivered by the current Kinect system. Our experiments for the facial landmark correction also illustrate that our algorithm can improve the accuracy of other detection/estimation systems.

  12. A sun-crown-sensor model and adapted C-correction logic for topographic correction of high resolution forest imagery

    NASA Astrophysics Data System (ADS)

    Fan, Yuanchao; Koukal, Tatjana; Weisberg, Peter J.

    2014-10-01

    Canopy shadowing mediated by topography is an important source of radiometric distortion on remote sensing images of rugged terrain. Topographic correction based on the sun-canopy-sensor (SCS) model significantly improved over those based on the sun-terrain-sensor (STS) model for surfaces with high forest canopy cover, because the SCS model considers and preserves the geotropic nature of trees. The SCS model accounts for sub-pixel canopy shadowing effects and normalizes the sunlit canopy area within a pixel. However, it does not account for mutual shadowing between neighboring pixels. Pixel-to-pixel shadowing is especially apparent for fine resolution satellite images in which individual tree crowns are resolved. This paper proposes a new topographic correction model: the sun-crown-sensor (SCnS) model based on high-resolution satellite imagery (IKONOS) and high-precision LiDAR digital elevation model. An improvement on the C-correction logic with a radiance partitioning method to address the effects of diffuse irradiance is also introduced (SCnS + C). In addition, we incorporate a weighting variable, based on pixel shadow fraction, on the direct and diffuse radiance portions to enhance the retrieval of at-sensor radiance and reflectance of highly shadowed tree pixels and form another variety of SCnS model (SCnS + W). Model evaluation with IKONOS test data showed that the new SCnS model outperformed the STS and SCS models in quantifying the correlation between terrain-regulated illumination factor and at-sensor radiance. Our adapted C-correction logic based on the sun-crown-sensor geometry and radiance partitioning better represented the general additive effects of diffuse radiation than C parameters derived from the STS or SCS models. The weighting factor Wt also significantly enhanced correction results by reducing within-class standard deviation and balancing the mean pixel radiance between sunlit and shaded slopes. We analyzed these improvements with model

  13. The robust corrective action priority-an improved approach for selecting competing corrective actions in FMEA based on principle of robust design

    NASA Astrophysics Data System (ADS)

    Sutrisno, Agung; Gunawan, Indra; Vanany, Iwan

    2017-11-01

    In spite of being integral part in risk - based quality improvement effort, studies improving quality of selection of corrective action priority using FMEA technique are still limited in literature. If any, none is considering robustness and risk in selecting competing improvement initiatives. This study proposed a theoretical model to select risk - based competing corrective action by considering robustness and risk of competing corrective actions. We incorporated the principle of robust design in counting the preference score among corrective action candidates. Along with considering cost and benefit of competing corrective actions, we also incorporate the risk and robustness of corrective actions. An example is provided to represent the applicability of the proposed model.

  14. Improved calibration-based non-uniformity correction method for uncooled infrared camera

    NASA Astrophysics Data System (ADS)

    Liu, Chengwei; Sui, Xiubao

    2017-08-01

    With the latest improvements of microbolometer focal plane arrays (FPA), uncooled infrared (IR) cameras are becoming the most widely used devices in thermography, especially in handheld devices. However the influences derived from changing ambient condition and the non-uniform response of the sensors make it more difficult to correct the nonuniformity of uncooled infrared camera. In this paper, based on the infrared radiation characteristic in the TEC-less uncooled infrared camera, a novel model was proposed for calibration-based non-uniformity correction (NUC). In this model, we introduce the FPA temperature, together with the responses of microbolometer under different ambient temperature to calculate the correction parameters. Based on the proposed model, we can work out the correction parameters with the calibration measurements under controlled ambient condition and uniform blackbody. All correction parameters can be determined after the calibration process and then be used to correct the non-uniformity of the infrared camera in real time. This paper presents the detail of the compensation procedure and the performance of the proposed calibration-based non-uniformity correction method. And our method was evaluated on realistic IR images obtained by a 384x288 pixels uncooled long wave infrared (LWIR) camera operated under changed ambient condition. The results show that our method can exclude the influence caused by the changed ambient condition, and ensure that the infrared camera has a stable performance.

  15. Enabling full-field physics-based optical proximity correction via dynamic model generation

    NASA Astrophysics Data System (ADS)

    Lam, Michael; Clifford, Chris; Raghunathan, Ananthan; Fenger, Germain; Adam, Kostas

    2017-07-01

    As extreme ultraviolet lithography becomes closer to reality for high volume production, its peculiar modeling challenges related to both inter and intrafield effects have necessitated building an optical proximity correction (OPC) infrastructure that operates with field position dependency. Previous state-of-the-art approaches to modeling field dependency used piecewise constant models where static input models are assigned to specific x/y-positions within the field. OPC and simulation could assign the proper static model based on simulation-level placement. However, in the realm of 7 and 5 nm feature sizes, small discontinuities in OPC from piecewise constant model changes can cause unacceptable levels of edge placement errors. The introduction of dynamic model generation (DMG) can be shown to effectively avoid these dislocations by providing unique mask and optical models per simulation region, allowing a near continuum of models through the field. DMG allows unique models for electromagnetic field, apodization, aberrations, etc. to vary through the entire field and provides a capability to precisely and accurately model systematic field signatures.

  16. Dilatation-dissipation corrections for advanced turbulence models

    NASA Technical Reports Server (NTRS)

    Wilcox, David C.

    1992-01-01

    This paper analyzes dilatation-dissipation based compressibility corrections for advanced turbulence models. Numerical computations verify that the dilatation-dissipation corrections devised by Sarkar and Zeman greatly improve both the k-omega and k-epsilon model predicted effect of Mach number on spreading rate. However, computations with the k-gamma model also show that the Sarkar/Zeman terms cause an undesired reduction in skin friction for the compressible flat-plate boundary layer. A perturbation solution for the compressible wall layer shows that the Sarkar and Zeman terms reduce the effective von Karman constant in the law of the wall. This is the source of the inaccurate k-gamma model skin-friction predictions for the flat-plate boundary layer. The perturbation solution also shows that the k-epsilon model has an inherent flaw for compressible boundary layers that is not compensated for by the dilatation-dissipation corrections. A compressibility modification for k-gamma and k-epsilon models is proposed that is similar to those of Sarkar and Zeman. The new compressibility term permits accurate predictions for the compressible mixing layer, flat-plate boundary layer, and a shock separated flow with the same values for all closure coefficients.

  17. Adeno-Associated Virus-Mediated Correction of a Canine Model of Glycogen Storage Disease Type Ia

    PubMed Central

    Weinstein, David A.; Correia, Catherine E.; Conlon, Thomas; Specht, Andrew; Verstegen, John; Onclin-Verstegen, Karine; Campbell-Thompson, Martha; Dhaliwal, Gurmeet; Mirian, Layla; Cossette, Holly; Falk, Darin J.; Germain, Sean; Clement, Nathalie; Porvasnik, Stacy; Fiske, Laurie; Struck, Maggie; Ramirez, Harvey E.; Jordan, Juan; Andrutis, Karl; Chou, Janice Y.; Byrne, Barry J.

    2010-01-01

    Abstract Glycogen storage disease type Ia (GSDIa; von Gierke disease; MIM 232200) is caused by a deficiency in glucose-6-phosphatase-α. Patients with GSDIa are unable to maintain glucose homeostasis and suffer from severe hypoglycemia, hepatomegaly, hyperlipidemia, hyperuricemia, and lactic acidosis. The canine model of GSDIa is naturally occurring and recapitulates almost all aspects of the human form of disease. We investigated the potential of recombinant adeno-associated virus (rAAV) vector-based therapy to treat the canine model of GSDIa. After delivery of a therapeutic rAAV2/8 vector to a 1-day-old GSDIa dog, improvement was noted as early as 2 weeks posttreatment. Correction was transient, however, and by 2 months posttreatment the rAAV2/8-treated dog could no longer sustain normal blood glucose levels after 1 hr of fasting. The same animal was then dosed with a therapeutic rAAV2/1 vector delivered via the portal vein. Two months after rAAV2/1 dosing, both blood glucose and lactate levels were normal at 4 hr postfasting. With more prolonged fasting, the dog still maintained near-normal glucose concentrations, but lactate levels were elevated by 9 hr, indicating that partial correction was achieved. Dietary glucose supplementation was discontinued starting 1 month after rAAV2/1 delivery and the dog continues to thrive with minimal laboratory abnormalities at 23 months of age (18 months after rAAV2/1 treatment). These results demonstrate that delivery of rAAV vectors can mediate significant correction of the GSDIa phenotype and that gene transfer may be a promising alternative therapy for this disease and other genetic diseases of the liver. PMID:20163245

  18. Preclinical modeling highlights the therapeutic potential of hematopoietic stem cell gene editing for correction of SCID-X1.

    PubMed

    Schiroli, Giulia; Ferrari, Samuele; Conway, Anthony; Jacob, Aurelien; Capo, Valentina; Albano, Luisa; Plati, Tiziana; Castiello, Maria C; Sanvito, Francesca; Gennery, Andrew R; Bovolenta, Chiara; Palchaudhuri, Rahul; Scadden, David T; Holmes, Michael C; Villa, Anna; Sitia, Giovanni; Lombardo, Angelo; Genovese, Pietro; Naldini, Luigi

    2017-10-11

    Targeted genome editing in hematopoietic stem/progenitor cells (HSPCs) is an attractive strategy for treating immunohematological diseases. However, the limited efficiency of homology-directed editing in primitive HSPCs constrains the yield of corrected cells and might affect the feasibility and safety of clinical translation. These concerns need to be addressed in stringent preclinical models and overcome by developing more efficient editing methods. We generated a humanized X-linked severe combined immunodeficiency (SCID-X1) mouse model and evaluated the efficacy and safety of hematopoietic reconstitution from limited input of functional HSPCs, establishing thresholds for full correction upon different types of conditioning. Unexpectedly, conditioning before HSPC infusion was required to protect the mice from lymphoma developing when transplanting small numbers of progenitors. We then designed a one-size-fits-all IL2RG (interleukin-2 receptor common γ-chain) gene correction strategy and, using the same reagents suitable for correction of human HSPC, validated the edited human gene in the disease model in vivo, providing evidence of targeted gene editing in mouse HSPCs and demonstrating the functionality of the IL2RG -edited lymphoid progeny. Finally, we optimized editing reagents and protocol for human HSPCs and attained the threshold of IL2RG editing in long-term repopulating cells predicted to safely rescue the disease, using clinically relevant HSPC sources and highly specific zinc finger nucleases or CRISPR (clustered regularly interspaced short palindromic repeats)/Cas9 (CRISPR-associated protein 9). Overall, our work establishes the rationale and guiding principles for clinical translation of SCID-X1 gene editing and provides a framework for developing gene correction for other diseases. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  19. Bias correction of satellite-based rainfall data

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Biswa; Solomatine, Dimitri

    2015-04-01

    Limitation in hydro-meteorological data availability in many catchments limits the possibility of reliable hydrological analyses especially for near-real-time predictions. However, the variety of satellite based and meteorological model products for rainfall provides new opportunities. Often times the accuracy of these rainfall products, when compared to rain gauge measurements, is not impressive. The systematic differences of these rainfall products from gauge observations can be partially compensated by adopting a bias (error) correction. Many of such methods correct the satellite based rainfall data by comparing their mean value to the mean value of rain gauge data. Refined approaches may also first find out a suitable time scale at which different data products are better comparable and then employ a bias correction at that time scale. More elegant methods use quantile-to-quantile bias correction, which however, assumes that the available (often limited) sample size can be useful in comparing probabilities of different rainfall products. Analysis of rainfall data and understanding of the process of its generation reveals that the bias in different rainfall data varies in space and time. The time aspect is sometimes taken into account by considering the seasonality. In this research we have adopted a bias correction approach that takes into account the variation of rainfall in space and time. A clustering based approach is employed in which every new data point (e.g. of Tropical Rainfall Measuring Mission (TRMM)) is first assigned to a specific cluster of that data product and then, by identifying the corresponding cluster of gauge data, the bias correction specific to that cluster is adopted. The presented approach considers the space-time variation of rainfall and as a result the corrected data is more realistic. Keywords: bias correction, rainfall, TRMM, satellite rainfall

  20. Health care provider communication: an empirical model of therapeutic effectiveness.

    PubMed

    Chochinov, Harvey M; McClement, Susan E; Hack, Thomas F; McKeen, Nancy A; Rach, Amanda M; Gagnon, Pierre; Sinclair, Shane; Taylor-Brown, Jill

    2013-05-01

    Patients who are facing life-threatening and life-limiting cancer almost invariably experience psychological distress. Responding effectively requires therapeutic sensitivity and skill. In this study, we examined therapeutic effectiveness within the setting of cancer-related distress with the objective of understanding its constituent parts. Seventy-eight experienced psychosocial oncology clinicians from 24 health care centers across Canada were invited to participate in 3 focus groups each. In total, 29 focus groups were held over 2 years, during which clinicians articulated the therapeutic factors deemed most helpful in mitigating patient psychosocial distress. The content of each focus group was summarized into major themes and was reviewed with participants to confirm their accuracy. Upon completion of the focus groups, workshops were held in various centers, eliciting participant feedback on an empirical model of therapeutic effectiveness based on the qualitative analysis of focus group data. Three primary, interrelated therapeutic domains emerged from the data, forming a model of optimal therapeutic effectiveness: 1) personal growth and self-care (domain A), 2) therapeutic approaches (domain B), and 3) creation of a safe space (domain C). Areas of domain overlap were identified and labeled accordingly: domain AB, therapeutic humility; domain BC, therapeutic pacing; and domain AC, therapeutic presence. This empirical model provides detailed insights regarding the elements and pedagogy of effective communication and psychosocial care for patients who are experiencing cancer-related distress. Copyright © 2012 American Cancer Society.

  1. [Atmospheric correction of visible-infrared band FY-3A/MERSI data based on 6S model].

    PubMed

    Wu, Yong-Li; Luan, Qing; Tian, Guo-Zhen

    2011-06-01

    Based on the observation data from the meteorological stations in Taiyuan City and its surrounding areas of Shanxi Province, the atmosphere parameters for 6S model were supplied, and the atmospheric correction of visible-infrared band (precision 250 meters) FY-3A/MERSI data was conducted. After atmospheric correction, the range of visible-infrared band FY-3A/MERSI data was widened, reflectivity increased, high peak was higher, and distribution histogram was smoother. In the meantime, the threshold value of NDVI data reflecting vegetation condition increased, and its high peak was higher, more close to the real data. Moreover, the color synthesis image of correction data showed more abundant information, its brightness increased, contrast enhanced, and the information reflected was more close to real.

  2. Carbon nanotubes (CNTs) based advanced dermal therapeutics: current trends and future potential.

    PubMed

    Kuche, Kaushik; Maheshwari, Rahul; Tambe, Vishakha; Mak, Kit-Kay; Jogi, Hardi; Raval, Nidhi; Pichika, Mallikarjuna Rao; Kumar Tekade, Rakesh

    2018-05-17

    The search for effective and non-invasive delivery modules to transport therapeutic molecules across skin has led to the discovery of a number of nanocarriers (viz.: liposomes, ethosomes, dendrimers, etc.) in the last few decades. However, available literature suggests that these delivery modules face several issues including poor stability, low encapsulation efficiency, and scale-up hurdles. Recently, carbon nanotubes (CNTs) emerged as a versatile tool to deliver therapeutics across skin. Superior stability, high loading capacity, well-developed synthesis protocol as well as ease of scale-up are some of the reason for growing interest in CNTs. CNTs have a unique physical architecture and a large surface area with unique surface chemistry that can be tailored for vivid biomedical applications. CNTs have been thus largely engaged in the development of transdermal systems such as tuneable hydrogels, programmable nonporous membranes, electroresponsive skin modalities, protein channel mimetic platforms, reverse iontophoresis, microneedles, and dermal buckypapers. In addition, CNTs were also employed in the development of RNA interference (RNAi) based therapeutics for correcting defective dermal genes. This review expounds the state-of-art synthesis methodologies, skin penetration mechanism, drug liberation profile, loading potential, characterization techniques, and transdermal applications along with a summary on patent/regulatory status and future scope of CNT based skin therapeutics.

  3. Predictors of therapeutic engagement in prison-based drug treatment.

    PubMed

    Welsh, Wayne N; McGrain, Patrick N

    2008-08-01

    Few studies to date have examined predictors of therapeutic engagement (TE) or other indicators of responsiveness to prison drug treatment. Subjects were 347 inmates participating in a 12-month modified therapeutic community (TC) drug treatment program at a specialized treatment prison for convicted, drug-involved offenders. Data were obtained through correctional databases and the administration of the TCU Drug Screen II, the Resident Evaluation of Self and Treatment (REST), and the Counselor Rating of Client (CRC) form. Three main hypotheses were supported: (1) baseline motivation predicted therapeutic engagement net of other inmate characteristics; (2) critical dimensions of the treatment experience (e.g., peer support, counselor rapport) also predicted therapeutic engagement; and (3) dynamic predictors and programmatic characteristics became more important over time. Implications for research, theory and policy are discussed.

  4. Non-model-based correction of respiratory motion using beat-to-beat 3D spiral fat-selective imaging.

    PubMed

    Keegan, Jennifer; Gatehouse, Peter D; Yang, Guang-Zhong; Firmin, David N

    2007-09-01

    To demonstrate the feasibility of retrospective beat-to-beat correction of respiratory motion, without the need for a respiratory motion model. A high-resolution three-dimensional (3D) spiral black-blood scan of the right coronary artery (RCA) of six healthy volunteers was acquired over 160 cardiac cycles without respiratory gating. One spiral interleaf was acquired per cardiac cycle, prior to each of which a complete low-resolution fat-selective 3D spiral dataset was acquired. The respiratory motion (3D translation) on each cardiac cycle was determined by cross-correlating a region of interest (ROI) in the fat around the artery in the low-resolution datasets with that on a reference end-expiratory dataset. The measured translations were used to correct the raw data of the high-resolution spiral interleaves. Beat-to-beat correction provided consistently good results, with the image quality being better than that obtained with a fixed superior-inferior tracking factor of 0.6 and better than (N = 5) or equal to (N = 1) that achieved using a subject-specific retrospective 3D translation motion model. Non-model-based correction of respiratory motion using 3D spiral fat-selective imaging is feasible, and in this small group of volunteers produced better-quality images than a subject-specific retrospective 3D translation motion model. (c) 2007 Wiley-Liss, Inc.

  5. Delivery of RNA interference therapeutics using polycation-based nanoparticles.

    PubMed

    Howard, Kenneth Alan

    2009-07-25

    RNAi-based therapies are dependent on extracellular and intracellular delivery of RNA molecules for enabling target interaction. Polycation-based nanoparticles (or polyplexes) formed by self-assembly with RNA can be used to modulate pharmacokinetics and intracellular trafficking to improve the therapeutic efficacy of RNAi-based therapeutics. This review describes the application of polyplexes for extracellular and intracellular delivery of synthetic RNA molecules. Focus is given to routes of administration and silencing effects in animal disease models. The inclusion of functional components into the nanoparticle for controlling cellular trafficking and RNA release is discussed. This work highlights the versatile nature of polycation-based nanoparticles to fulfil the delivery requirements for RNA molecules with flexibility in design to evolve alongside an expanding repertoire of RNAi-based drugs.

  6. CRISPR/Cas9-based genetic correction for recessive dystrophic epidermolysis bullosa

    PubMed Central

    Webber, Beau R; Osborn, Mark J; McElroy, Amber N; Twaroski, Kirk; Lonetree, Cara-lin; DeFeo, Anthony P; Xia, Lily; Eide, Cindy; Lees, Christopher J; McElmurry, Ron T; Riddle, Megan J; Kim, Chong Jai; Patel, Dharmeshkumar D; Blazar, Bruce R; Tolar, Jakub

    2016-01-01

    Recessive dystrophic epidermolysis bullosa (RDEB) is a severe disorder caused by mutations to the COL7A1 gene that deactivate production of a structural protein essential for skin integrity. Haematopoietic cell transplantation can ameliorate some of the symptoms; however, significant side effects from the allogeneic transplant procedure can occur and unresponsive areas of blistering persist. Therefore, we employed genome editing in patient-derived cells to create an autologous platform for multilineage engineering of therapeutic cell types. The clustered regularly interspaced palindromic repeats (CRISPR)/Cas9 system facilitated correction of an RDEB-causing COL7A1 mutation in primary fibroblasts that were then used to derive induced pluripotent stem cells (iPSCs). The resulting iPSCs were subsequently re-differentiated into keratinocytes, mesenchymal stem cells (MSCs) and haematopoietic progenitor cells using defined differentiation strategies. Gene-corrected keratinocytes exhibited characteristic epithelial morphology and expressed keratinocyte-specific genes and transcription factors. iPSC-derived MSCs exhibited a spindle morphology and expression of CD73, CD90 and CD105 with the ability to undergo adipogenic, chondrogenic and osteogenic differentiation in vitro in a manner indistinguishable from bone marrow-derived MSCs. Finally, we used a vascular induction strategy to generate potent definitive haematopoietic progenitors capable of multilineage differentiation in methylcellulose-based assays. In totality, we have shown that CRISPR/Cas9 is an adaptable gene-editing strategy that can be coupled with iPSC technology to produce multiple gene-corrected autologous cell types with therapeutic potential for RDEB. PMID:28250968

  7. Structurally Based Therapeutic Evaluation: A Therapeutic and Practical Approach to Teaching Medicinal Chemistry.

    ERIC Educational Resources Information Center

    Alsharif, Naser Z.; And Others

    1997-01-01

    Explains structurally based therapeutic evaluation of drugs, which uses seven therapeutic criteria in translating chemical and structural knowledge into therapeutic decision making in pharmaceutical care. In a Creighton University (Nebraska) medicinal chemistry course, students apply the approach to solve patient-related therapeutic problems in…

  8. Disease correction by AAV-mediated gene therapy in a new mouse model of mucopolysaccharidosis type IIID.

    PubMed

    Roca, Carles; Motas, Sandra; Marcó, Sara; Ribera, Albert; Sánchez, Víctor; Sánchez, Xavier; Bertolin, Joan; León, Xavier; Pérez, Jennifer; Garcia, Miguel; Villacampa, Pilar; Ruberte, Jesús; Pujol, Anna; Haurigot, Virginia; Bosch, Fatima

    2017-04-15

    Gene therapy is a promising therapeutic alternative for Lysosomal Storage Disorders (LSD), as it is not necessary to correct the genetic defect in all cells of an organ to achieve therapeutically significant levels of enzyme in body fluids, from which non-transduced cells can uptake the protein correcting their enzymatic deficiency. Animal models are instrumental in the development of new treatments for LSD. Here we report the generation of the first mouse model of the LSD Muccopolysaccharidosis Type IIID (MPSIIID), also known as Sanfilippo syndrome type D. This autosomic recessive, heparan sulphate storage disease is caused by deficiency in N-acetylglucosamine 6-sulfatase (GNS). Mice deficient in GNS showed lysosomal storage pathology and loss of lysosomal homeostasis in the CNS and peripheral tissues, chronic widespread neuroinflammation, reduced locomotor and exploratory activity and shortened lifespan, a phenotype that closely resembled human MPSIIID. Moreover, treatment of the GNS-deficient animals with GNS-encoding adeno-associated viral (AAV) vectors of serotype 9 delivered to the cerebrospinal fluid completely corrected pathological storage, improved lysosomal functionality in the CNS and somatic tissues, resolved neuroinflammation, restored normal behaviour and extended lifespan of treated mice. Hence, this work represents the first step towards the development of a treatment for MPSIIID. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. A systematic review of model-based economic evaluations of diagnostic and therapeutic strategies for lower extremity artery disease.

    PubMed

    Vaidya, Anil; Joore, Manuela A; ten Cate-Hoek, Arina J; Kleinegris, Marie-Claire; ten Cate, Hugo; Severens, Johan L

    2014-01-01

    Lower extremity artery disease (LEAD) is a sign of wide spread atherosclerosis also affecting coronary, cerebral and renal arteries and is associated with increased risk of cardiovascular events. Many economic evaluations have been published for LEAD due to its clinical, social and economic importance. The aim of this systematic review was to assess modelling methods used in published economic evaluations in the field of LEAD. Our review appraised and compared the general characteristics, model structure and methodological quality of published models. Electronic databases MEDLINE and EMBASE were searched until February 2013 via OVID interface. Cochrane database of systematic reviews, Health Technology Assessment database hosted by National Institute for Health research and National Health Services Economic Evaluation Database (NHSEED) were also searched. The methodological quality of the included studies was assessed by using the Philips' checklist. Sixteen model-based economic evaluations were identified and included. Eleven models compared therapeutic health technologies; three models compared diagnostic tests and two models compared a combination of diagnostic and therapeutic options for LEAD. Results of this systematic review revealed an acceptable to low methodological quality of the included studies. Methodological diversity and insufficient information posed a challenge for valid comparison of the included studies. In conclusion, there is a need for transparent, methodologically comparable and scientifically credible model-based economic evaluations in the field of LEAD. Future modelling studies should include clinically and economically important cardiovascular outcomes to reflect the wider impact of LEAD on individual patients and on the society.

  10. Gene therapy augments the efficacy of hematopoietic cell transplantation and fully corrects mucopolysaccharidosis type I phenotype in the mouse model

    PubMed Central

    Visigalli, Ilaria; Delai, Stefania; Politi, Letterio S.; Di Domenico, Carmela; Cerri, Federica; Mrak, Emanuela; D'Isa, Raffaele; Ungaro, Daniela; Stok, Merel; Sanvito, Francesca; Mariani, Elisabetta; Staszewsky, Lidia; Godi, Claudia; Russo, Ilaria; Cecere, Francesca; del Carro, Ubaldo; Rubinacci, Alessandro; Brambilla, Riccardo; Quattrini, Angelo; Di Natale, Paola; Ponder, Katherine; Naldini, Luigi

    2010-01-01

    Type I mucopolysaccharidosis (MPS I) is a lysosomal storage disorder caused by the deficiency of α-L-iduronidase, which results in glycosaminoglycan accumulation in tissues. Clinical manifestations include skeletal dysplasia, joint stiffness, visual and auditory defects, cardiac insufficiency, hepatosplenomegaly, and mental retardation (the last being present exclusively in the severe Hurler variant). The available treatments, enzyme-replacement therapy and hematopoietic stem cell (HSC) transplantation, can ameliorate most disease manifestations, but their outcome on skeletal and brain disease could be further improved. We demonstrate here that HSC gene therapy, based on lentiviral vectors, completely corrects disease manifestations in the mouse model. Of note, the therapeutic benefit provided by gene therapy on critical MPS I manifestations, such as neurologic and skeletal disease, greatly exceeds that exerted by HSC transplantation, the standard of care treatment for Hurler patients. Interestingly, therapeutic efficacy of HSC gene therapy is strictly dependent on the achievement of supranormal enzyme activity in the hematopoietic system of transplanted mice, which allows enzyme delivery to the brain and skeleton for disease correction. Overall, our data provide evidence of an efficacious treatment for MPS I Hurler patients, warranting future development toward clinical testing. PMID:20847202

  11. Scene-based nonuniformity correction and enhancement: pixel statistics and subpixel motion.

    PubMed

    Zhao, Wenyi; Zhang, Chao

    2008-07-01

    We propose a framework for scene-based nonuniformity correction (NUC) and nonuniformity correction and enhancement (NUCE) that is required for focal-plane array-like sensors to obtain clean and enhanced-quality images. The core of the proposed framework is a novel registration-based nonuniformity correction super-resolution (NUCSR) method that is bootstrapped by statistical scene-based NUC methods. Based on a comprehensive imaging model and an accurate parametric motion estimation, we are able to remove severe/structured nonuniformity and in the presence of subpixel motion to simultaneously improve image resolution. One important feature of our NUCSR method is the adoption of a parametric motion model that allows us to (1) handle many practical scenarios where parametric motions are present and (2) carry out perfect super-resolution in principle by exploring available subpixel motions. Experiments with real data demonstrate the efficiency of the proposed NUCE framework and the effectiveness of the NUCSR method.

  12. Therapeutic Enactment: Integrating Individual and Group Counseling Models for Change

    ERIC Educational Resources Information Center

    Westwood, Marvin J.; Keats, Patrice A.; Wilensky, Patricia

    2003-01-01

    The purpose of this article is to introduce the reader to a group-based therapy model known as therapeutic enactment. A description of this multimodal change model is provided by outlining the relevant background information, key concepts related to specific change processes, and the differences in this model compared to earlier psychodrama…

  13. C60 Fullerene as Promising Therapeutic Agent for the Prevention and Correction of Skeletal Muscle Functioning at Ischemic Injury

    NASA Astrophysics Data System (ADS)

    Nozdrenko, D. M.; Zavodovskyi, D. O.; Matvienko, T. Yu.; Zay, S. Yu.; Bogutska, K. I.; Prylutskyy, Yu. I.; Ritter, U.; Scharff, P.

    2017-02-01

    The therapeutic effect of pristine C60 fullerene aqueous colloid solution (C60FAS) on the functioning of the rat soleus muscle at ischemic injury depending on the time of the general pathogenesis of muscular system and method of administration C60FAS in vivo was investigated. It was found that intravenous administration of C60FAS is the optimal for correction of speed macroparameters of contraction for ischemic muscle damage. At the same time, intramuscular administration of C60FAS shows pronounced protective effect in movements associated with the generation of maximum force responses or prolonged contractions, which increase the muscle fatigue level. Analysis of content concentration of creatine phosphokinase and lactate dehydrogenase enzymes in the blood of experimental animals indicates directly that C60FAS may be a promising therapeutic agent for the prevention and correction of ischemic-damaged skeletal muscle function.

  14. Clinically advancing and promising polymer-based therapeutics.

    PubMed

    Souery, Whitney N; Bishop, Corey J

    2018-02-01

    In this review article, we will examine the history of polymers and their evolution from provisional World War II materials to medical therapeutics. To provide a comprehensive look at the current state of polymer-based therapeutics, we will classify technologies according to targeted areas of interest, including central nervous system-based and intraocular-, gastrointestinal-, cardiovascular-, dermal-, reproductive-, skeletal-, and neoplastic-based systems. Within each of these areas, we will consider several examples of novel, clinically available polymer-based therapeutics; in addition, this review will also include a discussion of developing therapies, ranging from the in vivo to clinical trial stage, for each targeted area of treatment. Finally, we will emphasize areas of patient care in need of more effective, accessible, and targeted treatment approaches where polymer-based therapeutics may offer potential solutions. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  15. On the importance of appropriate precipitation gauge catch correction for hydrological modelling at mid to high latitudes

    NASA Astrophysics Data System (ADS)

    Stisen, S.; Højberg, A. L.; Troldborg, L.; Refsgaard, J. C.; Christensen, B. S. B.; Olsen, M.; Henriksen, H. J.

    2012-11-01

    Precipitation gauge catch correction is often given very little attention in hydrological modelling compared to model parameter calibration. This is critical because significant precipitation biases often make the calibration exercise pointless, especially when supposedly physically-based models are in play. This study addresses the general importance of appropriate precipitation catch correction through a detailed modelling exercise. An existing precipitation gauge catch correction method addressing solid and liquid precipitation is applied, both as national mean monthly correction factors based on a historic 30 yr record and as gridded daily correction factors based on local daily observations of wind speed and temperature. The two methods, named the historic mean monthly (HMM) and the time-space variable (TSV) correction, resulted in different winter precipitation rates for the period 1990-2010. The resulting precipitation datasets were evaluated through the comprehensive Danish National Water Resources model (DK-Model), revealing major differences in both model performance and optimised model parameter sets. Simulated stream discharge is improved significantly when introducing the TSV correction, whereas the simulated hydraulic heads and multi-annual water balances performed similarly due to recalibration adjusting model parameters to compensate for input biases. The resulting optimised model parameters are much more physically plausible for the model based on the TSV correction of precipitation. A proxy-basin test where calibrated DK-Model parameters were transferred to another region without site specific calibration showed better performance for parameter values based on the TSV correction. Similarly, the performances of the TSV correction method were superior when considering two single years with a much dryer and a much wetter winter, respectively, as compared to the winters in the calibration period (differential split-sample tests). We conclude that TSV

  16. Reduction of wafer-edge overlay errors using advanced correction models, optimized for minimal metrology requirements

    NASA Astrophysics Data System (ADS)

    Kim, Min-Suk; Won, Hwa-Yeon; Jeong, Jong-Mun; Böcker, Paul; Vergaij-Huizer, Lydia; Kupers, Michiel; Jovanović, Milenko; Sochal, Inez; Ryan, Kevin; Sun, Kyu-Tae; Lim, Young-Wan; Byun, Jin-Moo; Kim, Gwang-Gon; Suh, Jung-Joon

    2016-03-01

    In order to optimize yield in DRAM semiconductor manufacturing for 2x nodes and beyond, the (processing induced) overlay fingerprint towards the edge of the wafer needs to be reduced. Traditionally, this is achieved by acquiring denser overlay metrology at the edge of the wafer, to feed field-by-field corrections. Although field-by-field corrections can be effective in reducing localized overlay errors, the requirement for dense metrology to determine the corrections can become a limiting factor due to a significant increase of metrology time and cost. In this study, a more cost-effective solution has been found in extending the regular correction model with an edge-specific component. This new overlay correction model can be driven by an optimized, sparser sampling especially at the wafer edge area, and also allows for a reduction of noise propagation. Lithography correction potential has been maximized, with significantly less metrology needs. Evaluations have been performed, demonstrating the benefit of edge models in terms of on-product overlay performance, as well as cell based overlay performance based on metrology-to-cell matching improvements. Performance can be increased compared to POR modeling and sampling, which can contribute to (overlay based) yield improvement. Based on advanced modeling including edge components, metrology requirements have been optimized, enabling integrated metrology which drives down overall metrology fab footprint and lithography cycle time.

  17. Lipid correction model of carbon stable isotopes for a cosmopolitan predator, spiny dogfish Squalus acanthias.

    PubMed

    Reum, J C P

    2011-12-01

    Three lipid correction models were evaluated for liver and white dorsal muscle from Squalus acanthias. For muscle, all three models performed well, based on the Akaike Information Criterion value corrected for small sample sizes (AIC(c) ), and predicted similar lipid corrections to δ(13) C that were up to 2.8 ‰ higher than those predicted using previously published models based on multispecies data. For liver, which possessed higher bulk C:N values compared to that of white muscle, all three models performed poorly and lipid-corrected δ(13) C values were best approximated by simply adding 5.74 ‰ to bulk δ(13) C values. © 2011 The Author. Journal of Fish Biology © 2011 The Fisheries Society of the British Isles.

  18. Immunogenicity of therapeutic proteins: the use of animal models.

    PubMed

    Brinks, Vera; Jiskoot, Wim; Schellekens, Huub

    2011-10-01

    Immunogenicity of therapeutic proteins lowers patient well-being and drastically increases therapeutic costs. Preventing immunogenicity is an important issue to consider when developing novel therapeutic proteins and applying them in the clinic. Animal models are increasingly used to study immunogenicity of therapeutic proteins. They are employed as predictive tools to assess different aspects of immunogenicity during drug development and have become vital in studying the mechanisms underlying immunogenicity of therapeutic proteins. However, the use of animal models needs critical evaluation. Because of species differences, predictive value of such models is limited, and mechanistic studies can be restricted. This review addresses the suitability of animal models for immunogenicity prediction and summarizes the insights in immunogenicity that they have given so far.

  19. Non-stationary Bias Correction of Monthly CMIP5 Temperature Projections over China using a Residual-based Bagging Tree Model

    NASA Astrophysics Data System (ADS)

    Yang, T.; Lee, C.

    2017-12-01

    The biases in the Global Circulation Models (GCMs) are crucial for understanding future climate changes. Currently, most bias correction methodologies suffer from the assumption that model bias is stationary. This paper provides a non-stationary bias correction model, termed Residual-based Bagging Tree (RBT) model, to reduce simulation biases and to quantify the contributions of single models. Specifically, the proposed model estimates the residuals between individual models and observations, and takes the differences between observations and the ensemble mean into consideration during the model training process. A case study is conducted for 10 major river basins in Mainland China during different seasons. Results show that the proposed model is capable of providing accurate and stable predictions while including the non-stationarities into the modeling framework. Significant reductions in both bias and root mean squared error are achieved with the proposed RBT model, especially for the central and western parts of China. The proposed RBT model has consistently better performance in reducing biases when compared to the raw ensemble mean, the ensemble mean with simple additive bias correction, and the single best model for different seasons. Furthermore, the contribution of each single GCM in reducing the overall bias is quantified. The single model importance varies between 3.1% and 7.2%. For different future scenarios (RCP 2.6, RCP 4.5, and RCP 8.5), the results from RBT model suggest temperature increases of 1.44 ºC, 2.59 ºC, and 4.71 ºC by the end of the century, respectively, when compared to the average temperature during 1970 - 1999.

  20. Protein and peptide-based therapeutics in periodontal regeneration.

    PubMed

    Reynolds, Mark A; Aichelmann-Reidy, Mary E

    2012-09-01

    Protein and peptide-based therapeutics provide a unique strategy for controlling highly specific and complex biologic actions that cannot be accomplished by simple devices or chemical compounds. This article reviews some of the key characteristics and summarizes the clinical effectiveness of protein and peptide-based therapeutics targeting periodontal regeneration. A literature search was conducted of randomized clinical trials and systematic reviews evaluating protein and peptide-based therapeutics for the regeneration of periodontal tissues of at least 6 months duration. Data sources included PubMed and Embase electronic databases, hand-searched journals, and the ClinicalTrials.gov registry. Commercially marketed protein and peptide-based therapeutics for periodontal regeneration provide gains in clinical attachment level and bone formation that are comparable or superior to other regenerative approaches. Results from several clinical trials indicate that protein and peptide-based therapies can accelerate repair and regeneration when compared with other treatments and that improvements in clinical parameters continue beyond 12 months. Protein and peptide-based therapies also exhibit the capacity to increase the predictability of treatment outcomes. Clinical and histologic studies support the effectiveness of protein- and peptide-based therapeutics for periodontal regeneration. Emerging evidence suggests that the delivery devices/scaffolds play a critical role in determining the effectiveness of this class of therapeutics. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Therapeutic applications of CRISPR RNA-guided genome editing.

    PubMed

    Koo, Taeyoung; Kim, Jin-Soo

    2017-01-01

    The rapid development of programmable nuclease-based genome editing technologies has enabled targeted gene disruption and correction both in vitro and in vivo This revolution opens up the possibility of precise genome editing at target genomic sites to modulate gene function in animals and plants. Among several programmable nucleases, the type II clustered regularly interspaced short palindromic repeats (CRISPR)-CRISPR-associated nuclease 9 (Cas9) system has progressed remarkably in recent years, leading to its widespread use in research, medicine and biotechnology. In particular, CRISPR-Cas9 shows highly efficient gene editing activity for therapeutic purposes in systems ranging from patient stem cells to animal models. However, the development of therapeutic approaches and delivery methods remains a great challenge for biomedical applications. Herein, we review therapeutic applications that use the CRISPR-Cas9 system and discuss the possibilities and challenges ahead. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  2. Interactive surface correction for 3D shape based segmentation

    NASA Astrophysics Data System (ADS)

    Schwarz, Tobias; Heimann, Tobias; Tetzlaff, Ralf; Rau, Anne-Mareike; Wolf, Ivo; Meinzer, Hans-Peter

    2008-03-01

    Statistical shape models have become a fast and robust method for segmentation of anatomical structures in medical image volumes. In clinical practice, however, pathological cases and image artifacts can lead to local deviations of the detected contour from the true object boundary. These deviations have to be corrected manually. We present an intuitively applicable solution for surface interaction based on Gaussian deformation kernels. The method is evaluated by two radiological experts on segmentations of the liver in contrast-enhanced CT images and of the left heart ventricle (LV) in MRI data. For both applications, five datasets are segmented automatically using deformable shape models, and the resulting surfaces are corrected manually. The interactive correction step improves the average surface distance against ground truth from 2.43mm to 2.17mm for the liver, and from 2.71mm to 1.34mm for the LV. We expect this method to raise the acceptance of automatic segmentation methods in clinical application.

  3. Effect of an Ergonomics-Based Educational Intervention Based on Transtheoretical Model in Adopting Correct Body Posture Among Operating Room Nurses.

    PubMed

    Moazzami, Zeinab; Dehdari, Tahere; Taghdisi, Mohammad Hosein; Soltanian, Alireza

    2015-11-03

    One of the preventive strategies for chronic low back pain among operating room nurses is instructing proper body mechanics and postural behavior, for which the use of the Transtheoretical Model (TTM) has been recommended. Eighty two nurses who were in the contemplation and preparation stages for adopting correct body posture were randomly selected (control group = 40, intervention group = 42). TTM variables and body posture were measured at baseline and again after 1 and 6 months after the intervention. A four-week ergonomics educational intervention based on TTM variables was designed and conducted for the nurses in the intervention group. Following the intervention, a higher proportion of nurses in the intervention group moved into the action stage (p < 0.05). Mean scores of self-efficacy, pros, experimental processes and correct body posture were also significantly higher in the intervention group (p < 0.05). No significant differences were found in the cons and behavioral processes, except for self-liberation, between the two groups (p > 0.05) after the intervention. The TTM provides a suitable framework for developing stage-based ergonomics interventions for postural behavior.

  4. Efficient color correction method for smartphone camera-based health monitoring application.

    PubMed

    Duc Dang; Chae Ho Cho; Daeik Kim; Oh Seok Kwon; Jo Woon Chong

    2017-07-01

    Smartphone health monitoring applications are recently highlighted due to the rapid development of hardware and software performance of smartphones. However, color characteristics of images captured by different smartphone models are dissimilar each other and this difference may give non-identical health monitoring results when the smartphone health monitoring applications monitor physiological information using their embedded smartphone cameras. In this paper, we investigate the differences in color properties of the captured images from different smartphone models and apply a color correction method to adjust dissimilar color values obtained from different smartphone cameras. Experimental results show that the color corrected images using the correction method provide much smaller color intensity errors compared to the images without correction. These results can be applied to enhance the consistency of smartphone camera-based health monitoring applications by reducing color intensity errors among the images obtained from different smartphones.

  5. Use of Therapeutic Drug Monitoring, Electronic Health Record Data, and Pharmacokinetic Modeling to Determine the Therapeutic Index of Phenytoin and Lamotrigine

    PubMed Central

    Ku, Lawrence C.; Wu, Huali; Greenberg, Rachel G.; Hill, Kevin D.; Gonzalez, Daniel; Hornik, Christoph P.; Berezny, Alysha; Guptill, Jeffrey T.; Jiang, Wenlei; Zheng, Nan; Cohen-Wolkowiez, Michael; Melloni, Chiara

    2016-01-01

    Background Defining a drug's therapeutic index (TI) is important for patient safety and regulating the development of generic drugs. For many drugs, the TI is unknown. A systematic approach was developed to characterize the TI of a drug using therapeutic drug monitoring and electronic health record (EHR) data with pharmacokinetic (PK) modeling. This approach was first tested on phenytoin, which has a known TI, and then applied to lamotrigine, which lacks a defined TI. Methods Retrospective EHR data from patients in a tertiary hospital were used to develop phenytoin and lamotrigine population PK models and to identify adverse events (anemia, thrombocytopenia, and leukopenia) and efficacy outcomes (seizure-free). Phenytoin and lamotrigine concentrations were simulated for each day with an adverse event or seizure. Relationships between simulated concentrations and adverse events and efficacy outcomes were used to calculate the TI for phenytoin and lamotrigine. Results For phenytoin, 93 patients with 270 total and 174 free concentrations were identified. A de novo 1-compartment PK model with Michaelis-Menten kinetics described the data well. Simulated average total and free concentrations of 10-15 and 1.0-1.5 μg/mL were associated with both adverse events and efficacy in 50% of patients, resulting in a TI of 0.7–1.5. For lamotrigine, 45 patients with 53 concentrations were identified. A published 1-compartment model was adapted to characterize the PK data. No relationships between simulated lamotrigine concentrations and safety or efficacy endpoints were seen; therefore, the TI could not be calculated. Conclusions This approach correctly determined the TI of phenytoin but was unable to determine the TI of lamotrigine due to a limited sample size. The use of therapeutic drug monitoring and EHR data to aid in narrow TI drug classification is promising, but it requires an adequate sample size and accurate characterization of concentration–response relationships

  6. Use of Therapeutic Drug Monitoring, Electronic Health Record Data, and Pharmacokinetic Modeling to Determine the Therapeutic Index of Phenytoin and Lamotrigine.

    PubMed

    Ku, Lawrence C; Wu, Huali; Greenberg, Rachel G; Hill, Kevin D; Gonzalez, Daniel; Hornik, Christoph P; Berezny, Alysha; Guptill, Jeffrey T; Jiang, Wenlei; Zheng, Nan; Cohen-Wolkowiez, Michael; Melloni, Chiara

    2016-12-01

    Defining a drug's therapeutic index (TI) is important for patient safety and regulating the development of generic drugs. For many drugs, the TI is unknown. A systematic approach was developed to characterize the TI of a drug using therapeutic drug monitoring and electronic health record (EHR) data with pharmacokinetic (PK) modeling. This approach was first tested on phenytoin, which has a known TI, and then applied to lamotrigine, which lacks a defined TI. Retrospective EHR data from patients in a tertiary hospital were used to develop phenytoin and lamotrigine population PK models and to identify adverse events (anemia, thrombocytopenia, and leukopenia) and efficacy outcomes (seizure-free). Phenytoin and lamotrigine concentrations were simulated for each day with an adverse event or seizure. Relationships between simulated concentrations and adverse events and efficacy outcomes were used to calculate the TI for phenytoin and lamotrigine. For phenytoin, 93 patients with 270 total and 174 free concentrations were identified. A de novo 1-compartment PK model with Michaelis-Menten kinetics described the data well. Simulated average total and free concentrations of 10-15 and 1.0-1.5 mcg/mL were associated with both adverse events and efficacy in 50% of patients, resulting in a TI of 0.7-1.5. For lamotrigine, 45 patients with 53 concentrations were identified. A published 1-compartment model was adapted to characterize the PK data. No relationships between simulated lamotrigine concentrations and safety or efficacy endpoints were seen; therefore, the TI could not be calculated. This approach correctly determined the TI of phenytoin but was unable to determine the TI of lamotrigine due to a limited sample size. The use of therapeutic drug monitoring and EHR data to aid in narrow TI drug classification is promising, but it requires an adequate sample size and accurate characterization of concentration-response relationships.

  7. Comparison of analytical and numerical approaches for CT-based aberration correction in transcranial passive acoustic imaging

    NASA Astrophysics Data System (ADS)

    Jones, Ryan M.; Hynynen, Kullervo

    2016-01-01

    Computed tomography (CT)-based aberration corrections are employed in transcranial ultrasound both for therapy and imaging. In this study, analytical and numerical approaches for calculating aberration corrections based on CT data were compared, with a particular focus on their application to transcranial passive imaging. Two models were investigated: a three-dimensional full-wave numerical model (Connor and Hynynen 2004 IEEE Trans. Biomed. Eng. 51 1693-706) based on the Westervelt equation, and an analytical method (Clement and Hynynen 2002 Ultrasound Med. Biol. 28 617-24) similar to that currently employed by commercial brain therapy systems. Trans-skull time delay corrections calculated from each model were applied to data acquired by a sparse hemispherical (30 cm diameter) receiver array (128 piezoceramic discs: 2.5 mm diameter, 612 kHz center frequency) passively listening through ex vivo human skullcaps (n  =  4) to emissions from a narrow-band, fixed source emitter (1 mm diameter, 516 kHz center frequency). Measurements were taken at various locations within the cranial cavity by moving the source around the field using a three-axis positioning system. Images generated through passive beamforming using CT-based skull corrections were compared with those obtained through an invasive source-based approach, as well as images formed without skull corrections, using the main lobe volume, positional shift, peak sidelobe ratio, and image signal-to-noise ratio as metrics for image quality. For each CT-based model, corrections achieved by allowing for heterogeneous skull acoustical parameters in simulation outperformed the corresponding case where homogeneous parameters were assumed. Of the CT-based methods investigated, the full-wave model provided the best imaging results at the cost of computational complexity. These results highlight the importance of accurately modeling trans-skull propagation when calculating CT-based aberration corrections

  8. Color correction with blind image restoration based on multiple images using a low-rank model

    NASA Astrophysics Data System (ADS)

    Li, Dong; Xie, Xudong; Lam, Kin-Man

    2014-03-01

    We present a method that can handle the color correction of multiple photographs with blind image restoration simultaneously and automatically. We prove that the local colors of a set of images of the same scene exhibit the low-rank property locally both before and after a color-correction operation. This property allows us to correct all kinds of errors in an image under a low-rank matrix model without particular priors or assumptions. The possible errors may be caused by changes of viewpoint, large illumination variations, gross pixel corruptions, partial occlusions, etc. Furthermore, a new iterative soft-segmentation method is proposed for local color transfer using color influence maps. Due to the fact that the correct color information and the spatial information of images can be recovered using the low-rank model, more precise color correction and many other image-restoration tasks-including image denoising, image deblurring, and gray-scale image colorizing-can be performed simultaneously. Experiments have verified that our method can achieve consistent and promising results on uncontrolled real photographs acquired from the Internet and that it outperforms current state-of-the-art methods.

  9. How does bias correction of regional climate model precipitation affect modelled runoff?

    NASA Astrophysics Data System (ADS)

    Teng, J.; Potter, N. J.; Chiew, F. H. S.; Zhang, L.; Wang, B.; Vaze, J.; Evans, J. P.

    2015-02-01

    Many studies bias correct daily precipitation from climate models to match the observed precipitation statistics, and the bias corrected data are then used for various modelling applications. This paper presents a review of recent methods used to bias correct precipitation from regional climate models (RCMs). The paper then assesses four bias correction methods applied to the weather research and forecasting (WRF) model simulated precipitation, and the follow-on impact on modelled runoff for eight catchments in southeast Australia. Overall, the best results are produced by either quantile mapping or a newly proposed two-state gamma distribution mapping method. However, the differences between the methods are small in the modelling experiments here (and as reported in the literature), mainly due to the substantial corrections required and inconsistent errors over time (non-stationarity). The errors in bias corrected precipitation are typically amplified in modelled runoff. The tested methods cannot overcome limitations of the RCM in simulating precipitation sequence, which affects runoff generation. Results further show that whereas bias correction does not seem to alter change signals in precipitation means, it can introduce additional uncertainty to change signals in high precipitation amounts and, consequently, in runoff. Future climate change impact studies need to take this into account when deciding whether to use raw or bias corrected RCM results. Nevertheless, RCMs will continue to improve and will become increasingly useful for hydrological applications as the bias in RCM simulations reduces.

  10. Factors associated with therapeutic inertia in hypertension: validation of a predictive model.

    PubMed

    Redón, Josep; Coca, Antonio; Lázaro, Pablo; Aguilar, Ma Dolores; Cabañas, Mercedes; Gil, Natividad; Sánchez-Zamorano, Miguel Angel; Aranda, Pedro

    2010-08-01

    To study factors associated with therapeutic inertia in treating hypertension and to develop a predictive model to estimate the probability of therapeutic inertia in a given medical consultation, based on variables related to the consultation, patient, physician, clinical characteristics, and level of care. National, multicentre, observational, cross-sectional study in primary care and specialist (hospital) physicians who each completed a questionnaire on therapeutic inertia, provided professional data and collected clinical data on four patients. Therapeutic inertia was defined as a consultation in which treatment change was indicated (i.e., SBP >or= 140 or DBP >or= 90 mmHg in all patients; SBP >or= 130 or DBP >or= 80 in patients with diabetes or stroke), but did not occur. A predictive model was constructed and validated according to the factors associated with therapeutic inertia. Data were collected on 2595 patients and 13,792 visits. Therapeutic inertia occurred in 7546 (75%) of the 10,041 consultations in which treatment change was indicated. Factors associated with therapeutic inertia were primary care setting, male sex, older age, SPB and/or DBP values close to normal, treatment with more than one antihypertensive drug, treatment with an ARB II, and more than six visits/year. Physician characteristics did not weigh heavily in the association. The predictive model was valid internally and externally, with acceptable calibration, discrimination and reproducibility, and explained one-third of the variability in therapeutic inertia. Although therapeutic inertia is frequent in the management of hypertension, the factors explaining it are not completely clear. Whereas some aspects of the consultations were associated with therapeutic inertia, physician characteristics were not a decisive factor.

  11. Therapeutic gene editing: delivery and regulatory perspectives.

    PubMed

    Shim, Gayong; Kim, Dongyoon; Park, Gyu Thae; Jin, Hyerim; Suh, Soo-Kyung; Oh, Yu-Kyoung

    2017-06-01

    Gene-editing technology is an emerging therapeutic modality for manipulating the eukaryotic genome by using target-sequence-specific engineered nucleases. Because of the exceptional advantages that gene-editing technology offers in facilitating the accurate correction of sequences in a genome, gene editing-based therapy is being aggressively developed as a next-generation therapeutic approach to treat a wide range of diseases. However, strategies for precise engineering and delivery of gene-editing nucleases, including zinc finger nucleases, transcription activator-like effector nuclease, and CRISPR/Cas9 (clustered regularly interspaced short palindromic repeats-associated nuclease Cas9), present major obstacles to the development of gene-editing therapies, as with other gene-targeting therapeutics. Currently, viral and non-viral vectors are being studied for the delivery of these nucleases into cells in the form of DNA, mRNA, or proteins. Clinical trials are already ongoing, and in vivo studies are actively investigating the applicability of CRISPR/Cas9 techniques. However, the concept of correcting the genome poses major concerns from a regulatory perspective, especially in terms of safety. This review addresses current research trends and delivery strategies for gene editing-based therapeutics in non-clinical and clinical settings and considers the associated regulatory issues.

  12. Therapeutic gene editing: delivery and regulatory perspectives

    PubMed Central

    Shim, Gayong; Kim, Dongyoon; Park, Gyu Thae; Jin, Hyerim; Suh, Soo-Kyung; Oh, Yu-Kyoung

    2017-01-01

    Gene-editing technology is an emerging therapeutic modality for manipulating the eukaryotic genome by using target-sequence-specific engineered nucleases. Because of the exceptional advantages that gene-editing technology offers in facilitating the accurate correction of sequences in a genome, gene editing-based therapy is being aggressively developed as a next-generation therapeutic approach to treat a wide range of diseases. However, strategies for precise engineering and delivery of gene-editing nucleases, including zinc finger nucleases, transcription activator-like effector nuclease, and CRISPR/Cas9 (clustered regularly interspaced short palindromic repeats-associated nuclease Cas9), present major obstacles to the development of gene-editing therapies, as with other gene-targeting therapeutics. Currently, viral and non-viral vectors are being studied for the delivery of these nucleases into cells in the form of DNA, mRNA, or proteins. Clinical trials are already ongoing, and in vivo studies are actively investigating the applicability of CRISPR/Cas9 techniques. However, the concept of correcting the genome poses major concerns from a regulatory perspective, especially in terms of safety. This review addresses current research trends and delivery strategies for gene editing-based therapeutics in non-clinical and clinical settings and considers the associated regulatory issues. PMID:28392568

  13. Mapping hydrological environments in central Amazonia: ground validation and surface model based on SRTM DEM data corrected for deforestation

    NASA Astrophysics Data System (ADS)

    Moulatlet, G. M.; Rennó, C. D.; Costa, F. R. C.; Emilio, T.; Schietti, J.

    2014-07-01

    One of the most important freely available digital elevation models (DEMs) for Amazonia is the one obtained by the Shuttle Radar Topography Mission (SRTM). However, since SRTM tends to represent the vegetation surface instead of the ground surface, the broad use of SRTM DEM as a framework for terrain description in Amazonia is hampered by the presence of deforested areas. We present here two datasets: (1) a deforestation-corrected SRTM DEM for the interfluve between the Purus and Madeira rivers, in central Amazonia, which passed through a careful identification of different environments and has deforestation features corrected by a new method of increasing pixel values of the DEM; and (2) a set of eighteen hydrological-topographic descriptors based on the corrected SRTM DEM. The hydrological-topographic description was generated by the Height Above the Nearest Drainage (HAND) algorithm, which normalizes the terrain elevation (a.s.l.) by the elevation of the nearest hydrologically connected drainage. The validation of the HAND dataset was done by in situ hydrological description of 110 km of walking trails also available in this dataset. The new SRTM DEM expands the applicability of SRTM data for landscape modelling; and the datasets of hydrological features based on topographic modelling is undoubtedly appropriate for ecological modelling and an important contribution for environmental mapping of Amazonia. The deforestation-corrected SRTM DEM is available at http://ppbio.inpa.gov.br/knb/metacat/naman.318.3/ppbio; the polygons selected for deforestation correction are available at http://ppbio.inpa.gov.br/knb/metacat/naman.317.3/ppbio; the set of hydrological-topographic descriptors is available at Off-target model based OPC

    NASA Astrophysics Data System (ADS)

    Lu, Mark; Liang, Curtis; King, Dion; Melvin, Lawrence S., III

    2005-11-01

    Model-based Optical Proximity correction has become an indispensable tool for achieving wafer pattern to design fidelity at current manufacturing process nodes. Most model-based OPC is performed considering the nominal process condition, with limited consideration of through process manufacturing robustness. This study examines the use of off-target process models - models that represent non-nominal process states such as would occur with a dose or focus variation - to understands and manipulate the final pattern correction to a more process robust configuration. The study will first examine and validate the process of generating an off-target model, then examine the quality of the off-target model. Once the off-target model is proven, it will be used to demonstrate methods of generating process robust corrections. The concepts are demonstrated using a 0.13 μm logic gate process. Preliminary indications show success in both off-target model production and process robust corrections. With these off-target models as tools, mask production cycle times can be reduced.

  14. Possibility of Exosome-Based Therapeutics and Challenges in Production of Exosomes Eligible for Therapeutic Application.

    PubMed

    Yamashita, Takuma; Takahashi, Yuki; Takakura, Yoshinobu

    2018-01-01

    Exosomes are cell-derived vesicles with a diameter 30-120 nm. Exosomes contain endogenous proteins and nucleic acids; delivery of these molecules to exosome-recipient cells causes biological effects. Exosomes derived from some types of cells such as mesenchymal stem cells and dendritic cells have therapeutic potential and may be biocompatible and efficient agents against various disorders such as organ injury. However, there are many challenges for the development of exosome-based therapeutics. In particular, producing exosomal formulations is the major barrier for therapeutic application because of their heterogeneity and low productivity. Development and optimization of producing methods, including methods for isolation and storage of exosome formulations, are required for realizing exosome-based therapeutics. In addition, improvement of therapeutic potential and delivery efficiency of exosomes are important for their therapeutic application. In this review, we summarize current knowledge about therapeutic application of exosomes and discuss some challenges in their successful use.

  15. A model-based correction for outcome reporting bias in meta-analysis.

    PubMed

    Copas, John; Dwan, Kerry; Kirkham, Jamie; Williamson, Paula

    2014-04-01

    It is often suspected (or known) that outcomes published in medical trials are selectively reported. A systematic review for a particular outcome of interest can only include studies where that outcome was reported and so may omit, for example, a study that has considered several outcome measures but only reports those giving significant results. Using the methodology of the Outcome Reporting Bias (ORB) in Trials study of (Kirkham and others, 2010. The impact of outcome reporting bias in randomised controlled trials on a cohort of systematic reviews. British Medical Journal 340, c365), we suggest a likelihood-based model for estimating the effect of ORB on confidence intervals and p-values in meta-analysis. Correcting for bias has the effect of moving estimated treatment effects toward the null and hence more cautious assessments of significance. The bias can be very substantial, sometimes sufficient to completely overturn previous claims of significance. We re-analyze two contrasting examples, and derive a simple fixed effects approximation that can be used to give an initial estimate of the effect of ORB in practice.

  16. Mapping hydrological environments in central Amazonia: ground validation and surface model based on SRTM DEM data corrected for deforestation

    NASA Astrophysics Data System (ADS)

    Moulatlet, G. M.; Rennó, C. D.; Costa, F. R. C.; Emilio, T.; Schietti, J.

    2015-03-01

    One of the most important freely available digital elevation models (DEMs) for Amazonia is the one obtained by the Shuttle Radar Topography Mission (SRTM). However, since SRTM tends to represent the vegetation surface instead of the ground surface, the broad use of SRTM DEM as a framework for terrain description in Amazonia is hampered by the presence of deforested areas. We present here two data sets: (1) a deforestation-corrected SRTM DEM for the interfluve between the Purus and Madeira rivers, in central Amazonia, which passed through a careful identification of different environments and has deforestation features corrected by a new method of increasing pixel values of the DEM (Rennó, 2009); and (2) a set of 18 hydrological-topographic descriptors based on the corrected SRTM DEM. Deforestation features are related with the opening of an 800 km road in the central part of the interfluve and occupancy of its vicinity. We used topographic profiles from the pristine forest to the deforested feature to evaluate the recovery of the original canopy coverage by minimizing canopy height variation (corrections ranged from 1 to 38 m). The hydrological-topographic description was obtained by the Height Above the Nearest Drainage (HAND) algorithm, which normalizes the terrain elevation (above sea level) by the elevation of the nearest hydrologically connected drainage. The validation of the HAND data set was done by in situ hydrological description of 110 km of walking trails also available in this data set. The new SRTM DEM expands the applicability of SRTM data for landscape modelling; the data sets of hydrological features based on topographic modelling are undoubtedly appropriate for ecological modelling and an important contribution to environmental mapping of Amazonia. The deforestation-corrected SRTM DEM is available at http://ppbio.inpa.gov.br/knb/metacat/naman.318.3/ppbio; the

  17. Effect of an Ergonomics-Based Educational Intervention Based on Transtheoretical Model in Adopting Correct Body Posture Among Operating Room Nurses

    PubMed Central

    Moazzami, Zeinab; Dehdari, Tahere; Taghdisi, Mohammad Hosein; Soltanian, Alireza

    2016-01-01

    Background: One of the preventive strategies for chronic low back pain among operating room nurses is instructing proper body mechanics and postural behavior, for which the use of the Transtheoretical Model (TTM) has been recommended. Methods: Eighty two nurses who were in the contemplation and preparation stages for adopting correct body posture were randomly selected (control group = 40, intervention group = 42). TTM variables and body posture were measured at baseline and again after 1 and 6 months after the intervention. A four-week ergonomics educational intervention based on TTM variables was designed and conducted for the nurses in the intervention group. Results: Following the intervention, a higher proportion of nurses in the intervention group moved into the action stage (p < 0.05). Mean scores of self-efficacy, pros, experimental processes and correct body posture were also significantly higher in the intervention group (p < 0.05). No significant differences were found in the cons and behavioral processes, except for self-liberation, between the two groups (p > 0.05) after the intervention. Conclusions: The TTM provides a suitable framework for developing stage-based ergonomics interventions for postural behavior. PMID:26925897

  18. Atmospheric correction for remote sensing image based on multi-spectral information

    NASA Astrophysics Data System (ADS)

    Wang, Yu; He, Hongyan; Tan, Wei; Qi, Wenwen

    2018-03-01

    The light collected from remote sensors taken from space must transit through the Earth's atmosphere. All satellite images are affected at some level by lightwave scattering and absorption from aerosols, water vapor and particulates in the atmosphere. For generating high-quality scientific data, atmospheric correction is required to remove atmospheric effects and to convert digital number (DN) values to surface reflectance (SR). Every optical satellite in orbit observes the earth through the same atmosphere, but each satellite image is impacted differently because atmospheric conditions are constantly changing. A physics-based detailed radiative transfer model 6SV requires a lot of key ancillary information about the atmospheric conditions at the acquisition time. This paper investigates to achieve the simultaneous acquisition of atmospheric radiation parameters based on the multi-spectral information, in order to improve the estimates of surface reflectance through physics-based atmospheric correction. Ancillary information on the aerosol optical depth (AOD) and total water vapor (TWV) derived from the multi-spectral information based on specific spectral properties was used for the 6SV model. The experimentation was carried out on images of Sentinel-2, which carries a Multispectral Instrument (MSI), recording in 13 spectral bands, covering a wide range of wavelengths from 440 up to 2200 nm. The results suggest that per-pixel atmospheric correction through 6SV model, integrating AOD and TWV derived from multispectral information, is better suited for accurate analysis of satellite images and quantitative remote sensing application.

  19. Cerebellarlike corrective model inference engine for manipulation tasks.

    PubMed

    Luque, Niceto Rafael; Garrido, Jesús Alberto; Carrillo, Richard Rafael; Coenen, Olivier J-M D; Ros, Eduardo

    2011-10-01

    This paper presents how a simple cerebellumlike architecture can infer corrective models in the framework of a control task when manipulating objects that significantly affect the dynamics model of the system. The main motivation of this paper is to evaluate a simplified bio-mimetic approach in the framework of a manipulation task. More concretely, the paper focuses on how the model inference process takes place within a feedforward control loop based on the cerebellar structure and on how these internal models are built up by means of biologically plausible synaptic adaptation mechanisms. This kind of investigation may provide clues on how biology achieves accurate control of non-stiff-joint robot with low-power actuators which involve controlling systems with high inertial components. This paper studies how a basic temporal-correlation kernel including long-term depression (LTD) and a constant long-term potentiation (LTP) at parallel fiber-Purkinje cell synapses can effectively infer corrective models. We evaluate how this spike-timing-dependent plasticity correlates sensorimotor activity arriving through the parallel fibers with teaching signals (dependent on error estimates) arriving through the climbing fibers from the inferior olive. This paper addresses the study of how these LTD and LTP components need to be well balanced with each other to achieve accurate learning. This is of interest to evaluate the relevant role of homeostatic mechanisms in biological systems where adaptation occurs in a distributed manner. Furthermore, we illustrate how the temporal-correlation kernel can also work in the presence of transmission delays in sensorimotor pathways. We use a cerebellumlike spiking neural network which stores the corrective models as well-structured weight patterns distributed among the parallel fibers to Purkinje cell connections.

  1. Antibody-Based Preventive and Therapeutic Strategies Against HIV.

    PubMed

    Fabra-Garcia, Amanda; Beltran, Carolina; Sanchez-Merino, Victor; Yuste, Eloisa

    2016-01-01

    Over the years, numerous studies have been carried out demonstrating the role of antibodies in HIV control leading to the development of antibody-based therapeutic and prophylactic strategies. The objective of this review is to provide updated information on the role of antibodies in the prevention and control of HIV infection and the strategies against HIV that have been designed based on this information. Passive transfer of anti-HIV antibodies in animal models has proven the efficacy of certain antibodies in the prevention and treatment of infection. The capacity of antibodies to control the virus was first attributed to their neutralizing capacity. However, we now know that there are other Fc-mediated antibody activities associated with virus protection. When it comes to better understanding protection against HIV, we ought to pay particular attention to mucosal immune responses. The evidence accumulated so far indicates that an effective vaccine against HIV should generate both mucosal IgAs and systemic IgGs. Due to the problematic induction of protective anti-HIV antibodies, several groups have developed alternative approaches based on antibody delivery via gene therapy vectors. Experiments in animal models with these vectors have shown impressive protection levels and this strategy is now being clinically trialed. Taking into account all the information included in this review, it seems evident that anti-HIV-1 antibodies play an important role in virus control and prevention. This review aims to give an overview of the strategies used and the advances in antibody-based preventive and therapeutic strategies against HIV-1.

  2. A Combined SRTM Digital Elevation Model for Zanjan State of Iran Based on the Corrective Surface Idea

    NASA Astrophysics Data System (ADS)

    Kiamehr, Ramin

    2016-04-01

    One arc-second high resolution version of the SRTM model recently published for the Iran by the US Geological Survey database. Digital Elevation Models (DEM) is widely used in different disciplines and applications by geoscientist. It is an essential data in geoid computation procedure, e.g., to determine the topographic, downward continuation (DWC) and atmospheric corrections. Also, it can be used in road location and design in civil engineering and hydrological analysis. However, a DEM is only a model of the elevation surface and it is subject to errors. The most important parts of errors could be comes from the bias in height datum. On the other hand, the accuracy of DEM is usually published in global sense and it is important to have estimation about the accuracy in the area of interest before using of it. One of the best methods to have a reasonable indication about the accuracy of DEM is obtained from the comparison of their height versus the precise national GPS/levelling data. It can be done by the determination of the Root-Mean-Square (RMS) of fitting between the DEM and leveling heights. The errors in the DEM can be approximated by different kinds of functions in order to fit the DEMs to a set of GPS/levelling data using the least squares adjustment. In the current study, several models ranging from a simple linear regression to seven parameter similarity transformation model are used in fitting procedure. However, the seven parameter model gives the best fitting with minimum standard division in all selected DEMs in the study area. Based on the 35 precise GPS/levelling data we obtain a RMS of 7 parameter fitting for SRTM DEM 5.5 m, The corrective surface model in generated based on the transformation parameters and included to the original SRTM model. The result of fitting in combined model is estimated again by independent GPS/leveling data. The result shows great improvement in absolute accuracy of the model with the standard deviation of 3.4 meter.

  3. Prioritizing therapeutic targets using patient-derived xenograft models

    PubMed Central

    Lodhia, K.A; Hadley, A; Haluska, P; Scott, C.L

    2015-01-01

    Effective systemic treatment of cancer relies on the delivery of agents with optimal therapeutic potential. The molecular age of medicine has provided genomic tools that can identify a large number of potential therapeutic targets in individual patients, heralding the promise of personalized treatment. However, determining which potential targets actually drive tumor growth and should be prioritized for therapy is challenging. Indeed, reliable molecular matches of target and therapeutic agent have been stringently validated in the clinic for only a small number of targets. Patient-derived xenografts (PDX) are tumor models developed in immunocompromised mice using tumor procured directly from the patient. As patient surrogates, PDX models represent a powerful tool for addressing individualized therapy. Challenges include humanizing the immune system of PDX models and ensuring high quality molecular annotation, in order to maximise insights for the clinic. Importantly, PDX can be sampled repeatedly and in parallel, to reveal clonal evolution, which may predict mechanisms of drug resistance and inform therapeutic strategy design. PMID:25783201

  4. A compact quantum correction model for symmetric double gate metal-oxide-semiconductor field-effect transistor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, Edward Namkyu; Shin, Yong Hyeon; Yun, Ilgu, E-mail: iyun@yonsei.ac.kr

    2014-11-07

    A compact quantum correction model for a symmetric double gate (DG) metal-oxide-semiconductor field-effect transistor (MOSFET) is investigated. The compact quantum correction model is proposed from the concepts of the threshold voltage shift (ΔV{sub TH}{sup QM}) and the gate capacitance (C{sub g}) degradation. First of all, ΔV{sub TH}{sup QM} induced by quantum mechanical (QM) effects is modeled. The C{sub g} degradation is then modeled by introducing the inversion layer centroid. With ΔV{sub TH}{sup QM} and the C{sub g} degradation, the QM effects are implemented in previously reported classical model and a comparison between the proposed quantum correction model and numerical simulationmore » results is presented. Based on the results, the proposed quantum correction model can be applicable to the compact model of DG MOSFET.« less

  5. Prospects for nucleic acid-based therapeutics against hepatitis C virus.

    PubMed

    Lee, Chang Ho; Kim, Ji Hyun; Lee, Seong-Wook

    2013-12-21

    In this review, we discuss recent advances in nucleic acid-based therapeutic technologies that target hepatitis C virus (HCV) infection. Because the HCV genome is present exclusively in RNA form during replication, various nucleic acid-based therapeutic approaches targeting the HCV genome, such as ribozymes, aptamers, siRNAs, and antisense oligonucleotides, have been suggested as potential tools against HCV. Nucleic acids are potentially immunogenic and typically require a delivery tool to be utilized as therapeutics. These limitations have hampered the clinical development of nucleic acid-based therapeutics. However, despite these limitations, nucleic acid-based therapeutics has clinical value due to their great specificity, easy and large-scale synthesis with chemical methods, and pharmaceutical flexibility. Moreover, nucleic acid therapeutics are expected to broaden the range of targetable molecules essential for the HCV replication cycle, and therefore they may prove to be more effective than existing therapeutics, such as interferon-α and ribavirin combination therapy. This review focuses on the current status and future prospects of ribozymes, aptamers, siRNAs, and antisense oligonucleotides as therapeutic reagents against HCV.

  6. Bias-correction of CORDEX-MENA projections using the Distribution Based Scaling method

    NASA Astrophysics Data System (ADS)

    Bosshard, Thomas; Yang, Wei; Sjökvist, Elin; Arheimer, Berit; Graham, L. Phil

    2014-05-01

    Within the Regional Initiative for the Assessment of the Impact of Climate Change on Water Resources and Socio-Economic Vulnerability in the Arab Region (RICCAR) lead by UN ESCWA, CORDEX RCM projections for the Middle East Northern Africa (MENA) domain are used to drive hydrological impacts models. Bias-correction of newly available CORDEX-MENA projections is a central part of this project. In this study, the distribution based scaling (DBS) method has been applied to 6 regional climate model projections driven by 2 RCP emission scenarios. The DBS method uses a quantile mapping approach and features a conditional temperature correction dependent on the wet/dry state in the climate model data. The CORDEX-MENA domain is particularly challenging for bias-correction as it spans very diverse climates showing pronounced dry and wet seasons. Results show that the regional climate models simulate too low temperatures and often have a displaced rainfall band compared to WATCH ERA-Interim forcing data in the reference period 1979-2008. DBS is able to correct the temperature biases as well as some aspects of the precipitation biases. Special focus is given to the analysis of the influence of the dry-frequency bias (i.e. climate models simulating too few rain days) on the bias-corrected projections and on the modification of the climate change signal by the DBS method.

  7. BP artificial neural network based wave front correction for sensor-less free space optics communication

    NASA Astrophysics Data System (ADS)

    Li, Zhaokun; Zhao, Xiaohui

    2017-02-01

    The sensor-less adaptive optics (AO) is one of the most promising methods to compensate strong wave front disturbance in free space optics communication (FSO). The back propagation (BP) artificial neural network is applied for the sensor-less AO system to design a distortion correction scheme in this study. This method only needs one or a few online measurements to correct the wave front distortion compared with other model-based approaches, by which the real-time capacity of the system is enhanced and the Strehl Ratio (SR) is largely improved. Necessary comparisons in numerical simulation with other model-based and model-free correction methods proposed in Refs. [6,8,9,10] are given to show the validity and advantage of the proposed method.

  8. Team-based Learning in Therapeutics Workshop Sessions

    PubMed Central

    Kelley, Katherine A.; Metzger, Anne H.; Bellebaum, Katherine L.; McAuley, James W.

    2009-01-01

    Objectives To implement team-based learning in the workshop portion of a pathophysiology and therapeutics sequence of courses to promote integration of concepts across the pharmacy curriculum, provide a consistent problem-solving approach to patient care, and determine the impact on student perceptions of professionalism and teamwork. Design Team-based learning was incorporated into the workshop portion of 3 of 6 pathophysiology and therapeutics courses. Assignments that promoted team-building and application of key concepts were created. Assessment Readiness assurance tests were used to assess individual and team understanding of course materials. Students consistently scored 20% higher on team assessments compared with individual assessments. Mean professionalism and teamwork scores were significantly higher after implementation of team-based learning; however, this improvement was not considered educationally significant. Approximately 91% of students felt team-based learning improved understanding of course materials and 93% of students felt teamwork should continue in workshops. Conclusion Team-based learning is an effective teaching method to ensure a consistent approach to problem-solving and curriculum integration in workshop sessions for a pathophysiology and therapeutics course sequence. PMID:19885069

  9. Team-based learning in therapeutics workshop sessions.

    PubMed

    Beatty, Stuart J; Kelley, Katherine A; Metzger, Anne H; Bellebaum, Katherine L; McAuley, James W

    2009-10-01

    To implement team-based learning in the workshop portion of a pathophysiology and therapeutics sequence of courses to promote integration of concepts across the pharmacy curriculum, provide a consistent problem-solving approach to patient care, and determine the impact on student perceptions of professionalism and teamwork. Team-based learning was incorporated into the workshop portion of 3 of 6 pathophysiology and therapeutics courses. Assignments that promoted team-building and application of key concepts were created. Readiness assurance tests were used to assess individual and team understanding of course materials. Students consistently scored 20% higher on team assessments compared with individual assessments. Mean professionalism and teamwork scores were significantly higher after implementation of team-based learning; however, this improvement was not considered educationally significant. Approximately 91% of students felt team-based learning improved understanding of course materials and 93% of students felt teamwork should continue in workshops. Team-based learning is an effective teaching method to ensure a consistent approach to problem-solving and curriculum integration in workshop sessions for a pathophysiology and therapeutics course sequence.

  10. Preclinical models used for immunogenicity prediction of therapeutic proteins.

    PubMed

    Brinks, Vera; Weinbuch, Daniel; Baker, Matthew; Dean, Yann; Stas, Philippe; Kostense, Stefan; Rup, Bonita; Jiskoot, Wim

    2013-07-01

    All therapeutic proteins are potentially immunogenic. Antibodies formed against these drugs can decrease efficacy, leading to drastically increased therapeutic costs and in rare cases to serious and sometimes life threatening side-effects. Many efforts are therefore undertaken to develop therapeutic proteins with minimal immunogenicity. For this, immunogenicity prediction of candidate drugs during early drug development is essential. Several in silico, in vitro and in vivo models are used to predict immunogenicity of drug leads, to modify potentially immunogenic properties and to continue development of drug candidates with expected low immunogenicity. Despite the extensive use of these predictive models, their actual predictive value varies. Important reasons for this uncertainty are the limited/insufficient knowledge on the immune mechanisms underlying immunogenicity of therapeutic proteins, the fact that different predictive models explore different components of the immune system and the lack of an integrated clinical validation. In this review, we discuss the predictive models in use, summarize aspects of immunogenicity that these models predict and explore the merits and the limitations of each of the models.

  11. Antisense oligonucleotide-mediated correction of transcriptional dysregulation is correlated with behavioral benefits in the YAC128 mouse model of Huntington's disease.

    PubMed

    Stanek, Lisa M; Yang, Wendy; Angus, Stuart; Sardi, Pablo S; Hayden, Michael R; Hung, Gene H; Bennett, C Frank; Cheng, Seng H; Shihabuddin, Lamya S

    2013-01-01

    Huntington's disease (HD) is a neurological disorder caused by mutations in the huntingtin (HTT) gene, the product of which leads to selective and progressive neuronal cell death in the striatum and cortex. Transcriptional dysregulation has emerged as a core pathologic feature in the CNS of human and animal models of HD. It is still unclear whether perturbations in gene expression are a consequence of the disease or importantly, contribute to the pathogenesis of HD. To examine if transcriptional dysregulation can be ameliorated with antisense oligonucleotides that reduce levels of mutant Htt and provide therapeutic benefit in the YAC128 mouse model of HD. Quantitative real-time PCR analysis was used to evaluate dysregulation of a subset of striatal genes in the YAC128 mouse model. Transcripts were then evaluated following ICV delivery of antisense oligonucleotides (ASO). Rota rod and Porsolt swim tests were used to evaluate phenotypic deficits in these mice following ASO treatment. Transcriptional dysregulation was detected in the YAC128 mouse model and appears to progress with age. ICV delivery of ASOs directed against mutant Htt resulted in reduction in mutant Htt levels and amelioration in behavioral deficits in the YAC128 mouse model. These improvements were correlated with improvements in the levels of several dysregulated striatal transcripts. The role of transcriptional dysregulation in the pathogenesis of Huntington's disease is not well understood, however, a wealth of evidence now strongly suggests that changes in transcriptional signatures are a prominent feature in the brains of both HD patients and animal models of the disease. Our study is the first to show that a therapeutic agent capable of improving an HD disease phenotype is concomitantly correlated with normalization of a subset of dysregulated striatal transcripts. Our data suggests that correction of these disease-altered transcripts may underlie, at least in part, the therapeutic efficacy

  12. HESS Opinions "Should we apply bias correction to global and regional climate model data?"

    NASA Astrophysics Data System (ADS)

    Ehret, U.; Zehe, E.; Wulfmeyer, V.; Warrach-Sagi, K.; Liebert, J.

    2012-04-01

    Despite considerable progress in recent years, output of both Global and Regional Circulation Models is still afflicted with biases to a degree that precludes its direct use, especially in climate change impact studies. This is well known, and to overcome this problem bias correction (BC), i.e. the correction of model output towards observations in a post processing step for its subsequent application in climate change impact studies has now become a standard procedure. In this paper we argue that bias correction, which has a considerable influence on the results of impact studies, is not a valid procedure in the way it is currently used: it impairs the advantages of Circulation Models which are based on established physical laws by altering spatiotemporal field consistency, relations among variables and by violating conservation principles. Bias correction largely neglects feedback mechanisms and it is unclear whether bias correction methods are time-invariant under climate change conditions. Applying bias correction increases agreement of Climate Model output with observations in hind casts and hence narrows the uncertainty range of simulations and predictions without, however, providing a satisfactory physical justification. This is in most cases not transparent to the end user. We argue that this masks rather than reduces uncertainty, which may lead to avoidable forejudging of end users and decision makers. We present here a brief overview of state-of-the-art bias correction methods, discuss the related assumptions and implications, draw conclusions on the validity of bias correction and propose ways to cope with biased output of Circulation Models in the short term and how to reduce the bias in the long term. The most promising strategy for improved future Global and Regional Circulation Model simulations is the increase in model resolution to the convection-permitting scale in combination with ensemble predictions based on sophisticated approaches for

  13. Therapeutic correction of ApoER2 splicing in Alzheimer's disease mice using antisense oligonucleotides.

    PubMed

    Hinrich, Anthony J; Jodelka, Francine M; Chang, Jennifer L; Brutman, Daniella; Bruno, Angela M; Briggs, Clark A; James, Bryan D; Stutzmann, Grace E; Bennett, David A; Miller, Steven A; Rigo, Frank; Marr, Robert A; Hastings, Michelle L

    2016-04-01

    Apolipoprotein E receptor 2 (ApoER2) is an apolipoprotein E receptor involved in long-term potentiation, learning, and memory. Given its role in cognition and its association with the Alzheimer's disease (AD) risk gene, apoE, ApoER2 has been proposed to be involved in AD, though a role for the receptor in the disease is not clear. ApoER2 signaling requires amino acids encoded by alternatively spliced exon 19. Here, we report that the balance of ApoER2 exon 19 splicing is deregulated in postmortem brain tissue from AD patients and in a transgenic mouse model of AD To test the role of deregulated ApoER2 splicing in AD, we designed an antisense oligonucleotide (ASO) that increases exon 19 splicing. Treatment of AD mice with a single dose of ASO corrected ApoER2 splicing for up to 6 months and improved synaptic function and learning and memory. These results reveal an association between ApoER2 isoform expression and AD, and provide preclinical evidence for the utility of ASOs as a therapeutic approach to mitigate Alzheimer's disease symptoms by improving ApoER2 exon 19 splicing. © 2016 The Authors. Published under the terms of the CC BY 4.0 license.

  14. Solar array model corrections from Mars Pathfinder lander data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewell, R.C.; Burger, D.R.

    1997-12-31

    The MESUR solar array power model initially assumed values for input variables. After landing early surface variables such as array tilt and azimuth or early environmental variables such as array temperature can be corrected. Correction of later environmental variables such as tau versus time, spectral shift, dust deposition, and UV darkening is dependent upon time, on-board science instruments, and ability to separate effects of variables. Engineering estimates had to be made for additional shadow losses and Voc sensor temperature corrections. Some variations had not been expected such as tau versus time of day, and spectral shift versus time of day.more » Additions needed to the model are thermal mass of lander petal and correction between Voc sensor and temperature sensor. Conclusions are: the model works well; good battery predictions are difficult; inclusion of Isc and Voc sensors was valuable; and the IMP and MAE science experiments greatly assisted the data analysis and model correction.« less

  15. Evaluation of Bias Correction Method for Satellite-Based Rainfall Data

    PubMed Central

    Bhatti, Haris Akram; Rientjes, Tom; Haile, Alemseged Tamiru; Habib, Emad; Verhoef, Wouter

    2016-01-01

    With the advances in remote sensing technology, satellite-based rainfall estimates are gaining attraction in the field of hydrology, particularly in rainfall-runoff modeling. Since estimates are affected by errors correction is required. In this study, we tested the high resolution National Oceanic and Atmospheric Administration’s (NOAA) Climate Prediction Centre (CPC) morphing technique (CMORPH) satellite rainfall product (CMORPH) in the Gilgel Abbey catchment, Ethiopia. CMORPH data at 8 km-30 min resolution is aggregated to daily to match in-situ observations for the period 2003–2010. Study objectives are to assess bias of the satellite estimates, to identify optimum window size for application of bias correction and to test effectiveness of bias correction. Bias correction factors are calculated for moving window (MW) sizes and for sequential windows (SW’s) of 3, 5, 7, 9, …, 31 days with the aim to assess error distribution between the in-situ observations and CMORPH estimates. We tested forward, central and backward window (FW, CW and BW) schemes to assess the effect of time integration on accumulated rainfall. Accuracy of cumulative rainfall depth is assessed by Root Mean Squared Error (RMSE). To systematically correct all CMORPH estimates, station based bias factors are spatially interpolated to yield a bias factor map. Reliability of interpolation is assessed by cross validation. The uncorrected CMORPH rainfall images are multiplied by the interpolated bias map to result in bias corrected CMORPH estimates. Findings are evaluated by RMSE, correlation coefficient (r) and standard deviation (SD). Results showed existence of bias in the CMORPH rainfall. It is found that the 7 days SW approach performs best for bias correction of CMORPH rainfall. The outcome of this study showed the efficiency of our bias correction approach. PMID:27314363

  16. Evaluation of Bias Correction Method for Satellite-Based Rainfall Data.

    PubMed

    Bhatti, Haris Akram; Rientjes, Tom; Haile, Alemseged Tamiru; Habib, Emad; Verhoef, Wouter

    2016-06-15

    With the advances in remote sensing technology, satellite-based rainfall estimates are gaining attraction in the field of hydrology, particularly in rainfall-runoff modeling. Since estimates are affected by errors correction is required. In this study, we tested the high resolution National Oceanic and Atmospheric Administration's (NOAA) Climate Prediction Centre (CPC) morphing technique (CMORPH) satellite rainfall product (CMORPH) in the Gilgel Abbey catchment, Ethiopia. CMORPH data at 8 km-30 min resolution is aggregated to daily to match in-situ observations for the period 2003-2010. Study objectives are to assess bias of the satellite estimates, to identify optimum window size for application of bias correction and to test effectiveness of bias correction. Bias correction factors are calculated for moving window (MW) sizes and for sequential windows (SW's) of 3, 5, 7, 9, …, 31 days with the aim to assess error distribution between the in-situ observations and CMORPH estimates. We tested forward, central and backward window (FW, CW and BW) schemes to assess the effect of time integration on accumulated rainfall. Accuracy of cumulative rainfall depth is assessed by Root Mean Squared Error (RMSE). To systematically correct all CMORPH estimates, station based bias factors are spatially interpolated to yield a bias factor map. Reliability of interpolation is assessed by cross validation. The uncorrected CMORPH rainfall images are multiplied by the interpolated bias map to result in bias corrected CMORPH estimates. Findings are evaluated by RMSE, correlation coefficient (r) and standard deviation (SD). Results showed existence of bias in the CMORPH rainfall. It is found that the 7 days SW approach performs best for bias correction of CMORPH rainfall. The outcome of this study showed the efficiency of our bias correction approach.

  17. How does bias correction of RCM precipitation affect modelled runoff?

    NASA Astrophysics Data System (ADS)

    Teng, J.; Potter, N. J.; Chiew, F. H. S.; Zhang, L.; Vaze, J.; Evans, J. P.

    2014-09-01

    Many studies bias correct daily precipitation from climate models to match the observed precipitation statistics, and the bias corrected data are then used for various modelling applications. This paper presents a review of recent methods used to bias correct precipitation from regional climate models (RCMs). The paper then assesses four bias correction methods applied to the weather research and forecasting (WRF) model simulated precipitation, and the follow-on impact on modelled runoff for eight catchments in southeast Australia. Overall, the best results are produced by either quantile mapping or a newly proposed two-state gamma distribution mapping method. However, the difference between the tested methods is small in the modelling experiments here (and as reported in the literature), mainly because of the substantial corrections required and inconsistent errors over time (non-stationarity). The errors remaining in bias corrected precipitation are typically amplified in modelled runoff. The tested methods cannot overcome limitation of RCM in simulating precipitation sequence, which affects runoff generation. Results further show that whereas bias correction does not seem to alter change signals in precipitation means, it can introduce additional uncertainty to change signals in high precipitation amounts and, consequently, in runoff. Future climate change impact studies need to take this into account when deciding whether to use raw or bias corrected RCM results. Nevertheless, RCMs will continue to improve and will become increasingly useful for hydrological applications as the bias in RCM simulations reduces.

  18. CRISPR-Cas9 therapeutics in cancer: promising strategies and present challenges.

    PubMed

    Yi, Lang; Li, Jinming

    2016-12-01

    Cancer is characterized by multiple genetic and epigenetic alterations that drive malignant cell proliferation and confer chemoresistance. The ability to correct or ablate such mutations holds immense promise for combating cancer. Recently, because of its high efficiency and accuracy, the CRISPR-Cas9 genome editing technique has been widely used in cancer therapeutic explorations. Several studies used CRISPR-Cas9 to directly target cancer cell genomic DNA in cellular and animal cancer models which have shown therapeutic potential in expanding our anticancer protocols. Moreover, CRISPR-Cas9 can also be employed to fight oncogenic infections, explore anticancer drugs, and engineer immune cells and oncolytic viruses for cancer immunotherapeutic applications. Here, we summarize these preclinical CRISPR-Cas9-based therapeutic strategies against cancer, and discuss the challenges and improvements in translating therapeutic CRISPR-Cas9 into clinical use, which will facilitate better application of this technique in cancer research. Further, we propose potential directions of the CRISPR-Cas9 system in cancer therapy. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Nonuniformity correction for an infrared focal plane array based on diamond search block matching.

    PubMed

    Sheng-Hui, Rong; Hui-Xin, Zhou; Han-Lin, Qin; Rui, Lai; Kun, Qian

    2016-05-01

    In scene-based nonuniformity correction algorithms, artificial ghosting and image blurring degrade the correction quality severely. In this paper, an improved algorithm based on the diamond search block matching algorithm and the adaptive learning rate is proposed. First, accurate transform pairs between two adjacent frames are estimated by the diamond search block matching algorithm. Then, based on the error between the corresponding transform pairs, the gradient descent algorithm is applied to update correction parameters. During the process of gradient descent, the local standard deviation and a threshold are utilized to control the learning rate to avoid the accumulation of matching error. Finally, the nonuniformity correction would be realized by a linear model with updated correction parameters. The performance of the proposed algorithm is thoroughly studied with four real infrared image sequences. Experimental results indicate that the proposed algorithm can reduce the nonuniformity with less ghosting artifacts in moving areas and can also overcome the problem of image blurring in static areas.

  20. Energy-based adaptive focusing of waves: application to noninvasive aberration correction of ultrasonic wavefields

    PubMed Central

    Herbert, Eric; Pernot, Mathieu; Montaldo, Gabriel; Fink, Mathias; Tanter, Mickael

    2009-01-01

    An aberration correction method based on the maximization of the wave intensity at the focus of an emitting array is presented. The potential of this new adaptive focusing technique is investigated for ultrasonic focusing in biological tissues. The acoustic intensity is maximized non invasively through the direct measurement or indirect estimation of the beam energy at the focus for a series of spatially coded emissions. For ultrasonic waves, the acoustic energy at the desired focus can be indirectly estimated from the local displacements induced in tissues by the ultrasonic radiation force of the beam. Based on the measurement of these displacements, this method allows the precise estimation of the phase and amplitude aberrations and consequently the correction of aberrations along the beam travel path. The proof of concept is first performed experimentally using a large therapeutic array with strong electronic phase aberrations (up to 2π). Displacements induced by the ultrasonic radiation force at the desired focus are indirectly estimated using the time shift of backscattered echoes recorded on the array. The phase estimation is deduced accurately using a direct inversion algorithm which reduces the standard deviation of the phase distribution from σ = 1.89 before correction to σ = 0.53 following correction. The corrected beam focusing quality is verified using a needle hydrophone. The peak intensity obtained through the aberrator is found to be −7.69 dB below the reference intensity obtained without any aberration. Using the phase correction, a sharp focus is restored through the aberrator with a relative peak intensity of −0.89 dB. The technique is tested experimentally using a linear transmit/receive array through a real aberrating layer. The array is used to automatically correct its beam quality, as it both generates the radiation force with coded excitations and indirectly estimates the acoustic intensity at the focus with speckle tracking. This

  1. Comparison of Different Attitude Correction Models for ZY-3 Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Song, Wenping; Liu, Shijie; Tong, Xiaohua; Niu, Changling; Ye, Zhen; Zhang, Han; Jin, Yanmin

    2018-04-01

    ZY-3 satellite, launched in 2012, is the first civilian high resolution stereo mapping satellite of China. This paper analyzed the positioning errors of ZY-3 satellite imagery and conducted compensation for geo-position accuracy improvement using different correction models, including attitude quaternion correction, attitude angle offset correction, and attitude angle linear correction. The experimental results revealed that there exist systematic errors with ZY-3 attitude observations and the positioning accuracy can be improved after attitude correction with aid of ground controls. There is no significant difference between the results of attitude quaternion correction method and the attitude angle correction method. However, the attitude angle offset correction model produced steady improvement than the linear correction model when limited ground control points are available for single scene.

  2. A simple modern correctness condition for a space-based high-performance multiprocessor

    NASA Technical Reports Server (NTRS)

    Probst, David K.; Li, Hon F.

    1992-01-01

    A number of U.S. national programs, including space-based detection of ballistic missile launches, envisage putting significant computing power into space. Given sufficient progress in low-power VLSI, multichip-module packaging and liquid-cooling technologies, we will see design of high-performance multiprocessors for individual satellites. In very high speed implementations, performance depends critically on tolerating large latencies in interprocessor communication; without latency tolerance, performance is limited by the vastly differing time scales in processor and data-memory modules, including interconnect times. The modern approach to tolerating remote-communication cost in scalable, shared-memory multiprocessors is to use a multithreaded architecture, and alter the semantics of shared memory slightly, at the price of forcing the programmer either to reason about program correctness in a relaxed consistency model or to agree to program in a constrained style. The literature on multiprocessor correctness conditions has become increasingly complex, and sometimes confusing, which may hinder its practical application. We propose a simple modern correctness condition for a high-performance, shared-memory multiprocessor; the correctness condition is based on a simple interface between the multiprocessor architecture and a high-performance, shared-memory multiprocessor; the correctness condition is based on a simple interface between the multiprocessor architecture and the parallel programming system.

  3. [Lateral chromatic aberrations correction for AOTF imaging spectrometer based on doublet prism].

    PubMed

    Zhao, Hui-Jie; Zhou, Peng-Wei; Zhang, Ying; Li, Chong-Chong

    2013-10-01

    An user defined surface function method was proposed to model the acousto-optic interaction of AOTF based on wave-vector match principle. Assessment experiment result shows that this model can achieve accurate ray trace of AOTF diffracted beam. In addition, AOTF imaging spectrometer presents large residual lateral color when traditional chromatic aberrations correcting method is adopted. In order to reduce lateral chromatic aberrations, a method based on doublet prism is proposed. The optical material and angle of the prism are optimized automatically using global optimization with the help of user defined AOTF surface. Simulation result shows that the proposed method provides AOTF imaging spectrometer with great conveniences, which reduces the lateral chromatic aberration to less than 0.000 3 degrees and improves by one order of magnitude, with spectral image shift effectively corrected.

  4. [Atmospheric correction of HJ-1 CCD data for water imagery based on dark object model].

    PubMed

    Zhou, Li-Guo; Ma, Wei-Chun; Gu, Wan-Hua; Huai, Hong-Yan

    2011-08-01

    The CCD multi-band data of HJ-1A has great potential in inland water quality monitoring, but the precision of atmospheric correction is a premise and necessary procedure for its application. In this paper, a method based on dark pixel for water-leaving radiance retrieving is proposed. Beside the Rayleigh scattering, the aerosol scattering is important to atmospheric correction, the water quality of inland lakes always are case II water and the value of water leaving radiance is not zero. So the synchronous MODIS shortwave infrared data was used to obtain the aerosol parameters, and in virtue of the characteristic that aerosol scattering is relative stabilized in 560 nm, the water-leaving radiance for each visible and near infrared band were retrieved and normalized, accordingly the remotely sensed reflectance of water was computed. The results show that the atmospheric correction method based on the imagery itself is more effective for the retrieval of water parameters for HJ-1A CCD data.

  5. Correction of β-thalassemia mutant by base editor in human embryos.

    PubMed

    Liang, Puping; Ding, Chenhui; Sun, Hongwei; Xie, Xiaowei; Xu, Yanwen; Zhang, Xiya; Sun, Ying; Xiong, Yuanyan; Ma, Wenbin; Liu, Yongxiang; Wang, Yali; Fang, Jianpei; Liu, Dan; Songyang, Zhou; Zhou, Canquan; Huang, Junjiu

    2017-11-01

    β-Thalassemia is a global health issue, caused by mutations in the HBB gene. Among these mutations, HBB -28 (A>G) mutations is one of the three most common mutations in China and Southeast Asia patients with β-thalassemia. Correcting this mutation in human embryos may prevent the disease being passed onto future generations and cure anemia. Here we report the first study using base editor (BE) system to correct disease mutant in human embryos. Firstly, we produced a 293T cell line with an exogenous HBB -28 (A>G) mutant fragment for gRNAs and targeting efficiency evaluation. Then we collected primary skin fibroblast cells from a β-thalassemia patient with HBB -28 (A>G) homozygous mutation. Data showed that base editor could precisely correct HBB -28 (A>G) mutation in the patient's primary cells. To model homozygous mutation disease embryos, we constructed nuclear transfer embryos by fusing the lymphocyte or skin fibroblast cells with enucleated in vitro matured (IVM) oocytes. Notably, the gene correction efficiency was over 23.0% in these embryos by base editor. Although these embryos were still mosaic, the percentage of repaired blastomeres was over 20.0%. In addition, we found that base editor variants, with narrowed deamination window, could promote G-to-A conversion at HBB -28 site precisely in human embryos. Collectively, this study demonstrated the feasibility of curing genetic disease in human somatic cells and embryos by base editor system.

  6. Determination of the quenching correction factors for plastic scintillation detectors in therapeutic high-energy proton beams

    PubMed Central

    Wang, L L W; Perles, L A; Archambault, L; Sahoo, N; Mirkovic, D; Beddar, S

    2013-01-01

    The plastic scintillation detectors (PSD) have many advantages over other detectors in small field dosimetry due to its high spatial resolution, excellent water equivalence and instantaneous readout. However, in proton beams, the PSDs will undergo a quenching effect which makes the signal level reduced significantly when the detector is close to Bragg peak where the linear energy transfer (LET) for protons is very high. This study measures the quenching correction factor (QCF) for a PSD in clinical passive-scattering proton beams and investigates the feasibility of using PSDs in depth-dose measurements in proton beams. A polystyrene based PSD (BCF-12, ϕ0.5mm×4mm) was used to measure the depth-dose curves in a water phantom for monoenergetic unmodulated proton beams of nominal energies 100, 180 and 250 MeV. A Markus plane-parallel ion chamber was also used to get the dose distributions for the same proton beams. From these results, the QCF as a function of depth was derived for these proton beams. Next, the LET depth distributions for these proton beams were calculated by using the MCNPX Monte Carlo code, based on the experimentally validated nozzle models for these passive-scattering proton beams. Then the relationship between the QCF and the proton LET could be derived as an empirical formula. Finally, the obtained empirical formula was applied to the PSD measurements to get the corrected depth-dose curves and they were compared to the ion chamber measurements. A linear relationship between QCF and LET, i.e. Birks' formula, was obtained for the proton beams studied. The result is in agreement with the literature. The PSD measurements after the quenching corrections agree with ion chamber measurements within 5%. PSDs are good dosimeters for proton beam measurement if the quenching effect is corrected appropriately. PMID:23128412

  7. Determination of the quenching correction factors for plastic scintillation detectors in therapeutic high-energy proton beams

    NASA Astrophysics Data System (ADS)

    Wang, L. L. W.; Perles, L. A.; Archambault, L.; Sahoo, N.; Mirkovic, D.; Beddar, S.

    2012-12-01

    Plastic scintillation detectors (PSDs) have many advantages over other detectors in small field dosimetry due to their high spatial resolution, excellent water equivalence and instantaneous readout. However, in proton beams, the PSDs undergo a quenching effect which makes the signal level reduced significantly when the detector is close to the Bragg peak where the linear energy transfer (LET) for protons is very high. This study measures the quenching correction factor (QCF) for a PSD in clinical passive-scattering proton beams and investigates the feasibility of using PSDs in depth-dose measurements in proton beams. A polystyrene-based PSD (BCF-12, ϕ0.5 mm × 4 mm) was used to measure the depth-dose curves in a water phantom for monoenergetic unmodulated proton beams of nominal energies 100, 180 and 250 MeV. A Markus plane-parallel ion chamber was also used to get the dose distributions for the same proton beams. From these results, the QCF as a function of depth was derived for these proton beams. Next, the LET depth distributions for these proton beams were calculated by using the MCNPX Monte Carlo code, based on the experimentally validated nozzle models for these passive-scattering proton beams. Then the relationship between the QCF and the proton LET could be derived as an empirical formula. Finally, the obtained empirical formula was applied to the PSD measurements to get the corrected depth-dose curves and they were compared to the ion chamber measurements. A linear relationship between the QCF and LET, i.e. Birks' formula, was obtained for the proton beams studied. The result is in agreement with the literature. The PSD measurements after the quenching corrections agree with ion chamber measurements within 5%. PSDs are good dosimeters for proton beam measurement if the quenching effect is corrected appropriately.

  8. A Zebrafish Heart Failure Model for Assessing Therapeutic Agents.

    PubMed

    Zhu, Xiao-Yu; Wu, Si-Qi; Guo, Sheng-Ya; Yang, Hua; Xia, Bo; Li, Ping; Li, Chun-Qi

    2018-03-20

    Heart failure is a leading cause of death and the development of effective and safe therapeutic agents for heart failure has been proven challenging. In this study, taking advantage of larval zebrafish, we developed a zebrafish heart failure model for drug screening and efficacy assessment. Zebrafish at 2 dpf (days postfertilization) were treated with verapamil at a concentration of 200 μM for 30 min, which were determined as optimum conditions for model development. Tested drugs were administered into zebrafish either by direct soaking or circulation microinjection. After treatment, zebrafish were randomly selected and subjected to either visual observation and image acquisition or record videos under a Zebralab Blood Flow System. The therapeutic effects of drugs on zebrafish heart failure were quantified by calculating the efficiency of heart dilatation, venous congestion, cardiac output, and blood flow dynamics. All 8 human heart failure therapeutic drugs (LCZ696, digoxin, irbesartan, metoprolol, qiliqiangxin capsule, enalapril, shenmai injection, and hydrochlorothiazide) showed significant preventive and therapeutic effects on zebrafish heart failure (p < 0.05, p < 0.01, and p < 0.001) in the zebrafish model. The larval zebrafish heart failure model developed and validated in this study could be used for in vivo heart failure studies and for rapid screening and efficacy assessment of preventive and therapeutic drugs.

  9. Magnetic Resonance-based Motion Correction for Quantitative PET in Simultaneous PET-MR Imaging.

    PubMed

    Rakvongthai, Yothin; El Fakhri, Georges

    2017-07-01

    Motion degrades image quality and quantitation of PET images, and is an obstacle to quantitative PET imaging. Simultaneous PET-MR offers a tool that can be used for correcting the motion in PET images by using anatomic information from MR imaging acquired concurrently. Motion correction can be performed by transforming a set of reconstructed PET images into the same frame or by incorporating the transformation into the system model and reconstructing the motion-corrected image. Several phantom and patient studies have validated that MR-based motion correction strategies have great promise for quantitative PET imaging in simultaneous PET-MR. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Nano-based theranostics for chronic obstructive lung diseases: challenges and therapeutic potential.

    PubMed

    Vij, Neeraj

    2011-09-01

    The major challenges in the delivery and therapeutic efficacy of nano-delivery systems in chronic obstructive airway conditions are airway defense, severe inflammation and mucous hypersecretion. Chronic airway inflammation and mucous hypersecretion are hallmarks of chronic obstructive airway diseases, including asthma, COPD (chronic obstructive pulmonary disease) and CF (cystic fibrosis). Distinct etiologies drive inflammation and mucous hypersecretion in these diseases, which are further induced by infection or components of cigarette smoke. Controlling chronic inflammation is at the root of treatments such as corticosteroids, antibiotics or other available drugs, which pose the challenge of sustained delivery of drugs to target cells or tissues. In spite of the wide application of nano-based drug delivery systems, very few are tested to date. Targeted nanoparticle-mediated sustained drug delivery is required to control inflammatory cell chemotaxis, fibrosis, protease-mediated chronic emphysema and/or chronic lung obstruction in COPD. Moreover, targeted epithelial delivery is indispensable for correcting the underlying defects in CF and targeted inflammatory cell delivery for controlling other chronic inflammatory lung diseases. We propose that the design and development of nano-based targeted theranostic vehicles with therapeutic, imaging and airway-defense penetrating capability, will be invaluable for treating chronic obstructive lung diseases. This paper discusses a novel nano-theranostic strategy that we are currently evaluating to treat the underlying cause of CF and COPD lung disease.

  11. Wavelet-based functional linear mixed models: an application to measurement error-corrected distributed lag models.

    PubMed

    Malloy, Elizabeth J; Morris, Jeffrey S; Adar, Sara D; Suh, Helen; Gold, Diane R; Coull, Brent A

    2010-07-01

    Frequently, exposure data are measured over time on a grid of discrete values that collectively define a functional observation. In many applications, researchers are interested in using these measurements as covariates to predict a scalar response in a regression setting, with interest focusing on the most biologically relevant time window of exposure. One example is in panel studies of the health effects of particulate matter (PM), where particle levels are measured over time. In such studies, there are many more values of the functional data than observations in the data set so that regularization of the corresponding functional regression coefficient is necessary for estimation. Additional issues in this setting are the possibility of exposure measurement error and the need to incorporate additional potential confounders, such as meteorological or co-pollutant measures, that themselves may have effects that vary over time. To accommodate all these features, we develop wavelet-based linear mixed distributed lag models that incorporate repeated measures of functional data as covariates into a linear mixed model. A Bayesian approach to model fitting uses wavelet shrinkage to regularize functional coefficients. We show that, as long as the exposure error induces fine-scale variability in the functional exposure profile and the distributed lag function representing the exposure effect varies smoothly in time, the model corrects for the exposure measurement error without further adjustment. Both these conditions are likely to hold in the environmental applications we consider. We examine properties of the method using simulations and apply the method to data from a study examining the association between PM, measured as hourly averages for 1-7 days, and markers of acute systemic inflammation. We use the method to fully control for the effects of confounding by other time-varying predictors, such as temperature and co-pollutants.

  12. Level 2 Therapeutic Model Site

    ERIC Educational Resources Information Center

    Spears, Brad; Sanchez, David; Bishop, Jane; Rogers, Sharon; DeJong, Judith A.

    2006-01-01

    L2, one of the original sites first funded under the Therapeutic Residential Model Initiative in 2001-2002, is operated as a peripheral dormitory This dormitory cares for 185 boys and girls in grades 1-12 who attend local public schools. L2 presented an outstanding proposal which identified gaps in services and presented a reasonable budget to…

  13. Fc-Mediated Anomalous Biodistribution of Therapeutic Antibodies in Immunodeficient Mouse Models.

    PubMed

    Sharma, Sai Kiran; Chow, Andrew; Monette, Sebastien; Vivier, Delphine; Pourat, Jacob; Edwards, Kimberly J; Dilling, Thomas R; Abdel-Atti, Dalya; Zeglis, Brian M; Poirier, John T; Lewis, Jason S

    2018-04-01

    A critical benchmark in the development of antibody-based therapeutics is demonstration of efficacy in preclinical mouse models of human disease, many of which rely on immunodeficient mice. However, relatively little is known about how the biology of various immunodeficient strains impacts the in vivo fate of these drugs. Here we used immunoPET radiotracers prepared from humanized, chimeric, and murine mAbs against four therapeutic oncologic targets to interrogate their biodistribution in four different strains of immunodeficient mice bearing lung, prostate, and ovarian cancer xenografts. The immunodeficiency status of the mouse host as well as both the biological origin and glycosylation of the antibody contributed significantly to the anomalous biodistribution of therapeutic monoclonal antibodies in an Fc receptor-dependent manner. These findings may have important implications for the preclinical evaluation of Fc-containing therapeutics and highlight a clear need for biodistribution studies in the early stages of antibody drug development. Significance: Fc/FcγR-mediated immunobiology of the experimental host is a key determinant to preclinical in vivo tumor targeting and efficacy of therapeutic antibodies. Cancer Res; 78(7); 1820-32. ©2018 AACR . ©2018 American Association for Cancer Research.

  14. Correcting Biases in a lower resolution global circulation model with data assimilation

    NASA Astrophysics Data System (ADS)

    Canter, Martin; Barth, Alexander

    2016-04-01

    With this work, we aim at developping a new method of bias correction using data assimilation. This method is based on the stochastic forcing of a model to correct bias. First, through a preliminary run, we estimate the bias of the model and its possible sources. Then, we establish a forcing term which is directly added inside the model's equations. We create an ensemble of runs and consider the forcing term as a control variable during the assimilation of observations. We then use this analysed forcing term to correct the bias of the model. Since the forcing is added inside the model, it acts as a source term, unlike external forcings such as wind. This procedure has been developed and successfully tested with a twin experiment on a Lorenz 95 model. It is currently being applied and tested on the sea ice ocean NEMO LIM model, which is used in the PredAntar project. NEMO LIM is a global and low resolution (2 degrees) coupled model (hydrodynamic model and sea ice model) with long time steps allowing simulations over several decades. Due to its low resolution, the model is subject to bias in area where strong currents are present. We aim at correcting this bias by using perturbed current fields from higher resolution models and randomly generated perturbations. The random perturbations need to be constrained in order to respect the physical properties of the ocean, and not create unwanted phenomena. To construct those random perturbations, we first create a random field with the Diva tool (Data-Interpolating Variational Analysis). Using a cost function, this tool penalizes abrupt variations in the field, while using a custom correlation length. It also decouples disconnected areas based on topography. Then, we filter the field to smoothen it and remove small scale variations. We use this field as a random stream function, and take its derivatives to get zonal and meridional velocity fields. We also constrain the stream function along the coasts in order not to have

  15. Advanced corrections for InSAR using GPS and numerical weather models

    NASA Astrophysics Data System (ADS)

    Foster, J. H.; Cossu, F.; Amelung, F.; Businger, S.; Cherubini, T.

    2016-12-01

    The complex spatial and temporal changes in the atmospheric propagation delay of the radar signal remain the single biggest factor limiting Interferometric Synthetic Aperture Radar's (InSAR) potential for hazard monitoring and mitigation. A new generation of InSAR systems is being built and launched, and optimizing the science and hazard applications of these systems requires advanced methodologies to mitigate tropospheric noise. We present preliminary results from an investigation into the application of GPS and numerical weather models for generating tropospheric correction fields. We use the Weather Research and Forecasting (WRF) model to generate a 900 m spatial resolution atmospheric model covering the Big Island of Hawaii and an even higher, 300 m resolution grid over Mauna Loa and Kilauea volcanoes. By comparing a range of approaches, from the simplest, using reanalyses based on typically available meteorological observations, through to the "kitchen-sink" approach of assimilating all relevant data sets into our custom analyses, we examine the impact of the additional data sets on the atmospheric models and their effectiveness in correcting InSAR data. We focus particularly on the assimilation of information from the more than 60 GPS sites in the island. We ingest zenith tropospheric delay estimates from these sites directly into the WRF analyses, and also perform double-difference tomography using the phase residuals from the GPS processing to robustly incorporate information on atmospheric heterogeneity from the GPS data into the models. We assess our performance through comparisons of our atmospheric models with external observations not ingested into the model, and through the effectiveness of the derived phase screens in reducing InSAR variance. This work will produce best-practice recommendations for the use of weather models for InSAR correction, and inform efforts to design a global strategy for the NISAR mission, for both low-latency and definitive

  16. Sensitivity of atmospheric correction to loading and model of the aerosol

    NASA Astrophysics Data System (ADS)

    Bassani, Cristiana; Braga, Federica; Bresciani, Mariano; Giardino, Claudia; Adamo, Maria; Ananasso, Cristina; Alberotanza, Luigi

    2013-04-01

    The physically-based atmospheric correction requires knowledge of the atmospheric conditions during the remotely data acquisitions [Guanter et al., 2007; Gao et al., 2009; Kotchenova et al. 2009; Bassani et al., 2010]. The propagation of solar radiation in the atmospheric window of visible and near-infrared spectral domain, depends on the aerosol scattering. The effects of solar beam extinction are related to the aerosol loading, by the aerosol optical thickness @550nm (AOT) parameter [Kaufman et al., 1997; Vermote et al., 1997; Kotchenova et al., 2008; Kokhanovsky et al. 2010], and also to the aerosol model. Recently, the atmospheric correction of hyperspectral data is considered sensitive to the micro-physical and optical characteristics of aerosol, as reported in [Bassani et al., 2012]. Within the framework of CLAM-PHYM (Coasts and Lake Assessment and Monitoring by PRISMA HYperspectral Mission) project, funded by Italian Space Agency (ASI), the role of the aerosol model on the accuracy of the atmospheric correction of hyperspectral image acquired over water target is investigated. In this work, the results of the atmospheric correction of HICO (Hyperspectral Imager for the Coastal Ocean) images acquired on Northern Adriatic Sea in the Mediterranean are presented. The atmospheric correction has been performed by an algorithm specifically developed for HICO sensor. The algorithm is based on the equation presented in [Vermote et al., 1997; Bassani et al., 2010] by using the last generation of the Second Simulation of a Satellite Signal in the Solar Spectrum (6S) radiative transfer code [Kotchenova et al., 2008; Vermote et al., 2009]. The sensitive analysis of the atmospheric correction of HICO data is performed with respect to the aerosol optical and micro-physical properties used to define the aerosol model. In particular, a variable mixture of the four basic components: dust- like, oceanic, water-soluble, and soot, has been considered. The water reflectance

  17. Discovery and design of carbohydrate-based therapeutics.

    PubMed

    Cipolla, Laura; Araújo, Ana C; Bini, Davide; Gabrielli, Luca; Russo, Laura; Shaikh, Nasrin

    2010-08-01

    Till now, the importance of carbohydrates has been underscored, if compared with the two other major classes of biopolymers such as oligonucleotides and proteins. Recent advances in glycobiology and glycochemistry have imparted a strong interest in the study of this enormous family of biomolecules. Carbohydrates have been shown to be implicated in recognition processes, such as cell-cell adhesion, cell-extracellular matrix adhesion and cell-intruder recognition phenomena. In addition, carbohydrates are recognized as differentiation markers and as antigenic determinants. Due to their relevant biological role, carbohydrates are promising candidates for drug design and disease treatment. However, the growing number of human disorders known as congenital disorders of glycosylation that are being identified as resulting from abnormalities in glycan structures and protein glycosylation strongly indicates that a fast development of glycobiology, glycochemistry and glycomedicine is highly desirable. The topics give an overview of different approaches that have been used to date for the design of carbohydrate-based therapeutics; this includes the use of native synthetic carbohydrates, the use of carbohydrate mimics designed on the basis of their native counterpart, the use of carbohydrates as scaffolds and finally the design of glyco-fused therapeutics, one of the most recent approaches. The review covers mainly literature that has appeared since 2000, except for a few papers cited for historical reasons. The reader will gain an overview of the current strategies applied to the design of carbohydrate-based therapeutics; in particular, the advantages/disadvantages of different approaches are highlighted. The topic is presented in a general, basic manner and will hopefully be a useful resource for all readers who are not familiar with it. In addition, in order to stress the potentialities of carbohydrates, several examples of carbohydrate-based marketed therapeutics are given

  18. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking.

    PubMed

    Lee, Min Su; Ju, Hojin; Song, Jin Woo; Park, Chan Gook

    2015-11-06

    In this paper, we present a method for finding the enhanced heading and position of pedestrians by fusing the Zero velocity UPdaTe (ZUPT)-based pedestrian dead reckoning (PDR) and the kinematic constraints of the lower human body. ZUPT is a well known algorithm for PDR, and provides a sufficiently accurate position solution for short term periods, but it cannot guarantee a stable and reliable heading because it suffers from magnetic disturbance in determining heading angles, which degrades the overall position accuracy as time passes. The basic idea of the proposed algorithm is integrating the left and right foot positions obtained by ZUPTs with the heading and position information from an IMU mounted on the waist. To integrate this information, a kinematic model of the lower human body, which is calculated by using orientation sensors mounted on both thighs and calves, is adopted. We note that the position of the left and right feet cannot be apart because of the kinematic constraints of the body, so the kinematic model generates new measurements for the waist position. The Extended Kalman Filter (EKF) on the waist data that estimates and corrects error states uses these measurements and magnetic heading measurements, which enhances the heading accuracy. The updated position information is fed into the foot mounted sensors, and reupdate processes are performed to correct the position error of each foot. The proposed update-reupdate technique consequently ensures improved observability of error states and position accuracy. Moreover, the proposed method provides all the information about the lower human body, so that it can be applied more effectively to motion tracking. The effectiveness of the proposed algorithm is verified via experimental results, which show that a 1.25% Return Position Error (RPE) with respect to walking distance is achieved.

  19. Detection and correction of laser induced breakdown spectroscopy spectral background based on spline interpolation method

    NASA Astrophysics Data System (ADS)

    Tan, Bing; Huang, Min; Zhu, Qibing; Guo, Ya; Qin, Jianwei

    2017-12-01

    Laser-induced breakdown spectroscopy (LIBS) is an analytical technique that has gained increasing attention because of many applications. The production of continuous background in LIBS is inevitable because of factors associated with laser energy, gate width, time delay, and experimental environment. The continuous background significantly influences the analysis of the spectrum. Researchers have proposed several background correction methods, such as polynomial fitting, Lorenz fitting and model-free methods. However, less of them apply these methods in the field of LIBS Technology, particularly in qualitative and quantitative analyses. This study proposes a method based on spline interpolation for detecting and estimating the continuous background spectrum according to its smooth property characteristic. Experiment on the background correction simulation indicated that, the spline interpolation method acquired the largest signal-to-background ratio (SBR) over polynomial fitting, Lorenz fitting and model-free method after background correction. These background correction methods all acquire larger SBR values than that acquired before background correction (The SBR value before background correction is 10.0992, whereas the SBR values after background correction by spline interpolation, polynomial fitting, Lorentz fitting, and model-free methods are 26.9576, 24.6828, 18.9770, and 25.6273 respectively). After adding random noise with different kinds of signal-to-noise ratio to the spectrum, spline interpolation method acquires large SBR value, whereas polynomial fitting and model-free method obtain low SBR values. All of the background correction methods exhibit improved quantitative results of Cu than those acquired before background correction (The linear correlation coefficient value before background correction is 0.9776. Moreover, the linear correlation coefficient values after background correction using spline interpolation, polynomial fitting, Lorentz

  20. A Multidimensional B-Spline Correction for Accurate Modeling Sugar Puckering in QM/MM Simulations.

    PubMed

    Huang, Ming; Dissanayake, Thakshila; Kuechler, Erich; Radak, Brian K; Lee, Tai-Sung; Giese, Timothy J; York, Darrin M

    2017-09-12

    The computational efficiency of approximate quantum mechanical methods allows their use for the construction of multidimensional reaction free energy profiles. It has recently been demonstrated that quantum models based on the neglect of diatomic differential overlap (NNDO) approximation have difficulty modeling deoxyribose and ribose sugar ring puckers and thus limit their predictive value in the study of RNA and DNA systems. A method has been introduced in our previous work to improve the description of the sugar puckering conformational landscape that uses a multidimensional B-spline correction map (BMAP correction) for systems involving intrinsically coupled torsion angles. This method greatly improved the adiabatic potential energy surface profiles of DNA and RNA sugar rings relative to high-level ab initio methods even for highly problematic NDDO-based models. In the present work, a BMAP correction is developed, implemented, and tested in molecular dynamics simulations using the AM1/d-PhoT semiempirical Hamiltonian for biological phosphoryl transfer reactions. Results are presented for gas-phase adiabatic potential energy surfaces of RNA transesterification model reactions and condensed-phase QM/MM free energy surfaces for nonenzymatic and RNase A-catalyzed transesterification reactions. The results show that the BMAP correction is stable, efficient, and leads to improvement in both the potential energy and free energy profiles for the reactions studied, as compared with ab initio and experimental reference data. Exploration of the effect of the size of the quantum mechanical region indicates the best agreement with experimental reaction barriers occurs when the full CpA dinucleotide substrate is treated quantum mechanically with the sugar pucker correction.

  1. Beam hardening correction for interior tomography based on exponential formed model and radon inversion transform

    NASA Astrophysics Data System (ADS)

    Chen, Siyu; Zhang, Hanming; Li, Lei; Xi, Xiaoqi; Han, Yu; Yan, Bin

    2016-10-01

    X-ray computed tomography (CT) has been extensively applied in industrial non-destructive testing (NDT). However, in practical applications, the X-ray beam polychromaticity often results in beam hardening problems for image reconstruction. The beam hardening artifacts, which manifested as cupping, streaks and flares, not only debase the image quality, but also disturb the subsequent analyses. Unfortunately, conventional CT scanning requires that the scanned object is completely covered by the field of view (FOV), the state-of-art beam hardening correction methods only consider the ideal scanning configuration, and often suffer problems for interior tomography due to the projection truncation. Aiming at this problem, this paper proposed a beam hardening correction method based on radon inversion transform for interior tomography. Experimental results show that, compared to the conventional correction algorithms, the proposed approach has achieved excellent performance in both beam hardening artifacts reduction and truncation artifacts suppression. Therefore, the presented method has vitally theoretic and practicable meaning in artifacts correction of industrial CT.

  2. Learning versus correct models: influence of model type on the learning of a free-weight squat lift.

    PubMed

    McCullagh, P; Meyer, K N

    1997-03-01

    It has been assumed that demonstrating the correct movement is the best way to impart task-relevant information. However, empirical verification with simple laboratory skills has shown that using a learning model (showing an individual in the process of acquiring the skill to be learned) may accelerate skill acquisition and increase retention more than using a correct model. The purpose of the present study was to compare the effectiveness of viewing correct versus learning models on the acquisition of a sport skill (free-weight squat lift). Forty female participants were assigned to four learning conditions: physical practice receiving feedback, learning model with model feedback, correct model with model feedback, and learning model without model feedback. Results indicated that viewing either a correct or learning model was equally effective in learning correct form in the squat lift.

  3. Fiducial marker-based correction for involuntary motion in weight-bearing C-arm CT scanning of knees. Part I. Numerical model-based optimization.

    PubMed

    Choi, Jang-Hwan; Fahrig, Rebecca; Keil, Andreas; Besier, Thor F; Pal, Saikat; McWalter, Emily J; Beaupré, Gary S; Maier, Andreas

    2013-09-01

    Human subjects in standing positions are apt to show much more involuntary motion than in supine positions. The authors aimed to simulate a complicated realistic lower body movement using the four-dimensional (4D) digital extended cardiac-torso (XCAT) phantom. The authors also investigated fiducial marker-based motion compensation methods in two-dimensional (2D) and three-dimensional (3D) space. The level of involuntary movement-induced artifacts and image quality improvement were investigated after applying each method. An optical tracking system with eight cameras and seven retroreflective markers enabled us to track involuntary motion of the lower body of nine healthy subjects holding a squat position at 60° of flexion. The XCAT-based knee model was developed using the 4D XCAT phantom and the optical tracking data acquired at 120 Hz. The authors divided the lower body in the XCAT into six parts and applied unique affine transforms to each so that the motion (6 degrees of freedom) could be synchronized with the optical markers' location at each time frame. The control points of the XCAT were tessellated into triangles and 248 projection images were created based on intersections of each ray and monochromatic absorption. The tracking data sets with the largest motion (Subject 2) and the smallest motion (Subject 5) among the nine data sets were used to animate the XCAT knee model. The authors defined eight skin control points well distributed around the knees as pseudo-fiducial markers which functioned as a reference in motion correction. Motion compensation was done in the following ways: (1) simple projection shifting in 2D, (2) deformable projection warping in 2D, and (3) rigid body warping in 3D. Graphics hardware accelerated filtered backprojection was implemented and combined with the three correction methods in order to speed up the simulation process. Correction fidelity was evaluated as a function of number of markers used (4-12) and marker distribution

  4. Applications of lipid based formulation technologies in the delivery of biotechnology-based therapeutics.

    PubMed

    du Plessis, Lissinda H; Marais, Etienne B; Mohammed, Faruq; Kotzé, Awie F

    2014-01-01

    In the last decades several new biotechnologically-based therapeutics have been developed due to progress in genetic engineering. A growing challenge facing pharmaceutical scientists is formulating these compounds into oral dosage forms with adequate bioavailability. An increasingly popular approach to formulate biotechnology-based therapeutics is the use of lipid based formulation technologies. This review highlights the importance of lipid based drug delivery systems in the formulation of oral biotechnology based therapeutics including peptides, proteins, DNA, siRNA and vaccines. The different production procedures used to achieve high encapsulation efficiencies of the bioactives are discussed, as well as the factors influencing the choice of excipient. Lipid based colloidal drug delivery systems including liposomes and solid lipid nanoparticles are reviewed with a focus on recent advances and updates. We further describe microemulsions and self-emulsifying drug delivery systems and recent findings on bioactive delivery. We conclude the review with a few examples on novel lipid based formulation technologies.

  5. A Flux-Corrected Transport Based Hydrodynamic Model for the Plasmasphere Refilling Problem following Geomagnetic Storms

    NASA Astrophysics Data System (ADS)

    Chatterjee, K.; Schunk, R. W.

    2017-12-01

    The refilling of the plasmasphere following a geomagnetic storm remains one of the longstanding problems in the area of ionosphere-magnetosphere coupling. Both diffusion and hydrodynamic approximations have been adopted for the modeling and solution of this problem. The diffusion approximation neglects the nonlinear inertial term in the momentum equation and so this approximation is not rigorously valid immediately after the storm. Over the last few years, we have developed a hydrodynamic refilling model using the flux-corrected transport method, a numerical method that is extremely well suited to handling nonlinear problems with shocks and discontinuities. The plasma transport equations are solved along 1D closed magnetic field lines that connect conjugate ionospheres and the model currently includes three ion (H+, O+, He+) and two neutral (O, H) species. In this work, each ion species under consideration has been modeled as two separate streams emanating from the conjugate hemispheres and the model correctly predicts supersonic ion speeds and the presence of high levels of Helium during the early hours of refilling. The ultimate objective of this research is the development of a 3D model for the plasmasphere refilling problem and with additional development, the same methodology can potentially be applied to the study of other complex space plasma coupling problems in closed flux tube geometries. Index Terms: 2447 Modeling and forecasting [IONOSPHERE] 2753 Numerical modeling [MAGNETOSPHERIC PHYSICS] 7959 Models [SPACE WEATHER

  6. Impacts of correcting the inter-variable correlation of climate model outputs on hydrological modeling

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Li, Chao; Brissette, François P.; Chen, Hua; Wang, Mingna; Essou, Gilles R. C.

    2018-05-01

    Bias correction is usually implemented prior to using climate model outputs for impact studies. However, bias correction methods that are commonly used treat climate variables independently and often ignore inter-variable dependencies. The effects of ignoring such dependencies on impact studies need to be investigated. This study aims to assess the impacts of correcting the inter-variable correlation of climate model outputs on hydrological modeling. To this end, a joint bias correction (JBC) method which corrects the joint distribution of two variables as a whole is compared with an independent bias correction (IBC) method; this is considered in terms of correcting simulations of precipitation and temperature from 26 climate models for hydrological modeling over 12 watersheds located in various climate regimes. The results show that the simulated precipitation and temperature are considerably biased not only in the individual distributions, but also in their correlations, which in turn result in biased hydrological simulations. In addition to reducing the biases of the individual characteristics of precipitation and temperature, the JBC method can also reduce the bias in precipitation-temperature (P-T) correlations. In terms of hydrological modeling, the JBC method performs significantly better than the IBC method for 11 out of the 12 watersheds over the calibration period. For the validation period, the advantages of the JBC method are greatly reduced as the performance becomes dependent on the watershed, GCM and hydrological metric considered. For arid/tropical and snowfall-rainfall-mixed watersheds, JBC performs better than IBC. For snowfall- or rainfall-dominated watersheds, however, the two methods behave similarly, with IBC performing somewhat better than JBC. Overall, the results emphasize the advantages of correcting the P-T correlation when using climate model-simulated precipitation and temperature to assess the impact of climate change on watershed

  7. Brain Based Instruction in Correctional Settings: Strategies for Teachers.

    ERIC Educational Resources Information Center

    Becktold, Toni Hill

    2001-01-01

    Brain-based learning strategies (learner choice, movement, small groups) may be inappropriate in corrections for security reasons. Problems encountered in correctional education (attention deficit disorder, learned helplessness) complicate the use of these strategies. Incorporating brain-based instruction in these settings requires creativity and…

  8. Linear model correction: A method for transferring a near-infrared multivariate calibration model without standard samples.

    PubMed

    Liu, Yan; Cai, Wensheng; Shao, Xueguang

    2016-12-05

    Calibration transfer is essential for practical applications of near infrared (NIR) spectroscopy because the measurements of the spectra may be performed on different instruments and the difference between the instruments must be corrected. For most of calibration transfer methods, standard samples are necessary to construct the transfer model using the spectra of the samples measured on two instruments, named as master and slave instrument, respectively. In this work, a method named as linear model correction (LMC) is proposed for calibration transfer without standard samples. The method is based on the fact that, for the samples with similar physical and chemical properties, the spectra measured on different instruments are linearly correlated. The fact makes the coefficients of the linear models constructed by the spectra measured on different instruments are similar in profile. Therefore, by using the constrained optimization method, the coefficients of the master model can be transferred into that of the slave model with a few spectra measured on slave instrument. Two NIR datasets of corn and plant leaf samples measured with different instruments are used to test the performance of the method. The results show that, for both the datasets, the spectra can be correctly predicted using the transferred partial least squares (PLS) models. Because standard samples are not necessary in the method, it may be more useful in practical uses. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. [Spectral scatter correction of coal samples based on quasi-linear local weighted method].

    PubMed

    Lei, Meng; Li, Ming; Ma, Xiao-Ping; Miao, Yan-Zi; Wang, Jian-Sheng

    2014-07-01

    The present paper puts forth a new spectral correction method based on quasi-linear expression and local weighted function. The first stage of the method is to search 3 quasi-linear expressions to replace the original linear expression in MSC method, such as quadratic, cubic and growth curve expression. Then the local weighted function is constructed by introducing 4 kernel functions, such as Gaussian, Epanechnikov, Biweight and Triweight kernel function. After adding the function in the basic estimation equation, the dependency between the original and ideal spectra is described more accurately and meticulously at each wavelength point. Furthermore, two analytical models were established respectively based on PLS and PCA-BP neural network method, which can be used for estimating the accuracy of corrected spectra. At last, the optimal correction mode was determined by the analytical results with different combination of quasi-linear expression and local weighted function. The spectra of the same coal sample have different noise ratios while the coal sample was prepared under different particle sizes. To validate the effectiveness of this method, the experiment analyzed the correction results of 3 spectral data sets with the particle sizes of 0.2, 1 and 3 mm. The results show that the proposed method can eliminate the scattering influence, and also can enhance the information of spectral peaks. This paper proves a more efficient way to enhance the correlation between corrected spectra and coal qualities significantly, and improve the accuracy and stability of the analytical model substantially.

  10. [Identification of novel therapeutically effective antibiotics using silkworm infection model].

    PubMed

    Hamamoto, Hiroshi; Urai, Makoto; Paudel, Atmika; Horie, Ryo; Murakami, Kazuhisa; Sekimizu, Kazuhisa

    2012-01-01

    Most antibiotics obtained by in vitro screening with antibacterial activity have inappropriate properties as medicines due to their toxicity and pharmacodynamics in animal bodies. Thus, evaluation of the therapeutic effects of these samples using animal models is essential in the crude stage. Mammals are not suitable for therapeutic evaluation of a large number of samples due to high costs and ethical issues. We propose the use of silkworms (Bombyx mori) as model animals for screening therapeutically effective antibiotics. Silkworms are infected by various pathogenic bacteria and are effectively treated with similar ED(50) values of clinically used antibiotics. Furthermore, the drug metabolism pathways, such as cytochrome P450 and conjugation systems, are similar between silkworms and mammals. Silkworms have many advantages compared with other infection models, such as their 1) low cost, 2) few associated ethical problems, 3) adequate body size for easily handling, and 4) easier separation of organs and hemolymph. These features of the silkworm allow for efficient screening of therapeutically effective antibiotics. In this review, we discuss the advantages of the silkworm model in the early stages of drug development and the screening results of some antibiotics using the silkworm infection model.

  11. Genetic Correction of Human Induced Pluripotent Stem Cells from Patients with Spinal Muscular Atrophy

    PubMed Central

    Corti, Stefania; Nizzardo, Monica; Simone, Chiara; Falcone, Marianna; Nardini, Martina; Ronchi, Dario; Donadoni, Chiara; Salani, Sabrina; Riboldi, Giulietta; Magri, Francesca; Menozzi, Giorgia; Bonaglia, Clara; Rizzo, Federica; Bresolin, Nereo; Comi, Giacomo P.

    2016-01-01

    Spinal muscular atrophy (SMA) is among the most common genetic neurological diseases that cause infant mortality. Induced pluripotent stem cells (iPSCs) generated from skin fibroblasts from SMA patients and genetically corrected have been proposed to be useful for autologous cell therapy. We generated iPSCs from SMA patients (SMA-iPSCs) using nonviral, nonintegrating episomal vectors and used a targeted gene correction approach based on single-stranded oligonucleotides to convert the survival motor neuron 2 (SMN2) gene into an SMN1-like gene. Corrected iPSC lines contained no exogenous sequences. Motor neurons formed by differentiation of uncorrected SMA-iPSCs reproduced disease-specific features. These features were ameliorated in motor neurons derived from genetically corrected SMA-iPSCs. The different gene splicing profile in SMA-iPSC motor neurons was rescued after genetic correction. The transplantation of corrected motor neurons derived from SMA-iPSCs into an SMA mouse model extended the life span of the animals and improved the disease phenotype. These results suggest that generating genetically corrected SMA-iPSCs and differentiating them into motor neurons may provide a source of motor neurons for therapeutic transplantation for SMA. PMID:23253609

  12. Increasing the Endoplasmic Reticulum Pool of the F508del Allele of the Cystic Fibrosis Transmembrane Conductance Regulator Leads to Greater Folding Correction by Small Molecule Therapeutics.

    PubMed

    Chung, W Joon; Goeckeler-Fried, Jennifer L; Havasi, Viktoria; Chiang, Annette; Rowe, Steven M; Plyler, Zackery E; Hong, Jeong S; Mazur, Marina; Piazza, Gary A; Keeton, Adam B; White, E Lucile; Rasmussen, Lynn; Weissman, Allan M; Denny, R Aldrin; Brodsky, Jeffrey L; Sorscher, Eric J

    2016-01-01

    Small molecules that correct the folding defects and enhance surface localization of the F508del mutation in the Cystic Fibrosis Transmembrane conductance Regulator (CFTR) comprise an important therapeutic strategy for cystic fibrosis lung disease. However, compounds that rescue the F508del mutant protein to wild type (WT) levels have not been identified. In this report, we consider obstacles to obtaining robust and therapeutically relevant levels of F508del CFTR. For example, markedly diminished steady state amounts of F508del CFTR compared to WT CFTR are present in recombinant bronchial epithelial cell lines, even when much higher levels of mutant transcript are present. In human primary airway cells, the paucity of Band B F508del is even more pronounced, although F508del and WT mRNA concentrations are comparable. Therefore, to augment levels of "repairable" F508del CFTR and identify small molecules that then correct this pool, we developed compound library screening protocols based on automated protein detection. First, cell-based imaging measurements were used to semi-quantitatively estimate distribution of F508del CFTR by high content analysis of two-dimensional images. We evaluated ~2,000 known bioactive compounds from the NIH Roadmap Molecular Libraries Small Molecule Repository in a pilot screen and identified agents that increase the F508del protein pool. Second, we analyzed ~10,000 compounds representing diverse chemical scaffolds for effects on total CFTR expression using a multi-plate fluorescence protocol and describe compounds that promote F508del maturation. Together, our findings demonstrate proof of principle that agents identified in this fashion can augment the level of endoplasmic reticulum (ER) resident "Band B" F508del CFTR suitable for pharmacologic correction. As further evidence in support of this strategy, PYR-41-a compound that inhibits the E1 ubiquitin activating enzyme-was shown to synergistically enhance F508del rescue by C18, a small

  13. Ground-Based Correction of Remote-Sensing Spectral Imagery

    NASA Technical Reports Server (NTRS)

    Alder-Golden, Steven M.; Rochford, Peter; Matthew, Michael; Berk, Alexander

    2007-01-01

    Software has been developed for an improved method of correcting for the atmospheric optical effects (primarily, effects of aerosols and water vapor) in spectral images of the surface of the Earth acquired by airborne and spaceborne remote-sensing instruments. In this method, the variables needed for the corrections are extracted from the readings of a radiometer located on the ground in the vicinity of the scene of interest. The software includes algorithms that analyze measurement data acquired from a shadow-band radiometer. These algorithms are based on a prior radiation transport software model, called MODTRAN, that has been developed through several versions up to what are now known as MODTRAN4 and MODTRAN5 . These components have been integrated with a user-friendly Interactive Data Language (IDL) front end and an advanced version of MODTRAN4. Software tools for handling general data formats, performing a Langley-type calibration, and generating an output file of retrieved atmospheric parameters for use in another atmospheric-correction computer program known as FLAASH have also been incorporated into the present soft-ware. Concomitantly with the soft-ware described thus far, there has been developed a version of FLAASH that utilizes the retrieved atmospheric parameters to process spectral image data.

  14. Real-time Monitoring of Nanoparticle-based Therapeutics: A Review.

    PubMed

    Han, Qingqing; Niu, Meng; Wu, Qirun; Zhong, Hongshan

    2018-01-01

    With the development of nanomaterials, nanoparticle-based therapeutics have found increasing application in various fields, including clinical and basic medicine. Real-time monitoring of nanoparticle-based therapeutics is considered critical to both pharmacology and pharmacokinetics. In this review, we discuss the different methods of real-time monitoring of nanoparticle-based therapeutics comprising different types of nanoparticle carriers, such as metal nanoparticles, inorganic nonmetallic nanoparticles, biodegradable polymer nanoparticles, and biological nanoparticles. In the light of examples and analyses, we conclude that the methods of analysis of the four types of nanoparticle carriers are commonly used methods and mostly not necessary. Under most circumstances, real-time monitoring differs according to nanoparticle type, drugs, diseases, and surroundings. With technology development and advanced researches, there have been increasing measures to track the real-time changes in nanoparticles, and this has led to great progress in pharmacology and therapeutics. However, future studies are warranted to determine the accuracy, applicability, and practicability of different technologies. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  15. Numerical model and analysis of an energy-based system using microwaves for vision correction

    NASA Astrophysics Data System (ADS)

    Pertaub, Radha; Ryan, Thomas P.

    2009-02-01

    A treatment system was developed utilizing a microwave-based procedure capable of treating myopia and offering a less invasive alternative to laser vision correction without cutting the eye. Microwave thermal treatment elevates the temperature of the paracentral stroma of the cornea to create a predictable refractive change while preserving the epithelium and deeper structures of the eye. A pattern of shrinkage outside of the optical zone may be sufficient to flatten the central cornea. A numerical model was set up to investigate both the electromagnetic field and the resultant transient temperature distribution. A finite element model of the eye was created and the axisymmetric distribution of temperature calculated to characterize the combination of controlled power deposition combined with surface cooling to spare the epithelium, yet shrink the cornea, in a circularly symmetric fashion. The model variables included microwave power levels and pulse width, cooling timing, dielectric material and thickness, and electrode configuration and gap. Results showed that power is totally contained within the cornea and no significant temperature rise was found outside the anterior cornea, due to the near-field design of the applicator and limited thermal conduction with the short on-time. Target isothermal regions were plotted as a result of common energy parameters along with a variety of electrode shapes and sizes, which were compared. Dose plots showed the relationship between energy and target isothermic regions.

  16. Optimal Model-Based Fault Estimation and Correction for Particle Accelerators and Industrial Plants Using Combined Support Vector Machines and First Principles Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sayyar-Rodsari, Bijan; Schweiger, Carl; /SLAC /Pavilion Technologies, Inc., Austin, TX

    2010-08-25

    Timely estimation of deviations from optimal performance in complex systems and the ability to identify corrective measures in response to the estimated parameter deviations has been the subject of extensive research over the past four decades. The implications in terms of lost revenue from costly industrial processes, operation of large-scale public works projects and the volume of the published literature on this topic clearly indicates the significance of the problem. Applications range from manufacturing industries (integrated circuits, automotive, etc.), to large-scale chemical plants, pharmaceutical production, power distribution grids, and avionics. In this project we investigated a new framework for buildingmore » parsimonious models that are suited for diagnosis and fault estimation of complex technical systems. We used Support Vector Machines (SVMs) to model potentially time-varying parameters of a First-Principles (FP) description of the process. The combined SVM & FP model was built (i.e. model parameters were trained) using constrained optimization techniques. We used the trained models to estimate faults affecting simulated beam lifetime. In the case where a large number of process inputs are required for model-based fault estimation, the proposed framework performs an optimal nonlinear principal component analysis of the large-scale input space, and creates a lower dimension feature space in which fault estimation results can be effectively presented to the operation personnel. To fulfill the main technical objectives of the Phase I research, our Phase I efforts have focused on: (1) SVM Training in a Combined Model Structure - We developed the software for the constrained training of the SVMs in a combined model structure, and successfully modeled the parameters of a first-principles model for beam lifetime with support vectors. (2) Higher-order Fidelity of the Combined Model - We used constrained training to ensure that the output of the SVM (i

  17. SU-F-T-143: Implementation of a Correction-Based Output Model for a Compact Passively Scattered Proton Therapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferguson, S; Ahmad, S; Chen, Y

    2016-06-15

    Purpose: To commission and investigate the accuracy of an output (cGy/MU) prediction model for a compact passively scattered proton therapy system. Methods: A previously published output prediction model (Sahoo et al, Med Phys, 35, 5088–5097, 2008) was commissioned for our Mevion S250 proton therapy system. This model is a correction-based model that multiplies correction factors (d/MUwnc=ROFxSOBPF xRSFxSOBPOCFxOCRxFSFxISF). These factors accounted for changes in output due to options (12 large, 5 deep, and 7 small), modulation width M, range R, off-center, off-axis, field-size, and off-isocenter. In this study, the model was modified to ROFxSOBPFxRSFxOCRxFSFxISF-OCFxGACF by merging SOBPOCF and ISF for simplicitymore » and introducing a gantry angle correction factor (GACF). To commission the model, outputs over 1,000 data points were taken at the time of the system commissioning. The output was predicted by interpolation (1D for SOBPF, FSF, and GACF; 2D for RSF and OCR) with inverse-square calculation (ISF-OCR). The outputs of 273 combinations of R and M covering total 24 options were measured to test the model. To minimize fluence perturbation, scattered dose from range compensator and patient was not considered. The percent differences between the predicted (P) and measured (M) outputs were calculated to test the prediction accuracy ([P-M]/Mx100%). Results: GACF was required because of up to 3.5% output variation dependence on the gantry angle. A 2D interpolation was required for OCR because the dose distribution was not radially symmetric especially for the deep options. The average percent differences were −0.03±0.98% (mean±SD) and the differences of all the measurements fell within ±3%. Conclusion: It is concluded that the model can be clinically used for the compact passively scattered proton therapy system. However, great care should be taken when the field-size is less than 5×5 cm{sup 2} where a direct output measurement is required due to

  18. CT-based attenuation and scatter correction compared with uniform attenuation correction in brain perfusion SPECT imaging for dementia

    NASA Astrophysics Data System (ADS)

    Gillen, Rebecca; Firbank, Michael J.; Lloyd, Jim; O'Brien, John T.

    2015-09-01

    This study investigated if the appearance and diagnostic accuracy of HMPAO brain perfusion SPECT images could be improved by using CT-based attenuation and scatter correction compared with the uniform attenuation correction method. A cohort of subjects who were clinically categorized as Alzheimer’s Disease (n=38 ), Dementia with Lewy Bodies (n=29 ) or healthy normal controls (n=30 ), underwent SPECT imaging with Tc-99m HMPAO and a separate CT scan. The SPECT images were processed using: (a) correction map derived from the subject’s CT scan or (b) the Chang uniform approximation for correction or (c) no attenuation correction. Images were visually inspected. The ratios between key regions of interest known to be affected or spared in each condition were calculated for each correction method, and the differences between these ratios were evaluated. The images produced using the different corrections were noted to be visually different. However, ROI analysis found similar statistically significant differences between control and dementia groups and between AD and DLB groups regardless of the correction map used. We did not identify an improvement in diagnostic accuracy in images which were corrected using CT-based attenuation and scatter correction, compared with those corrected using a uniform correction map.

  19. Structurally flexible triethanolamine-core poly(amidoamine) dendrimers as effective nanovectors to deliver RNAi-based therapeutics.

    PubMed

    Liu, Xiaoxuan; Liu, Cheng; Catapano, Carlo V; Peng, Ling; Zhou, Jiehua; Rocchi, Palma

    2014-01-01

    RNAi-based nucleic acid molecules have attracted considerable attention as compelling therapeutics providing safe and competent delivery systems are available. Dendrimers are emerging as appealing nanocarriers for nucleic acid delivery thanks to their unique well-defined architecture and the resulting cooperativity and multivalency confined within a nanostructure. The present review offers a brief overview of the structurally flexible triethanolamine-core poly(amidoamine) (PAMAM) dendrimers developed in our group as nanovectors for the delivery of RNAi therapeutics. Their excellent activity for delivering different RNAi therapeutics in various disease models in vitro and in vivo will be highlighted here. © 2013.

  20. Corrected Four-Sphere Head Model for EEG Signals.

    PubMed

    Næss, Solveig; Chintaluri, Chaitanya; Ness, Torbjørn V; Dale, Anders M; Einevoll, Gaute T; Wójcik, Daniel K

    2017-01-01

    The EEG signal is generated by electrical brain cell activity, often described in terms of current dipoles. By applying EEG forward models we can compute the contribution from such dipoles to the electrical potential recorded by EEG electrodes. Forward models are key both for generating understanding and intuition about the neural origin of EEG signals as well as inverse modeling, i.e., the estimation of the underlying dipole sources from recorded EEG signals. Different models of varying complexity and biological detail are used in the field. One such analytical model is the four-sphere model which assumes a four-layered spherical head where the layers represent brain tissue, cerebrospinal fluid (CSF), skull, and scalp, respectively. While conceptually clear, the mathematical expression for the electric potentials in the four-sphere model is cumbersome, and we observed that the formulas presented in the literature contain errors. Here, we derive and present the correct analytical formulas with a detailed derivation. A useful application of the analytical four-sphere model is that it can serve as ground truth to test the accuracy of numerical schemes such as the Finite Element Method (FEM). We performed FEM simulations of the four-sphere head model and showed that they were consistent with the corrected analytical formulas. For future reference we provide scripts for computing EEG potentials with the four-sphere model, both by means of the correct analytical formulas and numerical FEM simulations.

  1. Corrected Four-Sphere Head Model for EEG Signals

    PubMed Central

    Næss, Solveig; Chintaluri, Chaitanya; Ness, Torbjørn V.; Dale, Anders M.; Einevoll, Gaute T.; Wójcik, Daniel K.

    2017-01-01

    The EEG signal is generated by electrical brain cell activity, often described in terms of current dipoles. By applying EEG forward models we can compute the contribution from such dipoles to the electrical potential recorded by EEG electrodes. Forward models are key both for generating understanding and intuition about the neural origin of EEG signals as well as inverse modeling, i.e., the estimation of the underlying dipole sources from recorded EEG signals. Different models of varying complexity and biological detail are used in the field. One such analytical model is the four-sphere model which assumes a four-layered spherical head where the layers represent brain tissue, cerebrospinal fluid (CSF), skull, and scalp, respectively. While conceptually clear, the mathematical expression for the electric potentials in the four-sphere model is cumbersome, and we observed that the formulas presented in the literature contain errors. Here, we derive and present the correct analytical formulas with a detailed derivation. A useful application of the analytical four-sphere model is that it can serve as ground truth to test the accuracy of numerical schemes such as the Finite Element Method (FEM). We performed FEM simulations of the four-sphere head model and showed that they were consistent with the corrected analytical formulas. For future reference we provide scripts for computing EEG potentials with the four-sphere model, both by means of the correct analytical formulas and numerical FEM simulations. PMID:29093671

  2. Large-scale hydrological model river storage and discharge correction using a satellite altimetry-based discharge product

    NASA Astrophysics Data System (ADS)

    Emery, Charlotte Marie; Paris, Adrien; Biancamaria, Sylvain; Boone, Aaron; Calmant, Stéphane; Garambois, Pierre-André; Santos da Silva, Joecila

    2018-04-01

    Land surface models (LSMs) are widely used to study the continental part of the water cycle. However, even though their accuracy is increasing, inherent model uncertainties can not be avoided. In the meantime, remotely sensed observations of the continental water cycle variables such as soil moisture, lakes and river elevations are more frequent and accurate. Therefore, those two different types of information can be combined, using data assimilation techniques to reduce a model's uncertainties in its state variables or/and in its input parameters. The objective of this study is to present a data assimilation platform that assimilates into the large-scale ISBA-CTRIP LSM a punctual river discharge product, derived from ENVISAT nadir altimeter water elevation measurements and rating curves, over the whole Amazon basin. To deal with the scale difference between the model and the observation, the study also presents an initial development for a localization treatment that allows one to limit the impact of observations to areas close to the observation and in the same hydrological network. This assimilation platform is based on the ensemble Kalman filter and can correct either the CTRIP river water storage or the discharge. Root mean square error (RMSE) compared to gauge discharges is globally reduced until 21 % and at Óbidos, near the outlet, RMSE is reduced by up to 52 % compared to ENVISAT-based discharge. Finally, it is shown that localization improves results along the main tributaries.

  3. Integrated model-based retargeting and optical proximity correction

    NASA Astrophysics Data System (ADS)

    Agarwal, Kanak B.; Banerjee, Shayak

    2011-04-01

    Conventional resolution enhancement techniques (RET) are becoming increasingly inadequate at addressing the challenges of subwavelength lithography. In particular, features show high sensitivity to process variation in low-k1 lithography. Process variation aware RETs such as process-window OPC are becoming increasingly important to guarantee high lithographic yield, but such techniques suffer from high runtime impact. An alternative to PWOPC is to perform retargeting, which is a rule-assisted modification of target layout shapes to improve their process window. However, rule-based retargeting is not a scalable technique since rules cannot cover the entire search space of two-dimensional shape configurations, especially with technology scaling. In this paper, we propose to integrate the processes of retargeting and optical proximity correction (OPC). We utilize the normalized image log slope (NILS) metric, which is available at no extra computational cost during OPC. We use NILS to guide dynamic target modification between iterations of OPC. We utilize the NILS tagging capabilities of Calibre TCL scripting to identify fragments with low NILS. We then perform NILS binning to assign different magnitude of retargeting to different NILS bins. NILS is determined both for width, to identify regions of pinching, and space, to locate regions of potential bridging. We develop an integrated flow for 1x metal lines (M1) which exhibits lesser lithographic hotspots compared to a flow with just OPC and no retargeting. We also observe cases where hotspots that existed in the rule-based retargeting flow are fixed using our methodology. We finally also demonstrate that such a retargeting methodology does not significantly alter design properties by electrically simulating a latch layout before and after retargeting. We observe less than 1% impact on latch Clk-Q and D-Q delays post-retargeting, which makes this methodology an attractive one for use in improving shape process windows

  4. Validation of model-based brain shift correction in neurosurgery via intraoperative magnetic resonance imaging: preliminary results

    NASA Astrophysics Data System (ADS)

    Luo, Ma; Frisken, Sarah F.; Weis, Jared A.; Clements, Logan W.; Unadkat, Prashin; Thompson, Reid C.; Golby, Alexandra J.; Miga, Michael I.

    2017-03-01

    The quality of brain tumor resection surgery is dependent on the spatial agreement between preoperative image and intraoperative anatomy. However, brain shift compromises the aforementioned alignment. Currently, the clinical standard to monitor brain shift is intraoperative magnetic resonance (iMR). While iMR provides better understanding of brain shift, its cost and encumbrance is a consideration for medical centers. Hence, we are developing a model-based method that can be a complementary technology to address brain shift in standard resections, with resource-intensive cases as referrals for iMR facilities. Our strategy constructs a deformation `atlas' containing potential deformation solutions derived from a biomechanical model that account for variables such as cerebrospinal fluid drainage and mannitol effects. Volumetric deformation is estimated with an inverse approach that determines the optimal combinatory `atlas' solution fit to best match measured surface deformation. Accordingly, preoperative image is updated based on the computed deformation field. This study is the latest development to validate our methodology with iMR. Briefly, preoperative and intraoperative MR images of 2 patients were acquired. Homologous surface points were selected on preoperative and intraoperative scans as measurement of surface deformation and used to drive the inverse problem. To assess the model accuracy, subsurface shift of targets between preoperative and intraoperative states was measured and compared to model prediction. Considering subsurface shift above 3 mm, the proposed strategy provides an average shift correction of 59% across 2 cases. While further improvements in both the model and ability to validate with iMR are desired, the results reported are encouraging.

  5. Hydraulic correction method (HCM) to enhance the efficiency of SRTM DEM in flood modeling

    NASA Astrophysics Data System (ADS)

    Chen, Huili; Liang, Qiuhua; Liu, Yong; Xie, Shuguang

    2018-04-01

    Digital Elevation Model (DEM) is one of the most important controlling factors determining the simulation accuracy of hydraulic models. However, the currently available global topographic data is confronted with limitations for application in 2-D hydraulic modeling, mainly due to the existence of vegetation bias, random errors and insufficient spatial resolution. A hydraulic correction method (HCM) for the SRTM DEM is proposed in this study to improve modeling accuracy. Firstly, we employ the global vegetation corrected DEM (i.e. Bare-Earth DEM), developed from the SRTM DEM to include both vegetation height and SRTM vegetation signal. Then, a newly released DEM, removing both vegetation bias and random errors (i.e. Multi-Error Removed DEM), is employed to overcome the limitation of height errors. Last, an approach to correct the Multi-Error Removed DEM is presented to account for the insufficiency of spatial resolution, ensuring flow connectivity of the river networks. The approach involves: (a) extracting river networks from the Multi-Error Removed DEM using an automated algorithm in ArcGIS; (b) correcting the location and layout of extracted streams with the aid of Google Earth platform and Remote Sensing imagery; and (c) removing the positive biases of the raised segment in the river networks based on bed slope to generate the hydraulically corrected DEM. The proposed HCM utilizes easily available data and tools to improve the flow connectivity of river networks without manual adjustment. To demonstrate the advantages of HCM, an extreme flood event in Huifa River Basin (China) is simulated on the original DEM, Bare-Earth DEM, Multi-Error removed DEM, and hydraulically corrected DEM using an integrated hydrologic-hydraulic model. A comparative analysis is subsequently performed to assess the simulation accuracy and performance of four different DEMs and favorable results have been obtained on the corrected DEM.

  6. Fiducial marker-based correction for involuntary motion in weight-bearing C-arm CT scanning of knees. Part I. Numerical model-based optimization

    PubMed Central

    Choi, Jang-Hwan; Fahrig, Rebecca; Keil, Andreas; Besier, Thor F.; Pal, Saikat; McWalter, Emily J.; Beaupré, Gary S.; Maier, Andreas

    2013-01-01

    Purpose: Human subjects in standing positions are apt to show much more involuntary motion than in supine positions. The authors aimed to simulate a complicated realistic lower body movement using the four-dimensional (4D) digital extended cardiac-torso (XCAT) phantom. The authors also investigated fiducial marker-based motion compensation methods in two-dimensional (2D) and three-dimensional (3D) space. The level of involuntary movement-induced artifacts and image quality improvement were investigated after applying each method. Methods: An optical tracking system with eight cameras and seven retroreflective markers enabled us to track involuntary motion of the lower body of nine healthy subjects holding a squat position at 60° of flexion. The XCAT-based knee model was developed using the 4D XCAT phantom and the optical tracking data acquired at 120 Hz. The authors divided the lower body in the XCAT into six parts and applied unique affine transforms to each so that the motion (6 degrees of freedom) could be synchronized with the optical markers’ location at each time frame. The control points of the XCAT were tessellated into triangles and 248 projection images were created based on intersections of each ray and monochromatic absorption. The tracking data sets with the largest motion (Subject 2) and the smallest motion (Subject 5) among the nine data sets were used to animate the XCAT knee model. The authors defined eight skin control points well distributed around the knees as pseudo-fiducial markers which functioned as a reference in motion correction. Motion compensation was done in the following ways: (1) simple projection shifting in 2D, (2) deformable projection warping in 2D, and (3) rigid body warping in 3D. Graphics hardware accelerated filtered backprojection was implemented and combined with the three correction methods in order to speed up the simulation process. Correction fidelity was evaluated as a function of number of markers used (4–12) and

  7. Antibody-based therapeutics to watch in 2011

    PubMed Central

    2011-01-01

    This overview of 25 monoclonal antibody (mAb) and five Fc fusion protein therapeutics provides brief descriptions of the candidates, recently published clinical study results and on-going Phase 3 studies. In alphanumeric order, the 2011 therapeutic antibodies to watch list comprises AIN-457, bapineuzumab, brentuximab vedotin, briakinumab, dalotuzumab, epratuzumab, farletuzumab, girentuximab (WX-G250), naptumomab estafenatox, necitumumab, obinutuzumab, otelixizumab, pagibaximab, pertuzumab, ramucirumab, REGN88, reslizumab, solanezumab, T1h, teplizumab, trastuzumab emtansine, tremelimumab, vedolizumab, zalutumumab and zanolimumab. In alphanumeric order, the 2011 Fc fusion protein therapeutics to watch list comprises aflibercept, AMG-386, atacicept, Factor VIII-Fc and Factor IX-Fc. Commercially-sponsored mAb and Fc fusion therapeutics that have progressed only as far as Phase 2/3 or 3 were included. Candidates undergoing regulatory review or products that have been approved may also be in Phase 3 studies, but these were excluded. Due to the large body of primary literature about the candidates, only selected references are given and results from recent publications and articles that were relevant to Phase 3 studies are emphasized. Current as of September 2010, the information presented here will serve as a baseline against which future progress in the development of antibody-based therapeutics can be measured. PMID:21051951

  8. Modeling coherent errors in quantum error correction

    NASA Astrophysics Data System (ADS)

    Greenbaum, Daniel; Dutton, Zachary

    2018-01-01

    Analysis of quantum error correcting codes is typically done using a stochastic, Pauli channel error model for describing the noise on physical qubits. However, it was recently found that coherent errors (systematic rotations) on physical data qubits result in both physical and logical error rates that differ significantly from those predicted by a Pauli model. Here we examine the accuracy of the Pauli approximation for noise containing coherent errors (characterized by a rotation angle ɛ) under the repetition code. We derive an analytic expression for the logical error channel as a function of arbitrary code distance d and concatenation level n, in the small error limit. We find that coherent physical errors result in logical errors that are partially coherent and therefore non-Pauli. However, the coherent part of the logical error is negligible at fewer than {ε }-({dn-1)} error correction cycles when the decoder is optimized for independent Pauli errors, thus providing a regime of validity for the Pauli approximation. Above this number of correction cycles, the persistent coherent logical error will cause logical failure more quickly than the Pauli model would predict, and this may need to be combated with coherent suppression methods at the physical level or larger codes.

  9. Interactive Design Strategy for a Multi-Functional PAMAM Dendrimer-Based Nano-Therapeutic Using Computational Models and Experimental Analysis

    PubMed Central

    Lee, Inhan; Williams, Christopher R.; Athey, Brian D.; Baker, James R.

    2010-01-01

    Molecular dynamics simulations of nano-therapeutics as a final product and of all intermediates in the process of generating a multi-functional nano-therapeutic based on a poly(amidoamine) (PAMAM) dendrimer were performed along with chemical analyses of each of them. The actual structures of the dendrimers were predicted, based on potentiometric titration, gel permeation chromatography, and NMR. The chemical analyses determined the numbers of functional molecules, based on the actual structure of the dendrimer. Molecular dynamics simulations calculated the configurations of the intermediates and the radial distributions of functional molecules, based on their numbers. This interactive process between the simulation results and the chemical analyses provided a further strategy to design the next reaction steps and to gain insight into the products at each chemical reaction step. PMID:20700476

  10. Quantile Mapping Bias correction for daily precipitation over Vietnam in a regional climate model

    NASA Astrophysics Data System (ADS)

    Trinh, L. T.; Matsumoto, J.; Ngo-Duc, T.

    2017-12-01

    In the past decades, Regional Climate Models (RCMs) have been developed significantly, allowing climate simulation to be conducted at a higher resolution. However, RCMs often contained biases when comparing with observations. Therefore, statistical correction methods were commonly employed to reduce/minimize the model biases. In this study, outputs of the Regional Climate Model (RegCM) version 4.3 driven by the CNRM-CM5 global products were evaluated with and without the Quantile Mapping (QM) bias correction method. The model domain covered the area from 90oE to 145oE and from 15oS to 40oN with a horizontal resolution of 25km. The QM bias correction processes were implemented by using the Vietnam Gridded precipitation dataset (VnGP) and the outputs of RegCM historical run in the period 1986-1995 and then validated for the period 1996-2005. Based on the statistical quantity of spatial correlation and intensity distributions, the QM method showed a significant improvement in rainfall compared to the non-bias correction method. The improvements both in time and space were recognized in all seasons and all climatic sub-regions of Vietnam. Moreover, not only the rainfall amount but also some extreme indices such as R10m, R20mm, R50m, CDD, CWD, R95pTOT, R99pTOT were much better after the correction. The results suggested that the QM correction method should be taken into practice for the projections of the future precipitation over Vietnam.

  11. Magnetic resonance imaging for monitoring therapeutic response in a transgenic mouse model of Alzheimer's disease using voxel-based analysis of amyloid plaques.

    PubMed

    Kim, Jae-Hun; Ha, Tae Lin; Im, Geun Ho; Yang, Jehoon; Seo, Sang Won; Chung, Julius Juhyun; Chae, Sun Young; Lee, In Su; Lee, Jung Hee

    2014-03-05

    In this study, we have shown the potential of a voxel-based analysis for imaging amyloid plaques and its utility in monitoring therapeutic response in Alzheimer's disease (AD) mice using manganese oxide nanoparticles conjugated with an antibody of Aβ1-40 peptide (HMON-abAβ40). T1-weighted MR brain images of a drug-treated AD group (n=7), a nontreated AD group (n=7), and a wild-type group (n=7) were acquired using a 7.0 T MRI system before (D-1), 24-h (D+1) after, and 72-h (D+3) after injection with an HMON-abAβ40 contrast agent. For the treatment of AD mice, DAPT was injected intramuscularly into AD transgenic mice (50 mg/kg of body weight). For voxel-based analysis, the skull-stripped mouse brain images were spatially normalized, and these voxels' intensities were corrected to reduce voxel intensity differences across scans in different mice. Statistical analysis showed higher normalized MR signal intensity in the frontal cortex and hippocampus of AD mice over wild-type mice on D+1 and D+3 (P<0.01, uncorrected for multiple comparisons). After the treatment of AD mice, the normalized MR signal intensity in the frontal cortex and hippocampus decreased significantly in comparison with nontreated AD mice on D+1 and D+3 (P<0.01, uncorrected for multiple comparisons). These results were confirmed by histological analysis using a thioflavin staining. This unique strategy allows us to detect brain regions that are subjected to amyloid plaque deposition and has the potential for human applications in monitoring therapeutic response for drug development in AD.

  12. Statistical Downscaling and Bias Correction of Climate Model Outputs for Climate Change Impact Assessment in the U.S. Northeast

    NASA Technical Reports Server (NTRS)

    Ahmed, Kazi Farzan; Wang, Guiling; Silander, John; Wilson, Adam M.; Allen, Jenica M.; Horton, Radley; Anyah, Richard

    2013-01-01

    Statistical downscaling can be used to efficiently downscale a large number of General Circulation Model (GCM) outputs to a fine temporal and spatial scale. To facilitate regional impact assessments, this study statistically downscales (to 1/8deg spatial resolution) and corrects the bias of daily maximum and minimum temperature and daily precipitation data from six GCMs and four Regional Climate Models (RCMs) for the northeast United States (US) using the Statistical Downscaling and Bias Correction (SDBC) approach. Based on these downscaled data from multiple models, five extreme indices were analyzed for the future climate to quantify future changes of climate extremes. For a subset of models and indices, results based on raw and bias corrected model outputs for the present-day climate were compared with observations, which demonstrated that bias correction is important not only for GCM outputs, but also for RCM outputs. For future climate, bias correction led to a higher level of agreements among the models in predicting the magnitude and capturing the spatial pattern of the extreme climate indices. We found that the incorporation of dynamical downscaling as an intermediate step does not lead to considerable differences in the results of statistical downscaling for the study domain.

  13. Multi-pose facial correction based on Gaussian process with combined kernel function

    NASA Astrophysics Data System (ADS)

    Shi, Shuyan; Ji, Ruirui; Zhang, Fan

    2018-04-01

    In order to improve the recognition rate of various postures, this paper proposes a method of facial correction based on Gaussian Process which build a nonlinear regression model between the front and the side face with combined kernel function. The face images with horizontal angle from -45° to +45° can be properly corrected to front faces. Finally, Support Vector Machine is employed for face recognition. Experiments on CAS PEAL R1 face database show that Gaussian process can weaken the influence of pose changes and improve the accuracy of face recognition to certain extent.

  14. Delay correction model for estimating bus emissions at signalized intersections based on vehicle specific power distributions.

    PubMed

    Song, Guohua; Zhou, Xixi; Yu, Lei

    2015-05-01

    The intersection is one of the biggest emission points for buses and also the high exposure site for people. Several traffic performance indexes have been developed and widely used for intersection evaluations. However, few studies have focused on the relationship between these indexes and emissions at intersections. This paper intends to propose a model that relates emissions to the two commonly used measures of effectiveness (i.e. delay time and number of stops) by using bus activity data and emission data at intersections. First, with a large number of field instantaneous emission data and corresponding activity data collected by the Portable Emission Measurement System (PEMS), emission rates are derived for different vehicle specific power (VSP) bins. Then, 2002 sets of trajectory data, an equivalent of about 140,000 sets of second-by-second activity data, are obtained from Global Position Systems (GPSs)-equipped diesel buses in Beijing. The delay and the emission factors of each trajectory are estimated. Then, by using baseline emission factors for two types of intersections, e.g. the Arterial @ Arterial Intersection and the Arterial @ Collector, delay correction factors are calculated for the two types of intersections at different congestion levels. Finally, delay correction models are established for adjusting emission factors for each type of intersections and different numbers of stops. A comparative analysis between estimated and field emission factors demonstrates that the delay correction model is reliable. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Insar Unwrapping Error Correction Based on Quasi-Accurate Detection of Gross Errors (quad)

    NASA Astrophysics Data System (ADS)

    Kang, Y.; Zhao, C. Y.; Zhang, Q.; Yang, C. S.

    2018-04-01

    Unwrapping error is a common error in the InSAR processing, which will seriously degrade the accuracy of the monitoring results. Based on a gross error correction method, Quasi-accurate detection (QUAD), the method for unwrapping errors automatic correction is established in this paper. This method identifies and corrects the unwrapping errors by establishing a functional model between the true errors and interferograms. The basic principle and processing steps are presented. Then this method is compared with the L1-norm method with simulated data. Results show that both methods can effectively suppress the unwrapping error when the ratio of the unwrapping errors is low, and the two methods can complement each other when the ratio of the unwrapping errors is relatively high. At last the real SAR data is tested for the phase unwrapping error correction. Results show that this new method can correct the phase unwrapping errors successfully in the practical application.

  16. Correction to hill (2005).

    PubMed

    Hill, Clara E

    2006-01-01

    Reports an error in "Therapist Techniques, Client Involvement, and the Therapeutic Relationship: Inextricably Intertwined in the Therapy Process" by Clara E. Hill (Psychotherapy: Theory, Research, Practice, Training, 2005 Win, Vol 42(4), 431-442). An author's name was incorrectly spelled in a reference. The correct reference is presented. (The following abstract of the original article appeared in record 2006-03309-003.) I propose that therapist techniques, client involvement, and the therapeutic relationship are inextricably intertwined and need to be considered together in any discussion of the therapy process. Furthermore, I present a pantheoretical model of how these three variables evolve over four stages of successful therapy: initial impression formation, beginning the therapy (involves the components of facilitating client exploration and developing case conceptualization and treatment strategies), the core work of therapy (involves the components of theory-relevant tasks and overcoming obstacles), and termination. Theoretical propositions as well as implications for training and research are presented. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  17. Entropy corrected holographic dark energy models in modified gravity

    NASA Astrophysics Data System (ADS)

    Jawad, Abdul; Azhar, Nadeem; Rani, Shamaila

    We consider the power law and the entropy corrected holographic dark energy (HDE) models with Hubble horizon in the dynamical Chern-Simons modified gravity. We explore various cosmological parameters and planes in this framework. The Hubble parameter lies within the consistent range at the present and later epoch for both entropy corrected models. The deceleration parameter explains the accelerated expansion of the universe. The equation of state (EoS) parameter corresponds to quintessence and cold dark matter (ΛCDM) limit. The ωΛ-ωΛ‧ approaches to ΛCDM limit and freezing region in both entropy corrected models. The statefinder parameters are consistent with ΛCDM limit and dark energy (DE) models. The generalized second law of thermodynamics remain valid in all cases of interacting parameter. It is interesting to mention here that our results of Hubble, EoS parameter and ωΛ-ωΛ‧ plane show consistency with the present observations like Planck, WP, BAO, H0, SNLS and nine-year WMAP.

  18. Tailoring drug release rates in hydrogel-based therapeutic delivery applications using graphene oxide

    PubMed Central

    Zhi, Z. L.; Craster, R. V.

    2018-01-01

    Graphene oxide (GO) is increasingly used for controlling mass diffusion in hydrogel-based drug delivery applications. On the macro-scale, the density of GO in the hydrogel is a critical parameter for modulating drug release. Here, we investigate the diffusion of a peptide drug through a network of GO membranes and GO-embedded hydrogels, modelled as porous matrices resembling both laminated and ‘house of cards’ structures. Our experiments use a therapeutic peptide and show a tunable nonlinear dependence of the peptide concentration upon time. We establish models using numerical simulations with a diffusion equation accounting for the photo-thermal degradation of fluorophores and an effective percolation model to simulate the experimental data. The modelling yields an interpretation of the control of drug diffusion through GO membranes, which is extended to the diffusion of the peptide in GO-embedded agarose hydrogels. Varying the density of micron-sized GO flakes allows for fine control of the drug diffusion. We further show that both GO density and size influence the drug release rate. The ability to tune the density of hydrogel-like GO membranes to control drug release rates has exciting implications to offer guidelines for tailoring drug release rates in hydrogel-based therapeutic delivery applications. PMID:29445040

  19. Topological quantum error correction in the Kitaev honeycomb model

    NASA Astrophysics Data System (ADS)

    Lee, Yi-Chan; Brell, Courtney G.; Flammia, Steven T.

    2017-08-01

    The Kitaev honeycomb model is an approximate topological quantum error correcting code in the same phase as the toric code, but requiring only a 2-body Hamiltonian. As a frustrated spin model, it is well outside the commuting models of topological quantum codes that are typically studied, but its exact solubility makes it more amenable to analysis of effects arising in this noncommutative setting than a generic topologically ordered Hamiltonian. Here we study quantum error correction in the honeycomb model using both analytic and numerical techniques. We first prove explicit exponential bounds on the approximate degeneracy, local indistinguishability, and correctability of the code space. These bounds are tighter than can be achieved using known general properties of topological phases. Our proofs are specialized to the honeycomb model, but some of the methods may nonetheless be of broader interest. Following this, we numerically study noise caused by thermalization processes in the perturbative regime close to the toric code renormalization group fixed point. The appearance of non-topological excitations in this setting has no significant effect on the error correction properties of the honeycomb model in the regimes we study. Although the behavior of this model is found to be qualitatively similar to that of the standard toric code in most regimes, we find numerical evidence of an interesting effect in the low-temperature, finite-size regime where a preferred lattice direction emerges and anyon diffusion is geometrically constrained. We expect this effect to yield an improvement in the scaling of the lifetime with system size as compared to the standard toric code.

  20. Simulation-based artifact correction (SBAC) for metrological computed tomography

    NASA Astrophysics Data System (ADS)

    Maier, Joscha; Leinweber, Carsten; Sawall, Stefan; Stoschus, Henning; Ballach, Frederic; Müller, Tobias; Hammer, Michael; Christoph, Ralf; Kachelrieß, Marc

    2017-06-01

    Computed tomography (CT) is a valuable tool for the metrolocical assessment of industrial components. However, the application of CT to the investigation of highly attenuating objects or multi-material components is often restricted by the presence of CT artifacts caused by beam hardening, x-ray scatter, off-focal radiation, partial volume effects or the cone-beam reconstruction itself. In order to overcome this limitation, this paper proposes an approach to calculate a correction term that compensates for the contribution of artifacts and thus enables an appropriate assessment of these components using CT. Therefore, we make use of computer simulations of the CT measurement process. Based on an appropriate model of the object, e.g. an initial reconstruction or a CAD model, two simulations are carried out. One simulation considers all physical effects that cause artifacts using dedicated analytic methods as well as Monte Carlo-based models. The other one represents an ideal CT measurement i.e. a measurement in parallel beam geometry with a monochromatic, point-like x-ray source and no x-ray scattering. Thus, the difference between these simulations is an estimate for the present artifacts and can be used to correct the acquired projection data or the corresponding CT reconstruction, respectively. The performance of the proposed approach is evaluated using simulated as well as measured data of single and multi-material components. Our approach yields CT reconstructions that are nearly free of artifacts and thereby clearly outperforms commonly used artifact reduction algorithms in terms of image quality. A comparison against tactile reference measurements demonstrates the ability of the proposed approach to increase the accuracy of the metrological assessment significantly.

  1. Therapeutic Targeting of Siglecs using Antibody- and Glycan-based Approaches

    PubMed Central

    Angata, Takashi; Nycholat, Corwin M.; Macauley, Matthew S.

    2015-01-01

    The sialic acid-binding immunoglobulin-like lectins (Siglecs) are a family of immunomodulatory receptors whose functions are regulated by their glycan ligands. Siglecs are attractive therapeutic targets because of their cell-type specific expression pattern, endocytic properties, high expression on certain lymphomas/leukemias, and ability to modulate receptor signaling. Siglec-targeting approaches with therapeutic potential encompass antibody- and glycan-based strategies. Several antibody-based therapies are in clinical trials and continue to be developed for the treatment of lymphoma/leukemia and autoimmune disease, while the therapeutic potential of glycan-based strategies for cargo-delivery and immunomodulation is a promising new approach. Here, we review these strategies with special emphasis on emerging approaches and disease areas that may benefit from targeting the Siglec family. PMID:26435210

  2. Modeling correction of severe urea cycle defects in the growing murine liver using a hybrid recombinant adeno-associated virus/piggyBac transposase gene delivery system.

    PubMed

    Cunningham, Sharon C; Siew, Susan M; Hallwirth, Claus V; Bolitho, Christine; Sasaki, Natsuki; Garg, Gagan; Michael, Iacovos P; Hetherington, Nicola A; Carpenter, Kevin; de Alencastro, Gustavo; Nagy, Andras; Alexander, Ian E

    2015-08-01

    Liver-targeted gene therapy based on recombinant adeno-associated viral vectors (rAAV) shows promising therapeutic efficacy in animal models and adult-focused clinical trials. This promise, however, is not directly translatable to the growing liver, where high rates of hepatocellular proliferation are accompanied by loss of episomal rAAV genomes and subsequently a loss in therapeutic efficacy. We have developed a hybrid rAAV/piggyBac transposon vector system combining the highly efficient liver-targeting properties of rAAV with stable piggyBac-mediated transposition of the transgene into the hepatocyte genome. Transposition efficiency was first tested using an enhanced green fluorescent protein expression cassette following delivery to newborn wild-type mice, with a 20-fold increase in stably gene-modified hepatocytes observed 4 weeks posttreatment compared to traditional rAAV gene delivery. We next modeled the therapeutic potential of the system in the context of severe urea cycle defects. A single treatment in the perinatal period was sufficient to confer robust and stable phenotype correction in the ornithine transcarbamylase-deficient Spf(ash) mouse and the neonatal lethal argininosuccinate synthetase knockout mouse. Finally, transposon integration patterns were analyzed, revealing 127,386 unique integration sites which conformed to previously published piggyBac data. Using a hybrid rAAV/piggyBac transposon vector system, we achieved stable therapeutic protection in two urea cycle defect mouse models; a clinically conceivable early application of this technology in the management of severe urea cycle defects could be as a bridging therapy while awaiting liver transplantation; further improvement of the system will result from the development of highly human liver-tropic capsids, the use of alternative strategies to achieve transient transposase expression, and engineered refinements in the safety profile of piggyBac transposase-mediated integration. © 2015 by

  3. ITER Side Correction Coil Quench model and analysis

    NASA Astrophysics Data System (ADS)

    Nicollet, S.; Bessette, D.; Ciazynski, D.; Duchateau, J. L.; Gauthier, F.; Lacroix, B.

    2016-12-01

    Previous thermohydraulic studies performed for the ITER TF, CS and PF magnet systems have brought some important information on the detection and consequences of a quench as a function of the initial conditions (deposited energy, heated length). Even if the temperature margin of the Correction Coils is high, their behavior during a quench should also be studied since a quench is likely to be triggered by potential anomalies in joints, ground fault on the instrumentation wires, etc. A model has been developed with the SuperMagnet Code (Bagnasco et al., 2010) for a Side Correction Coil (SCC2) with four pancakes cooled in parallel, each of them represented by a Thea module (with the proper Cable In Conduit Conductor characteristics). All the other coils of the PF cooling loop are hydraulically connected in parallel (top/bottom correction coils and six Poloidal Field Coils) are modeled by Flower modules with equivalent hydraulics properties. The model and the analysis results are presented for five quench initiation cases with/without fast discharge: two quenches initiated by a heat input to the innermost turn of one pancake (case 1 and case 2) and two other quenches initiated at the innermost turns of four pancakes (case 3 and case 4). In the 5th case, the quench is initiated at the middle turn of one pancake. The impact on the cooling circuit, e.g. the exceedance of the opening pressure of the quench relief valves, is detailed in case of an undetected quench (i.e. no discharge of the magnet). Particular attention is also paid to a possible secondary quench detection system based on measured thermohydraulic signals (pressure, temperature and/or helium mass flow rate). The maximum cable temperature achieved in case of a fast current discharge (primary detection by voltage) is compared to the design hot spot criterion of 150 K, which includes the contribution of helium and jacket.

  4. Spherical Nucleic Acids as Intracellular Agents for Nucleic Acid Based Therapeutics

    NASA Astrophysics Data System (ADS)

    Hao, Liangliang

    Recent functional discoveries on the noncoding sequences of human genome and transcriptome could lead to revolutionary treatment modalities because the noncoding RNAs (ncRNAs) can be applied as therapeutic agents to manipulate disease-causing genes. To date few nucleic acid-based therapeutics have been translated into the clinic due to challenges in the delivery of the oligonucleotide agents in an effective, cell specific, and non-toxic fashion. Unmodified oligonucleotide agents are destroyed rapidly in biological fluids by enzymatic degradation and have difficulty crossing the plasma membrane without the aid of transfection reagents, which often cause inflammatory, cytotoxic, or immunogenic side effects. Spherical nucleic acids (SNAs), nanoparticles consisting of densely organized and highly oriented oligonucleotides, pose one possible solution to circumventing these problems in both the antisense and RNA interference (RNAi) pathways. The unique three dimensional architecture of SNAs protects the bioactive oligonucleotides from unspecific degradation during delivery and supports their targeting of class A scavenger receptors and endocytosis via a lipid-raft-dependent, caveolae-mediated pathway. Owing to their unique structure, SNAs are able to cross cell membranes and regulate target genes expression as a single entity, without triggering the cellular innate immune response. Herein, my thesis has focused on understanding the interactions between SNAs and cellular components and developing SNA-based nanostructures to improve therapeutic capabilities. Specifically, I developed a novel SNA-based, nanoscale agent for delivery of therapeutic oligonucleotides to manipulate microRNAs (miRNAs), the endogenous post-transcriptional gene regulators. I investigated the role of SNAs involving miRNAs in anti-cancer or anti-inflammation responses in cells and in in vivo murine disease models via systemic injection. Furthermore, I explored using different strategies to construct

  5. Transport Corrections in Nodal Diffusion Codes for HTR Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abderrafi M. Ougouag; Frederick N. Gleicher

    2010-08-01

    The cores and reflectors of High Temperature Reactors (HTRs) of the Next Generation Nuclear Plant (NGNP) type are dominantly diffusive media from the point of view of behavior of the neutrons and their migration between the various structures of the reactor. This means that neutron diffusion theory is sufficient for modeling most features of such reactors and transport theory may not be needed for most applications. Of course, the above statement assumes the availability of homogenized diffusion theory data. The statement is true for most situations but not all. Two features of NGNP-type HTRs require that the diffusion theory-based solutionmore » be corrected for local transport effects. These two cases are the treatment of burnable poisons (BP) in the case of the prismatic block reactors and, for both pebble bed reactor (PBR) and prismatic block reactor (PMR) designs, that of control rods (CR) embedded in non-multiplying regions near the interface between fueled zones and said non-multiplying zones. The need for transport correction arises because diffusion theory-based solutions appear not to provide sufficient fidelity in these situations.« less

  6. Advanced Corrections for InSAR Using GPS and Numerical Weather Models

    NASA Astrophysics Data System (ADS)

    Cossu, F.; Foster, J. H.; Amelung, F.; Varugu, B. K.; Businger, S.; Cherubini, T.

    2017-12-01

    We present results from an investigation into the application of numerical weather models for generating tropospheric correction fields for Interferometric Synthetic Aperture Radar (InSAR). We apply the technique to data acquired from a UAVSAR campaign as well as from the CosmoSkyMed satellites. The complex spatial and temporal changes in the atmospheric propagation delay of the radar signal remain the single biggest factor limiting InSAR's potential for hazard monitoring and mitigation. A new generation of InSAR systems is being built and launched, and optimizing the science and hazard applications of these systems requires advanced methodologies to mitigate tropospheric noise. We use the Weather Research and Forecasting (WRF) model to generate a 900 m spatial resolution atmospheric models covering the Big Island of Hawaii and an even higher, 300 m resolution grid over the Mauna Loa and Kilauea volcanoes. By comparing a range of approaches, from the simplest, using reanalyses based on typically available meteorological observations, through to the "kitchen-sink" approach of assimilating all relevant data sets into our custom analyses, we examine the impact of the additional data sets on the atmospheric models and their effectiveness in correcting InSAR data. We focus particularly on the assimilation of information from the more than 60 GPS sites in the island. We ingest zenith tropospheric delay estimates from these sites directly into the WRF analyses, and also perform double-difference tomography using the phase residuals from the GPS processing to robustly incorporate heterogeneous information from the GPS data into the atmospheric models. We assess our performance through comparisons of our atmospheric models with external observations not ingested into the model, and through the effectiveness of the derived phase screens in reducing InSAR variance. Comparison of the InSAR data, our atmospheric analyses, and assessments of the active local and mesoscale

  7. Filtering method of star control points for geometric correction of remote sensing image based on RANSAC algorithm

    NASA Astrophysics Data System (ADS)

    Tan, Xiangli; Yang, Jungang; Deng, Xinpu

    2018-04-01

    In the process of geometric correction of remote sensing image, occasionally, a large number of redundant control points may result in low correction accuracy. In order to solve this problem, a control points filtering algorithm based on RANdom SAmple Consensus (RANSAC) was proposed. The basic idea of the RANSAC algorithm is that using the smallest data set possible to estimate the model parameters and then enlarge this set with consistent data points. In this paper, unlike traditional methods of geometric correction using Ground Control Points (GCPs), the simulation experiments are carried out to correct remote sensing images, which using visible stars as control points. In addition, the accuracy of geometric correction without Star Control Points (SCPs) optimization is also shown. The experimental results show that the SCPs's filtering method based on RANSAC algorithm has a great improvement on the accuracy of remote sensing image correction.

  8. Manufacturing of Human Extracellular Vesicle-Based Therapeutics for Clinical Use

    PubMed Central

    Gimona, Mario; Pachler, Karin; Laner-Plamberger, Sandra; Schallmoser, Katharina; Rohde, Eva

    2017-01-01

    Extracellular vesicles (EVs) derived from stem and progenitor cells may have therapeutic effects comparable to their parental cells and are considered promising agents for the treatment of a variety of diseases. To this end, strategies must be designed to successfully translate EV research and to develop safe and efficacious therapies, whilst taking into account the applicable regulations. Here, we discuss the requirements for manufacturing, safety, and efficacy testing of EVs along their path from the laboratory to the patient. Development of EV-therapeutics is influenced by the source cell types and the target diseases. In this article, we express our view based on our experience in manufacturing biological therapeutics for routine use or clinical testing, and focus on strategies for advancing mesenchymal stromal cell (MSC)-derived EV-based therapies. We also discuss the rationale for testing MSC-EVs in selected diseases with an unmet clinical need such as critical size bone defects, epidermolysis bullosa and spinal cord injury. While the scientific community, pharmaceutical companies and clinicians are at the point of entering into clinical trials for testing the therapeutic potential of various EV-based products, the identification of the mode of action underlying the suggested potency in each therapeutic approach remains a major challenge to the translational path. PMID:28587212

  9. Manufacturing of Human Extracellular Vesicle-Based Therapeutics for Clinical Use.

    PubMed

    Gimona, Mario; Pachler, Karin; Laner-Plamberger, Sandra; Schallmoser, Katharina; Rohde, Eva

    2017-06-03

    Extracellular vesicles (EVs) derived from stem and progenitor cells may have therapeutic effects comparable to their parental cells and are considered promising agents for the treatment of a variety of diseases. To this end, strategies must be designed to successfully translate EV research and to develop safe and efficacious therapies, whilst taking into account the applicable regulations. Here, we discuss the requirements for manufacturing, safety, and efficacy testing of EVs along their path from the laboratory to the patient. Development of EV-therapeutics is influenced by the source cell types and the target diseases. In this article, we express our view based on our experience in manufacturing biological therapeutics for routine use or clinical testing, and focus on strategies for advancing mesenchymal stromal cell (MSC)-derived EV-based therapies. We also discuss the rationale for testing MSC-EVs in selected diseases with an unmet clinical need such as critical size bone defects, epidermolysis bullosa and spinal cord injury. While the scientific community, pharmaceutical companies and clinicians are at the point of entering into clinical trials for testing the therapeutic potential of various EV-based products, the identification of the mode of action underlying the suggested potency in each therapeutic approach remains a major challenge to the translational path.

  10. A reduced-order, single-bubble cavitation model with applications to therapeutic ultrasound

    PubMed Central

    Kreider, Wayne; Crum, Lawrence A.; Bailey, Michael R.; Sapozhnikov, Oleg A.

    2011-01-01

    Cavitation often occurs in therapeutic applications of medical ultrasound such as shock-wave lithotripsy (SWL) and high-intensity focused ultrasound (HIFU). Because cavitation bubbles can affect an intended treatment, it is important to understand the dynamics of bubbles in this context. The relevant context includes very high acoustic pressures and frequencies as well as elevated temperatures. Relative to much of the prior research on cavitation and bubble dynamics, such conditions are unique. To address the relevant physics, a reduced-order model of a single, spherical bubble is proposed that incorporates phase change at the liquid-gas interface as well as heat and mass transport in both phases. Based on the energy lost during the inertial collapse and rebound of a millimeter-sized bubble, experimental observations were used to tune and test model predictions. In addition, benchmarks from the published literature were used to assess various aspects of model performance. Benchmark comparisons demonstrate that the model captures the basic physics of phase change and diffusive transport, while it is quantitatively sensitive to specific model assumptions and implementation details. Given its performance and numerical stability, the model can be used to explore bubble behaviors across a broad parameter space relevant to therapeutic ultrasound. PMID:22088026

  11. A reduced-order, single-bubble cavitation model with applications to therapeutic ultrasound.

    PubMed

    Kreider, Wayne; Crum, Lawrence A; Bailey, Michael R; Sapozhnikov, Oleg A

    2011-11-01

    Cavitation often occurs in therapeutic applications of medical ultrasound such as shock-wave lithotripsy (SWL) and high-intensity focused ultrasound (HIFU). Because cavitation bubbles can affect an intended treatment, it is important to understand the dynamics of bubbles in this context. The relevant context includes very high acoustic pressures and frequencies as well as elevated temperatures. Relative to much of the prior research on cavitation and bubble dynamics, such conditions are unique. To address the relevant physics, a reduced-order model of a single, spherical bubble is proposed that incorporates phase change at the liquid-gas interface as well as heat and mass transport in both phases. Based on the energy lost during the inertial collapse and rebound of a millimeter-sized bubble, experimental observations were used to tune and test model predictions. In addition, benchmarks from the published literature were used to assess various aspects of model performance. Benchmark comparisons demonstrate that the model captures the basic physics of phase change and diffusive transport, while it is quantitatively sensitive to specific model assumptions and implementation details. Given its performance and numerical stability, the model can be used to explore bubble behaviors across a broad parameter space relevant to therapeutic ultrasound.

  12. A systematic approach to identify therapeutic effects of natural products based on human metabolite information.

    PubMed

    Noh, Kyungrin; Yoo, Sunyong; Lee, Doheon

    2018-06-13

    Natural products have been widely investigated in the drug development field. Their traditional use cases as medicinal agents and their resemblance of our endogenous compounds show the possibility of new drug development. Many researchers have focused on identifying therapeutic effects of natural products, yet the resemblance of natural products and human metabolites has been rarely touched. We propose a novel method which predicts therapeutic effects of natural products based on their similarity with human metabolites. In this study, we compare the structure, target and phenotype similarities between natural products and human metabolites to capture molecular and phenotypic properties of both compounds. With the generated similarity features, we train support vector machine model to identify similar natural product and human metabolite pairs. The known functions of human metabolites are then mapped to the paired natural products to predict their therapeutic effects. With our selected three feature sets, structure, target and phenotype similarities, our trained model successfully paired similar natural products and human metabolites. When applied to the natural product derived drugs, we could successfully identify their indications with high specificity and sensitivity. We further validated the found therapeutic effects of natural products with the literature evidence. These results suggest that our model can match natural products to similar human metabolites and provide possible therapeutic effects of natural products. By utilizing the similar human metabolite information, we expect to find new indications of natural products which could not be covered by previous in silico methods.

  13. Segmentation-based retrospective shading correction in fluorescence microscopy E. coli images for quantitative analysis

    NASA Astrophysics Data System (ADS)

    Mai, Fei; Chang, Chunqi; Liu, Wenqing; Xu, Weichao; Hung, Yeung S.

    2009-10-01

    Due to the inherent imperfections in the imaging process, fluorescence microscopy images often suffer from spurious intensity variations, which is usually referred to as intensity inhomogeneity, intensity non uniformity, shading or bias field. In this paper, a retrospective shading correction method for fluorescence microscopy Escherichia coli (E. Coli) images is proposed based on segmentation result. Segmentation and shading correction are coupled together, so we iteratively correct the shading effects based on segmentation result and refine the segmentation by segmenting the image after shading correction. A fluorescence microscopy E. Coli image can be segmented (based on its intensity value) into two classes: the background and the cells, where the intensity variation within each class is close to zero if there is no shading. Therefore, we make use of this characteristics to correct the shading in each iteration. Shading is mathematically modeled as a multiplicative component and an additive noise component. The additive component is removed by a denoising process, and the multiplicative component is estimated using a fast algorithm to minimize the intra-class intensity variation. We tested our method on synthetic images and real fluorescence E.coli images. It works well not only for visual inspection, but also for numerical evaluation. Our proposed method should be useful for further quantitative analysis especially for protein expression value comparison.

  14. Library based x-ray scatter correction for dedicated cone beam breast CT

    PubMed Central

    Shi, Linxi; Karellas, Andrew; Zhu, Lei

    2016-01-01

    Purpose: The image quality of dedicated cone beam breast CT (CBBCT) is limited by substantial scatter contamination, resulting in cupping artifacts and contrast-loss in reconstructed images. Such effects obscure the visibility of soft-tissue lesions and calcifications, which hinders breast cancer detection and diagnosis. In this work, we propose a library-based software approach to suppress scatter on CBBCT images with high efficiency, accuracy, and reliability. Methods: The authors precompute a scatter library on simplified breast models with different sizes using the geant4-based Monte Carlo (MC) toolkit. The breast is approximated as a semiellipsoid with homogeneous glandular/adipose tissue mixture. For scatter correction on real clinical data, the authors estimate the breast size from a first-pass breast CT reconstruction and then select the corresponding scatter distribution from the library. The selected scatter distribution from simplified breast models is spatially translated to match the projection data from the clinical scan and is subtracted from the measured projection for effective scatter correction. The method performance was evaluated using 15 sets of patient data, with a wide range of breast sizes representing about 95% of general population. Spatial nonuniformity (SNU) and contrast to signal deviation ratio (CDR) were used as metrics for evaluation. Results: Since the time-consuming MC simulation for library generation is precomputed, the authors’ method efficiently corrects for scatter with minimal processing time. Furthermore, the authors find that a scatter library on a simple breast model with only one input parameter, i.e., the breast diameter, sufficiently guarantees improvements in SNU and CDR. For the 15 clinical datasets, the authors’ method reduces the average SNU from 7.14% to 2.47% in coronal views and from 10.14% to 3.02% in sagittal views. On average, the CDR is improved by a factor of 1.49 in coronal views and 2.12 in sagittal

  15. Validation of model-based deformation correction in image-guided liver surgery via tracked intraoperative ultrasound: preliminary method and results

    NASA Astrophysics Data System (ADS)

    Clements, Logan W.; Collins, Jarrod A.; Wu, Yifei; Simpson, Amber L.; Jarnagin, William R.; Miga, Michael I.

    2015-03-01

    Soft tissue deformation represents a significant error source in current surgical navigation systems used for open hepatic procedures. While numerous algorithms have been proposed to rectify the tissue deformation that is encountered during open liver surgery, clinical validation of the proposed methods has been limited to surface based metrics and sub-surface validation has largely been performed via phantom experiments. Tracked intraoperative ultrasound (iUS) provides a means to digitize sub-surface anatomical landmarks during clinical procedures. The proposed method involves the validation of a deformation correction algorithm for open hepatic image-guided surgery systems via sub-surface targets digitized with tracked iUS. Intraoperative surface digitizations were acquired via a laser range scanner and an optically tracked stylus for the purposes of computing the physical-to-image space registration within the guidance system and for use in retrospective deformation correction. Upon completion of surface digitization, the organ was interrogated with a tracked iUS transducer where the iUS images and corresponding tracked locations were recorded. After the procedure, the clinician reviewed the iUS images to delineate contours of anatomical target features for use in the validation procedure. Mean closest point distances between the feature contours delineated in the iUS images and corresponding 3-D anatomical model generated from the preoperative tomograms were computed to quantify the extent to which the deformation correction algorithm improved registration accuracy. The preliminary results for two patients indicate that the deformation correction method resulted in a reduction in target error of approximately 50%.

  16. Concise Review: Developing Best‐Practice Models for the Therapeutic Use of Extracellular Vesicles

    PubMed Central

    Reiner, Agnes T.; Witwer, Kenneth W.; van Balkom, Bas W.M.; de Beer, Joel; Brodie, Chaya; Corteling, Randolph L.; Gabrielsson, Susanne; Gimona, Mario; Ibrahim, Ahmed G.; de Kleijn, Dominique; Lai, Charles P.; Lötvall, Jan; del Portillo, Hernando A.; Reischl, Ilona G.; Riazifar, Milad; Salomon, Carlos; Tahara, Hidetoshi; Toh, Wei Seong; Wauben, Marca H.M.; Yang, Vicky K.; Yang, Yijun; Yeo, Ronne Wee Yeh; Yin, Hang; Giebel, Bernd

    2017-01-01

    Abstract Growing interest in extracellular vesicles (EVs, including exosomes and microvesicles) as therapeutic entities, particularly in stem cell‐related approaches, has underlined the need for standardization and coordination of development efforts. Members of the International Society for Extracellular Vesicles and the Society for Clinical Research and Translation of Extracellular Vesicles Singapore convened a Workshop on this topic to discuss the opportunities and challenges associated with development of EV‐based therapeutics at the preclinical and clinical levels. This review outlines topic‐specific action items that, if addressed, will enhance the development of best‐practice models for EV therapies. Stem Cells Translational Medicine 2017;6:1730–1739 PMID:28714557

  17. Prior-based artifact correction (PBAC) in computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heußer, Thorsten, E-mail: thorsten.heusser@dkfz-heidelberg.de; Brehm, Marcus; Ritschl, Ludwig

    2014-02-15

    Purpose: Image quality in computed tomography (CT) often suffers from artifacts which may reduce the diagnostic value of the image. In many cases, these artifacts result from missing or corrupt regions in the projection data, e.g., in the case of metal, truncation, and limited angle artifacts. The authors propose a generalized correction method for different kinds of artifacts resulting from missing or corrupt data by making use of available prior knowledge to perform data completion. Methods: The proposed prior-based artifact correction (PBAC) method requires prior knowledge in form of a planning CT of the same patient or in form ofmore » a CT scan of a different patient showing the same body region. In both cases, the prior image is registered to the patient image using a deformable transformation. The registered prior is forward projected and data completion of the patient projections is performed using smooth sinogram inpainting. The obtained projection data are used to reconstruct the corrected image. Results: The authors investigate metal and truncation artifacts in patient data sets acquired with a clinical CT and limited angle artifacts in an anthropomorphic head phantom data set acquired with a gantry-based flat detector CT device. In all cases, the corrected images obtained by PBAC are nearly artifact-free. Compared to conventional correction methods, PBAC achieves better artifact suppression while preserving the patient-specific anatomy at the same time. Further, the authors show that prominent anatomical details in the prior image seem to have only minor impact on the correction result. Conclusions: The results show that PBAC has the potential to effectively correct for metal, truncation, and limited angle artifacts if adequate prior data are available. Since the proposed method makes use of a generalized algorithm, PBAC may also be applicable to other artifacts resulting from missing or corrupt data.« less

  18. Designing multifocal corneal models to correct presbyopia by laser ablation

    NASA Astrophysics Data System (ADS)

    Alarcón, Aixa; Anera, Rosario G.; Del Barco, Luis Jiménez; Jiménez, José R.

    2012-01-01

    Two multifocal corneal models and an aspheric model designed to correct presbyopia by corneal photoablation were evaluated. The design of each model was optimized to achieve the best visual quality possible for both near and distance vision. In addition, we evaluated the effect of myosis and pupil decentration on visual quality. The corrected model with the central zone for near vision provides better results since it requires less ablated corneal surface area, permits higher addition values, presents stabler visual quality with pupil-size variations and lower high-order aberrations.

  19. Modeling boundary measurements of scattered light using the corrected diffusion approximation

    PubMed Central

    Lehtikangas, Ossi; Tarvainen, Tanja; Kim, Arnold D.

    2012-01-01

    We study the modeling and simulation of steady-state measurements of light scattered by a turbid medium taken at the boundary. In particular, we implement the recently introduced corrected diffusion approximation in two spatial dimensions to model these boundary measurements. This implementation uses expansions in plane wave solutions to compute boundary conditions and the additive boundary layer correction, and a finite element method to solve the diffusion equation. We show that this corrected diffusion approximation models boundary measurements substantially better than the standard diffusion approximation in comparison to numerical solutions of the radiative transport equation. PMID:22435102

  20. RNA interference-based therapeutics for inherited long QT syndrome.

    PubMed

    Li, Guoliang; Ma, Shuting; Sun, Chaofeng

    2015-08-01

    Inherited long QT syndrome (LQTS) is an electrical heart disorder that manifests with syncope, seizures, and increased risk of torsades de pointes and sudden cardiac death. Dominant-negative current suppression is a mechanism by which pathogenic proteins disrupt the function of ion channels in inherited LQTS. However, current approaches for the management of inherited LQTS are inadequate. RNA interference (RNAi) is a powerful technique that is able to suppress or silence the expression of mutant genes. RNAi may be harnessed to knock out mRNAs that code for toxic proteins, and has been increasingly recognized as a potential therapeutic intervention for a range of conditions. The present study reviews the literature for RNAi-based therapeutics in the treatment of inherited LQTS. Furthermore, this review discusses the combined use of RNAi with the emerging technology of induced pluripotent stem cells for the treatment of inherited LQTS. In addition, key challenges that must be overcome prior to RNAi-based therapies becoming clinically applicable are addressed. In summary, RNAi-based therapy is potentially a powerful therapeutic intervention, although a number of difficulties remain unresolved.

  1. RNA interference-based therapeutics for inherited long QT syndrome

    PubMed Central

    LI, GUOLIANG; MA, SHUTING; SUN, CHAOFENG

    2015-01-01

    Inherited long QT syndrome (LQTS) is an electrical heart disorder that manifests with syncope, seizures, and increased risk of torsades de pointes and sudden cardiac death. Dominant-negative current suppression is a mechanism by which pathogenic proteins disrupt the function of ion channels in inherited LQTS. However, current approaches for the management of inherited LQTS are inadequate. RNA interference (RNAi) is a powerful technique that is able to suppress or silence the expression of mutant genes. RNAi may be harnessed to knock out mRNAs that code for toxic proteins, and has been increasingly recognized as a potential therapeutic intervention for a range of conditions. The present study reviews the literature for RNAi-based therapeutics in the treatment of inherited LQTS. Furthermore, this review discusses the combined use of RNAi with the emerging technology of induced pluripotent stem cells for the treatment of inherited LQTS. In addition, key challenges that must be overcome prior to RNAi-based therapies becoming clinically applicable are addressed. In summary, RNAi-based therapy is potentially a powerful therapeutic intervention, although a number of difficulties remain unresolved. PMID:26622327

  2. Chromatic aberrations correction for imaging spectrometer based on acousto-optic tunable filter with two transducers.

    PubMed

    Zhao, Huijie; Wang, Ziye; Jia, Guorui; Zhang, Ying; Xu, Zefu

    2017-10-02

    The acousto-optic tunable filter (AOTF) with wide wavelength range and high spectral resolution has long crystal and two transducers. A longer crystal length leads to a bigger chromatic focal shift and the double-transducer arrangement induces angular mutation in diffracted beam, which increase difficulty in longitudinal and lateral chromatic aberration correction respectively. In this study, the two chromatic aberrations are analyzed quantitatively based on an AOTF optical model and a novel catadioptric dual-path configuration is proposed to correct both the chromatic aberrations. The test results exhibit effectiveness of the optical configuration for this type of AOTF-based imaging spectrometer.

  3. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  4. Quantitative Microplate-Based Respirometry with Correction for Oxygen Diffusion

    PubMed Central

    2009-01-01

    Respirometry using modified cell culture microplates offers an increase in throughput and a decrease in biological material required for each assay. Plate based respirometers are susceptible to a range of diffusion phenomena; as O2 is consumed by the specimen, atmospheric O2 leaks into the measurement volume. Oxygen also dissolves in and diffuses passively through the polystyrene commonly used as a microplate material. Consequently the walls of such respirometer chambers are not just permeable to O2 but also store substantial amounts of gas. O2 flux between the walls and the measurement volume biases the measured oxygen consumption rate depending on the actual [O2] gradient. We describe a compartment model-based correction algorithm to deconvolute the biological oxygen consumption rate from the measured [O2]. We optimize the algorithm to work with the Seahorse XF24 extracellular flux analyzer. The correction algorithm is biologically validated using mouse cortical synaptosomes and liver mitochondria attached to XF24 V7 cell culture microplates, and by comparison to classical Clark electrode oxygraph measurements. The algorithm increases the useful range of oxygen consumption rates, the temporal resolution, and durations of measurements. The algorithm is presented in a general format and is therefore applicable to other respirometer systems. PMID:19555051

  5. A high speed model-based approach for wavefront sensorless adaptive optics systems

    NASA Astrophysics Data System (ADS)

    Lianghua, Wen; Yang, Ping; Shuai, Wang; Wenjing, Liu; Shanqiu, Chen; Xu, Bing

    2018-02-01

    To improve temporal-frequency property of wavefront sensorless adaptive optics (AO) systems, a fast general model-based aberration correction algorithm is presented. The fast general model-based approach is based on the approximately linear relation between the mean square of the aberration gradients and the second moment of far-field intensity distribution. The presented model-based method is capable of completing a mode aberration effective correction just applying one disturbing onto the deformable mirror(one correction by one disturbing), which is reconstructed by the singular value decomposing the correlation matrix of the Zernike functions' gradients. Numerical simulations of AO corrections under the various random and dynamic aberrations are implemented. The simulation results indicate that the equivalent control bandwidth is 2-3 times than that of the previous method with one aberration correction after applying N times disturbing onto the deformable mirror (one correction by N disturbing).

  6. Addressing the mischaracterization of extreme rainfall in regional climate model simulations - A synoptic pattern based bias correction approach

    NASA Astrophysics Data System (ADS)

    Li, Jingwan; Sharma, Ashish; Evans, Jason; Johnson, Fiona

    2018-01-01

    Addressing systematic biases in regional climate model simulations of extreme rainfall is a necessary first step before assessing changes in future rainfall extremes. Commonly used bias correction methods are designed to match statistics of the overall simulated rainfall with observations. This assumes that change in the mix of different types of extreme rainfall events (i.e. convective and non-convective) in a warmer climate is of little relevance in the estimation of overall change, an assumption that is not supported by empirical or physical evidence. This study proposes an alternative approach to account for the potential change of alternate rainfall types, characterized here by synoptic weather patterns (SPs) using self-organizing maps classification. The objective of this study is to evaluate the added influence of SPs on the bias correction, which is achieved by comparing the corrected distribution of future extreme rainfall with that using conventional quantile mapping. A comprehensive synthetic experiment is first defined to investigate the conditions under which the additional information of SPs makes a significant difference to the bias correction. Using over 600,000 synthetic cases, statistically significant differences are found to be present in 46% cases. This is followed by a case study over the Sydney region using a high-resolution run of the Weather Research and Forecasting (WRF) regional climate model, which indicates a small change in the proportions of the SPs and a statistically significant change in the extreme rainfall over the region, although the differences between the changes obtained from the two bias correction methods are not statistically significant.

  7. Optimisation of reconstruction--reprojection-based motion correction for cardiac SPECT.

    PubMed

    Kangasmaa, Tuija S; Sohlberg, Antti O

    2014-07-01

    Cardiac motion is a challenging cause of image artefacts in myocardial perfusion SPECT. A wide range of motion correction methods have been developed over the years, and so far automatic algorithms based on the reconstruction--reprojection principle have proved to be the most effective. However, these methods have not been fully optimised in terms of their free parameters and implementational details. Two slightly different implementations of reconstruction--reprojection-based motion correction techniques were optimised for effective, good-quality motion correction and then compared with each other. The first of these methods (Method 1) was the traditional reconstruction-reprojection motion correction algorithm, where the motion correction is done in projection space, whereas the second algorithm (Method 2) performed motion correction in reconstruction space. The parameters that were optimised include the type of cost function (squared difference, normalised cross-correlation and mutual information) that was used to compare measured and reprojected projections, and the number of iterations needed. The methods were tested with motion-corrupt projection datasets, which were generated by adding three different types of motion (lateral shift, vertical shift and vertical creep) to motion-free cardiac perfusion SPECT studies. Method 2 performed slightly better overall than Method 1, but the difference between the two implementations was small. The execution time for Method 2 was much longer than for Method 1, which limits its clinical usefulness. The mutual information cost function gave clearly the best results for all three motion sets for both correction methods. Three iterations were sufficient for a good quality correction using Method 1. The traditional reconstruction--reprojection-based method with three update iterations and mutual information cost function is a good option for motion correction in clinical myocardial perfusion SPECT.

  8. A model measuring therapeutic inertia and the associated factors among diabetes patients: A nationwide population-based study in Taiwan.

    PubMed

    Huang, Li-Ying; Shau, Wen-Yi; Yeh, Hseng-Long; Chen, Tsung-Tai; Hsieh, Jun Yi; Su, Syi; Lai, Mei-Shu

    2015-01-01

    This article presents an analysis conducted on the patterns related to therapeutic inertia with the aim of uncovering how variables at the patient level and the healthcare provider level influence the intensification of therapy when it is clinically indicated. A cohort study was conducted on 899,135 HbA1c results from 168,876 adult diabetes patients with poorly controlled HbA1c levels. HbA1c results were used to identify variations in the prescription of hypoglycemic drugs. Logistic regression and hierarchical linear models (HLMs) were used to determine how differences among healthcare providers and patient characteristics influence therapeutic inertia. We estimated that 38.5% of the patients in this study were subject to therapeutic inertia. The odds ratio of cardiologists choosing to intensify therapy was 0.708 times that of endocrinologists. Furthermore, patients in medical centers were shown to be 1.077 times more likely to be prescribed intensified treatment than patients in primary clinics. The HLMs presented results similar to those of the logistic model. Overall, we determined that 88.92% of the variation in the application of intensified treatment was at the within-physician level. Reducing therapeutic inertia will likely require educational initiatives aimed at ensuring adherence to clinical practice guidelines in the care of diabetes patients. © 2014, The American College of Clinical Pharmacology.

  9. A Technique for Real-Time Ionospheric Ranging Error Correction Based On Radar Dual-Frequency Detection

    NASA Astrophysics Data System (ADS)

    Lyu, Jiang-Tao; Zhou, Chen

    2017-12-01

    Ionospheric refraction is one of the principal error sources for limiting the accuracy of radar systems for space target detection. High-accuracy measurement of the ionospheric electron density along the propagation path of radar wave is the most important procedure for the ionospheric refraction correction. Traditionally, the ionospheric model and the ionospheric detection instruments, like ionosonde or GPS receivers, are employed for obtaining the electron density. However, both methods are not capable of satisfying the requirements of correction accuracy for the advanced space target radar system. In this study, we propose a novel technique for ionospheric refraction correction based on radar dual-frequency detection. Radar target range measurements at two adjacent frequencies are utilized for calculating the electron density integral exactly along the propagation path of the radar wave, which can generate accurate ionospheric range correction. The implementation of radar dual-frequency detection is validated by a P band radar located in midlatitude China. The experimental results present that the accuracy of this novel technique is more accurate than the traditional ionospheric model correction. The technique proposed in this study is very promising for the high-accuracy radar detection and tracking of objects in geospace.

  10. The correction of time and temperature effects in MR-based 3D Fricke xylenol orange dosimetry.

    PubMed

    Welch, Mattea L; Jaffray, David A

    2017-04-21

    Previously developed MR-based three-dimensional (3D) Fricke-xylenol orange (FXG) dosimeters can provide end-to-end quality assurance and validation protocols for pre-clinical radiation platforms. FXG dosimeters quantify ionizing irradiation induced oxidation of Fe 2+ ions using pre- and post-irradiation MR imaging methods that detect changes in spin-lattice relaxation rates (R 1   =  [Formula: see text]) caused by irradiation induced oxidation of Fe 2+ . Chemical changes in MR-based FXG dosimeters that occur over time and with changes in temperature can decrease dosimetric accuracy if they are not properly characterized and corrected. This paper describes the characterization, development and utilization of an empirical model-based correction algorithm for time and temperature effects in the context of a pre-clinical irradiator and a 7 T pre-clinical MR imaging system. Time and temperature dependent changes of R 1 values were characterized using variable TR spin-echo imaging. R 1 -time and R 1 -temperature dependencies were fit using non-linear least squares fitting methods. Models were validated using leave-one-out cross-validation and resampling. Subsequently, a correction algorithm was developed that employed the previously fit empirical models to predict and reduce baseline R 1 shifts that occurred in the presence of time and temperature changes. The correction algorithm was tested on R 1 -dose response curves and 3D dose distributions delivered using a small animal irradiator at 225 kVp. The correction algorithm reduced baseline R 1 shifts from  -2.8  ×  10 -2 s -1 to 1.5  ×  10 -3 s -1 . In terms of absolute dosimetric performance as assessed with traceable standards, the correction algorithm reduced dose discrepancies from approximately 3% to approximately 0.5% (2.90  ±  2.08% to 0.20  ±  0.07%, and 2.68  ±  1.84% to 0.46  ±  0.37% for the 10  ×  10 and 8  ×  12 mm 2 fields

  11. Retinal image mosaicing using the radial distortion correction model

    NASA Astrophysics Data System (ADS)

    Lee, Sangyeol; Abràmoff, Michael D.; Reinhardt, Joseph M.

    2008-03-01

    Fundus camera imaging can be used to examine the retina to detect disorders. Similar to looking through a small keyhole into a large room, imaging the fundus with an ophthalmologic camera allows only a limited view at a time. Thus, the generation of a retinal montage using multiple images has the potential to increase diagnostic accuracy by providing larger field of view. A method of mosaicing multiple retinal images using the radial distortion correction (RADIC) model is proposed in this paper. Our method determines the inter-image connectivity by detecting feature correspondences. The connectivity information is converted to a tree structure that describes the spatial relationships between the reference and target images for pairwise registration. The montage is generated by cascading pairwise registration scheme starting from the anchor image downward through the connectivity tree hierarchy. The RADIC model corrects the radial distortion that is due to the spherical-to-planar projection during retinal imaging. Therefore, after radial distortion correction, individual images can be properly mapped onto a montage space by a linear geometric transformation, e.g. affine transform. Compared to the most existing montaging methods, our method is unique in that only a single registration per image is required because of the distortion correction property of RADIC model. As a final step, distance-weighted intensity blending is employed to correct the inter-image differences in illumination encountered when forming the montage. Visual inspection of the experimental results using three mosaicing cases shows our method can produce satisfactory montages.

  12. Exploiting Nanotechnology for the Development of MicroRNA-Based Cancer Therapeutics.

    PubMed

    Tyagi, Nikhil; Arora, Sumit; Deshmukh, Sachin K; Singh, Seema; Marimuthu, Saravanakumar; Singh, Ajay P

    2016-01-01

    MicroRNAs (miRNAs/miRs) represent a novel class of small non-coding RNAs that post-transcriptionally regulate gene expression by base pairing with complementary sequences in the 3' untranslated region (UTR) of target mRNAs. Functional studies suggest that miRNAs control almost every biological process, and their aberrant expression leads to a disease state, such as cancer. Differential expression of miRNAs in cancerous versus normal cells have generated enormous interest for the development of miRNA-based cancer cell-targeted therapeutics. Depending on the miRNA function and expression in cancer, two types of miRNA-based therapeutic strategies can be utilized that either restore or inhibit miRNA function through exogenous delivery of miRNAs mimics or inhibitors (anti-miRs). However, hydrophilic nature of miRNA mimics/anti-miRs, sensitivity to nuclease degradation in serum, poor penetration and reduced uptake by the tumor cells are chief hurdles in accomplishing their efficient in vivo delivery. To overcome these barriers, several nanotechnology-based systems are being developed and tested for delivery efficacy. This review summarizes the importance of miRNAs-based therapeutics in cancer, associated translational challenges and novel nanotechnology-assisted delivery systems that hold potential for next-generation miRNA-based cancer therapeutics.

  13. Intercomparison Of Approaches For Modeling Second Order Ionospheric Corrections Using Gnss Measurements

    NASA Astrophysics Data System (ADS)

    Garcia Fernandez, M.; Butala, M.; Komjathy, A.; Desai, S. D.

    2012-12-01

    Correcting GNSS tracking data for the effects of second order ionospheric effects have been shown to cause a southward shift in GNSS-based precise point positioning solutions by as much as 10 mm, depending on the solar cycle conditions. The most commonly used approaches for modeling the higher order ionospheric effect include, (a) the use of global ionosphere maps to determine vertical total electron content (VTEC) and convert to slant TEC (STEC) assuming a thin shell ionosphere, and (b) using the dual-frequency measurements themselves to determine STEC. The latter approach benefits from not requiring ionospheric mapping functions between VTEC and STEC. However, this approach will require calibrations with receiver and transmitter Differential Code Biases (DCBs). We present results from comparisons of the two approaches. For the first approach, we also compare the use of VTEC observations from IONEX maps compared to climatological model-derived VTEC as provided by the International Reference Ionosphere (IRI2012). We consider various metrics to evaluate the relative performance of the different approaches, including station repeatability, GNSS-based reference frame recovery, and post-fit measurement residuals. Overall, the GIM-based approaches tend to provide lower noise in second order ionosphere correction and positioning solutions. The use of IONEX and IRI2012 models of VTEC provide similar results, especially in periods of low solar activity periods. The use of the IRI2012 model provides a convenient approach for operational scenarios by eliminating the dependence on routine updates of the GIMs, and also serves as a useful source of VTEC when IONEX maps may not be readily available.

  14. Prediction of a Therapeutic Dose for Buagafuran, a Potent Anxiolytic Agent by Physiologically Based Pharmacokinetic/Pharmacodynamic Modeling Starting from Pharmacokinetics in Rats and Human.

    PubMed

    Yang, Fen; Wang, Baolian; Liu, Zhihao; Xia, Xuejun; Wang, Weijun; Yin, Dali; Sheng, Li; Li, Yan

    2017-01-01

    Physiologically based pharmacokinetic (PBPK)/pharmacodynamic (PD) models can contribute to animal-to-human extrapolation and therapeutic dose predictions. Buagafuran is a novel anxiolytic agent and phase I clinical trials of buagafuran have been completed. In this paper, a potentially effective dose for buagafuran of 30 mg t.i.d. in human was estimated based on the human brain concentration predicted by a PBPK/PD modeling. The software GastroPlus TM was used to build the PBPK/PD model for buagafuran in rat which related the brain tissue concentrations of buagafuran and the times of animals entering the open arms in the pharmacological model of elevated plus-maze. Buagafuran concentrations in human plasma were fitted and brain tissue concentrations were predicted by using a human PBPK model in which the predicted plasma profiles were in good agreement with observations. The results provided supportive data for the rational use of buagafuran in clinic.

  15. Superior therapeutic efficacy of nab-paclitaxel over cremophor-based paclitaxel in locally advanced and metastatic models of human pancreatic cancer.

    PubMed

    Rajeshkumar, N V; Yabuuchi, Shinichi; Pai, Shweta G; Tong, Zeen; Hou, Shihe; Bateman, Scott; Pierce, Daniel W; Heise, Carla; Von Hoff, Daniel D; Maitra, Anirban; Hidalgo, Manuel

    2016-08-09

    Albumin-bound paclitaxel (nab-paclitaxel, nab-PTX) plus gemcitabine (GEM) combination has demonstrated efficient antitumour activity and statistically significant overall survival of patients with metastatic pancreatic ductal adenocarcinoma (PDAC) compared with GEM monotherapy. This regimen is currently approved as a standard of care treatment option for patients with metastatic PDAC. It is unclear whether cremophor-based PTX combined with GEM provide a similar level of therapeutic efficacy in PDAC. We comprehensively explored the antitumour efficacy, effect on metastatic dissemination, tumour stroma and survival advantage following GEM, PTX and nab-PTX as monotherapy or in combination with GEM in a locally advanced, and a highly metastatic orthotopic model of human PDAC. Nab-PTX treatment resulted in significantly higher paclitaxel tumour plasma ratio (1.98-fold), robust stromal depletion, antitumour efficacy (3.79-fold) and survival benefit compared with PTX treatment. PTX plus GEM treatment showed no survival gain over GEM monotherapy. However, nab-PTX in combination with GEM decreased primary tumour burden, metastatic dissemination and significantly increased median survival of animals compared with either agents alone. These therapeutic effects were accompanied by depletion of dense fibrotic tumour stroma and decreased proliferation of carcinoma cells. Notably, nab-PTX monotherapy was equivalent to nab-PTX plus GEM in providing survival advantage to mice in a highly aggressive metastatic PDAC model, indicating that nab-PTX could potentially stop the progression of late-stage pancreatic cancer. Our data confirmed that therapeutic efficacy of PTX and nab-PTX vary widely, and the contention that these agents elicit similar antitumour response was not supported. The addition of PTX to GEM showed no survival advantage, concluding that a clinical combination of PTX and GEM may unlikely to provide significant survival advantage over GEM monotherapy and may not be a

  16. Agonist anti-GITR antibody significantly enhances the therapeutic efficacy of Listeria monocytogenes-based immunotherapy.

    PubMed

    Shrimali, Rajeev; Ahmad, Shamim; Berrong, Zuzana; Okoev, Grigori; Matevosyan, Adelaida; Razavi, Ghazaleh Shoja E; Petit, Robert; Gupta, Seema; Mkrtichyan, Mikayel; Khleif, Samir N

    2017-08-15

    We previously demonstrated that in addition to generating an antigen-specific immune response, Listeria monocytogenes (Lm)-based immunotherapy significantly reduces the ratio of regulatory T cells (Tregs)/CD4 + and myeloid-derived suppressor cells (MDSCs) in the tumor microenvironment. Since Lm-based immunotherapy is able to inhibit the immune suppressive environment, we hypothesized that combining this treatment with agonist antibody to a co-stimulatory receptor that would further boost the effector arm of immunity will result in significant improvement of anti-tumor efficacy of treatment. Here we tested the immune and therapeutic efficacy of Listeria-based immunotherapy combination with agonist antibody to glucocorticoid-induced tumor necrosis factor receptor-related protein (GITR) in TC-1 mouse tumor model. We evaluated the potency of combination on tumor growth and survival of treated animals and profiled tumor microenvironment for effector and suppressor cell populations. We demonstrate that combination of Listeria-based immunotherapy with agonist antibody to GITR synergizes to improve immune and therapeutic efficacy of treatment in a mouse tumor model. We show that this combinational treatment leads to significant inhibition of tumor-growth, prolongs survival and leads to complete regression of established tumors in 60% of treated animals. We determined that this therapeutic benefit of combinational treatment is due to a significant increase in tumor infiltrating effector CD4 + and CD8 + T cells along with a decrease of inhibitory cells. To our knowledge, this is the first study that exploits Lm-based immunotherapy combined with agonist anti-GITR antibody as a potent treatment strategy that simultaneously targets both the effector and suppressor arms of the immune system, leading to significantly improved anti-tumor efficacy. We believe that our findings depicted in this manuscript provide a promising and translatable strategy that can enhance the overall

  17. Correction of Electron Density Profiles in the Low Ionosphere Based on the Data of Vertical Sounding with the IRI Model

    NASA Astrophysics Data System (ADS)

    Denisenko, P. F.; Maltseva, O. A.; Sotsky, V. V.

    2018-03-01

    The method of correcting the daytime vertical profiles of electron plasma frequency in the low ionosphere from International Refererence Ionosphere (IRI) model in accordance with the measured data of the virtual heights and absorption of signal radiowaves (method A1) reflected from the bottom of E-region at vertical sounding (VS) is presented. The method is based on the replacement of the IRI model profile by an approximation of analytical dependence with parameters determined according to VS data and partially by the IRI model. The method is tested by the results of four joint ground-based and rocket experiments carried out in the 1970s at midlatitudes of the European part of Russia upon the launches of high-altitude geophysical rockets of the Vertical series. It is shown that the consideration of both virtual reflection heigths and absorption makes it possible to obtain electron density distributions that show the best agreement with the rocket measurements made at most height ranges in the D- and E-regions. In additional, the obtained distributions account more adequately than the IRI model for the contributions of D- and E-regions to absorption of signals reflected above these regions.

  18. Hardware-efficient bosonic quantum error-correcting codes based on symmetry operators

    NASA Astrophysics Data System (ADS)

    Niu, Murphy Yuezhen; Chuang, Isaac L.; Shapiro, Jeffrey H.

    2018-03-01

    We establish a symmetry-operator framework for designing quantum error-correcting (QEC) codes based on fundamental properties of the underlying system dynamics. Based on this framework, we propose three hardware-efficient bosonic QEC codes that are suitable for χ(2 )-interaction based quantum computation in multimode Fock bases: the χ(2 ) parity-check code, the χ(2 ) embedded error-correcting code, and the χ(2 ) binomial code. All of these QEC codes detect photon-loss or photon-gain errors by means of photon-number parity measurements, and then correct them via χ(2 ) Hamiltonian evolutions and linear-optics transformations. Our symmetry-operator framework provides a systematic procedure for finding QEC codes that are not stabilizer codes, and it enables convenient extension of a given encoding to higher-dimensional qudit bases. The χ(2 ) binomial code is of special interest because, with m ≤N identified from channel monitoring, it can correct m -photon-loss errors, or m -photon-gain errors, or (m -1 )th -order dephasing errors using logical qudits that are encoded in O (N ) photons. In comparison, other bosonic QEC codes require O (N2) photons to correct the same degree of bosonic errors. Such improved photon efficiency underscores the additional error-correction power that can be provided by channel monitoring. We develop quantum Hamming bounds for photon-loss errors in the code subspaces associated with the χ(2 ) parity-check code and the χ(2 ) embedded error-correcting code, and we prove that these codes saturate their respective bounds. Our χ(2 ) QEC codes exhibit hardware efficiency in that they address the principal error mechanisms and exploit the available physical interactions of the underlying hardware, thus reducing the physical resources required for implementing their encoding, decoding, and error-correction operations, and their universal encoded-basis gate sets.

  19. Using an experimental model for the study of therapeutic touch.

    PubMed

    dos Santos, Daniella Soares; Marta, Ilda Estéfani Ribeiro; Cárnio, Evelin Capellari; de Quadros, Andreza Urba; Cunha, Thiago Mattar; de Carvalho, Emilia Campos

    2013-02-01

    to verify whether the Paw Edema Model can be used in investigations about the effects of Therapeutic Touch on inflammation by measuring the variables pain, edema and neutrophil migration. this is a pilot and experimental study, involving ten male mice of the same genetic strain and divided into experimental and control group, submitted to the chemical induction of local inflammation in the right back paw. The experimental group received a daily administration of Therapeutic Touch for 15 minutes during three days. the data showed statistically significant differences in the nociceptive threshold and in the paw circumference of the animals from the experimental group on the second day of the experiment. the experiment model involving animals can contribute to study the effects of Therapeutic Touch on inflammation, and adjustments are suggested in the treatment duration, number of sessions and experiment duration.

  20. Mobile Image Based Color Correction Using Deblurring

    PubMed Central

    Wang, Yu; Xu, Chang; Boushey, Carol; Zhu, Fengqing; Delp, Edward J.

    2016-01-01

    Dietary intake, the process of determining what someone eats during the course of a day, provides valuable insights for mounting intervention programs for prevention of many chronic diseases such as obesity and cancer. The goals of the Technology Assisted Dietary Assessment (TADA) System, developed at Purdue University, is to automatically identify and quantify foods and beverages consumed by utilizing food images acquired with a mobile device. Color correction serves as a critical step to ensure accurate food identification and volume estimation. We make use of a specifically designed color checkerboard (i.e. a fiducial marker) to calibrate the imaging system so that the variations of food appearance under different lighting conditions can be determined. In this paper, we propose an image quality enhancement technique by combining image de-blurring and color correction. The contribution consists of introducing an automatic camera shake removal method using a saliency map and improving the polynomial color correction model using the LMS color space. PMID:28572697

  1. Mobile image based color correction using deblurring

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Xu, Chang; Boushey, Carol; Zhu, Fengqing; Delp, Edward J.

    2015-03-01

    Dietary intake, the process of determining what someone eats during the course of a day, provides valuable insights for mounting intervention programs for prevention of many chronic diseases such as obesity and cancer. The goals of the Technology Assisted Dietary Assessment (TADA) System, developed at Purdue University, is to automatically identify and quantify foods and beverages consumed by utilizing food images acquired with a mobile device. Color correction serves as a critical step to ensure accurate food identification and volume estimation. We make use of a specifically designed color checkerboard (i.e. a fiducial marker) to calibrate the imaging system so that the variations of food appearance under different lighting conditions can be determined. In this paper, we propose an image quality enhancement technique by combining image de-blurring and color correction. The contribution consists of introducing an automatic camera shake removal method using a saliency map and improving the polynomial color correction model using the LMS color space.

  2. Genome Editing in Stem Cells for Disease Therapeutics.

    PubMed

    Song, Minjung; Ramakrishna, Suresh

    2018-04-01

    Programmable nucleases including zinc finger nucleases, transcription activator-like effector nucleases, and clustered regularly interspaced short palindrome repeats (CRISPR)/CRISPR-associated protein have tremendous potential biological and therapeutic applications as novel genome editing tools. These nucleases enable precise modification of the gene of interest by disruption, insertion, or correction. The application of genome editing technology to pluripotent stem cells or hematopoietic stem cells has the potential to remarkably advance the contribution of this technology to life sciences. Specifically, disease models can be generated and effective therapeutics can be developed with great efficiency and speed. Here we review the characteristics and mechanisms of each programmable nuclease. In addition, we review the applications of these nucleases to stem cells for disease therapies and summarize key studies of interest.

  3. CANT1 lncRNA Triggers Efficient Therapeutic Efficacy by Correcting Aberrant lncing Cascade in Malignant Uveal Melanoma.

    PubMed

    Xing, Yue; Wen, Xuyang; Ding, Xia; Fan, Jiayan; Chai, Peiwei; Jia, Renbing; Ge, Shengfang; Qian, Guanxiang; Zhang, He; Fan, Xianqun

    2017-05-03

    Uveal melanoma (UM) is an intraocular malignant tumor with a high mortality rate. Recent studies have shown the functions of long non-coding RNAs (lncRNAs) in tumorigenesis; thus, targeting tumor-specific lncRNA abnormalities has become an attractive approach for developing therapeutics to treat uveal melanoma. In this study, we identified a novel nuclear CANT1 lncRNA (CASC15-New-Transcript 1) that acts as a necessary UM suppressor. CANT1 significantly reduced tumor metastatic capacity and tumor formation, either in cell culture or in animals harboring tumor xenograft. Intriguingly, XIST lncRNA serves as a potential target of CANT1, and JPX or FTX lncRNA subsequently serves as a contextual hinge to activate a novel CANT1-JPX/FTX-XIST long non-coding (lncing) pathway in UM. Moreover, CANT1 triggers the expression of JPX and FTX by directly binding to their promoters and promoting H3K4 methylation. These observations delineate a novel lncing cascade in which lncRNAs directly build a lncing cascade without coding genes that aims to modulate UM tumorigenesis, thereby specifying a novel "lncing-cascade renewal" anti-tumor therapeutic strategy by correcting aberrant lncing cascade in uveal melanoma. Copyright © 2017 The American Society of Gene and Cell Therapy. Published by Elsevier Inc. All rights reserved.

  4. The current state of therapeutic and T cell-based vaccines against human papillomaviruses

    PubMed Central

    Yang, Andrew; Farmer, Emily; Lin, John; Wu, T-C.; Hung, Chien-Fu

    2016-01-01

    Human papillomavirus (HPV) is known to be a necessary factor for many gynecologic malignancies and is also associated with a subset of head and neck malignancies. This knowledge has created the opportunity to control these HPV-associated cancers through vaccination. However, despite the availability of prophylactic HPV vaccines, HPV infections remain extremely common worldwide. In addition, while prophylactic HPV vaccines have been effective in preventing infection, they are ineffective at clearing pre-existing HPV infections. Thus, there is an urgent need for therapeutic and T cell-based vaccines to treat existing HPV infections and HPV-associated lesions and cancers. Unlike prophylactic vaccines, which generate neutralizing antibodies, therapeutic, and T cell-based vaccines enhance cell-mediated immunity against HPV antigens. Our review will cover various therapeutic and T cell-based vaccines in development for the treatment of HPV-associated diseases. Furthermore, we review the strategies to enhance the efficacy of therapeutic vaccines and the latest clinical trials on therapeutic and T cell-based HPV vaccines. PMID:27932207

  5. Corrective Action Decision Document/Corrective Action Plan for Corrective Action Unit 97: Yucca Flat/Climax Mine Nevada National Security Site, Nevada, Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farnham, Irene

    This corrective action decision document (CADD)/corrective action plan (CAP) has been prepared for Corrective Action Unit (CAU) 97, Yucca Flat/Climax Mine, Nevada National Security Site (NNSS), Nevada. The Yucca Flat/Climax Mine CAU is located in the northeastern portion of the NNSS and comprises 720 corrective action sites. A total of 747 underground nuclear detonations took place within this CAU between 1957 and 1992 and resulted in the release of radionuclides (RNs) in the subsurface in the vicinity of the test cavities. The CADD portion describes the Yucca Flat/Climax Mine CAU data-collection and modeling activities completed during the corrective action investigationmore » (CAI) stage, presents the corrective action objectives, and describes the actions recommended to meet the objectives. The CAP portion describes the corrective action implementation plan. The CAP presents CAU regulatory boundary objectives and initial use-restriction boundaries identified and negotiated by DOE and the Nevada Division of Environmental Protection (NDEP). The CAP also presents the model evaluation process designed to build confidence that the groundwater flow and contaminant transport modeling results can be used for the regulatory decisions required for CAU closure. The UGTA strategy assumes that active remediation of subsurface RN contamination is not feasible with current technology. As a result, the corrective action is based on a combination of characterization and modeling studies, monitoring, and institutional controls. The strategy is implemented through a four-stage approach that comprises the following: (1) corrective action investigation plan (CAIP), (2) CAI, (3) CADD/CAP, and (4) closure report (CR) stages.« less

  6. The promises and pitfalls of RNA-interference-based therapeutics

    PubMed Central

    Castanotto, Daniela; Rossi, John J.

    2009-01-01

    The discovery that gene expression can be controlled by the Watson–Crick base-pairing of small RNAs with messenger RNAs containing complementary sequence — a process known as RNA interference — has markedly advanced our understanding of eukaryotic gene regulation and function. The ability of short RNA sequences to modulate gene expression has provided a powerful tool with which to study gene function and is set to revolutionize the treatment of disease. Remarkably, despite being just one decade from its discovery, the phenomenon is already being used therapeutically in human clinical trials, and biotechnology companies that focus on RNA-interference-based therapeutics are already publicly traded. PMID:19158789

  7. Correcting Satellite Image Derived Surface Model for Atmospheric Effects

    NASA Technical Reports Server (NTRS)

    Emery, William; Baldwin, Daniel

    1998-01-01

    This project was a continuation of the project entitled "Resolution Earth Surface Features from Repeat Moderate Resolution Satellite Imagery". In the previous study, a Bayesian Maximum Posterior Estimate (BMPE) algorithm was used to obtain a composite series of repeat imagery from the Advanced Very High Resolution Radiometer (AVHRR). The spatial resolution of the resulting composite was significantly greater than the 1 km resolution of the individual AVHRR images. The BMPE algorithm utilized a simple, no-atmosphere geometrical model for the short-wave radiation budget at the Earth's surface. A necessary assumption of the algorithm is that all non geometrical parameters remain static over the compositing period. This assumption is of course violated by temporal variations in both the surface albedo and the atmospheric medium. The effect of the albedo variations is expected to be minimal since the variations are on a fairly long time scale compared to the compositing period, however, the atmospheric variability occurs on a relatively short time scale and can be expected to cause significant errors in the surface reconstruction. The current project proposed to incorporate an atmospheric correction into the BMPE algorithm for the purpose of investigating the effects of a variable atmosphere on the surface reconstructions. Once the atmospheric effects were determined, the investigation could be extended to include corrections various cloud effects, including short wave radiation through thin cirrus clouds. The original proposal was written for a three year project, funded one year at a time. The first year of the project focused on developing an understanding of atmospheric corrections and choosing an appropriate correction model. Several models were considered and the list was narrowed to the two best suited. These were the 5S and 6S shortwave radiation models developed at NASA/GODDARD and tested extensively with data from the AVHRR instrument. Although the 6S model

  8. Bias-Corrected Estimation of Noncentrality Parameters of Covariance Structure Models

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2005-01-01

    A bias-corrected estimator of noncentrality parameters of covariance structure models is discussed. The approach represents an application of the bootstrap methodology for purposes of bias correction, and utilizes the relation between average of resample conventional noncentrality parameter estimates and their sample counterpart. The…

  9. Optically buffered Jones-matrix-based multifunctional optical coherence tomography with polarization mode dispersion correction

    PubMed Central

    Hong, Young-Joo; Makita, Shuichi; Sugiyama, Satoshi; Yasuno, Yoshiaki

    2014-01-01

    Polarization mode dispersion (PMD) degrades the performance of Jones-matrix-based polarization-sensitive multifunctional optical coherence tomography (JM-OCT). The problem is specially acute for optically buffered JM-OCT, because the long fiber in the optical buffering module induces a large amount of PMD. This paper aims at presenting a method to correct the effect of PMD in JM-OCT. We first mathematically model the PMD in JM-OCT and then derive a method to correct the PMD. This method is a combination of simple hardware modification and subsequent software correction. The hardware modification is introduction of two polarizers which transform the PMD into global complex modulation of Jones matrix. Subsequently, the software correction demodulates the global modulation. The method is validated with an experimentally obtained point spread function with a mirror sample, as well as by in vivo measurement of a human retina. PMID:25657888

  10. Surface corrections for peridynamic models in elasticity and fracture

    NASA Astrophysics Data System (ADS)

    Le, Q. V.; Bobaru, F.

    2018-04-01

    Peridynamic models are derived by assuming that a material point is located in the bulk. Near a surface or boundary, material points do not have a full non-local neighborhood. This leads to effective material properties near the surface of a peridynamic model to be slightly different from those in the bulk. A number of methods/algorithms have been proposed recently for correcting this peridynamic surface effect. In this study, we investigate the efficacy and computational cost of peridynamic surface correction methods for elasticity and fracture. We provide practical suggestions for reducing the peridynamic surface effect.

  11. Nonlinear model for offline correction of pulmonary waveform generators.

    PubMed

    Reynolds, Jeffrey S; Stemple, Kimberly J; Petsko, Raymond A; Ebeling, Thomas R; Frazer, David G

    2002-12-01

    Pulmonary waveform generators consisting of motor-driven piston pumps are frequently used to test respiratory-function equipment such as spirometers and peak expiratory flow (PEF) meters. Gas compression within these generators can produce significant distortion of the output flow-time profile. A nonlinear model of the generator was developed along with a method to compensate for gas compression when testing pulmonary function equipment. The model and correction procedure were tested on an Assess Full Range PEF meter and a Micro DiaryCard PEF meter. The tests were performed using the 26 American Thoracic Society standard flow-time waveforms as the target flow profiles. Without correction, the pump loaded with the higher resistance Assess meter resulted in ten waveforms having a mean square error (MSE) higher than 0.001 L2/s2. Correction of the pump for these ten waveforms resulted in a mean decrease in MSE of 87.0%. When loaded with the Micro DiaryCard meter, the uncorrected pump outputs included six waveforms with MSE higher than 0.001 L2/s2. Pump corrections for these six waveforms resulted in a mean decrease in MSE of 58.4%.

  12. Toxin-Based Therapeutic Approaches

    PubMed Central

    Shapira, Assaf; Benhar, Itai

    2010-01-01

    Protein toxins confer a defense against predation/grazing or a superior pathogenic competence upon the producing organism. Such toxins have been perfected through evolution in poisonous animals/plants and pathogenic bacteria. Over the past five decades, a lot of effort has been invested in studying their mechanism of action, the way they contribute to pathogenicity and in the development of antidotes that neutralize their action. In parallel, many research groups turned to explore the pharmaceutical potential of such toxins when they are used to efficiently impair essential cellular processes and/or damage the integrity of their target cells. The following review summarizes major advances in the field of toxin based therapeutics and offers a comprehensive description of the mode of action of each applied toxin. PMID:22069564

  13. Recent Advances on Inorganic Nanoparticle-Based Cancer Therapeutic Agents

    PubMed Central

    Wang, Fenglin; Li, Chengyao; Cheng, Jing; Yuan, Zhiqin

    2016-01-01

    Inorganic nanoparticles have been widely investigated as therapeutic agents for cancer treatments in biomedical fields due to their unique physical/chemical properties, versatile synthetic strategies, easy surface functionalization and excellent biocompatibility. This review focuses on the discussion of several types of inorganic nanoparticle-based cancer therapeutic agents, including gold nanoparticles, magnetic nanoparticles, upconversion nanoparticles and mesoporous silica nanoparticles. Several cancer therapy techniques are briefly introduced at the beginning. Emphasis is placed on how these inorganic nanoparticles can provide enhanced therapeutic efficacy in cancer treatment through site-specific accumulation, targeted drug delivery and stimulated drug release, with elaborations on several examples to highlight the respective strategies adopted. Finally, a brief summary and future challenges are included. PMID:27898016

  14. Recent Advances on Inorganic Nanoparticle-Based Cancer Therapeutic Agents.

    PubMed

    Wang, Fenglin; Li, Chengyao; Cheng, Jing; Yuan, Zhiqin

    2016-11-25

    Inorganic nanoparticles have been widely investigated as therapeutic agents for cancer treatments in biomedical fields due to their unique physical/chemical properties, versatile synthetic strategies, easy surface functionalization and excellent biocompatibility. This review focuses on the discussion of several types of inorganic nanoparticle-based cancer therapeutic agents, including gold nanoparticles, magnetic nanoparticles, upconversion nanoparticles and mesoporous silica nanoparticles. Several cancer therapy techniques are briefly introduced at the beginning. Emphasis is placed on how these inorganic nanoparticles can provide enhanced therapeutic efficacy in cancer treatment through site-specific accumulation, targeted drug delivery and stimulated drug release, with elaborations on several examples to highlight the respective strategies adopted. Finally, a brief summary and future challenges are included.

  15. An Overview on the Role of α -Synuclein in Experimental Models of Parkinson's Disease from Pathogenesis to Therapeutics.

    PubMed

    Javed, Hayate; Kamal, Mohammad Amjad; Ojha, Shreesh

    2016-01-01

    Parkinson's disease (PD) is a devastating and progressive movement disorder characterized by symptoms of muscles rigidity, tremor, postural instability and slow physical movements. Biochemically, PD is characterized by lack of dopamine production and its action due to loss of dopaminergic neurons and neuropathologically by the presence of intracytoplasmic inclusions known as Lewy bodies, which mainly consist of presynaptic neuronal protein, α-synuclein (α-syn). It is believed that alteration in α-syn homeostasis leads to increased accumulation and aggregation of α-syn in Lewy body. Based on the important role of α-syn from pathogenesis to therapeutics, the recent researches are mainly focused on deciphering the critical role of α-syn at advanced level. Being a major protein in Lewy body that has a key role in pathogenesis of PD, several model systems including immortalized cell lines (SH-SY5Y), primary neuronal cultures, yeast (saccharomyces cerevisiae), drosophila (fruit flies), nematodes (Caenorhabditis elegans) and rodents are being employed to understand the PD pathogenesis and treatment. In order to study the etiopathogensis and develop novel therapeutic target for α -syn aggregation, majority of investigators rely on toxin (rotenone, 1-Methyl-4-Phenyl-1,2,3,6-Tetrahydropyridine, 6-hydroxydopamine, paraquat)-induced animal models of PD as a tool for basic research. Whereas, cell and tissue based models are mostly utilized to elucidate the mechanistic and molecular pathways underlying the α -syn induced toxicity and therapeutic approaches in PD. Gene modified mouse models based on α-syn expression are fascinating for modeling familial PD and toxin induced models provide a suitable approach for sporadic PD. The purpose of this review is to provide a summary and a critical review of the involvement of α-syn in various in vitro and in vivo models of PD based on use of neurotoxins as well as genetic modifications.

  16. Selecting the correct cellular model for assessing of the biological response of collagen-based biomaterials.

    PubMed

    Davidenko, Natalia; Hamaia, Samir; Bax, Daniel V; Malcor, Jean-Daniel; Schuster, Carlos F; Gullberg, Donald; Farndale, Richard W; Best, Serena M; Cameron, Ruth E

    2018-01-01

    the cell adhesion results, showing receptor class- and species-specificities. The understanding of the physiologically relevant cell anchorage characteristics of bio-constructs may assist in the selection of (1) the optimum collagen source for cellular supports and (2) the correct cellular model for their biological assessment. This, in turn, may allow reliable prediction of the biological performance of bio-scaffolds in vivo for specific TE applications. Integrins play a vital role in cellular responses to environmental cues during early-stage cell-substrate interaction. We describe physiologically relevant cell anchorage to collagen substrates that present different affinity cell-recognition motifs, to provide experimental tools to assist in understanding integrin binding. Using different cell types and recombinant integrin α1-I-domains, we found that cellular response was highly dependent on collagen type, origin and EDC-crosslinking status, as well as on the integrin class and species of origin. This comprehensive study establishes selectivity amongst the four collagen-binding integrins and species-specific properties that together may influence choice of cell type and receptor in different experimental settings. This work offers key guidance in selecting of the correct cellular model for the biological testing of collagen-based biomaterials. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  17. Accuracy Improvement Capability of Advanced Projectile Based on Course Correction Fuze Concept

    PubMed Central

    Elsaadany, Ahmed; Wen-jun, Yi

    2014-01-01

    Improvement in terminal accuracy is an important objective for future artillery projectiles. Generally it is often associated with range extension. Various concepts and modifications are proposed to correct the range and drift of artillery projectile like course correction fuze. The course correction fuze concepts could provide an attractive and cost-effective solution for munitions accuracy improvement. In this paper, the trajectory correction has been obtained using two kinds of course correction modules, one is devoted to range correction (drag ring brake) and the second is devoted to drift correction (canard based-correction fuze). The course correction modules have been characterized by aerodynamic computations and flight dynamic investigations in order to analyze the effects on deflection of the projectile aerodynamic parameters. The simulation results show that the impact accuracy of a conventional projectile using these course correction modules can be improved. The drag ring brake is found to be highly capable for range correction. The deploying of the drag brake in early stage of trajectory results in large range correction. The correction occasion time can be predefined depending on required correction of range. On the other hand, the canard based-correction fuze is found to have a higher effect on the projectile drift by modifying its roll rate. In addition, the canard extension induces a high-frequency incidence angle as canards reciprocate at the roll motion. PMID:25097873

  18. Accuracy improvement capability of advanced projectile based on course correction fuze concept.

    PubMed

    Elsaadany, Ahmed; Wen-jun, Yi

    2014-01-01

    Improvement in terminal accuracy is an important objective for future artillery projectiles. Generally it is often associated with range extension. Various concepts and modifications are proposed to correct the range and drift of artillery projectile like course correction fuze. The course correction fuze concepts could provide an attractive and cost-effective solution for munitions accuracy improvement. In this paper, the trajectory correction has been obtained using two kinds of course correction modules, one is devoted to range correction (drag ring brake) and the second is devoted to drift correction (canard based-correction fuze). The course correction modules have been characterized by aerodynamic computations and flight dynamic investigations in order to analyze the effects on deflection of the projectile aerodynamic parameters. The simulation results show that the impact accuracy of a conventional projectile using these course correction modules can be improved. The drag ring brake is found to be highly capable for range correction. The deploying of the drag brake in early stage of trajectory results in large range correction. The correction occasion time can be predefined depending on required correction of range. On the other hand, the canard based-correction fuze is found to have a higher effect on the projectile drift by modifying its roll rate. In addition, the canard extension induces a high-frequency incidence angle as canards reciprocate at the roll motion.

  19. Evaluation of scoring models for identifying the need for therapeutic intervention of upper gastrointestinal bleeding: A new prediction score model for Japanese patients.

    PubMed

    Iino, Chikara; Mikami, Tatsuya; Igarashi, Takasato; Aihara, Tomoyuki; Ishii, Kentaro; Sakamoto, Jyuichi; Tono, Hiroshi; Fukuda, Shinsaku

    2016-11-01

    Multiple scoring systems have been developed to predict outcomes in patients with upper gastrointestinal bleeding. We determined how well these and a newly established scoring model predict the need for therapeutic intervention, excluding transfusion, in Japanese patients with upper gastrointestinal bleeding. We reviewed data from 212 consecutive patients with upper gastrointestinal bleeding. Patients requiring endoscopic intervention, operation, or interventional radiology were allocated to the therapeutic intervention group. Firstly, we compared areas under the curve for the Glasgow-Blatchford, Clinical Rockall, and AIMS65 scores. Secondly, the scores and factors likely associated with upper gastrointestinal bleeding were analyzed with a logistic regression analysis to form a new scoring model. Thirdly, the new model and the existing model were investigated to evaluate their usefulness. Therapeutic intervention was required in 109 patients (51.4%). The Glasgow-Blatchford score was superior to both the Clinical Rockall and AIMS65 scores for predicting therapeutic intervention need (area under the curve, 0.75 [95% confidence interval, 0.69-0.81] vs 0.53 [0.46-0.61] and 0.52 [0.44-0.60], respectively). Multivariate logistic regression analysis retained seven significant predictors in the model: systolic blood pressure <100 mmHg, syncope, hematemesis, hemoglobin <10 g/dL, blood urea nitrogen ≥22.4 mg/dL, estimated glomerular filtration rate ≤ 60 mL/min per 1.73 m 2 , and antiplatelet medication. Based on these variables, we established a new scoring model with superior discrimination to those of existing scoring systems (area under the curve, 0.85 [0.80-0.90]). We developed a superior scoring model for identifying therapeutic intervention need in Japanese patients with upper gastrointestinal bleeding. © 2016 Japan Gastroenterological Endoscopy Society.

  20. Therapeutic strategies based on modified U1 snRNAs and chaperones for Sanfilippo C splicing mutations.

    PubMed

    Matos, Liliana; Canals, Isaac; Dridi, Larbi; Choi, Yoo; Prata, Maria João; Jordan, Peter; Desviat, Lourdes R; Pérez, Belén; Pshezhetsky, Alexey V; Grinberg, Daniel; Alves, Sandra; Vilageliu, Lluïsa

    2014-12-10

    Mutations affecting RNA splicing represent more than 20% of the mutant alleles in Sanfilippo syndrome type C, a rare lysosomal storage disorder that causes severe neurodegeneration. Many of these mutations are localized in the conserved donor or acceptor splice sites, while few are found in the nearby nucleotides. In this study we tested several therapeutic approaches specifically designed for different splicing mutations depending on how the mutations affect mRNA processing. For three mutations that affect the donor site (c.234 + 1G > A, c.633 + 1G > A and c.1542 + 4dupA), different modified U1 snRNAs recognizing the mutated donor sites, have been developed in an attempt to rescue the normal splicing process. For another mutation that affects an acceptor splice site (c.372-2A > G) and gives rise to a protein lacking four amino acids, a competitive inhibitor of the HGSNAT protein, glucosamine, was tested as a pharmacological chaperone to correct the aberrant folding and to restore the normal trafficking of the protein to the lysosome. Partial correction of c.234 + 1G > A mutation was achieved with a modified U1 snRNA that completely matches the splice donor site suggesting that these molecules may have a therapeutic potential for some splicing mutations. Furthermore, the importance of the splice site sequence context is highlighted as a key factor in the success of this type of therapy. Additionally, glucosamine treatment resulted in an increase in the enzymatic activity, indicating a partial recovery of the correct folding. We have assayed two therapeutic strategies for different splicing mutations with promising results for the future applications.

  1. Vascular input function correction of inflow enhancement for improved pharmacokinetic modeling of liver DCE-MRI.

    PubMed

    Ning, Jia; Schubert, Tilman; Johnson, Kevin M; Roldán-Alzate, Alejandro; Chen, Huijun; Yuan, Chun; Reeder, Scott B

    2018-06-01

    To propose a simple method to correct vascular input function (VIF) due to inflow effects and to test whether the proposed method can provide more accurate VIFs for improved pharmacokinetic modeling. A spoiled gradient echo sequence-based inflow quantification and contrast agent concentration correction method was proposed. Simulations were conducted to illustrate improvement in the accuracy of VIF estimation and pharmacokinetic fitting. Animal studies with dynamic contrast-enhanced MR scans were conducted before, 1 week after, and 2 weeks after portal vein embolization (PVE) was performed in the left portal circulation of pigs. The proposed method was applied to correct the VIFs for model fitting. Pharmacokinetic parameters fitted using corrected and uncorrected VIFs were compared between different lobes and visits. Simulation results demonstrated that the proposed method can improve accuracy of VIF estimation and pharmacokinetic fitting. In animal study results, pharmacokinetic fitting using corrected VIFs demonstrated changes in perfusion consistent with changes expected after PVE, whereas the perfusion estimates derived by uncorrected VIFs showed no significant changes. The proposed correction method improves accuracy of VIFs and therefore provides more precise pharmacokinetic fitting. This method may be promising in improving the reliability of perfusion quantification. Magn Reson Med 79:3093-3102, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  2. Hydrological modeling as an evaluation tool of EURO-CORDEX climate projections and bias correction methods

    NASA Astrophysics Data System (ADS)

    Hakala, Kirsti; Addor, Nans; Seibert, Jan

    2017-04-01

    Streamflow stemming from Switzerland's mountainous landscape will be influenced by climate change, which will pose significant challenges to the water management and policy sector. In climate change impact research, the determination of future streamflow is impeded by different sources of uncertainty, which propagate through the model chain. In this research, we explicitly considered the following sources of uncertainty: (1) climate models, (2) downscaling of the climate projections to the catchment scale, (3) bias correction method and (4) parameterization of the hydrological model. We utilize climate projections at the 0.11 degree 12.5 km resolution from the EURO-CORDEX project, which are the most recent climate projections for the European domain. EURO-CORDEX is comprised of regional climate model (RCM) simulations, which have been downscaled from global climate models (GCMs) from the CMIP5 archive, using both dynamical and statistical techniques. Uncertainties are explored by applying a modeling chain involving 14 GCM-RCMs to ten Swiss catchments. We utilize the rainfall-runoff model HBV Light, which has been widely used in operational hydrological forecasting. The Lindström measure, a combination of model efficiency and volume error, was used as an objective function to calibrate HBV Light. Ten best sets of parameters are then achieved by calibrating using the genetic algorithm and Powell optimization (GAP) method. The GAP optimization method is based on the evolution of parameter sets, which works by selecting and recombining high performing parameter sets with each other. Once HBV is calibrated, we then perform a quantitative comparison of the influence of biases inherited from climate model simulations to the biases stemming from the hydrological model. The evaluation is conducted over two time periods: i) 1980-2009 to characterize the simulation realism under the current climate and ii) 2070-2099 to identify the magnitude of the projected change of

  3. Application of Pressure-Based Wall Correction Methods to Two NASA Langley Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Iyer, V.; Everhart, J. L.

    2001-01-01

    This paper is a description and status report on the implementation and application of the WICS wall interference method to the National Transonic Facility (NTF) and the 14 x 22-ft subsonic wind tunnel at the NASA Langley Research Center. The method calculates free-air corrections to the measured parameters and aerodynamic coefficients for full span and semispan models when the tunnels are in the solid-wall configuration. From a data quality point of view, these corrections remove predictable bias errors in the measurement due to the presence of the tunnel walls. At the NTF, the method is operational in the off-line and on-line modes, with three tests already computed for wall corrections. At the 14 x 22-ft tunnel, initial implementation has been done based on a test on a full span wing. This facility is currently scheduled for an upgrade to its wall pressure measurement system. With the addition of new wall orifices and other instrumentation upgrades, a significant improvement in the wall correction accuracy is expected.

  4. Constraint based modeling of metabolism allows finding metabolic cancer hallmarks and identifying personalized therapeutic windows.

    PubMed

    Bordel, Sergio

    2018-04-13

    In order to choose optimal personalized anticancer treatments, transcriptomic data should be analyzed within the frame of biological networks. The best known human biological network (in terms of the interactions between its different components) is metabolism. Cancer cells have been known to have specific metabolic features for a long time and currently there is a growing interest in characterizing new cancer specific metabolic hallmarks. In this article it is presented a method to find personalized therapeutic windows using RNA-seq data and Genome Scale Metabolic Models. This method is implemented in the python library, pyTARG. Our predictions showed that the most anticancer selective (affecting 27 out of 34 considered cancer cell lines and only 1 out of 6 healthy mesenchymal stem cell lines) single metabolic reactions are those involved in cholesterol biosynthesis. Excluding cholesterol biosynthesis, all the considered cell lines can be selectively affected by targeting different combinations (from 1 to 5 reactions) of only 18 metabolic reactions, which suggests that a small subset of drugs or siRNAs combined in patient specific manners could be at the core of metabolism based personalized treatments.

  5. Adaptive gamma correction-based expert system for nonuniform illumination face enhancement

    NASA Astrophysics Data System (ADS)

    Abdelhamid, Iratni; Mustapha, Aouache; Adel, Oulefki

    2018-03-01

    The image quality of a face recognition system suffers under severe lighting conditions. Thus, this study aims to develop an approach for nonuniform illumination adjustment based on an adaptive gamma correction (AdaptGC) filter that can solve the aforementioned issue. An approach for adaptive gain factor prediction was developed via neural network model-based cross-validation (NN-CV). To achieve this objective, a gamma correction function and its effects on the face image quality with different gain values were examined first. Second, an orientation histogram (OH) algorithm was assessed as a face's feature descriptor. Subsequently, a density histogram module was developed for face label generation. During the NN-CV construction, the model was assessed to recognize the OH descriptor and predict the face label. The performance of the NN-CV model was evaluated by examining the statistical measures of root mean square error and coefficient of efficiency. Third, to evaluate the AdaptGC enhancement approach, an image quality metric was adopted using enhancement by entropy, contrast per pixel, second-derivative-like measure of enhancement, and sharpness, then supported by visual inspection. The experiment results were examined using five face's databases, namely, extended Yale-B, Carnegie Mellon University-Pose, Illumination, and Expression, Mobio, FERET, and Oulu-CASIA-NIR-VIS. The final results prove that AdaptGC filter implementation compared with state-of-the-art methods is the best choice in terms of contrast and nonuniform illumination adjustment. In summary, the benefits attained prove that AdaptGC is driven by a profitable enhancement rate, which provides satisfying features for high rate face recognition systems.

  6. A novel strategy for development of recombinant antitoxin therapeutics tested in a mouse botulism model.

    PubMed

    Mukherjee, Jean; Tremblay, Jacqueline M; Leysath, Clinton E; Ofori, Kwasi; Baldwin, Karen; Feng, Xiaochuan; Bedenice, Daniela; Webb, Robert P; Wright, Patrick M; Smith, Leonard A; Tzipori, Saul; Shoemaker, Charles B

    2012-01-01

    Antitoxins are needed that can be produced economically with improved safety and shelf life compared to conventional antisera-based therapeutics. Here we report a practical strategy for development of simple antitoxin therapeutics with substantial advantages over currently available treatments. The therapeutic strategy employs a single recombinant 'targeting agent' that binds a toxin at two unique sites and a 'clearing Ab' that binds two epitopes present on each targeting agent. Co-administration of the targeting agent and the clearing Ab results in decoration of the toxin with up to four Abs to promote accelerated clearance. The therapeutic strategy was applied to two Botulinum neurotoxin (BoNT) serotypes and protected mice from lethality in two different intoxication models with an efficacy equivalent to conventional antitoxin serum. Targeting agents were a single recombinant protein consisting of a heterodimer of two camelid anti-BoNT heavy-chain-only Ab V(H) (VHH) binding domains and two E-tag epitopes. The clearing mAb was an anti-E-tag mAb. By comparing the in vivo efficacy of treatments that employed neutralizing vs. non-neutralizing agents or the presence vs. absence of clearing Ab permitted unprecedented insight into the roles of toxin neutralization and clearance in antitoxin efficacy. Surprisingly, when a post-intoxication treatment model was used, a toxin-neutralizing heterodimer agent fully protected mice from intoxication even in the absence of clearing Ab. Thus a single, easy-to-produce recombinant protein was as efficacious as polyclonal antiserum in a clinically-relevant mouse model of botulism. This strategy should have widespread application in antitoxin development and other therapies in which neutralization and/or accelerated clearance of a serum biomolecule can offer therapeutic benefit.

  7. Lens correction algorithm based on the see-saw diagram to correct Seidel aberrations employing aspheric surfaces

    NASA Astrophysics Data System (ADS)

    Rosete-Aguilar, Martha

    2000-06-01

    In this paper a lens correction algorithm based on the see- saw diagram developed by Burch is described. The see-saw diagram describes the image correction in rotationally symmetric systems over a finite field of view by means of aspherics surfaces. The algorithm is applied to the design of some basic telescopic configurations such as the classical Cassegrain telescope, the Dall-Kirkham telescope, the Pressman-Camichel telescope and the Ritchey-Chretien telescope in order to show a physically visualizable concept of image correction for optical systems that employ aspheric surfaces. By using the see-saw method the student can visualize the different possible configurations of such telescopes as well as their performances and also the student will be able to understand that it is not always possible to correct more primary aberrations by aspherizing more surfaces.

  8. Carrier-phase multipath corrections for GPS-based satellite attitude determination

    NASA Technical Reports Server (NTRS)

    Axelrad, A.; Reichert, P.

    2001-01-01

    This paper demonstrates the high degree of spatial repeatability of these errors for a spacecraft environment and describes a correction technique, termed the sky map method, which exploits the spatial correlation to correct measurements and improve the accuracy of GPS-based attitude solutions.

  9. Repeat-aware modeling and correction of short read errors.

    PubMed

    Yang, Xiao; Aluru, Srinivas; Dorman, Karin S

    2011-02-15

    High-throughput short read sequencing is revolutionizing genomics and systems biology research by enabling cost-effective deep coverage sequencing of genomes and transcriptomes. Error detection and correction are crucial to many short read sequencing applications including de novo genome sequencing, genome resequencing, and digital gene expression analysis. Short read error detection is typically carried out by counting the observed frequencies of kmers in reads and validating those with frequencies exceeding a threshold. In case of genomes with high repeat content, an erroneous kmer may be frequently observed if it has few nucleotide differences with valid kmers with multiple occurrences in the genome. Error detection and correction were mostly applied to genomes with low repeat content and this remains a challenging problem for genomes with high repeat content. We develop a statistical model and a computational method for error detection and correction in the presence of genomic repeats. We propose a method to infer genomic frequencies of kmers from their observed frequencies by analyzing the misread relationships among observed kmers. We also propose a method to estimate the threshold useful for validating kmers whose estimated genomic frequency exceeds the threshold. We demonstrate that superior error detection is achieved using these methods. Furthermore, we break away from the common assumption of uniformly distributed errors within a read, and provide a framework to model position-dependent error occurrence frequencies common to many short read platforms. Lastly, we achieve better error correction in genomes with high repeat content. The software is implemented in C++ and is freely available under GNU GPL3 license and Boost Software V1.0 license at "http://aluru-sun.ece.iastate.edu/doku.php?id = redeem". We introduce a statistical framework to model sequencing errors in next-generation reads, which led to promising results in detecting and correcting errors

  10. Stem cells: The Next Therapeutic Frontier

    PubMed Central

    Humes, H. David

    2005-01-01

    Cell therapy is one of the most exciting fields in translational medicine. It stands at the intersection of a variety of rapidly developing scientific disciplines: stem cell biology, immunology, tissue engineering, molecular biology, biomaterials, transplantation biology, regenerative medicine, and clinical research. Cell-based therapy may develop into a new therapeutic platform to treat a vast array of clinical disorders. Blood transfusions and bone marrow transplantation are prime examples of the successful application of cell-based therapeutics; but recent advances in cellular and molecular biology have expanded the potential applications of this approach. Although recombinant genetic engineering to produce a variety of therapeutics such as human erythropoietin and insulin has proven successful, these treatments are unable to completely correct or reverse disease states, because most common disease processes are not due to the deficiency of a single protein but develop due to alterations in the complex interactions of a variety of cell components. In these complex situations, cell-based therapy may be a more successful strategy by providing a dynamic, interactive, and individualized therapeutic approach that responds to the pathophysiological condition of the patient. In this regard, cells may provide innovative methods for drug delivery of biologics, immunotherapy, and tissue regenerative or replacement engineering (1,2). The translation of this discipline to medical practice has tremendous potential, but in many applications technological issues need to be overcome. Since many cell-based indications are already being evaluated in the clinic, the field appears to be on the threshold of a number of successes. This review will focus on our group's use of human stem/progenitor cells in the treatment of acute and chronic renal failure as extensions to the current successful renal substitution processes of hemodialysis and hemofiltration. PMID:16555613

  11. Correction of Measured Taxicab Exhaust Emission Data Based on Cmem Modle

    NASA Astrophysics Data System (ADS)

    Li, Q.; Jia, T.

    2017-09-01

    Carbon dioxide emissions from urban road traffic mainly come from automobile exhaust. However, the carbon dioxide emissions obtained by the instruments are unreliable due to time delay error. In order to improve the reliability of data, we propose a method to correct the measured vehicles' carbon dioxide emissions from instrument based on the CMEM model. Firstly, the synthetic time series of carbon dioxide emissions are simulated by CMEM model and GPS velocity data. Then, taking the simulation data as the control group, the time delay error of the measured carbon dioxide emissions can be estimated by the asynchronous correlation analysis, and the outliers can be automatically identified and corrected using the principle of DTW algorithm. Taking the taxi trajectory data of Wuhan as an example, the results show that (1) the correlation coefficient between the measured data and the control group data can be improved from 0.52 to 0.59 by mitigating the systematic time delay error. Furthermore, by adjusting the outliers which account for 4.73 % of the total data, the correlation coefficient can raise to 0.63, which suggests strong correlation. The construction of low carbon traffic has become the focus of the local government. In order to respond to the slogan of energy saving and emission reduction, the distribution of carbon emissions from motor vehicle exhaust emission was studied. So our corrected data can be used to make further air quality analysis.

  12. The current state of therapeutic and T cell-based vaccines against human papillomaviruses.

    PubMed

    Yang, Andrew; Farmer, Emily; Lin, John; Wu, T-C; Hung, Chien-Fu

    2017-03-02

    Human papillomavirus (HPV) is known to be a necessary factor for many gynecologic malignancies and is also associated with a subset of head and neck malignancies. This knowledge has created the opportunity to control these HPV-associated cancers through vaccination. However, despite the availability of prophylactic HPV vaccines, HPV infections remain extremely common worldwide. In addition, while prophylactic HPV vaccines have been effective in preventing infection, they are ineffective at clearing pre-existing HPV infections. Thus, there is an urgent need for therapeutic and T cell-based vaccines to treat existing HPV infections and HPV-associated lesions and cancers. Unlike prophylactic vaccines, which generate neutralizing antibodies, therapeutic, and T cell-based vaccines enhance cell-mediated immunity against HPV antigens. Our review will cover various therapeutic and T cell-based vaccines in development for the treatment of HPV-associated diseases. Furthermore, we review the strategies to enhance the efficacy of therapeutic vaccines and the latest clinical trials on therapeutic and T cell-based HPV vaccines. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Toward Exosome-Based Therapeutics: Isolation, Heterogeneity, and Fit-for-Purpose Potency

    PubMed Central

    Willis, Gareth R.; Kourembanas, Stella; Mitsialis, S. Alex

    2017-01-01

    Exosomes are defined as submicron (30–150 nm), lipid bilayer-enclosed extracellular vesicles (EVs), specifically generated by the late endosomal compartment through fusion of multivesicular bodies with the plasma membrane. Produced by almost all cells, exosomes were originally considered to represent just a mechanism for jettisoning unwanted cellular moieties. Although this may be a major function in most cells, evolution has recruited the endosomal membrane-sorting pathway to duties beyond mere garbage disposal, one of the most notable examples being its cooption by retroviruses for the generation of Trojan virions. It is, therefore, tempting to speculate that certain cell types have evolved an exosome subclass active in intracellular communication. We term this EV subclass “signalosomes” and define them as exosomes that are produced by the “signaling” cells upon specific physiological or environmental cues and harbor cargo capable of modulating the programming of recipient cells. Our recent studies have established that signalosomes released by mesenchymal stem/stromal cells (MSCs) represent the main vector of MSC immunomodulation and therapeutic action in animal models of lung disease. The efficacy of MSC-exosome treatments in a number of preclinical models of cardiovascular and pulmonary disease supports the promise of application of exosome-based therapeutics across a wide range of pathologies within the near future. However, the full realization of exosome therapeutic potential has been hampered by the absence of standardization in EV isolation, and procedures for purification of signalosomes from the main exosome population. This is mainly due to immature methodologies for exosome isolation and characterization and our incomplete understanding of the specific characteristics and molecular composition of signalosomes. In addition, difficulties in defining metrics for potency of exosome preparations and the challenges of industrial scale-up and

  14. Messenger RNA-based therapeutics for the treatment of apoptosis-associated diseases.

    PubMed

    Matsui, Akitsugu; Uchida, Satoshi; Ishii, Takehiko; Itaka, Keiji; Kataoka, Kazunori

    2015-10-28

    Gene therapy is a promising approach for treating diseases that are closely associated with excessive apoptosis, because the gene can effectively and sustainably introduce anti-apoptotic factors into cells. However, DNA delivery poses the risk of random genomic integration, leading to overexpression of the delivered gene and cancer development. Messenger RNA (mRNA) can evade integration events in target cells. We examined the use of mRNA-based therapeutics for introducing anti-apoptotic factors by using a mouse model of fulminant hepatitis. For introducing mRNA into the liver, a synthesised polymer-based carrier of polyplex nanomicelles was used for hydrodynamic intravenous injection. Using GFP as a reporter, we demonstrate that mRNA delivery induced efficient protein expression in almost 100% of liver cells, while plasmid DNA (pDNA) delivery provided a smaller percentage of GFP-positive cells. Analyses using Cy5-labelled mRNA and pDNA revealed that efficient expression by mRNA was attributed to a simple intracellular mechanism, without the need for nuclear entry. Consistent with this observation, Bcl-2 mRNA was more effective on reducing apoptosis in the liver of mice with fulminant hepatitis than Bcl-2 pDNA. Therefore, mRNA-based therapeutics combined with an effective delivery system such as polyplex nanomicelles is a promising treatment for intractable diseases associated with excessive apoptosis.

  15. Establishment of a cell-based wound healing assay for bio-relevant testing of wound therapeutics.

    PubMed

    Planz, Viktoria; Wang, Jing; Windbergs, Maike

    Predictive in vitro testing of novel wound therapeutics requires adequate cell-based bio-assays. Such assays represent an integral part during preclinical development as pre-step before entering in vivo studies. Simple "scratch tests" based on defected skin cell monolayers exist, however these can solely be used for testing liquids, as cell monolayer destruction and excessive hydration limit their applicability for (semi-)solid systems like wound dressings. In this context, a cell-based wound healing assay is introduced for rapid and predictive testing of wound therapeutics independent of their physical state in a bio-relevant environment. A novel wound healing assay was established for bio-relevant and predictive testing of (semi-) solid wound therapeutics. The assay allows for physiologically relevant hydration of the tested wound therapeutics at the air-liquid interface and their removal without cell monolayer disruption. In a proof-of-concept study, the applicability and discriminative power could be demonstrated by examining unloaded and drug-loaded wound dressings with two different established wound healing actives (dexpanthenol and metyrapone) and their effect on skin cell behavior. The influence of the released drug on the cells´ healing behavior could successfully be monitored over time. Wound size assessment after 96h resulted in an eight fold smaller wound area for drug treated models compared to the ones treated with unloaded fibers and non-treated wounds. This assay provides valuable first insights towards the establishment of a valid screening and evaluation tool for preclinical wound therapeutic development from liquid to (semi-)solid systems to improve predictability in a simple, yet standardized way. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Targeting therapeutics to the glomerulus with nanoparticles.

    PubMed

    Zuckerman, Jonathan E; Davis, Mark E

    2013-11-01

    Nanoparticles are an enabling technology for the creation of tissue-/cell-specific therapeutics that have been investigated extensively as targeted therapeutics for cancer. The kidney, specifically the glomerulus, is another accessible site for nanoparticle delivery that has been relatively overlooked as a target organ. Given the medical need for the development of more potent, kidney-targeted therapies, the use of nanoparticle-based therapeutics may be one such solution to this problem. Here, we review the literature on nanoparticle targeting of the glomerulus. Specifically, we provide a broad overview of nanoparticle-based therapeutics and how the unique structural characteristics of the glomerulus allow for selective, nanoparticle targeting of this area of the kidney. We then summarize literature examples of nanoparticle delivery to the glomerulus and elaborate on the appropriate nanoparticle design criteria for glomerular targeting. Finally, we discuss the behavior of nanoparticles in animal models of diseased glomeruli and review examples of nanoparticle therapeutic approaches that have shown promise in animal models of glomerulonephritic disease. Copyright © 2013 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  17. 19 CFR 142.50 - Line Release data base corrections or changes.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 2 2014-04-01 2014-04-01 false Line Release data base corrections or changes. 142...; DEPARTMENT OF THE TREASURY (CONTINUED) ENTRY PROCESS Line Release § 142.50 Line Release data base corrections... numbers or bond information on a Line Release Data Loading Sheet as soon as possible. Notification shall...

  18. 19 CFR 142.50 - Line Release data base corrections or changes.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Line Release data base corrections or changes. 142...; DEPARTMENT OF THE TREASURY (CONTINUED) ENTRY PROCESS Line Release § 142.50 Line Release data base corrections... numbers or bond information on a Line Release Data Loading Sheet as soon as possible. Notification shall...

  19. 19 CFR 142.50 - Line Release data base corrections or changes.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 2 2011-04-01 2011-04-01 false Line Release data base corrections or changes. 142...; DEPARTMENT OF THE TREASURY (CONTINUED) ENTRY PROCESS Line Release § 142.50 Line Release data base corrections... numbers or bond information on a Line Release Data Loading Sheet as soon as possible. Notification shall...

  20. 19 CFR 142.50 - Line Release data base corrections or changes.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 2 2013-04-01 2013-04-01 false Line Release data base corrections or changes. 142...; DEPARTMENT OF THE TREASURY (CONTINUED) ENTRY PROCESS Line Release § 142.50 Line Release data base corrections... numbers or bond information on a Line Release Data Loading Sheet as soon as possible. Notification shall...

  1. 19 CFR 142.50 - Line Release data base corrections or changes.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 2 2012-04-01 2012-04-01 false Line Release data base corrections or changes. 142...; DEPARTMENT OF THE TREASURY (CONTINUED) ENTRY PROCESS Line Release § 142.50 Line Release data base corrections... numbers or bond information on a Line Release Data Loading Sheet as soon as possible. Notification shall...

  2. Evaluation of NWP-based Satellite Precipitation Error Correction with Near-Real-Time Model Products and Flood-inducing Storms

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Anagnostou, E. N.; Schwartz, C. S.

    2017-12-01

    Satellite precipitation products tend to have significant biases over complex terrain. Our research investigates a statistical approach for satellite precipitation adjustment based solely on numerical weather simulations. This approach has been evaluated in two mid-latitude (Zhang et al. 2013*1, Zhang et al. 2016*2) and three topical mountainous regions by using the WRF model to adjust two high-resolution satellite products i) National Oceanic and Atmospheric Administration (NOAA) Climate Prediction Center morphing technique (CMORPH) and ii) Global Satellite Mapping of Precipitation (GSMaP). Results show the adjustment effectively reduces the satellite underestimation of high rain rates, which provides a solid proof-of-concept for continuing research of NWP-based satellite correction. In this study we investigate the feasibility of using NCAR Real-time Ensemble Forecasts*3 for adjusting near-real-time satellite precipitation datasets over complex terrain areas in the Continental United States (CONUS) such as Olympic Peninsula, California coastal mountain ranges, Rocky Mountains and South Appalachians. The research will focus on flood-inducing storms occurred from May 2015 to December 2016 and four satellite precipitation products (CMORPH, GSMaP, PERSIANN-CCS and IMERG). The error correction performance evaluation will be based on comparisons against the gauge-adjusted Stage IV precipitation data. *1 Zhang, Xinxuan, et al. "Using NWP simulations in satellite rainfall estimation of heavy precipitation events over mountainous areas." Journal of Hydrometeorology 14.6 (2013): 1844-1858. *2 Zhang, Xinxuan, et al. "Hydrologic Evaluation of NWP-Adjusted CMORPH Estimates of Hurricane-Induced Precipitation in the Southern Appalachians." Journal of Hydrometeorology 17.4 (2016): 1087-1099. *3 Schwartz, Craig S., et al. "NCAR's experimental real-time convection-allowing ensemble prediction system." Weather and Forecasting 30.6 (2015): 1645-1654.

  3. The usefulness of "corrected" body mass index vs. self-reported body mass index: comparing the population distributions, sensitivity, specificity, and predictive utility of three correction equations using Canadian population-based data.

    PubMed

    Dutton, Daniel J; McLaren, Lindsay

    2014-05-06

    National data on body mass index (BMI), computed from self-reported height and weight, is readily available for many populations including the Canadian population. Because self-reported weight is found to be systematically under-reported, it has been proposed that the bias in self-reported BMI can be corrected using equations derived from data sets which include both self-reported and measured height and weight. Such correction equations have been developed and adopted. We aim to evaluate the usefulness (i.e., distributional similarity; sensitivity and specificity; and predictive utility vis-à-vis disease outcomes) of existing and new correction equations in population-based research. The Canadian Community Health Surveys from 2005 and 2008 include both measured and self-reported values of height and weight, which allows for construction and evaluation of correction equations. We focused on adults age 18-65, and compared three correction equations (two correcting weight only, and one correcting BMI) against self-reported and measured BMI. We first compared population distributions of BMI. Second, we compared the sensitivity and specificity of self-reported BMI and corrected BMI against measured BMI. Third, we compared the self-reported and corrected BMI in terms of association with health outcomes using logistic regression. All corrections outperformed self-report when estimating the full BMI distribution; the weight-only correction outperformed the BMI-only correction for females in the 23-28 kg/m2 BMI range. In terms of sensitivity/specificity, when estimating obesity prevalence, corrected values of BMI (from any equation) were superior to self-report. In terms of modelling BMI-disease outcome associations, findings were mixed, with no correction proving consistently superior to self-report. If researchers are interested in modelling the full population distribution of BMI, or estimating the prevalence of obesity in a population, then a correction of any kind

  4. Multifactorial causal model of brain (dis)organization and therapeutic intervention: Application to Alzheimer's disease.

    PubMed

    Iturria-Medina, Yasser; Carbonell, Félix M; Sotero, Roberto C; Chouinard-Decorte, Francois; Evans, Alan C

    2017-05-15

    Generative models focused on multifactorial causal mechanisms in brain disorders are scarce and generally based on limited data. Despite the biological importance of the multiple interacting processes, their effects remain poorly characterized from an integrative analytic perspective. Here, we propose a spatiotemporal multifactorial causal model (MCM) of brain (dis)organization and therapeutic intervention that accounts for local causal interactions, effects propagation via physical brain networks, cognitive alterations, and identification of optimum therapeutic interventions. In this article, we focus on describing the model and applying it at the population-based level for studying late onset Alzheimer's disease (LOAD). By interrelating six different neuroimaging modalities and cognitive measurements, this model accurately predicts spatiotemporal alterations in brain amyloid-β (Aβ) burden, glucose metabolism, vascular flow, resting state functional activity, structural properties, and cognitive integrity. The results suggest that a vascular dysregulation may be the most-likely initial pathologic event leading to LOAD. Nevertheless, they also suggest that LOAD it is not caused by a unique dominant biological factor (e.g. vascular or Aβ) but by the complex interplay among multiple relevant direct interactions. Furthermore, using theoretical control analysis of the identified population-based multifactorial causal network, we show the crucial advantage of using combinatorial over single-target treatments, explain why one-target Aβ based therapies might fail to improve clinical outcomes, and propose an efficiency ranking of possible LOAD interventions. Although still requiring further validation at the individual level, this work presents the first analytic framework for dynamic multifactorial brain (dis)organization that may explain both the pathologic evolution of progressive neurological disorders and operationalize the influence of multiple interventional

  5. A new digitized reverse correction method for hypoid gears based on a one-dimensional probe

    NASA Astrophysics Data System (ADS)

    Li, Tianxing; Li, Jubo; Deng, Xiaozhong; Yang, Jianjun; Li, Genggeng; Ma, Wensuo

    2017-12-01

    In order to improve the tooth surface geometric accuracy and transmission quality of hypoid gears, a new digitized reverse correction method is proposed based on the measurement data from a one-dimensional probe. The minimization of tooth surface geometrical deviations is realized from the perspective of mathematical analysis and reverse engineering. Combining the analysis of complex tooth surface generation principles and the measurement mechanism of one-dimensional probes, the mathematical relationship between the theoretical designed tooth surface, the actual machined tooth surface and the deviation tooth surface is established, the mapping relation between machine-tool settings and tooth surface deviations is derived, and the essential connection between the accurate calculation of tooth surface deviations and the reverse correction method of machine-tool settings is revealed. Furthermore, a reverse correction model of machine-tool settings is built, a reverse correction strategy is planned, and the minimization of tooth surface deviations is achieved by means of the method of numerical iterative reverse solution. On this basis, a digitized reverse correction system for hypoid gears is developed by the organic combination of numerical control generation, accurate measurement, computer numerical processing, and digitized correction. Finally, the correctness and practicability of the digitized reverse correction method are proved through a reverse correction experiment. The experimental results show that the tooth surface geometric deviations meet the engineering requirements after two trial cuts and one correction.

  6. Protein-Based Therapeutic Killing for Cancer Therapies.

    PubMed

    Serna, Naroa; Sánchez-García, Laura; Unzueta, Ugutz; Díaz, Raquel; Vázquez, Esther; Mangues, Ramón; Villaverde, Antonio

    2018-03-01

    The treatment of some high-incidence human diseases is based on therapeutic cell killing. In cancer this is mainly achieved by chemical drugs that are systemically administered to reach effective toxic doses. As an innovative alternative, cytotoxic proteins identified in nature can be adapted as precise therapeutic agents. For example, individual toxins and venom components, proapoptotic factors, and antimicrobial peptides from bacteria, animals, plants, and humans have been engineered as highly potent drugs. In addition to the intrinsic cytotoxic activities of these constructs, their biological fabrication by DNA recombination allows the recruitment, in single pharmacological entities, of diverse functions of clinical interest such as specific cell-surface receptor binding, self-activation, and self-assembling as nanoparticulate materials, with wide applicability in cell-targeted oncotherapy and theragnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Intelligent model-based OPC

    NASA Astrophysics Data System (ADS)

    Huang, W. C.; Lai, C. M.; Luo, B.; Tsai, C. K.; Chih, M. H.; Lai, C. W.; Kuo, C. C.; Liu, R. G.; Lin, H. T.

    2006-03-01

    Optical proximity correction is the technique of pre-distorting mask layouts so that the printed patterns are as close to the desired shapes as possible. For model-based optical proximity correction, a lithographic model to predict the edge position (contour) of patterns on the wafer after lithographic processing is needed. Generally, segmentation of edges is performed prior to the correction. Pattern edges are dissected into several small segments with corresponding target points. During the correction, the edges are moved back and forth from the initial drawn position, assisted by the lithographic model, to finally settle on the proper positions. When the correction converges, the intensity predicted by the model in every target points hits the model-specific threshold value. Several iterations are required to achieve the convergence and the computation time increases with the increase of the required iterations. An artificial neural network is an information-processing paradigm inspired by biological nervous systems, such as how the brain processes information. It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. A neural network can be a powerful data-modeling tool that is able to capture and represent complex input/output relationships. The network can accurately predict the behavior of a system via the learning procedure. A radial basis function network, a variant of artificial neural network, is an efficient function approximator. In this paper, a radial basis function network was used to build a mapping from the segment characteristics to the edge shift from the drawn position. This network can provide a good initial guess for each segment that OPC has carried out. The good initial guess reduces the required iterations. Consequently, cycle time can be shortened effectively. The optimization of the radial basis function network for this system was practiced by genetic algorithm

  8. Xenograft model for therapeutic drug testing in recurrent respiratory papillomatosis.

    PubMed

    Ahn, Julie; Bishop, Justin A; Akpeng, Belinda; Pai, Sara I; Best, Simon R A

    2015-02-01

    Identifying effective treatment for papillomatosis is limited by a lack of animal models, and there is currently no preclinical model for testing potential therapeutic agents. We hypothesized that xenografting of papilloma may facilitate in vivo drug testing to identify novel treatment options. A biopsy of fresh tracheal papilloma was xenografted into a NOD-scid-IL2Rgamma(null) (NSG) mouse. The xenograft began growing after 5 weeks and was serially passaged over multiple generations. Each generation showed a consistent log-growth pattern, and in all xenografts, the presence of the human papillomavirus (HPV) genome was confirmed by polymerase chain reaction (PCR). Histopathologic analysis demonstrated that the squamous architecture of the original papilloma was maintained in each generation. In vivo drug testing with bevacizumab (5 mg/kg i.p. twice weekly for 3 weeks) showed a dramatic therapeutic response compared to saline control. We report here the first successful case of serial xenografting of a tracheal papilloma in vivo with a therapeutic response observed with drug testing. In severely immunocompromised mice, the HPV genome and squamous differentiation of the papilloma can be maintained for multiple generations. This is a feasible approach to identify therapeutic agents in the treatment of recurrent respiratory papillomatosis. © The Author(s) 2014.

  9. Self-corrected chip-based dual-comb spectrometer.

    PubMed

    Hébert, Nicolas Bourbeau; Genest, Jérôme; Deschênes, Jean-Daniel; Bergeron, Hugo; Chen, George Y; Khurmi, Champak; Lancaster, David G

    2017-04-03

    We present a dual-comb spectrometer based on two passively mode-locked waveguide lasers integrated in a single Er-doped ZBLAN chip. This original design yields two free-running frequency combs having a high level of mutual stability. We developed in parallel a self-correction algorithm that compensates residual relative fluctuations and yields mode-resolved spectra without the help of any reference laser or control system. Fluctuations are extracted directly from the interferograms using the concept of ambiguity function, which leads to a significant simplification of the instrument that will greatly ease its widespread adoption and commercial deployment. Comparison with a correction algorithm relying on a single-frequency laser indicates discrepancies of only 50 attoseconds on optical timings. The capacities of this instrument are finally demonstrated with the acquisition of a high-resolution molecular spectrum covering 20 nm. This new chip-based multi-laser platform is ideal for the development of high-repetition-rate, compact and fieldable comb spectrometers in the near- and mid-infrared.

  10. Analysis of different models for atmospheric correction of meteosat infrared images. A new approach

    NASA Astrophysics Data System (ADS)

    Pérez, A. M.; Illera, P.; Casanova, J. L.

    A comparative study of several atmospheric correction models has been carried out. As primary data, atmospheric profiles of temperature and humidity obtained from radiosoundings on cloud-free days have been used. Special attention has been paid to the model used operationally in the European Space operations Centre (ESOC) for sea temperature calculations. The atmospheric correction results are expressed in terms of the increase in the brightness temperature and the surface temperature. A difference of up to a maximum of 1.4 degrees with respect to the correction obtained in the studied models has been observed. The radiances calculated by models are also compared with those obtained directly from the satellite. The temperature corrections by the latter are greater than the former in practically every case. As a result of this, the operational calibration coefficients should be first recalculated if we wish to apply an atmospheric correction model to the satellite data. Finally, a new simplified calculation scheme which may be introduced into any model is proposed.

  11. A Physical Model-based Correction for Charge Traps in the Hubble Space Telescope’s Wide Field Camera 3 Near-IR Detector and Its Applications to Transiting Exoplanets and Brown Dwarfs

    NASA Astrophysics Data System (ADS)

    Zhou, Yifan; Apai, Dániel; Lew, Ben W. P.; Schneider, Glenn

    2017-06-01

    The Hubble Space Telescope Wide Field Camera 3 (WFC3) near-IR channel is extensively used in time-resolved observations, especially for transiting exoplanet spectroscopy as well as brown dwarf and directly imaged exoplanet rotational phase mapping. The ramp effect is the dominant source of systematics in the WFC3 for time-resolved observations, which limits its photometric precision. Current mitigation strategies are based on empirical fits and require additional orbits to help the telescope reach a thermal equilibrium. We show that the ramp-effect profiles can be explained and corrected with high fidelity using charge trapping theories. We also present a model for this process that can be used to predict and to correct charge trap systematics. Our model is based on a very small number of parameters that are intrinsic to the detector. We find that these parameters are very stable between the different data sets, and we provide best-fit values. Our model is tested with more than 120 orbits (∼40 visits) of WFC3 observations and is proved to be able to provide near photon noise limited corrections for observations made with both staring and scanning modes of transiting exoplanets as well as for starting-mode observations of brown dwarfs. After our model correction, the light curve of the first orbit in each visit has the same photometric precision as subsequent orbits, so data from the first orbit no longer need to be discarded. Near-IR arrays with the same physical characteristics (e.g., JWST/NIRCam) may also benefit from the extension of this model if similar systematic profiles are observed.

  12. Comparison and Analysis of Geometric Correction Models of Spaceborne SAR

    PubMed Central

    Jiang, Weihao; Yu, Anxi; Dong, Zhen; Wang, Qingsong

    2016-01-01

    Following the development of synthetic aperture radar (SAR), SAR images have become increasingly common. Many researchers have conducted large studies on geolocation models, but little work has been conducted on the available models for the geometric correction of SAR images of different terrain. To address the terrain issue, four different models were compared and are described in this paper: a rigorous range-doppler (RD) model, a rational polynomial coefficients (RPC) model, a revised polynomial (PM) model and an elevation derivation (EDM) model. The results of comparisons of the geolocation capabilities of the models show that a proper model for a SAR image of a specific terrain can be determined. A solution table was obtained to recommend a suitable model for users. Three TerraSAR-X images, two ALOS-PALSAR images and one Envisat-ASAR image were used for the experiment, including flat terrain and mountain terrain SAR images as well as two large area images. Geolocation accuracies of the models for different terrain SAR images were computed and analyzed. The comparisons of the models show that the RD model was accurate but was the least efficient; therefore, it is not the ideal model for real-time implementations. The RPC model is sufficiently accurate and efficient for the geometric correction of SAR images of flat terrain, whose precision is below 0.001 pixels. The EDM model is suitable for the geolocation of SAR images of mountainous terrain, and its precision can reach 0.007 pixels. Although the PM model does not produce results as precise as the other models, its efficiency is excellent and its potential should not be underestimated. With respect to the geometric correction of SAR images over large areas, the EDM model has higher accuracy under one pixel, whereas the RPC model consumes one third of the time of the EDM model. PMID:27347973

  13. Therapeutic NOTCH3 cysteine correction in CADASIL using exon skipping: in vitro proof of concept.

    PubMed

    Rutten, Julie W; Dauwerse, Hans G; Peters, Dorien J M; Goldfarb, Andrew; Venselaar, Hanka; Haffner, Christof; van Ommen, Gert-Jan B; Aartsma-Rus, Annemieke M; Lesnik Oberstein, Saskia A J

    2016-04-01

    Cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy, or CADASIL, is a hereditary cerebral small vessel disease caused by characteristic cysteine altering missense mutations in the NOTCH3 gene. NOTCH3 mutations in CADASIL result in an uneven number of cysteine residues in one of the 34 epidermal growth factor like-repeat (EGFr) domains of the NOTCH3 protein. The consequence of an unpaired cysteine residue in an EGFr domain is an increased multimerization tendency of mutant NOTCH3, leading to toxic accumulation of the protein in the (cerebro)vasculature, and ultimately reduced cerebral blood flow, recurrent stroke and vascular dementia. There is no therapy to delay or alleviate symptoms in CADASIL. We hypothesized that exclusion of the mutant EGFr domain from NOTCH3 would abolish the detrimental effect of the unpaired cysteine and thus prevent toxic NOTCH3 accumulation and the negative cascade of events leading to CADASIL. To accomplish this NOTCH3 cysteine correction by EGFr domain exclusion, we used pre-mRNA antisense-mediated skipping of specific NOTCH3 exons. Selection of these exons was achieved using in silico studies and based on the criterion that skipping of a particular exon or exon pair would modulate the protein in such a way that the mutant EGFr domain is eliminated, without otherwise corrupting NOTCH3 structure and function. Remarkably, we found that this strategy closely mimics evolutionary events, where the elimination and fusion of NOTCH EGFr domains led to the generation of four functional NOTCH homologues. We modelled a selection of exon skip strategies using cDNA constructs and show that the skip proteins retain normal protein processing, can bind ligand and be activated by ligand. We then determined the technical feasibility of targeted NOTCH3 exon skipping, by designing antisense oligonucleotides targeting exons 2-3, 4-5 and 6, which together harbour the majority of distinct CADASIL-causing mutations

  14. Rotational distortion correction in endoscopic optical coherence tomography based on speckle decorrelation

    PubMed Central

    Uribe-Patarroyo, Néstor; Bouma, Brett E.

    2015-01-01

    We present a new technique for the correction of nonuniform rotation distortion in catheter-based optical coherence tomography (OCT), based on the statistics of speckle between A-lines using intensity-based dynamic light scattering. This technique does not rely on tissue features and can be performed on single frames of data, thereby enabling real-time image correction. We demonstrate its suitability in a gastrointestinal balloon-catheter OCT system, determining the actual rotational speed with high temporal resolution, and present corrected cross-sectional and en face views showing significant enhancement of image quality. PMID:26625040

  15. An alternative ionospheric correction model for global navigation satellite systems

    NASA Astrophysics Data System (ADS)

    Hoque, M. M.; Jakowski, N.

    2015-04-01

    The ionosphere is recognized as a major error source for single-frequency operations of global navigation satellite systems (GNSS). To enhance single-frequency operations the global positioning system (GPS) uses an ionospheric correction algorithm (ICA) driven by 8 coefficients broadcasted in the navigation message every 24 h. Similarly, the global navigation satellite system Galileo uses the electron density NeQuick model for ionospheric correction. The Galileo satellite vehicles (SVs) transmit 3 ionospheric correction coefficients as driver parameters of the NeQuick model. In the present work, we propose an alternative ionospheric correction algorithm called Neustrelitz TEC broadcast model NTCM-BC that is also applicable for global satellite navigation systems. Like the GPS ICA or Galileo NeQuick, the NTCM-BC can be optimized on a daily basis by utilizing GNSS data obtained at the previous day at monitor stations. To drive the NTCM-BC, 9 ionospheric correction coefficients need to be uploaded to the SVs for broadcasting in the navigation message. Our investigation using GPS data of about 200 worldwide ground stations shows that the 24-h-ahead prediction performance of the NTCM-BC is better than the GPS ICA and comparable to the Galileo NeQuick model. We have found that the 95 percentiles of the prediction error are about 16.1, 16.1 and 13.4 TECU for the GPS ICA, Galileo NeQuick and NTCM-BC, respectively, during a selected quiet ionospheric period, whereas the corresponding numbers are found about 40.5, 28.2 and 26.5 TECU during a selected geomagnetic perturbed period. However, in terms of complexity the NTCM-BC is easier to handle than the Galileo NeQuick and in this respect comparable to the GPS ICA.

  16. Correction of terrestrial LiDAR intensity channel using Oren-Nayar reflectance model: An application to lithological differentiation

    NASA Astrophysics Data System (ADS)

    Carrea, Dario; Abellan, Antonio; Humair, Florian; Matasci, Battista; Derron, Marc-Henri; Jaboyedoff, Michel

    2016-03-01

    Ground-based LiDAR has been traditionally used for surveying purposes via 3D point clouds. In addition to XYZ coordinates, an intensity value is also recorded by LiDAR devices. The intensity of the backscattered signal can be a significant source of information for various applications in geosciences. Previous attempts to account for the scattering of the laser signal are usually modelled using a perfect diffuse reflection. Nevertheless, experience on natural outcrops shows that rock surfaces do not behave as perfect diffuse reflectors. The geometry (or relief) of the scanned surfaces plays a major role in the recorded intensity values. Our study proposes a new terrestrial LiDAR intensity correction, which takes into consideration the range, the incidence angle and the geometry of the scanned surfaces. The proposed correction equation combines the classical radar equation for LiDAR with the bidirectional reflectance distribution function of the Oren-Nayar model. It is based on the idea that the surface geometry can be modelled by a relief of multiple micro-facets. This model is constrained by only one tuning parameter: the standard deviation of the slope angle distribution (σslope) of micro-facets. Firstly, a series of tests have been carried out in laboratory conditions on a 2 m2 board covered by black/white matte paper (perfect diffuse reflector) and scanned at different ranges and incidence angles. Secondly, other tests were carried out on rock blocks of different lithologies and surface conditions. Those tests demonstrated that the non-perfect diffuse reflectance of rock surfaces can be practically handled by the proposed correction method. Finally, the intensity correction method was applied to a real case study, with two scans of the carbonate rock outcrop of the Dents-du-Midi (Swiss Alps), to improve the lithological identification for geological mapping purposes. After correction, the intensity values are proportional to the intrinsic material reflectance

  17. A neural network-based method for spectral distortion correction in photon counting x-ray CT

    NASA Astrophysics Data System (ADS)

    Touch, Mengheng; Clark, Darin P.; Barber, William; Badea, Cristian T.

    2016-08-01

    Spectral CT using a photon counting x-ray detector (PCXD) shows great potential for measuring material composition based on energy dependent x-ray attenuation. Spectral CT is especially suited for imaging with K-edge contrast agents to address the otherwise limited contrast in soft tissues. We have developed a micro-CT system based on a PCXD. This system enables both 4 energy bins acquisition, as well as full-spectrum mode in which the energy thresholds of the PCXD are swept to sample the full energy spectrum for each detector element and projection angle. Measurements provided by the PCXD, however, are distorted due to undesirable physical effects in the detector and can be very noisy due to photon starvation in narrow energy bins. To address spectral distortions, we propose and demonstrate a novel artificial neural network (ANN)-based spectral distortion correction mechanism, which learns to undo the distortion in spectral CT, resulting in improved material decomposition accuracy. To address noise, post-reconstruction denoising based on bilateral filtration, which jointly enforces intensity gradient sparsity between spectral samples, is used to further improve the robustness of ANN training and material decomposition accuracy. Our ANN-based distortion correction method is calibrated using 3D-printed phantoms and a model of our spectral CT system. To enable realistic simulations and validation of our method, we first modeled the spectral distortions using experimental data acquired from 109Cd and 133Ba radioactive sources measured with our PCXD. Next, we trained an ANN to learn the relationship between the distorted spectral CT projections and the ideal, distortion-free projections in a calibration step. This required knowledge of the ground truth, distortion-free spectral CT projections, which were obtained by simulating a spectral CT scan of the digital version of a 3D-printed phantom. Once the training was completed, the trained ANN was used to perform

  18. The Impact of CRISPR/Cas9 Technology on Cardiac Research: From Disease Modelling to Therapeutic Approaches

    PubMed Central

    Pramstaller, Peter P.; Hicks, Andrew A.; Rossini, Alessandra

    2017-01-01

    Genome-editing technology has emerged as a powerful method that enables the generation of genetically modified cells and organisms necessary to elucidate gene function and mechanisms of human diseases. The clustered regularly interspaced short palindromic repeats- (CRISPR-) associated 9 (Cas9) system has rapidly become one of the most popular approaches for genome editing in basic biomedical research over recent years because of its simplicity and adaptability. CRISPR/Cas9 genome editing has been used to correct DNA mutations ranging from a single base pair to large deletions in both in vitro and in vivo model systems. CRISPR/Cas9 has been used to increase the understanding of many aspects of cardiovascular disorders, including lipid metabolism, electrophysiology and genetic inheritance. The CRISPR/Cas9 technology has been proven to be effective in creating gene knockout (KO) or knockin in human cells and is particularly useful for editing induced pluripotent stem cells (iPSCs). Despite these progresses, some biological, technical, and ethical issues are limiting the therapeutic potential of genome editing in cardiovascular diseases. This review will focus on various applications of CRISPR/Cas9 genome editing in the cardiovascular field, for both disease research and the prospect of in vivo genome-editing therapies in the future. PMID:29434642

  19. A Novel Strategy for Development of Recombinant Antitoxin Therapeutics Tested in a Mouse Botulism Model

    PubMed Central

    Leysath, Clinton E.; Ofori, Kwasi; Baldwin, Karen; Feng, Xiaochuan; Bedenice, Daniela; Webb, Robert P.; Wright, Patrick M.; Smith, Leonard A.; Tzipori, Saul; Shoemaker, Charles B.

    2012-01-01

    Antitoxins are needed that can be produced economically with improved safety and shelf life compared to conventional antisera-based therapeutics. Here we report a practical strategy for development of simple antitoxin therapeutics with substantial advantages over currently available treatments. The therapeutic strategy employs a single recombinant ‘targeting agent’ that binds a toxin at two unique sites and a ‘clearing Ab’ that binds two epitopes present on each targeting agent. Co-administration of the targeting agent and the clearing Ab results in decoration of the toxin with up to four Abs to promote accelerated clearance. The therapeutic strategy was applied to two Botulinum neurotoxin (BoNT) serotypes and protected mice from lethality in two different intoxication models with an efficacy equivalent to conventional antitoxin serum. Targeting agents were a single recombinant protein consisting of a heterodimer of two camelid anti-BoNT heavy-chain-only Ab VH (VHH) binding domains and two E-tag epitopes. The clearing mAb was an anti-E-tag mAb. By comparing the in vivo efficacy of treatments that employed neutralizing vs. non-neutralizing agents or the presence vs. absence of clearing Ab permitted unprecedented insight into the roles of toxin neutralization and clearance in antitoxin efficacy. Surprisingly, when a post-intoxication treatment model was used, a toxin-neutralizing heterodimer agent fully protected mice from intoxication even in the absence of clearing Ab. Thus a single, easy-to-produce recombinant protein was as efficacious as polyclonal antiserum in a clinically-relevant mouse model of botulism. This strategy should have widespread application in antitoxin development and other therapies in which neutralization and/or accelerated clearance of a serum biomolecule can offer therapeutic benefit. PMID:22238680

  20. Turning and Radius Deviation Correction for a Hexapod Walking Robot Based on an Ant-Inspired Sensory Strategy

    PubMed Central

    Guo, Tong; Liu, Qiong; Zhu, Qianwei; Zhao, Xiangmo; Jin, Bo

    2017-01-01

    In order to find a common approach to plan the turning of a bio-inspired hexapod robot, a locomotion strategy for turning and deviation correction of a hexapod walking robot based on the biological behavior and sensory strategy of ants. A series of experiments using ants were carried out where the gait and the movement form of ants was studied. Taking the results of the ant experiments as inspiration by imitating the behavior of ants during turning, an extended turning algorithm based on arbitrary gait was proposed. Furthermore, after the observation of the radius adjustment of ants during turning, a radius correction algorithm based on the arbitrary gait of the hexapod robot was raised. The radius correction surface function was generated by fitting the correction data, which made it possible for the robot to move in an outdoor environment without the positioning system and environment model. The proposed algorithm was verified on the hexapod robot experimental platform. The turning and radius correction experiment of the robot with several gaits were carried out. The results indicated that the robot could follow the ideal radius and maintain stability, and the proposed ant-inspired turning strategy could easily make free turns with an arbitrary gait. PMID:29168742

  1. Turning and Radius Deviation Correction for a Hexapod Walking Robot Based on an Ant-Inspired Sensory Strategy.

    PubMed

    Zhu, Yaguang; Guo, Tong; Liu, Qiong; Zhu, Qianwei; Zhao, Xiangmo; Jin, Bo

    2017-11-23

    Abstract : In order to find a common approach to plan the turning of a bio-inspired hexapod robot, a locomotion strategy for turning and deviation correction of a hexapod walking robot based on the biological behavior and sensory strategy of ants. A series of experiments using ants were carried out where the gait and the movement form of ants was studied. Taking the results of the ant experiments as inspiration by imitating the behavior of ants during turning, an extended turning algorithm based on arbitrary gait was proposed. Furthermore, after the observation of the radius adjustment of ants during turning, a radius correction algorithm based on the arbitrary gait of the hexapod robot was raised. The radius correction surface function was generated by fitting the correction data, which made it possible for the robot to move in an outdoor environment without the positioning system and environment model. The proposed algorithm was verified on the hexapod robot experimental platform. The turning and radius correction experiment of the robot with several gaits were carried out. The results indicated that the robot could follow the ideal radius and maintain stability, and the proposed ant-inspired turning strategy could easily make free turns with an arbitrary gait.

  2. Library based x-ray scatter correction for dedicated cone beam breast CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Linxi; Zhu, Lei, E-mail: leizhu@gatech.edu

    Purpose: The image quality of dedicated cone beam breast CT (CBBCT) is limited by substantial scatter contamination, resulting in cupping artifacts and contrast-loss in reconstructed images. Such effects obscure the visibility of soft-tissue lesions and calcifications, which hinders breast cancer detection and diagnosis. In this work, we propose a library-based software approach to suppress scatter on CBBCT images with high efficiency, accuracy, and reliability. Methods: The authors precompute a scatter library on simplified breast models with different sizes using the GEANT4-based Monte Carlo (MC) toolkit. The breast is approximated as a semiellipsoid with homogeneous glandular/adipose tissue mixture. For scatter correctionmore » on real clinical data, the authors estimate the breast size from a first-pass breast CT reconstruction and then select the corresponding scatter distribution from the library. The selected scatter distribution from simplified breast models is spatially translated to match the projection data from the clinical scan and is subtracted from the measured projection for effective scatter correction. The method performance was evaluated using 15 sets of patient data, with a wide range of breast sizes representing about 95% of general population. Spatial nonuniformity (SNU) and contrast to signal deviation ratio (CDR) were used as metrics for evaluation. Results: Since the time-consuming MC simulation for library generation is precomputed, the authors’ method efficiently corrects for scatter with minimal processing time. Furthermore, the authors find that a scatter library on a simple breast model with only one input parameter, i.e., the breast diameter, sufficiently guarantees improvements in SNU and CDR. For the 15 clinical datasets, the authors’ method reduces the average SNU from 7.14% to 2.47% in coronal views and from 10.14% to 3.02% in sagittal views. On average, the CDR is improved by a factor of 1.49 in coronal views and 2.12 in

  3. [Preventive and therapeutic effect of genetic vaccine based on recombinant alpha virus against mouse mastocytoma P815].

    PubMed

    Ni, Bing; Yang, Ri-gao; Li, Yan-qiu; Wu, Yu-zhang

    2004-01-01

    To explore the immunological effect of genetic vaccine based on alpha-virus and to seek out better forms of gene vaccines. Expression plasmid P1A/pSMART2a and packaging plasmid helper were cotransfected into mammalian 293 cells by calcium phosphate precipitation method and high level of recombinant alpha-virus P1A/SFV was prepared. Following identification of rSFV and its expression, BALB/c mice were inoculated with rSFV, and the production of antigen-specific antibody and the cytotoxic effect of CTLs were determined. In the preventive and therapeutic experiments, the percents of tumor-free and of survival mice immunized with rSFV were observed. The recombinant SFV could express correctly in cultured cells. After being inoculated into the mice, rSFV could prime stronger CTL response than that in control mice. When the ratio of E/T cells was 100:1, the (51)Cr release rate reached 75%. No antibody could be detected in mice from all groups. The immunological effect of P1A/SFV among all groups was the best in both preventive and therapeutic experiment within experimental deadline. On 60th day in preventive experiment, the percent of tumor-free animal in P1A/SFV group reached 60%, whereas that was only 20% in P1A/pCI-neogroup. On 60th day in therapeutic experiment, survival rate of mice in P1A/SFV group reached 50%, but only 10% mice could survive in all control groups. Compared with common gene vaccines, the genetic vaccine based on recombinant SFV has the best immunological effect, which provides some new strategies for clinical genetic therapy of tumors.

  4. Research on respiratory motion correction method based on liver contrast-enhanced ultrasound images of single mode

    NASA Astrophysics Data System (ADS)

    Zhang, Ji; Li, Tao; Zheng, Shiqiang; Li, Yiyong

    2015-03-01

    To reduce the effects of respiratory motion in the quantitative analysis based on liver contrast-enhanced ultrasound (CEUS) image sequencesof single mode. The image gating method and the iterative registration method using model image were adopted to register liver contrast-enhanced ultrasound image sequences of single mode. The feasibility of the proposed respiratory motion correction method was explored preliminarily using 10 hepatocellular carcinomas CEUS cases. The positions of the lesions in the time series of 2D ultrasound images after correction were visually evaluated. Before and after correction, the quality of the weighted sum of transit time (WSTT) parametric images were also compared, in terms of the accuracy and spatial resolution. For the corrected and uncorrected sequences, their mean deviation values (mDVs) of time-intensity curve (TIC) fitting derived from CEUS sequences were measured. After the correction, the positions of the lesions in the time series of 2D ultrasound images were almost invariant. In contrast, the lesions in the uncorrected images all shifted noticeably. The quality of the WSTT parametric maps derived from liver CEUS image sequences were improved more greatly. Moreover, the mDVs of TIC fitting derived from CEUS sequences after the correction decreased by an average of 48.48+/-42.15. The proposed correction method could improve the accuracy of quantitative analysis based on liver CEUS image sequences of single mode, which would help in enhancing the differential diagnosis efficiency of liver tumors.

  5. Tls Field Data Based Intensity Correction for Forest Environments

    NASA Astrophysics Data System (ADS)

    Heinzel, J.; Huber, M. O.

    2016-06-01

    Terrestrial laser scanning (TLS) is increasingly used for forestry applications. Besides the three dimensional point coordinates, the 'intensity' of the reflected signal plays an important role in forestry and vegetation studies. The benefit of the signal intensity is caused by the wavelength of the laser that is within the near infrared (NIR) for most scanners. The NIR is highly indicative for various vegetation characteristics. However, the intensity as recorded by most terrestrial scanners is distorted by both external and scanner specific factors. Since details about system internal alteration of the signal are often unknown to the user, model driven approaches are impractical. On the other hand, existing data driven calibration procedures require laborious acquisition of separate reference datasets or areas of homogenous reflection characteristics from the field data. In order to fill this gap, the present study introduces an approach to correct unwanted intensity variations directly from the point cloud of the field data. The focus is on the variation over range and sensor specific distortions. Instead of an absolute calibration of the values, a relative correction within the dataset is sufficient for most forestry applications. Finally, a method similar to time series detrending is presented with the only pre-condition of a relative equal distribution of forest objects and materials over range. Our test data covers 50 terrestrial scans captured with a FARO Focus 3D S120 scanner using a laser wavelength of 905 nm. Practical tests demonstrate that our correction method removes range and scanner based alterations of the intensity.

  6. Screening Magnetic Resonance Imaging-Based Prediction Model for Assessing Immediate Therapeutic Response to Magnetic Resonance Imaging-Guided High-Intensity Focused Ultrasound Ablation of Uterine Fibroids.

    PubMed

    Kim, Young-sun; Lim, Hyo Keun; Park, Min Jung; Rhim, Hyunchul; Jung, Sin-Ho; Sohn, Insuk; Kim, Tae-Joong; Keserci, Bilgin

    2016-01-01

    The aim of this study was to fit and validate screening magnetic resonance imaging (MRI)-based prediction models for assessing immediate therapeutic responses of uterine fibroids to MRI-guided high-intensity focused ultrasound (MR-HIFU) ablation. Informed consent from all subjects was obtained for our institutional review board-approved study. A total of 240 symptomatic uterine fibroids (mean diameter, 6.9 cm) in 152 women (mean age, 43.3 years) treated with MR-HIFU ablation were retrospectively analyzed (160 fibroids for training, 80 fibroids for validation). Screening MRI parameters (subcutaneous fat thickness [mm], x1; relative peak enhancement [%] in semiquantitative perfusion MRI, x2; T2 signal intensity ratio of fibroid to skeletal muscle, x3) were used to fit prediction models with regard to ablation efficiency (nonperfused volume/treatment cell volume, y1) and ablation quality (grade 1-5, poor to excellent, y2), respectively, using the generalized estimating equation method. Cutoff values for achievement of treatment intent (efficiency >1.0; quality grade 4/5) were determined based on receiver operating characteristic curve analysis. Prediction performances were validated by calculating positive and negative predictive values. Generalized estimating equation analyses yielded models of y1 = 2.2637 - 0.0415x1 - 0.0011x2 - 0.0772x3 and y2 = 6.8148 - 0.1070x1 - 0.0050x2 - 0.2163x3. Cutoff values were 1.312 for ablation efficiency (area under the curve, 0.7236; sensitivity, 0.6882; specificity, 0.6866) and 4.019 for ablation quality (0.8794; 0.7156; 0.9020). Positive and negative predictive values were 0.917 and 0.500 for ablation efficiency and 0.978 and 0.600 for ablation quality, respectively. Screening MRI-based prediction models for assessing immediate therapeutic responses of uterine fibroids to MR-HIFU ablation were fitted and validated, which may reduce the risk of unsuccessful treatment.

  7. [Therapeutic strategy for different types of epicanthus].

    PubMed

    Gaofeng, Li; Jun, Tan; Zihan, Wu; Wei, Ding; Huawei, Ouyang; Fan, Zhang; Mingcan, Luo

    2015-11-01

    To explore the reasonable therapeutic strategy for different types of epicanthus. Patients with epicanthus were classificated according to the shape, extent and inner canthal distance and treated with different methods appropriately. Modified asymmetric Z plasty with two curve method was used in lower eyelid type epicanthus, inner canthus type epicanthus and severe upper eyelid type epicanthus. Moderate upper epicanthus underwent '-' shape method. Mild Upper epicanthus in two conditions which underwent nasal augumentation and double eyelid formation with normal inner canthal distance need no correction surgery. The other mild epicanthus underwent '-' shape method. A total of 66 cases underwent the classification and the appropriate treatment. All wounds healed well. During 3 to 12 months follow-up period, all epicanthus were corrected completely with natural contour and unconspicuous scars. All patients were satisfied with the results. Classification of epicanthus hosed on the shape, extent and inner canthal distance and correction with appropriate methods is a reasonable therapeutic strategy.

  8. A Physical Model-based Correction for Charge Traps in the Hubble Space Telescope ’s Wide Field Camera 3 Near-IR Detector and Its Applications to Transiting Exoplanets and Brown Dwarfs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Yifan; Apai, Dániel; Schneider, Glenn

    The Hubble Space Telescope Wide Field Camera 3 (WFC3) near-IR channel is extensively used in time-resolved observations, especially for transiting exoplanet spectroscopy as well as brown dwarf and directly imaged exoplanet rotational phase mapping. The ramp effect is the dominant source of systematics in the WFC3 for time-resolved observations, which limits its photometric precision. Current mitigation strategies are based on empirical fits and require additional orbits to help the telescope reach a thermal equilibrium . We show that the ramp-effect profiles can be explained and corrected with high fidelity using charge trapping theories. We also present a model for this processmore » that can be used to predict and to correct charge trap systematics. Our model is based on a very small number of parameters that are intrinsic to the detector. We find that these parameters are very stable between the different data sets, and we provide best-fit values. Our model is tested with more than 120 orbits (∼40 visits) of WFC3 observations and is proved to be able to provide near photon noise limited corrections for observations made with both staring and scanning modes of transiting exoplanets as well as for starting-mode observations of brown dwarfs. After our model correction, the light curve of the first orbit in each visit has the same photometric precision as subsequent orbits, so data from the first orbit no longer need to be discarded. Near-IR arrays with the same physical characteristics (e.g., JWST/NIRCam ) may also benefit from the extension of this model if similar systematic profiles are observed.« less

  9. Practical considerations in the development of hemoglobin-based oxygen therapeutics.

    PubMed

    Kim, Hae Won; Estep, Timothy N

    2012-09-01

    The development of hemoglobin based oxygen therapeutics (HBOCs) requires consideration of a number of factors. While the enabling technology derives from fundamental research on protein biochemistry and biological interactions, translation of these research insights into usable medical therapeutics demands the application of considerable technical expertise and consideration and reconciliation of a myriad of manufacturing, medical, and regulatory requirements. The HBOC development challenge is further exacerbated by the extremely high intravenous doses required for many of the indications contemplated for these products, which in turn implies an extremely high level of purity is required. This communication discusses several of the important product configuration and developmental considerations that impact the translation of fundamental research discoveries on HBOCs into usable medical therapeutics.

  10. Inventory of Novel Animal Models Addressing Etiology of Preeclampsia in the Development of New Therapeutic/Intervention Opportunities.

    PubMed

    Erlandsson, Lena; Nääv, Åsa; Hennessy, Annemarie; Vaiman, Daniel; Gram, Magnus; Åkerström, Bo; Hansson, Stefan R

    2016-03-01

    Preeclampsia is a pregnancy-related disease afflicting 3-7% of pregnancies worldwide and leads to maternal and infant morbidity and mortality. The disease is of placental origin and is commonly described as a disease of two stages. A variety of preeclampsia animal models have been proposed, but all of them have limitations in fully recapitulating the human disease. Based on the research question at hand, different or multiple models might be suitable. Multiple animal models in combination with in vitro or ex vivo studies on human placenta together offer a synergistic platform to further our understanding of the etiology of preeclampsia and potential therapeutic interventions. The described animal models of preeclampsia divide into four categories (i) spontaneous, (ii) surgically induced, (iii) pharmacologically/substance induced, and (iv) transgenic. This review aims at providing an inventory of novel models addressing etiology of the disease and or therapeutic/intervention opportunities. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Experimental demonstration of passive acoustic imaging in the human skull cavity using CT-based aberration corrections.

    PubMed

    Jones, Ryan M; O'Reilly, Meaghan A; Hynynen, Kullervo

    2015-07-01

    Experimentally verify a previously described technique for performing passive acoustic imaging through an intact human skull using noninvasive, computed tomography (CT)-based aberration corrections Jones et al. [Phys. Med. Biol. 58, 4981-5005 (2013)]. A sparse hemispherical receiver array (30 cm diameter) consisting of 128 piezoceramic discs (2.5 mm diameter, 612 kHz center frequency) was used to passively listen through ex vivo human skullcaps (n = 4) to acoustic emissions from a narrow-band fixed source (1 mm diameter, 516 kHz center frequency) and from ultrasound-stimulated (5 cycle bursts, 1 Hz pulse repetition frequency, estimated in situ peak negative pressure 0.11-0.33 MPa, 306 kHz driving frequency) Definity™ microbubbles flowing through a thin-walled tube phantom. Initial in vivo feasibility testing of the method was performed. The performance of the method was assessed through comparisons to images generated without skull corrections, with invasive source-based corrections, and with water-path control images. For source locations at least 25 mm from the inner skull surface, the modified reconstruction algorithm successfully restored a single focus within the skull cavity at a location within 1.25 mm from the true position of the narrow-band source. The results obtained from imaging single bubbles are in good agreement with numerical simulations of point source emitters and the authors' previous experimental measurements using source-based skull corrections O'Reilly et al. [IEEE Trans. Biomed. Eng. 61, 1285-1294 (2014)]. In a rat model, microbubble activity was mapped through an intact human skull at pressure levels below and above the threshold for focused ultrasound-induced blood-brain barrier opening. During bursts that led to coherent bubble activity, the location of maximum intensity in images generated with CT-based skull corrections was found to deviate by less than 1 mm, on average, from the position obtained using source-based corrections. Taken

  12. Spherical aberration correction with an in-lens N-fold symmetric line currents model.

    PubMed

    Hoque, Shahedul; Ito, Hiroyuki; Nishi, Ryuji

    2018-04-01

    In our previous works, we have proposed N-SYLC (N-fold symmetric line currents) models for aberration correction. In this paper, we propose "in-lens N-SYLC" model, where N-SYLC overlaps rotationally symmetric lens. Such overlap is possible because N-SYLC is free of magnetic materials. We analytically prove that, if certain parameters of the model are optimized, an in-lens 3-SYLC (N = 3) doublet can correct 3rd order spherical aberration. By computer simulation, we show that the required excitation current for correction is less than 0.25 AT for beam energy 5 keV, and the beam size after correction is smaller than 1 nm at the corrector image plane for initial slope less than 4 mrad. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Establishment and correction of an Echelle cross-prism spectrogram reduction model

    NASA Astrophysics Data System (ADS)

    Zhang, Rui; Bayanheshig; Li, Xiaotian; Cui, Jicheng

    2017-11-01

    The accuracy of an echelle cross-prism spectrometer depends on the matching degree between the spectrum reduction model and the actual state of the spectrometer. However, the error of adjustment can change the actual state of the spectrometer and result in a reduction model that does not match. This produces an inaccurate wavelength calibration. Therefore, the calibration of a spectrogram reduction model is important for the analysis of any echelle cross-prism spectrometer. In this study, the spectrogram reduction model of an echelle cross-prism spectrometer was established. The image position laws of a spectrometer that varies with the system parameters were simulated to the influence of the changes in prism refractive index, focal length and so on, on the calculation results. The model was divided into different wavebands. The iterative method, least squares principle and element lamps with known characteristic wavelength were used to calibrate the spectral model in different wavebands to obtain the actual values of the system parameters. After correction, the deviation of actual x- and y-coordinates and the coordinates calculated by the model are less than one pixel. The model corrected by this method thus reflects the system parameters in the current spectrometer state and can assist in accurate wavelength extraction. The instrument installation and adjustment would be guided in model-repeated correction, reducing difficulty of equipment, respectively.

  14. Validation of the Two-Layer Model for Correcting Clear Sky Reflectance Near Clouds

    NASA Technical Reports Server (NTRS)

    Wen, Guoyong; Marshak, Alexander; Evans, K. Frank; Vamal, Tamas

    2014-01-01

    A two-layer model was developed in our earlier studies to estimate the clear sky reflectance enhancement near clouds. This simple model accounts for the radiative interaction between boundary layer clouds and molecular layer above, the major contribution to the reflectance enhancement near clouds for short wavelengths. We use LES/SHDOM simulated 3D radiation fields to valid the two-layer model for reflectance enhancement at 0.47 micrometer. We find: (a) The simple model captures the viewing angle dependence of the reflectance enhancement near cloud, suggesting the physics of this model is correct; and (b) The magnitude of the 2-layer modeled enhancement agree reasonably well with the "truth" with some expected underestimation. We further extend our model to include cloud-surface interaction using the Poisson model for broken clouds. We found that including cloud-surface interaction improves the correction, though it can introduced some over corrections for large cloud albedo, large cloud optical depth, large cloud fraction, large cloud aspect ratio. This over correction can be reduced by excluding scenes (10 km x 10km) with large cloud fraction for which the Poisson model is not designed for. Further research is underway to account for the contribution of cloud-aerosol radiative interaction to the enhancement.

  15. Development of an Inhalational Bacillus anthracis Exposure Therapeutic Model in Cynomolgus Macaques

    PubMed Central

    Comer, Jason E.; Stark, Gregory V.; Ray, Bryan D.; Tordoff, Kevin P.; Knostman, Katherine A. B.; Meister, Gabriel T.

    2012-01-01

    Appropriate animal models are required to test medical countermeasures to bioterrorist threats. To that end, we characterized a nonhuman primate (NHP) inhalational anthrax therapeutic model for use in testing anthrax therapeutic medical countermeasures according to the U.S. Food and Drug Administration Animal Rule. A clinical profile was recorded for each NHP exposed to a lethal dose of Bacillus anthracis Ames spores. Specific diagnostic parameters were detected relatively early in disease progression, i.e., by blood culture (∼37 h postchallenge) and the presence of circulating protective antigen (PA) detected by electrochemiluminescence (ECL) ∼38 h postchallenge, whereas nonspecific clinical signs of disease, i.e., changes in body temperature, hematologic parameters (ca. 52 to 66 h), and clinical observations, were delayed. To determine whether the presentation of antigenemia (PA in the blood) was an appropriate trigger for therapeutic intervention, a monoclonal antibody specific for PA was administered to 12 additional animals after the circulating levels of PA were detected by ECL. Seventy-five percent of the monoclonal antibody-treated animals survived compared to 17% of the untreated controls, suggesting that intervention at the onset of antigenemia is an appropriate treatment trigger for this model. Moreover, the onset of antigenemia correlated with bacteremia, and NHPs were treated in a therapeutic manner. Interestingly, brain lesions were observed by histopathology in the treated nonsurviving animals, whereas this observation was absent from 90% of the nonsurviving untreated animals. Our results support the use of the cynomolgus macaque as an appropriate therapeutic animal model for assessing the efficacy of medical countermeasures developed against anthrax when administered after a confirmation of infection. PMID:22956657

  16. TU-G-210-02: TRANS-FUSIMO - An Integrative Approach to Model-Based Treatment Planning of Liver FUS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Preusser, T.

    Modeling can play a vital role in predicting, optimizing and analyzing the results of therapeutic ultrasound treatments. Simulating the propagating acoustic beam in various targeted regions of the body allows for the prediction of the resulting power deposition and temperature profiles. In this session we will apply various modeling approaches to breast, abdominal organ and brain treatments. Of particular interest is the effectiveness of procedures for correcting for phase aberrations caused by intervening irregular tissues, such as the skull in transcranial applications or inhomogeneous breast tissues. Also described are methods to compensate for motion in targeted abdominal organs such asmore » the liver or kidney. Douglas Christensen – Modeling for Breast and Brain HIFU Treatment Planning Tobias Preusser – TRANS-FUSIMO – An Integrative Approach to Model-Based Treatment Planning of Liver FUS Tobias Preusser – TRANS-FUSIMO – An Integrative Approach to Model-Based Treatment Planning of Liver FUS Learning Objectives: Understand the role of acoustic beam modeling for predicting the effectiveness of therapeutic ultrasound treatments. Apply acoustic modeling to specific breast, liver, kidney and transcranial anatomies. Determine how to obtain appropriate acoustic modeling parameters from clinical images. Understand the separate role of absorption and scattering in energy delivery to tissues. See how organ motion can be compensated for in ultrasound therapies. Compare simulated data with clinical temperature measurements in transcranial applications. Supported by NIH R01 HL172787 and R01 EB013433 (DC); EU Seventh Framework Programme (FP7/2007-2013) under 270186 (FUSIMO) and 611889 (TRANS-FUSIMO)(TP); and P01 CA159992, GE, FUSF and InSightec (UV)« less

  17. Towards quantitative PET/MRI: a review of MR-based attenuation correction techniques.

    PubMed

    Hofmann, Matthias; Pichler, Bernd; Schölkopf, Bernhard; Beyer, Thomas

    2009-03-01

    Positron emission tomography (PET) is a fully quantitative technology for imaging metabolic pathways and dynamic processes in vivo. Attenuation correction of raw PET data is a prerequisite for quantification and is typically based on separate transmission measurements. In PET/CT attenuation correction, however, is performed routinely based on the available CT transmission data. Recently, combined PET/magnetic resonance (MR) has been proposed as a viable alternative to PET/CT. Current concepts of PET/MRI do not include CT-like transmission sources and, therefore, alternative methods of PET attenuation correction must be found. This article reviews existing approaches to MR-based attenuation correction (MR-AC). Most groups have proposed MR-AC algorithms for brain PET studies and more recently also for torso PET/MR imaging. Most MR-AC strategies require the use of complementary MR and transmission images, or morphology templates generated from transmission images. We review and discuss these algorithms and point out challenges for using MR-AC in clinical routine. MR-AC is work-in-progress with potentially promising results from a template-based approach applicable to both brain and torso imaging. While efforts are ongoing in making clinically viable MR-AC fully automatic, further studies are required to realize the potential benefits of MR-based motion compensation and partial volume correction of the PET data.

  18. Staircase-scene-based nonuniformity correction in aerial point target detection systems.

    PubMed

    Huo, Lijun; Zhou, Dabiao; Wang, Dejiang; Liu, Rang; He, Bin

    2016-09-01

    Focal-plane arrays (FPAs) are often interfered by heavy fixed-pattern noise, which severely degrades the detection rate and increases the false alarms in airborne point target detection systems. Thus, high-precision nonuniformity correction is an essential preprocessing step. In this paper, a new nonuniformity correction method is proposed based on a staircase scene. This correction method can compensate for the nonlinear response of the detector and calibrate the entire optical system with computational efficiency and implementation simplicity. Then, a proof-of-concept point target detection system is established with a long-wave Sofradir FPA. Finally, the local standard deviation of the corrected image and the signal-to-clutter ratio of the Airy disk of a Boeing B738 are measured to evaluate the performance of the proposed nonuniformity correction method. Our experimental results demonstrate that the proposed correction method achieves high-quality corrections.

  19. Description of Exemplar Cases in the Intensive Mental Health Program: Illustrations of Application of the Therapeutic Model

    ERIC Educational Resources Information Center

    Nelson, Timothy D.; Mashunkashey, Joanna O.; Mitchell, Montserrat C.; Benson, Eric R.; Vernberg, Eric M.; Roberts, Michael C.

    2008-01-01

    We describe cases from the clinical records in the Intensive Mental Health Program to illustrate the diverse presenting problems, intervention strategies, therapeutic process, and outcomes for children receiving services in this school-based, community-oriented treatment model. Cases reflect varying degrees of treatment response and potential…

  20. Gene correction in patient-specific iPSCs for therapy development and disease modeling

    PubMed Central

    Jang, Yoon-Young

    2018-01-01

    The discovery that mature cells can be reprogrammed to become pluripotent and the development of engineered endonucleases for enhancing genome editing are two of the most exciting and impactful technology advances in modern medicine and science. Human pluripotent stem cells have the potential to establish new model systems for studying human developmental biology and disease mechanisms. Gene correction in patient-specific iPSCs can also provide a novel source for autologous cell therapy. Although historically challenging, precise genome editing in human iPSCs is becoming more feasible with the development of new genome-editing tools, including ZFNs, TALENs, and CRISPR. iPSCs derived from patients of a variety of diseases have been edited to correct disease-associated mutations and to generate isogenic cell lines. After directed differentiation, many of the corrected iPSCs showed restored functionality and demonstrated their potential in cell replacement therapy. Genome-wide analyses of gene-corrected iPSCs have collectively demonstrated a high fidelity of the engineered endonucleases. Remaining challenges in clinical translation of these technologies include maintaining genome integrity of the iPSC clones and the differentiated cells. Given the rapid advances in genome-editing technologies, gene correction is no longer the bottleneck in developing iPSC-based gene and cell therapies; generating functional and transplantable cell types from iPSCs remains the biggest challenge needing to be addressed by the research field. PMID:27256364

  1. The functional therapeutic chemical classification system.

    PubMed

    Croset, Samuel; Overington, John P; Rebholz-Schuhmann, Dietrich

    2014-03-15

    Drug repositioning is the discovery of new indications for compounds that have already been approved and used in a clinical setting. Recently, some computational approaches have been suggested to unveil new opportunities in a systematic fashion, by taking into consideration gene expression signatures or chemical features for instance. We present here a novel method based on knowledge integration using semantic technologies, to capture the functional role of approved chemical compounds. In order to computationally generate repositioning hypotheses, we used the Web Ontology Language to formally define the semantics of over 20 000 terms with axioms to correctly denote various modes of action (MoA). Based on an integration of public data, we have automatically assigned over a thousand of approved drugs into these MoA categories. The resulting new resource is called the Functional Therapeutic Chemical Classification System and was further evaluated against the content of the traditional Anatomical Therapeutic Chemical Classification System. We illustrate how the new classification can be used to generate drug repurposing hypotheses, using Alzheimers disease as a use-case. https://www.ebi.ac.uk/chembl/ftc; https://github.com/loopasam/ftc. croset@ebi.ac.uk Supplementary data are available at Bioinformatics online.

  2. Real-time distortion correction for visual inspection systems based on FPGA

    NASA Astrophysics Data System (ADS)

    Liang, Danhua; Zhang, Zhaoxia; Chen, Xiaodong; Yu, Daoyin

    2008-03-01

    Visual inspection is a kind of new technology based on the research of computer vision, which focuses on the measurement of the object's geometry and location. It can be widely used in online measurement, and other real-time measurement process. Because of the defects of the traditional visual inspection, a new visual detection mode -all-digital intelligent acquisition and transmission is presented. The image processing, including filtering, image compression, binarization, edge detection and distortion correction, can be completed in the programmable devices -FPGA. As the wide-field angle lens is adopted in the system, the output images have serious distortion. Limited by the calculating speed of computer, software can only correct the distortion of static images but not the distortion of dynamic images. To reach the real-time need, we design a distortion correction system based on FPGA. The method of hardware distortion correction is that the spatial correction data are calculated first under software circumstance, then converted into the address of hardware storage and stored in the hardware look-up table, through which data can be read out to correct gray level. The major benefit using FPGA is that the same circuit can be used for other circularly symmetric wide-angle lenses without being modified.

  3. Correction of oral contrast artifacts in CT-based attenuation correction of PET images using an automated segmentation algorithm.

    PubMed

    Ahmadian, Alireza; Ay, Mohammad R; Bidgoli, Javad H; Sarkar, Saeed; Zaidi, Habib

    2008-10-01

    Oral contrast is usually administered in most X-ray computed tomography (CT) examinations of the abdomen and the pelvis as it allows more accurate identification of the bowel and facilitates the interpretation of abdominal and pelvic CT studies. However, the misclassification of contrast medium with high-density bone in CT-based attenuation correction (CTAC) is known to generate artifacts in the attenuation map (mumap), thus resulting in overcorrection for attenuation of positron emission tomography (PET) images. In this study, we developed an automated algorithm for segmentation and classification of regions containing oral contrast medium to correct for artifacts in CT-attenuation-corrected PET images using the segmented contrast correction (SCC) algorithm. The proposed algorithm consists of two steps: first, high CT number object segmentation using combined region- and boundary-based segmentation and second, object classification to bone and contrast agent using a knowledge-based nonlinear fuzzy classifier. Thereafter, the CT numbers of pixels belonging to the region classified as contrast medium are substituted with their equivalent effective bone CT numbers using the SCC algorithm. The generated CT images are then down-sampled followed by Gaussian smoothing to match the resolution of PET images. A piecewise calibration curve was then used to convert CT pixel values to linear attenuation coefficients at 511 keV. The visual assessment of segmented regions performed by an experienced radiologist confirmed the accuracy of the segmentation and classification algorithms for delineation of contrast-enhanced regions in clinical CT images. The quantitative analysis of generated mumaps of 21 clinical CT colonoscopy datasets showed an overestimation ranging between 24.4% and 37.3% in the 3D-classified regions depending on their volume and the concentration of contrast medium. Two PET/CT studies known to be problematic demonstrated the applicability of the technique in

  4. Adaptive cyclic physiologic noise modeling and correction in functional MRI.

    PubMed

    Beall, Erik B

    2010-03-30

    Physiologic noise in BOLD-weighted MRI data is known to be a significant source of the variance, reducing the statistical power and specificity in fMRI and functional connectivity analyses. We show a dramatic improvement on current noise correction methods in both fMRI and fcMRI data that avoids overfitting. The traditional noise model is a Fourier series expansion superimposed on the periodicity of parallel measured breathing and cardiac cycles. Correction using this model results in removal of variance matching the periodicity of the physiologic cycles. Using this framework allows easy modeling of noise. However, using a large number of regressors comes at the cost of removing variance unrelated to physiologic noise, such as variance due to the signal of functional interest (overfitting the data). It is our hypothesis that there are a small variety of fits that describe all of the significantly coupled physiologic noise. If this is true, we can replace a large number of regressors used in the model with a smaller number of the fitted regressors and thereby account for the noise sources with a smaller reduction in variance of interest. We describe these extensions and demonstrate that we can preserve variance in the data unrelated to physiologic noise while removing physiologic noise equivalently, resulting in data with a higher effective SNR than with current corrections techniques. Our results demonstrate a significant improvement in the sensitivity of fMRI (up to a 17% increase in activation volume for fMRI compared with higher order traditional noise correction) and functional connectivity analyses. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  5. Statistical reconstruction for cone-beam CT with a post-artifact-correction noise model: application to high-quality head imaging

    NASA Astrophysics Data System (ADS)

    Dang, H.; Stayman, J. W.; Sisniega, A.; Xu, J.; Zbijewski, W.; Wang, X.; Foos, D. H.; Aygun, N.; Koliatsos, V. E.; Siewerdsen, J. H.

    2015-08-01

    Non-contrast CT reliably detects fresh blood in the brain and is the current front-line imaging modality for intracranial hemorrhage such as that occurring in acute traumatic brain injury (contrast ~40-80 HU, size  >  1 mm). We are developing flat-panel detector (FPD) cone-beam CT (CBCT) to facilitate such diagnosis in a low-cost, mobile platform suitable for point-of-care deployment. Such a system may offer benefits in the ICU, urgent care/concussion clinic, ambulance, and sports and military theatres. However, current FPD-CBCT systems face significant challenges that confound low-contrast, soft-tissue imaging. Artifact correction can overcome major sources of bias in FPD-CBCT but imparts noise amplification in filtered backprojection (FBP). Model-based reconstruction improves soft-tissue image quality compared to FBP by leveraging a high-fidelity forward model and image regularization. In this work, we develop a novel penalized weighted least-squares (PWLS) image reconstruction method with a noise model that includes accurate modeling of the noise characteristics associated with the two dominant artifact corrections (scatter and beam-hardening) in CBCT and utilizes modified weights to compensate for noise amplification imparted by each correction. Experiments included real data acquired on a FPD-CBCT test-bench and an anthropomorphic head phantom emulating intra-parenchymal hemorrhage. The proposed PWLS method demonstrated superior noise-resolution tradeoffs in comparison to FBP and PWLS with conventional weights (viz. at matched 0.50 mm spatial resolution, CNR = 11.9 compared to CNR = 5.6 and CNR = 9.9, respectively) and substantially reduced image noise especially in challenging regions such as skull base. The results support the hypothesis that with high-fidelity artifact correction and statistical reconstruction using an accurate post-artifact-correction noise model, FPD-CBCT can achieve image quality allowing reliable detection of intracranial

  6. Forward and correctional OFDM-based visible light positioning

    NASA Astrophysics Data System (ADS)

    Li, Wei; Huang, Zhitong; Zhao, Runmei; He, Peixuan; Ji, Yuefeng

    2017-09-01

    Visible light positioning (VLP) has attracted much attention in both academic and industrial areas due to the extensive deployment of light-emitting diodes (LEDs) as next-generation green lighting. Generally, the coverage of a single LED lamp is limited, so LED arrays are always utilized to achieve uniform illumination within the large-scale indoor environment. However, in such dense LED deployment scenario, the superposition of the light signals becomes an important challenge for accurate VLP. To solve this problem, we propose a forward and correctional orthogonal frequency division multiplexing (OFDM)-based VLP (FCO-VLP) scheme with low complexity in generating and processing of signals. In the first forward procedure of FCO-VLP, an initial position is obtained by the trilateration method based on OFDM-subcarriers. The positioning accuracy will be further improved in the second correctional procedure based on the database of reference points. As demonstrated in our experiments, our approach yields an improved average positioning error of 4.65 cm and an enhanced positioning accuracy by 24.2% compared with trilateration method.

  7. Music-based therapeutic interventions for people with dementia.

    PubMed

    van der Steen, Jenny T; van Soest-Poortvliet, Mirjam C; van der Wouden, Johannes C; Bruinsma, Manon S; Scholten, Rob Jpm; Vink, Annemiek C

    2017-05-02

    Dementia is a clinical syndrome with a number of different causes which is characterised by deterioration in cognitive, behavioural, social and emotional functions. Pharmacological interventions are available but have limited effect to treat many of the syndrome's features. Less research has been directed towards non-pharmacological treatments. In this review, we examined the evidence for effects of music-based interventions as a treatment. To assess the effects of music-based therapeutic interventions for people with dementia on emotional well-being including quality of life, mood disturbance or negative affect, behavioural problems, social behaviour, and cognition at the end of therapy and four or more weeks after the end of treatment. We searched ALOIS, the Specialized Register of the Cochrane Dementia and Cognitive Improvement Group (CDCIG) on 14 April 2010 using the terms: music therapy, music, singing, sing, auditory stimulation. Additional searches were also carried out on 3 July 2015 in the major healthcare databases MEDLINE, Embase, psycINFO, CINAHL and LILACS; and in trial registers and grey literature sources. On 12 April 2016, we searched the major databases for new studies for future evaluation. We included randomized controlled trials of music-based therapeutic interventions (at least five sessions) for people with dementia that measured any of our outcomes of interest. Control groups either received usual care or other activities. Two reviewers worked independently to screen the retrieved studies against the inclusion criteria and then to extract data and assess methodological quality of the included studies. If necessary, we contacted trial authors to ask for additional data, including relevant subscales, or for other missing information. We pooled data using random-effects models. We included 17 studies. Sixteen studies with a total of 620 participants contributed data to meta-analyses. Participants in the studies had dementia of varying degrees of

  8. Quantitative Evaluation of 2 Scatter-Correction Techniques for 18F-FDG Brain PET/MRI in Regard to MR-Based Attenuation Correction.

    PubMed

    Teuho, Jarmo; Saunavaara, Virva; Tolvanen, Tuula; Tuokkola, Terhi; Karlsson, Antti; Tuisku, Jouni; Teräs, Mika

    2017-10-01

    In PET, corrections for photon scatter and attenuation are essential for visual and quantitative consistency. MR attenuation correction (MRAC) is generally conducted by image segmentation and assignment of discrete attenuation coefficients, which offer limited accuracy compared with CT attenuation correction. Potential inaccuracies in MRAC may affect scatter correction, because the attenuation image (μ-map) is used in single scatter simulation (SSS) to calculate the scatter estimate. We assessed the impact of MRAC to scatter correction using 2 scatter-correction techniques and 3 μ-maps for MRAC. Methods: The tail-fitted SSS (TF-SSS) and a Monte Carlo-based single scatter simulation (MC-SSS) algorithm implementations on the Philips Ingenuity TF PET/MR were used with 1 CT-based and 2 MR-based μ-maps. Data from 7 subjects were used in the clinical evaluation, and a phantom study using an anatomic brain phantom was conducted. Scatter-correction sinograms were evaluated for each scatter correction method and μ-map. Absolute image quantification was investigated with the phantom data. Quantitative assessment of PET images was performed by volume-of-interest and ratio image analysis. Results: MRAC did not result in large differences in scatter algorithm performance, especially with TF-SSS. Scatter sinograms and scatter fractions did not reveal large differences regardless of the μ-map used. TF-SSS showed slightly higher absolute quantification. The differences in volume-of-interest analysis between TF-SSS and MC-SSS were 3% at maximum in the phantom and 4% in the patient study. Both algorithms showed excellent correlation with each other with no visual differences between PET images. MC-SSS showed a slight dependency on the μ-map used, with a difference of 2% on average and 4% at maximum when a μ-map without bone was used. Conclusion: The effect of different MR-based μ-maps on the performance of scatter correction was minimal in non-time-of-flight 18 F-FDG PET

  9. Preclinical studies for induced pluripotent stem cell-based therapeutics.

    PubMed

    Harding, John; Mirochnitchenko, Oleg

    2014-02-21

    Induced pluripotent stem cells (iPSCs) and their differentiated derivatives can potentially be applied to cell-based therapy for human diseases. The properties of iPSCs are being studied intensively both to understand the basic biology of pluripotency and cellular differentiation and to solve problems associated with therapeutic applications. Examples of specific preclinical applications summarized briefly in this minireview include the use of iPSCs to treat diseases of the liver, nervous system, eye, and heart and metabolic conditions such as diabetes. Early stage studies illustrate the potential of iPSC-derived cells and have identified several challenges that must be addressed before moving to clinical trials. These include rigorous quality control and efficient production of required cell populations, improvement of cell survival and engraftment, and development of technologies to monitor transplanted cell behavior for extended periods of time. Problems related to immune rejection, genetic instability, and tumorigenicity must be solved. Testing the efficacy of iPSC-based therapies requires further improvement of animal models precisely recapitulating human disease conditions.

  10. Use of the Ames Check Standard Model for the Validation of Wall Interference Corrections

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Amaya, M.; Flach, R.

    2018-01-01

    The new check standard model of the NASA Ames 11-ft Transonic Wind Tunnel was chosen for a future validation of the facility's wall interference correction system. The chosen validation approach takes advantage of the fact that test conditions experienced by a large model in the slotted part of the tunnel's test section will change significantly if a subset of the slots is temporarily sealed. Therefore, the model's aerodynamic coefficients have to be recorded, corrected, and compared for two different test section configurations in order to perform the validation. Test section configurations with highly accurate Mach number and dynamic pressure calibrations were selected for the validation. First, the model is tested with all test section slots in open configuration while keeping the model's center of rotation on the tunnel centerline. In the next step, slots on the test section floor are sealed and the model is moved to a new center of rotation that is 33 inches below the tunnel centerline. Then, the original angle of attack sweeps are repeated. Afterwards, wall interference corrections are applied to both test data sets and response surface models of the resulting aerodynamic coefficients in interference-free flow are generated. Finally, the response surface models are used to predict the aerodynamic coefficients for a family of angles of attack while keeping dynamic pressure, Mach number, and Reynolds number constant. The validation is considered successful if the corrected aerodynamic coefficients obtained from the related response surface model pair show good agreement. Residual differences between the corrected coefficient sets will be analyzed as well because they are an indicator of the overall accuracy of the facility's wall interference correction process.

  11. Quantum error-correction failure distributions: Comparison of coherent and stochastic error models

    NASA Astrophysics Data System (ADS)

    Barnes, Jeff P.; Trout, Colin J.; Lucarelli, Dennis; Clader, B. D.

    2017-06-01

    We compare failure distributions of quantum error correction circuits for stochastic errors and coherent errors. We utilize a fully coherent simulation of a fault-tolerant quantum error correcting circuit for a d =3 Steane and surface code. We find that the output distributions are markedly different for the two error models, showing that no simple mapping between the two error models exists. Coherent errors create very broad and heavy-tailed failure distributions. This suggests that they are susceptible to outlier events and that mean statistics, such as pseudothreshold estimates, may not provide the key figure of merit. This provides further statistical insight into why coherent errors can be so harmful for quantum error correction. These output probability distributions may also provide a useful metric that can be utilized when optimizing quantum error correcting codes and decoding procedures for purely coherent errors.

  12. High-dimensional inference with the generalized Hopfield model: principal component analysis and corrections.

    PubMed

    Cocco, S; Monasson, R; Sessak, V

    2011-05-01

    We consider the problem of inferring the interactions between a set of N binary variables from the knowledge of their frequencies and pairwise correlations. The inference framework is based on the Hopfield model, a special case of the Ising model where the interaction matrix is defined through a set of patterns in the variable space, and is of rank much smaller than N. We show that maximum likelihood inference is deeply related to principal component analysis when the amplitude of the pattern components ξ is negligible compared to √N. Using techniques from statistical mechanics, we calculate the corrections to the patterns to the first order in ξ/√N. We stress the need to generalize the Hopfield model and include both attractive and repulsive patterns in order to correctly infer networks with sparse and strong interactions. We present a simple geometrical criterion to decide how many attractive and repulsive patterns should be considered as a function of the sampling noise. We moreover discuss how many sampled configurations are required for a good inference, as a function of the system size N and of the amplitude ξ. The inference approach is illustrated on synthetic and biological data.

  13. A larval zebrafish model of bipolar disorder as a screening platform for neuro-therapeutics.

    PubMed

    Ellis, Lee David; Soanes, Kelly Howard

    2012-08-01

    Modelling neurological diseases has proven extraordinarily difficult due to the phenotypic complexity of each disorder. The zebrafish has become a useful model system with which to study abnormal neurological and behavioural activity and holds promise as a model of human disease. While most of the disease modelling using zebrafish has made use of adults, larvae hold tremendous promise for the high-throughput screening of potential therapeutics. The further development of larval disease models will strengthen their ability to contribute to the drug screening process. Here we have used zebrafish larvae to model the symptoms of bipolar disorder by treating larvae with sub-convulsive concentrations of the GABA antagonist pentylenetetrazol (PTZ). A number of therapeutics that act on different targets, in addition to those that have been used to treat bipolar disorder, were tested against this model to assess its predictive value. Carbamazepine, valproic acid, baclofen and honokiol, were found to oppose various aspects of the PTZ-induced changes in activity. Lidocaine and haloperidol exacerbated the PTZ-induced activity changes and sulpiride had no effect. By comparing the degree of phenotypic rescue with the mechanism of action of each therapeutic we have shown that the low-concentration PTZ model can produce a number of intermediate phenotypes that model symptoms of bipolar disorder, may be useful in modelling other disease states, and will help predict the efficacy of novel therapeutics. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.

  14. Therapeutic Potency of Nanoformulations of siRNAs and shRNAs in Animal Models of Cancers.

    PubMed

    Karim, Md Emranul; Tha, Kyi Kyi; Othman, Iekhsan; Borhan Uddin, Mohammad; Chowdhury, Ezharul Hoque

    2018-05-26

    RNA Interference (RNAi) has brought revolutionary transformations in cancer management in the past two decades. RNAi-based therapeutics including siRNA and shRNA have immense scope to silence the expression of mutant cancer genes specifically in a therapeutic context. Although tremendous progress has been made to establish catalytic RNA as a new class of biologics for cancer management, a lot of extracellular and intracellular barriers still pose a long-lasting challenge on the way to clinical approval. A series of chemically suitable, safe and effective viral and non-viral carriers have emerged to overcome physiological barriers and ensure targeted delivery of RNAi. The newly invented carriers, delivery techniques and gene editing technology made current treatment protocols stronger to fight cancer. This review has provided a platform about the chronicle of siRNA development and challenges of RNAi therapeutics for laboratory to bedside translation focusing on recent advancement in siRNA delivery vehicles with their limitations. Furthermore, an overview of several animal model studies of siRNA- or shRNA-based cancer gene therapy over the past 15 years has been presented, highlighting the roles of genes in multiple cancers, pharmacokinetic parameters and critical evaluation. The review concludes with a future direction for the development of catalytic RNA vehicles and design strategies to make RNAi-based cancer gene therapy more promising to surmount cancer gene delivery challenges.

  15. Atmospheric Correction Algorithm for Hyperspectral Imagery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. J. Pollina

    1999-09-01

    In December 1997, the US Department of Energy (DOE) established a Center of Excellence (Hyperspectral-Multispectral Algorithm Research Center, HyMARC) for promoting the research and development of algorithms to exploit spectral imagery. This center is located at the DOE Remote Sensing Laboratory in Las Vegas, Nevada, and is operated for the DOE by Bechtel Nevada. This paper presents the results to date of a research project begun at the center during 1998 to investigate the correction of hyperspectral data for atmospheric aerosols. Results of a project conducted by the Rochester Institute of Technology to define, implement, and test procedures for absolutemore » calibration and correction of hyperspectral data to absolute units of high spectral resolution imagery will be presented. Hybrid techniques for atmospheric correction using image or spectral scene data coupled through radiative propagation models will be specifically addressed. Results of this effort to analyze HYDICE sensor data will be included. Preliminary results based on studying the performance of standard routines, such as Atmospheric Pre-corrected Differential Absorption and Nonlinear Least Squares Spectral Fit, in retrieving reflectance spectra show overall reflectance retrieval errors of approximately one to two reflectance units in the 0.4- to 2.5-micron-wavelength region (outside of the absorption features). These results are based on HYDICE sensor data collected from the Southern Great Plains Atmospheric Radiation Measurement site during overflights conducted in July of 1997. Results of an upgrade made in the model-based atmospheric correction techniques, which take advantage of updates made to the moderate resolution atmospheric transmittance model (MODTRAN 4.0) software, will also be presented. Data will be shown to demonstrate how the reflectance retrieval in the shorter wavelengths of the blue-green region will be improved because of enhanced modeling of multiple scattering effects.« less

  16. A Model-Based Approach for Microvasculature Structure Distortion Correction in Two-Photon Fluorescence Microscopy Images

    PubMed Central

    Dao, Lam; Glancy, Brian; Lucotte, Bertrand; Chang, Lin-Ching; Balaban, Robert S; Hsu, Li-Yueh

    2015-01-01

    SUMMARY This paper investigates a post-processing approach to correct spatial distortion in two-photon fluorescence microscopy images for vascular network reconstruction. It is aimed at in vivo imaging of large field-of-view, deep-tissue studies of vascular structures. Based on simple geometric modeling of the object-of-interest, a distortion function is directly estimated from the image volume by deconvolution analysis. Such distortion function is then applied to sub volumes of the image stack to adaptively adjust for spatially varying distortion and reduce the image blurring through blind deconvolution. The proposed technique was first evaluated in phantom imaging of fluorescent microspheres that are comparable in size to the underlying capillary vascular structures. The effectiveness of restoring three-dimensional spherical geometry of the microspheres using the estimated distortion function was compared with empirically measured point-spread function. Next, the proposed approach was applied to in vivo vascular imaging of mouse skeletal muscle to reduce the image distortion of the capillary structures. We show that the proposed method effectively improve the image quality and reduce spatially varying distortion that occurs in large field-of-view deep-tissue vascular dataset. The proposed method will help in qualitative interpretation and quantitative analysis of vascular structures from fluorescence microscopy images. PMID:26224257

  17. Disease modeling and cell based therapy with iPSC: future therapeutic option with fast and safe application.

    PubMed

    Kim, Changsung

    2014-03-01

    Induced pluripotent stem cell (iPSC) technology has shown us great hope to treat various human diseases which have been known as untreatable and further endows personalized medicine for future therapy without ethical issues and immunological rejection which embryonic stem cell (hES) treatment has faced. It has been agreed that iPSCs knowledge can be harnessed from disease modeling which mimics human pathological development rather than trials utilizing conventional rodent and cell lines. Now, we can routinely generate iPSC from patient specific cell sources, such as skin fibroblast, hair follicle cells, patient blood samples and even urine containing small amount of epithelial cells. iPSC has both similarity and dissimilarity to hES. iPSC is similar enough to regenerate tissue and even full organism as ES does, however what we want for therapeutic advantage is limited to regenerated tissue and lineage specific differentiation. Depending on the lineage and type of cells, both tissue memory containing (DNA rearrangement/epigenetics) and non-containing iPSC can be generated. This makes iPSC even better choice to perform disease modeling as well as cell based therapy. Tissue memory containing iPSC from mature leukocytes would be beneficial for curing cancer and infectious disease. In this review, the benefit of iPSC for translational approaches will be presented.

  18. IRT Models for Ability-Based Guessing

    ERIC Educational Resources Information Center

    Martin, Ernesto San; del Pino, Guido; De Boeck, Paul

    2006-01-01

    An ability-based guessing model is formulated and applied to several data sets regarding educational tests in language and in mathematics. The formulation of the model is such that the probability of a correct guess does not only depend on the item but also on the ability of the individual, weighted with a general discrimination parameter. By so…

  19. Cone-beam CT of traumatic brain injury using statistical reconstruction with a post-artifact-correction noise model

    NASA Astrophysics Data System (ADS)

    Dang, H.; Stayman, J. W.; Sisniega, A.; Xu, J.; Zbijewski, W.; Yorkston, J.; Aygun, N.; Koliatsos, V.; Siewerdsen, J. H.

    2015-03-01

    Traumatic brain injury (TBI) is a major cause of death and disability. The current front-line imaging modality for TBI detection is CT, which reliably detects intracranial hemorrhage (fresh blood contrast 30-50 HU, size down to 1 mm) in non-contrast-enhanced exams. Compared to CT, flat-panel detector (FPD) cone-beam CT (CBCT) systems offer lower cost, greater portability, and smaller footprint suitable for point-of-care deployment. We are developing FPD-CBCT to facilitate TBI detection at the point-of-care such as in emergent, ambulance, sports, and military applications. However, current FPD-CBCT systems generally face challenges in low-contrast, soft-tissue imaging. Model-based reconstruction can improve image quality in soft-tissue imaging compared to conventional filtered back-projection (FBP) by leveraging high-fidelity forward model and sophisticated regularization. In FPD-CBCT TBI imaging, measurement noise characteristics undergo substantial change following artifact correction, resulting in non-negligible noise amplification. In this work, we extend the penalized weighted least-squares (PWLS) image reconstruction to include the two dominant artifact corrections (scatter and beam hardening) in FPD-CBCT TBI imaging by correctly modeling the variance change following each correction. Experiments were performed on a CBCT test-bench using an anthropomorphic phantom emulating intra-parenchymal hemorrhage in acute TBI, and the proposed method demonstrated an improvement in blood-brain contrast-to-noise ratio (CNR = 14.2) compared to FBP (CNR = 9.6) and PWLS using conventional weights (CNR = 11.6) at fixed spatial resolution (1 mm edge-spread width at the target contrast). The results support the hypothesis that FPD-CBCT can fulfill the image quality requirements for reliable TBI detection, using high-fidelity artifact correction and statistical reconstruction with accurate post-artifact-correction noise models.

  20. Nanoparticle-based targeted therapeutics in head-and-neck cancer.

    PubMed

    Wu, Ting-Ting; Zhou, Shui-Hong

    2015-01-01

    Head-and-neck cancer is a major form of the disease worldwide. Treatment consists of surgery, radiation therapy and chemotherapy, but these have not resulted in improved survival rates over the past few decades. Versatile nanoparticles, with selective tumor targeting, are considered to have the potential to improve these poor outcomes. Application of nanoparticle-based targeted therapeutics has extended into many areas, including gene silencing, chemotherapeutic drug delivery, radiosensitization, photothermal therapy, and has shown much promise. In this review, we discuss recent advances in the field of nanoparticle-mediated targeted therapeutics for head-and-neck cancer, with an emphasis on the description of targeting points, including future perspectives.

  1. Development of anti-migraine therapeutics using the capsaicin-induced dermal blood flow model.

    PubMed

    Buntinx, Linde; Vermeersch, Steve; de Hoon, Jan

    2015-11-01

    The efficacy of calcitonin gene-related peptide (receptor) (CGRP-(R)) blocking therapeutics in the treatment of acute migraine headache provided proof-of-concept for the involvement of CGRP in the pathophysiology of this disorder. One of the major hurdles for the development of any class of drugs, including CGRP blocking therapeutics, is the early clinical development process during which toxic and inefficacious compounds need to be eliminated as early as possible in order to focus on the most promising molecules. At this stage, human models providing proof of target engagement, combined with safety and tolerability studies, are extremely valuable in focusing on those therapeutics that have the highest engagement from the lowest exposure. They guide the go/no-go decision making, establish confidence in the candidate molecule by de-risking toxicity and safety issues and thereby speed up the early clinical development. In this review the focus is on the so called 'capsaicin model' as a typical example of a target engagement biomarker used as a human model for the development of CGRP blocking therapeutics. By applying capsaicin onto the skin, TRPV1 channels are activated and a CGRP-mediated increase in dermal blood flow can be quantified with laser Doppler perfusion imaging. Effective CGRP blocking therapeutics in turn, display blockade of this response. The translation of this biomarker model from animals to humans is discussed as well as the limitations of the assay in predicting the efficacy of anti-migraine drugs. © 2015 The British Pharmacological Society.

  2. Research and implementation of the algorithm for unwrapped and distortion correction basing on CORDIC for panoramic image

    NASA Astrophysics Data System (ADS)

    Zhang, Zhenhai; Li, Kejie; Wu, Xiaobing; Zhang, Shujiang

    2008-03-01

    The unwrapped and correcting algorithm based on Coordinate Rotation Digital Computer (CORDIC) and bilinear interpolation algorithm was presented in this paper, with the purpose of processing dynamic panoramic annular image. An original annular panoramic image captured by panoramic annular lens (PAL) can be unwrapped and corrected to conventional rectangular image without distortion, which is much more coincident with people's vision. The algorithm for panoramic image processing is modeled by VHDL and implemented in FPGA. The experimental results show that the proposed panoramic image algorithm for unwrapped and distortion correction has the lower computation complexity and the architecture for dynamic panoramic image processing has lower hardware cost and power consumption. And the proposed algorithm is valid.

  3. Study of a quadratic redshift-based correction in f(R) gravity with Baryonic matter

    NASA Astrophysics Data System (ADS)

    Masoudi, Mozhgan; Saffari, Reza

    2015-08-01

    This paper is considered as a second-order redshift-based corrections in derivative of modified gravitational action, f(R), to explain the late time acceleration which is appeared by Supernova Type Ia (SNeIa) without considering the dark components. Here, we obtained the cosmological dynamic parameters of universe for this redshift depended corrections. Next, we used the recent data of SNeIa Union2, shift parameter of the cosmic background radiation, Baryon acoustic oscillation from sloan digital sky survey (SDSS), and combined analysis of these observations to put constraints on the parameters of the selected F(z) model. It is very interesting that the well-known age problem of the three old objects for combined observations can be alleviated in this model. Finally, the reference action will be constructed in terms of its Taylor expansion. Also, we show that the reconstructed action definitely pass the solar system and stability of the cosmological solution tests.

  4. WE-DE-207B-12: Scatter Correction for Dedicated Cone Beam Breast CT Based On a Forward Projection Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, L; Zhu, L; Vedantham, S

    2016-06-15

    Purpose: The image quality of dedicated cone-beam breast CT (CBBCT) is fundamentally limited by substantial x-ray scatter contamination, resulting in cupping artifacts and contrast-loss in reconstructed images. Such effects obscure the visibility of soft-tissue lesions and calcifications, which hinders breast cancer detection and diagnosis. In this work, we propose to suppress x-ray scatter in CBBCT images using a deterministic forward projection model. Method: We first use the 1st-pass FDK-reconstructed CBBCT images to segment fibroglandular and adipose tissue. Attenuation coefficients are assigned to the two tissues based on the x-ray spectrum used for imaging acquisition, and is forward projected to simulatemore » scatter-free primary projections. We estimate the scatter by subtracting the simulated primary projection from the measured projection, and then the resultant scatter map is further refined by a Fourier-domain fitting algorithm after discarding untrusted scatter information. The final scatter estimate is subtracted from the measured projection for effective scatter correction. In our implementation, the proposed scatter correction takes 0.5 seconds for each projection. The method was evaluated using the overall image spatial non-uniformity (SNU) metric and the contrast-to-noise ratio (CNR) with 5 clinical datasets of BI-RADS 4/5 subjects. Results: For the 5 clinical datasets, our method reduced the SNU from 7.79% to 1.68% in coronal view and from 6.71% to 3.20% in sagittal view. The average CNR is improved by a factor of 1.38 in coronal view and 1.26 in sagittal view. Conclusion: The proposed scatter correction approach requires no additional scans or prior images and uses a deterministic model for efficient calculation. Evaluation with clinical datasets demonstrates the feasibility and stability of the method. These features are attractive for clinical CBBCT and make our method distinct from other approaches. Supported partly by NIH R21EB019597, R21

  5. Generalized algebraic scene-based nonuniformity correction algorithm.

    PubMed

    Ratliff, Bradley M; Hayat, Majeed M; Tyo, J Scott

    2005-02-01

    A generalization of a recently developed algebraic scene-based nonuniformity correction algorithm for focal plane array (FPA) sensors is presented. The new technique uses pairs of image frames exhibiting arbitrary one- or two-dimensional translational motion to compute compensator quantities that are then used to remove nonuniformity in the bias of the FPA response. Unlike its predecessor, the generalization does not require the use of either a blackbody calibration target or a shutter. The algorithm has a low computational overhead, lending itself to real-time hardware implementation. The high-quality correction ability of this technique is demonstrated through application to real IR data from both cooled and uncooled infrared FPAs. A theoretical and experimental error analysis is performed to study the accuracy of the bias compensator estimates in the presence of two main sources of error.

  6. Relativistic Corrections to the Bohr Model of the Atom

    ERIC Educational Resources Information Center

    Kraft, David W.

    1974-01-01

    Presents a simple means for extending the Bohr model to include relativistic corrections using a derivation similar to that for the non-relativistic case, except that the relativistic expressions for mass and kinetic energy are employed. (Author/GS)

  7. Enhanced identification and biological validation of differential gene expression via Illumina whole-genome expression arrays through the use of the model-based background correction methodology

    PubMed Central

    Ding, Liang-Hao; Xie, Yang; Park, Seongmi; Xiao, Guanghua; Story, Michael D.

    2008-01-01

    Despite the tremendous growth of microarray usage in scientific studies, there is a lack of standards for background correction methodologies, especially in single-color microarray platforms. Traditional background subtraction methods often generate negative signals and thus cause large amounts of data loss. Hence, some researchers prefer to avoid background corrections, which typically result in the underestimation of differential expression. Here, by utilizing nonspecific negative control features integrated into Illumina whole genome expression arrays, we have developed a method of model-based background correction for BeadArrays (MBCB). We compared the MBCB with a method adapted from the Affymetrix robust multi-array analysis algorithm and with no background subtraction, using a mouse acute myeloid leukemia (AML) dataset. We demonstrated that differential expression ratios obtained by using the MBCB had the best correlation with quantitative RT–PCR. MBCB also achieved better sensitivity in detecting differentially expressed genes with biological significance. For example, we demonstrated that the differential regulation of Tnfr2, Ikk and NF-kappaB, the death receptor pathway, in the AML samples, could only be detected by using data after MBCB implementation. We conclude that MBCB is a robust background correction method that will lead to more precise determination of gene expression and better biological interpretation of Illumina BeadArray data. PMID:18450815

  8. Lowered threshold energy for femtosecond laser induced optical breakdown in a water based eye model by aberration correction with adaptive optics.

    PubMed

    Hansen, Anja; Géneaux, Romain; Günther, Axel; Krüger, Alexander; Ripken, Tammo

    2013-06-01

    In femtosecond laser ophthalmic surgery tissue dissection is achieved by photodisruption based on laser induced optical breakdown. In order to minimize collateral damage to the eye laser surgery systems should be optimized towards the lowest possible energy threshold for photodisruption. However, optical aberrations of the eye and the laser system distort the irradiance distribution from an ideal profile which causes a rise in breakdown threshold energy even if great care is taken to minimize the aberrations of the system during design and alignment. In this study we used a water chamber with an achromatic focusing lens and a scattering sample as eye model and determined breakdown threshold in single pulse plasma transmission loss measurements. Due to aberrations, the precise lower limit for breakdown threshold irradiance in water is still unknown. Here we show that the threshold energy can be substantially reduced when using adaptive optics to improve the irradiance distribution by spatial beam shaping. We found that for initial aberrations with a root-mean-square wave front error of only one third of the wavelength the threshold energy can still be reduced by a factor of three if the aberrations are corrected to the diffraction limit by adaptive optics. The transmitted pulse energy is reduced by 17% at twice the threshold. Furthermore, the gas bubble motions after breakdown for pulse trains at 5 kilohertz repetition rate show a more transverse direction in the corrected case compared to the more spherical distribution without correction. Our results demonstrate how both applied and transmitted pulse energy could be reduced during ophthalmic surgery when correcting for aberrations. As a consequence, the risk of retinal damage by transmitted energy and the extent of collateral damage to the focal volume could be minimized accordingly when using adaptive optics in fs-laser surgery.

  9. Lowered threshold energy for femtosecond laser induced optical breakdown in a water based eye model by aberration correction with adaptive optics

    PubMed Central

    Hansen, Anja; Géneaux, Romain; Günther, Axel; Krüger, Alexander; Ripken, Tammo

    2013-01-01

    In femtosecond laser ophthalmic surgery tissue dissection is achieved by photodisruption based on laser induced optical breakdown. In order to minimize collateral damage to the eye laser surgery systems should be optimized towards the lowest possible energy threshold for photodisruption. However, optical aberrations of the eye and the laser system distort the irradiance distribution from an ideal profile which causes a rise in breakdown threshold energy even if great care is taken to minimize the aberrations of the system during design and alignment. In this study we used a water chamber with an achromatic focusing lens and a scattering sample as eye model and determined breakdown threshold in single pulse plasma transmission loss measurements. Due to aberrations, the precise lower limit for breakdown threshold irradiance in water is still unknown. Here we show that the threshold energy can be substantially reduced when using adaptive optics to improve the irradiance distribution by spatial beam shaping. We found that for initial aberrations with a root-mean-square wave front error of only one third of the wavelength the threshold energy can still be reduced by a factor of three if the aberrations are corrected to the diffraction limit by adaptive optics. The transmitted pulse energy is reduced by 17% at twice the threshold. Furthermore, the gas bubble motions after breakdown for pulse trains at 5 kilohertz repetition rate show a more transverse direction in the corrected case compared to the more spherical distribution without correction. Our results demonstrate how both applied and transmitted pulse energy could be reduced during ophthalmic surgery when correcting for aberrations. As a consequence, the risk of retinal damage by transmitted energy and the extent of collateral damage to the focal volume could be minimized accordingly when using adaptive optics in fs-laser surgery. PMID:23761849

  10. Simple liquid models with corrected dielectric constants

    PubMed Central

    Fennell, Christopher J.; Li, Libo; Dill, Ken A.

    2012-01-01

    Molecular simulations often use explicit-solvent models. Sometimes explicit-solvent models can give inaccurate values for basic liquid properties, such as the density, heat capacity, and permittivity, as well as inaccurate values for molecular transfer free energies. Such errors have motivated the development of more complex solvents, such as polarizable models. We describe an alternative here. We give new fixed-charge models of solvents for molecular simulations – water, carbon tetrachloride, chloroform and dichloromethane. Normally, such solvent models are parameterized to agree with experimental values of the neat liquid density and enthalpy of vaporization. Here, in addition to those properties, our parameters are chosen to give the correct dielectric constant. We find that these new parameterizations also happen to give better values for other properties, such as the self-diffusion coefficient. We believe that parameterizing fixed-charge solvent models to fit experimental dielectric constants may provide better and more efficient ways to treat solvents in computer simulations. PMID:22397577

  11. Evaluation of a non-Arrhenius model for therapeutic monoclonal antibody aggregation.

    PubMed

    Kayser, Veysel; Chennamsetty, Naresh; Voynov, Vladimir; Helk, Bernhard; Forrer, Kurt; Trout, Bernhardt L

    2011-07-01

    Understanding antibody aggregation is of great significance for the pharmaceutical industry. We studied the aggregation of five different therapeutic monoclonal antibodies (mAbs) with size-exclusion chromatography-high-performance liquid chromatography (SEC-HPLC), fluorescence spectroscopy, electron microscopy, and light scattering methods at various temperatures with the aim of gaining insight into the aggregation process and developing models of it. In particular, we find that the kinetics can be described by a second-order model and are non-Arrhenius. Thus, we develop a non-Arrhenius model to connect accelerated aggregation experiments at high temperature to long-term storage experiments at low temperature. We evaluate our model by predicting mAb aggregation and comparing it with long-term behavior. Our results suggest that the number of monomers and mAb conformations within aggregates vary with the size and age of the aggregates, and that only certain sizes of aggregates are populated in the solution. We also propose a kinetic model based on conformational changes of proteins and monomer peak loss kinetics from SEC-HPLC. This model could be employed for a detail analysis of mAb aggregation kinetics. Copyright © 2011 Wiley-Liss, Inc. and the American Pharmacists Association

  12. Bandwidth correction for LED chromaticity based on Levenberg-Marquardt algorithm

    NASA Astrophysics Data System (ADS)

    Huang, Chan; Jin, Shiqun; Xia, Guo

    2017-10-01

    Light emitting diode (LED) is widely employed in industrial applications and scientific researches. With a spectrometer, the chromaticity of LED can be measured. However, chromaticity shift will occur due to the broadening effects of the spectrometer. In this paper, an approach is put forward to bandwidth correction for LED chromaticity based on Levenberg-Marquardt algorithm. We compare chromaticity of simulated LED spectra by using the proposed method and differential operator method to bandwidth correction. The experimental results show that the proposed approach achieves an excellent performance in bandwidth correction which proves the effectiveness of the approach. The method has also been tested on true blue LED spectra.

  13. Quantification of hepatic steatosis with T1-independent, T2-corrected MR imaging with spectral modeling of fat: blinded comparison with MR spectroscopy.

    PubMed

    Meisamy, Sina; Hines, Catherine D G; Hamilton, Gavin; Sirlin, Claude B; McKenzie, Charles A; Yu, Huanzhou; Brittain, Jean H; Reeder, Scott B

    2011-03-01

    To prospectively compare an investigational version of a complex-based chemical shift-based fat fraction magnetic resonance (MR) imaging method with MR spectroscopy for the quantification of hepatic steatosis. This study was approved by the institutional review board and was HIPAA compliant. Written informed consent was obtained before all studies. Fifty-five patients (31 women, 24 men; age range, 24-71 years) were prospectively imaged at 1.5 T with quantitative MR imaging and single-voxel MR spectroscopy, each within a single breath hold. The effects of T2 correction, spectral modeling of fat, and magnitude fitting for eddy current correction on fat quantification with MR imaging were investigated by reconstructing fat fraction images from the same source data with different combinations of error correction. Single-voxel T2-corrected MR spectroscopy was used to measure fat fraction and served as the reference standard. All MR spectroscopy data were postprocessed at a separate institution by an MR physicist who was blinded to MR imaging results. Fat fractions measured with MR imaging and MR spectroscopy were compared statistically to determine the correlation (r(2)), and the slope and intercept as measures of agreement between MR imaging and MR spectroscopy fat fraction measurements, to determine whether MR imaging can help quantify fat, and examine the importance of T2 correction, spectral modeling of fat, and eddy current correction. Two-sided t tests (significance level, P = .05) were used to determine whether estimated slopes and intercepts were significantly different from 1.0 and 0.0, respectively. Sensitivity and specificity for the classification of clinically significant steatosis were evaluated. Overall, there was excellent correlation between MR imaging and MR spectroscopy for all reconstruction combinations. However, agreement was only achieved when T2 correction, spectral modeling of fat, and magnitude fitting for eddy current correction were used (r(2

  14. Experimental demonstration of passive acoustic imaging in the human skull cavity using CT-based aberration corrections

    PubMed Central

    Jones, Ryan M.; O’Reilly, Meaghan A.; Hynynen, Kullervo

    2015-01-01

    Purpose: Experimentally verify a previously described technique for performing passive acoustic imaging through an intact human skull using noninvasive, computed tomography (CT)-based aberration corrections Jones et al. [Phys. Med. Biol. 58, 4981–5005 (2013)]. Methods: A sparse hemispherical receiver array (30 cm diameter) consisting of 128 piezoceramic discs (2.5 mm diameter, 612 kHz center frequency) was used to passively listen through ex vivo human skullcaps (n = 4) to acoustic emissions from a narrow-band fixed source (1 mm diameter, 516 kHz center frequency) and from ultrasound-stimulated (5 cycle bursts, 1 Hz pulse repetition frequency, estimated in situ peak negative pressure 0.11–0.33 MPa, 306 kHz driving frequency) Definity™ microbubbles flowing through a thin-walled tube phantom. Initial in vivo feasibility testing of the method was performed. The performance of the method was assessed through comparisons to images generated without skull corrections, with invasive source-based corrections, and with water-path control images. Results: For source locations at least 25 mm from the inner skull surface, the modified reconstruction algorithm successfully restored a single focus within the skull cavity at a location within 1.25 mm from the true position of the narrow-band source. The results obtained from imaging single bubbles are in good agreement with numerical simulations of point source emitters and the authors’ previous experimental measurements using source-based skull corrections O’Reilly et al. [IEEE Trans. Biomed. Eng. 61, 1285–1294 (2014)]. In a rat model, microbubble activity was mapped through an intact human skull at pressure levels below and above the threshold for focused ultrasound-induced blood–brain barrier opening. During bursts that led to coherent bubble activity, the location of maximum intensity in images generated with CT-based skull corrections was found to deviate by less than 1 mm, on average, from the position

  15. An efficient algorithm for automatic phase correction of NMR spectra based on entropy minimization

    NASA Astrophysics Data System (ADS)

    Chen, Li; Weng, Zhiqiang; Goh, LaiYoong; Garland, Marc

    2002-09-01

    A new algorithm for automatic phase correction of NMR spectra based on entropy minimization is proposed. The optimal zero-order and first-order phase corrections for a NMR spectrum are determined by minimizing entropy. The objective function is constructed using a Shannon-type information entropy measure. Entropy is defined as the normalized derivative of the NMR spectral data. The algorithm has been successfully applied to experimental 1H NMR spectra. The results of automatic phase correction are found to be comparable to, or perhaps better than, manual phase correction. The advantages of this automatic phase correction algorithm include its simple mathematical basis and the straightforward, reproducible, and efficient optimization procedure. The algorithm is implemented in the Matlab program ACME—Automated phase Correction based on Minimization of Entropy.

  16. A rank-based approach for correcting systematic biases in spatial disaggregation of coarse-scale climate simulations

    NASA Astrophysics Data System (ADS)

    Nahar, Jannatun; Johnson, Fiona; Sharma, Ashish

    2017-07-01

    Use of General Circulation Model (GCM) precipitation and evapotranspiration sequences for hydrologic modelling can result in unrealistic simulations due to the coarse scales at which GCMs operate and the systematic biases they contain. The Bias Correction Spatial Disaggregation (BCSD) method is a popular statistical downscaling and bias correction method developed to address this issue. The advantage of BCSD is its ability to reduce biases in the distribution of precipitation totals at the GCM scale and then introduce more realistic variability at finer scales than simpler spatial interpolation schemes. Although BCSD corrects biases at the GCM scale before disaggregation; at finer spatial scales biases are re-introduced by the assumptions made in the spatial disaggregation process. Our study focuses on this limitation of BCSD and proposes a rank-based approach that aims to reduce the spatial disaggregation bias especially for both low and high precipitation extremes. BCSD requires the specification of a multiplicative bias correction anomaly field that represents the ratio of the fine scale precipitation to the disaggregated precipitation. It is shown that there is significant temporal variation in the anomalies, which is masked when a mean anomaly field is used. This can be improved by modelling the anomalies in rank-space. Results from the application of the rank-BCSD procedure improve the match between the distributions of observed and downscaled precipitation at the fine scale compared to the original BCSD approach. Further improvements in the distribution are identified when a scaling correction to preserve mass in the disaggregation process is implemented. An assessment of the approach using a single GCM over Australia shows clear advantages especially in the simulation of particularly low and high downscaled precipitation amounts.

  17. An improved simulation of the 2015 El Niño event by optimally correcting the initial conditions and model parameters in an intermediate coupled model

    NASA Astrophysics Data System (ADS)

    Zhang, Rong-Hua; Tao, Ling-Jiang; Gao, Chuan

    2017-09-01

    Large uncertainties exist in real-time predictions of the 2015 El Niño event, which have systematic intensity biases that are strongly model-dependent. It is critically important to characterize those model biases so they can be reduced appropriately. In this study, the conditional nonlinear optimal perturbation (CNOP)-based approach was applied to an intermediate coupled model (ICM) equipped with a four-dimensional variational data assimilation technique. The CNOP-based approach was used to quantify prediction errors that can be attributed to initial conditions (ICs) and model parameters (MPs). Two key MPs were considered in the ICM: one represents the intensity of the thermocline effect, and the other represents the relative coupling intensity between the ocean and atmosphere. Two experiments were performed to illustrate the effects of error corrections, one with a standard simulation and another with an optimized simulation in which errors in the ICs and MPs derived from the CNOP-based approach were optimally corrected. The results indicate that simulations of the 2015 El Niño event can be effectively improved by using CNOP-derived error correcting. In particular, the El Niño intensity in late 2015 was adequately captured when simulations were started from early 2015. Quantitatively, the Niño3.4 SST index simulated in Dec. 2015 increased to 2.8 °C in the optimized simulation, compared with only 1.5 °C in the standard simulation. The feasibility and effectiveness of using the CNOP-based technique to improve ENSO simulations are demonstrated in the context of the 2015 El Niño event. The limitations and further applications are also discussed.

  18. Therapeutic Effects of Extinction Learning as a Model of Exposure Therapy in Rats.

    PubMed

    Fucich, Elizabeth A; Paredes, Denisse; Morilak, David A

    2016-12-01

    Current treatments for stress-related psychiatric disorders, such as depression and posttraumatic stress disorder (PTSD), are inadequate. Cognitive behavioral psychotherapies, including exposure therapy, are an alternative to pharmacotherapy, but the neurobiological mechanisms are unknown. Preclinical models demonstrating therapeutic effects of behavioral interventions are required to investigate such mechanisms. Exposure therapy bears similarity to extinction learning. Thus, we investigated the therapeutic effects of extinction learning as a behavioral intervention to model exposure therapy in rats, testing its effectiveness in reversing chronic stress-induced deficits in cognitive flexibility and coping behavior that resemble dimensions of depression and PTSD. Rats were fear-conditioned by pairing a tone with footshock, and then exposed to chronic unpredictable stress (CUS) that induces deficits in cognitive set-shifting and active coping behavior. They then received an extinction learning session as a therapeutic intervention by repeated exposure to the tone with no shock. Effects on cognitive flexibility and coping behavior were assessed 24 h later on the attentional set-shifting test or shock-probe defensive burying test, respectively. Extinction reversed the CUS-induced deficits in cognitive flexibility and coping behavior, and increased phosphorylation of ribosomal protein S6 in the medial prefrontal cortex (mPFC) of stress-compromised rats, suggesting a role for activity-dependent protein synthesis in the therapeutic effect. Inhibiting protein synthesis by microinjecting anisomycin into mPFC blocked the therapeutic effect of extinction on cognitive flexibility. These results demonstrate the utility of extinction as a model by which to study mechanisms underlying exposure therapy, and suggest these mechanisms involve protein synthesis in the mPFC, the further study of which may identify novel therapeutic targets.

  19. Therapeutic Effects of Extinction Learning as a Model of Exposure Therapy in Rats

    PubMed Central

    Fucich, Elizabeth A; Paredes, Denisse; Morilak, David A

    2016-01-01

    Current treatments for stress-related psychiatric disorders, such as depression and posttraumatic stress disorder (PTSD), are inadequate. Cognitive behavioral psychotherapies, including exposure therapy, are an alternative to pharmacotherapy, but the neurobiological mechanisms are unknown. Preclinical models demonstrating therapeutic effects of behavioral interventions are required to investigate such mechanisms. Exposure therapy bears similarity to extinction learning. Thus, we investigated the therapeutic effects of extinction learning as a behavioral intervention to model exposure therapy in rats, testing its effectiveness in reversing chronic stress-induced deficits in cognitive flexibility and coping behavior that resemble dimensions of depression and PTSD. Rats were fear-conditioned by pairing a tone with footshock, and then exposed to chronic unpredictable stress (CUS) that induces deficits in cognitive set-shifting and active coping behavior. They then received an extinction learning session as a therapeutic intervention by repeated exposure to the tone with no shock. Effects on cognitive flexibility and coping behavior were assessed 24 h later on the attentional set-shifting test or shock-probe defensive burying test, respectively. Extinction reversed the CUS-induced deficits in cognitive flexibility and coping behavior, and increased phosphorylation of ribosomal protein S6 in the medial prefrontal cortex (mPFC) of stress-compromised rats, suggesting a role for activity-dependent protein synthesis in the therapeutic effect. Inhibiting protein synthesis by microinjecting anisomycin into mPFC blocked the therapeutic effect of extinction on cognitive flexibility. These results demonstrate the utility of extinction as a model by which to study mechanisms underlying exposure therapy, and suggest these mechanisms involve protein synthesis in the mPFC, the further study of which may identify novel therapeutic targets. PMID:27417516

  20. Human Disease Models in Drosophila melanogaster and the Role of the Fly in Therapeutic Drug Discovery

    PubMed Central

    Pandey, Udai Bhan

    2011-01-01

    The common fruit fly, Drosophila melanogaster, is a well studied and highly tractable genetic model organism for understanding molecular mechanisms of human diseases. Many basic biological, physiological, and neurological properties are conserved between mammals and D. melanogaster, and nearly 75% of human disease-causing genes are believed to have a functional homolog in the fly. In the discovery process for therapeutics, traditional approaches employ high-throughput screening for small molecules that is based primarily on in vitro cell culture, enzymatic assays, or receptor binding assays. The majority of positive hits identified through these types of in vitro screens, unfortunately, are found to be ineffective and/or toxic in subsequent validation experiments in whole-animal models. New tools and platforms are needed in the discovery arena to overcome these limitations. The incorporation of D. melanogaster into the therapeutic discovery process holds tremendous promise for an enhanced rate of discovery of higher quality leads. D. melanogaster models of human diseases provide several unique features such as powerful genetics, highly conserved disease pathways, and very low comparative costs. The fly can effectively be used for low- to high-throughput drug screens as well as in target discovery. Here, we review the basic biology of the fly and discuss models of human diseases and opportunities for therapeutic discovery for central nervous system disorders, inflammatory disorders, cardiovascular disease, cancer, and diabetes. We also provide information and resources for those interested in pursuing fly models of human disease, as well as those interested in using D. melanogaster in the drug discovery process. PMID:21415126

  1. Improvement of forecast skill for severe weather by merging radar-based extrapolation and storm-scale NWP corrected forecast

    NASA Astrophysics Data System (ADS)

    Wang, Gaili; Wong, Wai-Kin; Hong, Yang; Liu, Liping; Dong, Jili; Xue, Ming

    2015-03-01

    The primary objective of this study is to improve the performance of deterministic high resolution rainfall forecasts caused by severe storms by merging an extrapolation radar-based scheme with a storm-scale Numerical Weather Prediction (NWP) model. Effectiveness of Multi-scale Tracking and Forecasting Radar Echoes (MTaRE) model was compared with that of a storm-scale NWP model named Advanced Regional Prediction System (ARPS) for forecasting a violent tornado event that developed over parts of western and much of central Oklahoma on May 24, 2011. Then the bias corrections were performed to improve the forecast accuracy of ARPS forecasts. Finally, the corrected ARPS forecast and radar-based extrapolation were optimally merged by using a hyperbolic tangent weight scheme. The comparison of forecast skill between MTaRE and ARPS in high spatial resolution of 0.01° × 0.01° and high temporal resolution of 5 min showed that MTaRE outperformed ARPS in terms of index of agreement and mean absolute error (MAE). MTaRE had a better Critical Success Index (CSI) for less than 20-min lead times and was comparable to ARPS for 20- to 50-min lead times, while ARPS had a better CSI for more than 50-min lead times. Bias correction significantly improved ARPS forecasts in terms of MAE and index of agreement, although the CSI of corrected ARPS forecasts was similar to that of the uncorrected ARPS forecasts. Moreover, optimally merging results using hyperbolic tangent weight scheme further improved the forecast accuracy and became more stable.

  2. Photometric correction for an optical CCD-based system based on the sparsity of an eight-neighborhood gray gradient.

    PubMed

    Zhang, Yuzhong; Zhang, Yan

    2016-07-01

    In an optical measurement and analysis system based on a CCD, due to the existence of optical vignetting and natural vignetting, photometric distortion, in which the intensity falls off away from the image center, affects the subsequent processing and measuring precision severely. To deal with this problem, an easy and straightforward method used for photometric distortion correction is presented in this paper. This method introduces a simple polynomial fitting model of the photometric distortion function and employs a particle swarm optimization algorithm to get these model parameters by means of a minimizing eight-neighborhood gray gradient. Compared with conventional calibration methods, this method can obtain the profile information of photometric distortion from only a single common image captured by the optical CCD-based system, with no need for a uniform luminance area source used as a standard reference source and relevant optical and geometric parameters in advance. To illustrate the applicability of this method, numerical simulations and photometric distortions with different lens parameters are evaluated using this method in this paper. Moreover, the application example of temperature field correction for casting billets also demonstrates the effectiveness of this method. The experimental results show that the proposed method is able to achieve the maximum absolute error for vignetting estimation of 0.0765 and the relative error for vignetting estimation from different background images of 3.86%.

  3. The L0 Regularized Mumford-Shah Model for Bias Correction and Segmentation of Medical Images.

    PubMed

    Duan, Yuping; Chang, Huibin; Huang, Weimin; Zhou, Jiayin; Lu, Zhongkang; Wu, Chunlin

    2015-11-01

    We propose a new variant of the Mumford-Shah model for simultaneous bias correction and segmentation of images with intensity inhomogeneity. First, based on the model of images with intensity inhomogeneity, we introduce an L0 gradient regularizer to model the true intensity and a smooth regularizer to model the bias field. In addition, we derive a new data fidelity using the local intensity properties to allow the bias field to be influenced by its neighborhood. Second, we use a two-stage segmentation method, where the fast alternating direction method is implemented in the first stage for the recovery of true intensity and bias field and a simple thresholding is used in the second stage for segmentation. Different from most of the existing methods for simultaneous bias correction and segmentation, we estimate the bias field and true intensity without fixing either the number of the regions or their values in advance. Our method has been validated on medical images of various modalities with intensity inhomogeneity. Compared with the state-of-art approaches and the well-known brain software tools, our model is fast, accurate, and robust with initializations.

  4. Imaging enabled platforms for development of therapeutics

    NASA Astrophysics Data System (ADS)

    Celli, Jonathan; Rizvi, Imran; Blanden, Adam R.; Evans, Conor L.; Abu-Yousif, Adnan O.; Spring, Bryan Q.; Muzikansky, Alona; Pogue, Brian W.; Finkelstein, Dianne M.; Hasan, Tayyaba

    2011-03-01

    Advances in imaging and spectroscopic technologies have enabled the optimization of many therapeutic modalities in cancer and noncancer pathologies either by earlier disease detection or by allowing therapy monitoring. Amongst the therapeutic options benefiting from developments in imaging technologies, photodynamic therapy (PDT) is exceptional. PDT is a photochemistry-based therapeutic approach where a light-sensitive molecule (photosensitizer) is activated with light of appropriate energy (wavelength) to produce reactive molecular species such as free radicals and singlet oxygen. These molecular entities then react with biological targets such as DNA, membranes and other cellular components to impair their function and lead to eventual cell and tissue death. Development of PDT-based imaging also provides a platform for rapid screening of new therapeutics in novel in vitro models prior to expensive and labor-intensive animal studies. In this study we demonstrate how an imaging platform can be used for strategizing a novel combination treatment strategy for multifocal ovarian cancer. Using an in vitro 3D model for micrometastatic ovarian cancer in conjunction with quantitative imaging we examine dose and scheduling strategies for PDT in combination with carboplatin, a chemotherapeutic agent presently in clinical use for management of this deadly form of cancer.

  5. Predicting Social Anxiety Treatment Outcome Based on Therapeutic Email Conversations.

    PubMed

    Hoogendoorn, Mark; Berger, Thomas; Schulz, Ava; Stolz, Timo; Szolovits, Peter

    2017-09-01

    Predicting therapeutic outcome in the mental health domain is of utmost importance to enable therapists to provide the most effective treatment to a patient. Using information from the writings of a patient can potentially be a valuable source of information, especially now that more and more treatments involve computer-based exercises or electronic conversations between patient and therapist. In this paper, we study predictive modeling using writings of patients under treatment for a social anxiety disorder. We extract a wealth of information from the text written by patients including their usage of words, the topics they talk about, the sentiment of the messages, and the style of writing. In addition, we study trends over time with respect to those measures. We then apply machine learning algorithms to generate the predictive models. Based on a dataset of 69 patients, we are able to show that we can predict therapy outcome with an area under the curve of 0.83 halfway through the therapy and with a precision of 0.78 when using the full data (i.e., the entire treatment period). Due to the limited number of participants, it is hard to generalize the results, but they do show great potential in this type of information.

  6. Quantum corrections to quasi-periodic solution of Sine-Gordon model and periodic solution of phi4 model

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, G.; Leble, S.

    2014-03-01

    Analytical form of quantum corrections to quasi-periodic solution of Sine-Gordon model and periodic solution of phi4 model is obtained through zeta function regularisation with account of all rest variables of a d-dimensional theory. Qualitative dependence of quantum corrections on parameters of the classical systems is also evaluated for a much broader class of potentials u(x) = b2f(bx) + C with b and C as arbitrary real constants.

  7. Active vibration control with model correction on a flexible laboratory grid structure

    NASA Technical Reports Server (NTRS)

    Schamel, George C., II; Haftka, Raphael T.

    1991-01-01

    This paper presents experimental and computational comparisons of three active damping control laws applied to a complex laboratory structure. Two reduced structural models were used with one model being corrected on the basis of measured mode shapes and frequencies. Three control laws were investigated, a time-invariant linear quadratic regulator with state estimation and two direct rate feedback control laws. Experimental results for all designs were obtained with digital implementation. It was found that model correction improved the agreement between analytical and experimental results. The best agreement was obtained with the simplest direct rate feedback control.

  8. Therapeutic potential of gel-based injectables for vocal fold regeneration

    PubMed Central

    Bartlett, Rebecca S.; Thibeault, Susan L.; Prestwich, Glenn D.

    2012-01-01

    Vocal folds are anatomically and biomechanically unique, thus complicating the design and implementation of tissue engineering strategies for repair and regeneration. Integration of an enhanced understanding of tissue biomechanics, wound healing dynamics and innovative gel-based therapeutics has generated enthusiasm for the notion that an efficacious treatment for vocal fold scarring could be clinically attainable within several years. Fibroblast phenotype and gene expression are mediated by the three-dimensional mechanical and chemical microenvironment at an injury site. Thus, therapeutic approaches need to coordinate spatial and temporal aspects of the wound healing response in an injured vocal tissue to achieve an optimal clinical outcome. Successful gel-based injectables for vocal fold scarring will require a keen understanding of how the native inflammatory response sets into motion the later extracellular matrix remodeling, which in turn will determine the ultimate biomechanical properties of the tissue. We present an overview of the challenges associated with this translation as well as the proposed gel-based injectable solutions. PMID:22456756

  9. Parton distribution functions with QED corrections in the valon model

    NASA Astrophysics Data System (ADS)

    Mottaghizadeh, Marzieh; Taghavi Shahri, Fatemeh; Eslami, Parvin

    2017-10-01

    The parton distribution functions (PDFs) with QED corrections are obtained by solving the QCD ⊗QED DGLAP evolution equations in the framework of the "valon" model at the next-to-leading-order QCD and the leading-order QED approximations. Our results for the PDFs with QED corrections in this phenomenological model are in good agreement with the newly related CT14QED global fits code [Phys. Rev. D 93, 114015 (2016), 10.1103/PhysRevD.93.114015] and APFEL (NNPDF2.3QED) program [Comput. Phys. Commun. 185, 1647 (2014), 10.1016/j.cpc.2014.03.007] in a wide range of x =[10-5,1 ] and Q2=[0.283 ,108] GeV2 . The model calculations agree rather well with those codes. In the latter, we proposed a new method for studying the symmetry breaking of the sea quark distribution functions inside the proton.

  10. Finding of Correction Factor and Dimensional Error in Bio-AM Model by FDM Technique

    NASA Astrophysics Data System (ADS)

    Manmadhachary, Aiamunoori; Ravi Kumar, Yennam; Krishnanand, Lanka

    2018-06-01

    Additive Manufacturing (AM) is the swift manufacturing process, in which input data can be provided from various sources like 3-Dimensional (3D) Computer Aided Design (CAD), Computed Tomography (CT), Magnetic Resonance Imaging (MRI) and 3D scanner data. From the CT/MRI data can be manufacture Biomedical Additive Manufacturing (Bio-AM) models. The Bio-AM model gives a better lead on preplanning of oral and maxillofacial surgery. However manufacturing of the accurate Bio-AM model is one of the unsolved problems. The current paper demonstrates error between the Standard Triangle Language (STL) model to Bio-AM model of dry mandible and found correction factor in Bio-AM model with Fused Deposition Modelling (FDM) technique. In the present work dry mandible CT images are acquired by CT scanner and supplied into a 3D CAD model in the form of STL model. Further the data is sent to FDM machine for fabrication of Bio-AM model. The difference between Bio-AM to STL model dimensions is considered as dimensional error and the ratio of STL to Bio-AM model dimensions considered as a correction factor. This correction factor helps to fabricate the AM model with accurate dimensions of the patient anatomy. These true dimensional Bio-AM models increasing the safety and accuracy in pre-planning of oral and maxillofacial surgery. The correction factor for Dimension SST 768 FDM AM machine is 1.003 and dimensional error is limited to 0.3 %.

  11. Finding of Correction Factor and Dimensional Error in Bio-AM Model by FDM Technique

    NASA Astrophysics Data System (ADS)

    Manmadhachary, Aiamunoori; Ravi Kumar, Yennam; Krishnanand, Lanka

    2016-06-01

    Additive Manufacturing (AM) is the swift manufacturing process, in which input data can be provided from various sources like 3-Dimensional (3D) Computer Aided Design (CAD), Computed Tomography (CT), Magnetic Resonance Imaging (MRI) and 3D scanner data. From the CT/MRI data can be manufacture Biomedical Additive Manufacturing (Bio-AM) models. The Bio-AM model gives a better lead on preplanning of oral and maxillofacial surgery. However manufacturing of the accurate Bio-AM model is one of the unsolved problems. The current paper demonstrates error between the Standard Triangle Language (STL) model to Bio-AM model of dry mandible and found correction factor in Bio-AM model with Fused Deposition Modelling (FDM) technique. In the present work dry mandible CT images are acquired by CT scanner and supplied into a 3D CAD model in the form of STL model. Further the data is sent to FDM machine for fabrication of Bio-AM model. The difference between Bio-AM to STL model dimensions is considered as dimensional error and the ratio of STL to Bio-AM model dimensions considered as a correction factor. This correction factor helps to fabricate the AM model with accurate dimensions of the patient anatomy. These true dimensional Bio-AM models increasing the safety and accuracy in pre-planning of oral and maxillofacial surgery. The correction factor for Dimension SST 768 FDM AM machine is 1.003 and dimensional error is limited to 0.3 %.

  12. An Evaluation of Information Criteria Use for Correct Cross-Classified Random Effects Model Selection

    ERIC Educational Resources Information Center

    Beretvas, S. Natasha; Murphy, Daniel L.

    2013-01-01

    The authors assessed correct model identification rates of Akaike's information criterion (AIC), corrected criterion (AICC), consistent AIC (CAIC), Hannon and Quinn's information criterion (HQIC), and Bayesian information criterion (BIC) for selecting among cross-classified random effects models. Performance of default values for the 5…

  13. N-acetylcysteine Amide Augments the Therapeutic Effect of Neural Stem Cell-Based Antiglioma Oncolytic Virotherapy

    PubMed Central

    Kim, Chung Kwon; Ahmed, Atique U; Auffinger, Brenda; Ulasov, Ilya V; Tobias, Alex L; Moon, Kyung-Sub; Lesniak, Maciej S

    2013-01-01

    Current research has evaluated the intrinsic tumor-tropic properties of stem cell carriers for targeted anticancer therapy. Our laboratory has been extensively studying in the preclinical setting, the role of neural stem cells (NSCs) as delivery vehicles of CRAd-S-pk7, a gliomatropic oncolytic adenovirus (OV). However, the mediated toxicity of therapeutic payloads, such as oncolytic adenoviruses, toward cell carriers has significantly limited this targeted delivery approach. Following this rationale, in this study, we assessed the role of a novel antioxidant thiol, N-acetylcysteine amide (NACA), to prevent OV-mediated toxicity toward NSC carriers in an orthotropic glioma xenograft mouse model. Our results show that the combination of NACA and CRAd-S-pk7 not only increases the viability of these cell carriers by preventing reactive oxygen species (ROS)-induced apoptosis of NSCs, but also improves the production of viral progeny in HB1.F3.CD NSCs. In an intracranial xenograft mouse model, the combination treatment of NACA and NSCs loaded with CRAd-S-pk7 showed enhanced CRAd-S-pk7 production and distribution in malignant tissues, which improves the therapeutic efficacy of NSC-based targeted antiglioma oncolytic virotherapy. These data demonstrate that the combination of NACA and NSCs loaded with CRAd-S-pk7 may be a desirable strategy to improve the therapeutic efficacy of antiglioma oncolytic virotherapy. PMID:23883863

  14. Review and standardization of cell phone exposure calculations using the SAM phantom and anatomically correct head models.

    PubMed

    Beard, Brian B; Kainz, Wolfgang

    2004-10-13

    We reviewed articles using computational RF dosimetry to compare the Specific Anthropomorphic Mannequin (SAM) to anatomically correct models of the human head. Published conclusions based on such comparisons have varied widely. We looked for reasons that might cause apparently similar comparisons to produce dissimilar results. We also looked at the information needed to adequately compare the results of computational RF dosimetry studies. We concluded studies were not comparable because of differences in definitions, models, and methodology. Therefore we propose a protocol, developed by an IEEE standards group, as an initial step in alleviating this problem. The protocol calls for a benchmark validation study comparing the SAM phantom to two anatomically correct models of the human head. It also establishes common definitions and reporting requirements that will increase the comparability of all computational RF dosimetry studies of the human head.

  15. Review and standardization of cell phone exposure calculations using the SAM phantom and anatomically correct head models

    PubMed Central

    Beard, Brian B; Kainz, Wolfgang

    2004-01-01

    We reviewed articles using computational RF dosimetry to compare the Specific Anthropomorphic Mannequin (SAM) to anatomically correct models of the human head. Published conclusions based on such comparisons have varied widely. We looked for reasons that might cause apparently similar comparisons to produce dissimilar results. We also looked at the information needed to adequately compare the results of computational RF dosimetry studies. We concluded studies were not comparable because of differences in definitions, models, and methodology. Therefore we propose a protocol, developed by an IEEE standards group, as an initial step in alleviating this problem. The protocol calls for a benchmark validation study comparing the SAM phantom to two anatomically correct models of the human head. It also establishes common definitions and reporting requirements that will increase the comparability of all computational RF dosimetry studies of the human head. PMID:15482601

  16. Scene-based nonuniformity correction for airborne point target detection systems.

    PubMed

    Zhou, Dabiao; Wang, Dejiang; Huo, Lijun; Liu, Rang; Jia, Ping

    2017-06-26

    Images acquired by airborne infrared search and track (IRST) systems are often characterized by nonuniform noise. In this paper, a scene-based nonuniformity correction method for infrared focal-plane arrays (FPAs) is proposed based on the constant statistics of the received radiation ratios of adjacent pixels. The gain of each pixel is computed recursively based on the ratios between adjacent pixels, which are estimated through a median operation. Then, an elaborate mathematical model describing the error propagation, derived from random noise and the recursive calculation procedure, is established. The proposed method maintains the characteristics of traditional methods in calibrating the whole electro-optics chain, in compensating for temporal drifts, and in not preserving the radiometric accuracy of the system. Moreover, the proposed method is robust since the frame number is the only variant, and is suitable for real-time applications owing to its low computational complexity and simplicity of implementation. The experimental results, on different scenes from a proof-of-concept point target detection system with a long-wave Sofradir FPA, demonstrate the compelling performance of the proposed method.

  17. Continental-scale Validation of MODIS-based and LEDAPS Landsat ETM+ Atmospheric Correction Methods

    NASA Technical Reports Server (NTRS)

    Ju, Junchang; Roy, David P.; Vermote, Eric; Masek, Jeffrey; Kovalskyy, Valeriy

    2012-01-01

    The potential of Landsat data processing to provide systematic continental scale products has been demonstrated by several projects including the NASA Web-enabled Landsat Data (WELD) project. The recent free availability of Landsat data increases the need for robust and efficient atmospheric correction algorithms applicable to large volume Landsat data sets. This paper compares the accuracy of two Landsat atmospheric correction methods: a MODIS-based method and the Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) method. Both methods are based on the 6SV radiative transfer code but have different atmospheric characterization approaches. The MODIS-based method uses the MODIS Terra derived dynamic aerosol type, aerosol optical thickness, and water vapor to atmospherically correct ETM+ acquisitions in each coincident orbit. The LEDAPS method uses aerosol characterizations derived independently from each Landsat acquisition and assumes a fixed continental aerosol type and uses ancillary water vapor. Validation results are presented comparing ETM+ atmospherically corrected data generated using these two methods with AERONET corrected ETM+ data for 95 10 km×10 km 30 m subsets, a total of nearly 8 million 30 m pixels, located across the conterminous United States. The results indicate that the MODIS-based method has better accuracy than the LEDAPS method for the ETM+ red and longer wavelength bands.

  18. Development of a population pharmacokinetic model for carbamazepine based on sparse therapeutic monitoring data from pediatric patients with epilepsy.

    PubMed

    Carlsson, Kristin Cecilie; Hoem, Nils Ove; Glauser, Tracy; Vinks, Alexander A

    2005-05-01

    Population models can be important extensions of therapeutic drug monitoring (TDM), as they allow estimation of individual pharmacokinetic parameters based on a small number of measured drug concentrations. This study used a Bayesian approach to explore the utility of routinely collected and sparse TDM data (1 sample per patient) for carbamazepine (CBZ) monotherapy in developing a population pharmacokinetic (PPK) model for CBZ in pediatric patients that would allow prediction of CBZ concentrations for both immediate- and controlled-release formulations. Patient and TDM data were obtained from a pediatric neurology outpatient database. Data were analyzed using an iterative 2-stage Bayesian algorithm and a nonparametric adaptive grid algorithm. Models were compared by final log likelihood, mean error (ME) as a measure of bias, and root mean squared error (RMSE) as a measure of precision. Fifty-seven entries with data on CBZ monotherapy were identified from the database and used in the analysis (36 from males, 21 from females; mean [SD] age, 9.1 [4.4] years [range, 2-21 years]). Preliminary models estimating clearance (Cl) or the elimination rate constant (K(el)) gave good prediction of serum concentrations compared with measured serum concentrations, but estimates of Cl and K(el) were highly correlated with estimates of volume of distribution (V(d)). Different covariate models were then tested. The selected model had zero-order input and had age and body weight as covariates. Cl (L/h) was calculated as K(el) . V(d), where K(el) = [K(i) - (K(s) . age)] and V(d) = [V(i) + (V(s) . body weight)]. Median parameter estimates were V(i) (intercept) = 11.5 L (fixed); V(s) (slope) = 0.3957 L/kg (range, 0.01200-1.5730); K(i) (intercept) = 0.173 h(-1) (fixed); and K(s) (slope) = 0.004487 h(-1) . y(-1) (range, 0.0001800-0.02969). The fit was good for estimates of steady-state serum concentrations based on prior values (population median estimates) (R = 0.468; R(2) = 0.219) but

  19. Empirical Correction to the Likelihood Ratio Statistic for Structural Equation Modeling with Many Variables.

    PubMed

    Yuan, Ke-Hai; Tian, Yubin; Yanagihara, Hirokazu

    2015-06-01

    Survey data typically contain many variables. Structural equation modeling (SEM) is commonly used in analyzing such data. The most widely used statistic for evaluating the adequacy of a SEM model is T ML, a slight modification to the likelihood ratio statistic. Under normality assumption, T ML approximately follows a chi-square distribution when the number of observations (N) is large and the number of items or variables (p) is small. However, in practice, p can be rather large while N is always limited due to not having enough participants. Even with a relatively large N, empirical results show that T ML rejects the correct model too often when p is not too small. Various corrections to T ML have been proposed, but they are mostly heuristic. Following the principle of the Bartlett correction, this paper proposes an empirical approach to correct T ML so that the mean of the resulting statistic approximately equals the degrees of freedom of the nominal chi-square distribution. Results show that empirically corrected statistics follow the nominal chi-square distribution much more closely than previously proposed corrections to T ML, and they control type I errors reasonably well whenever N ≥ max(50,2p). The formulations of the empirically corrected statistics are further used to predict type I errors of T ML as reported in the literature, and they perform well.

  20. Similarity-based prediction for Anatomical Therapeutic Chemical classification of drugs by integrating multiple data sources.

    PubMed

    Liu, Zhongyang; Guo, Feifei; Gu, Jiangyong; Wang, Yong; Li, Yang; Wang, Dan; Lu, Liang; Li, Dong; He, Fuchu

    2015-06-01

    Anatomical Therapeutic Chemical (ATC) classification system, widely applied in almost all drug utilization studies, is currently the most widely recognized classification system for drugs. Currently, new drug entries are added into the system only on users' requests, which leads to seriously incomplete drug coverage of the system, and bioinformatics prediction is helpful during this process. Here we propose a novel prediction model of drug-ATC code associations, using logistic regression to integrate multiple heterogeneous data sources including chemical structures, target proteins, gene expression, side-effects and chemical-chemical associations. The model obtains good performance for the prediction not only on ATC codes of unclassified drugs but also on new ATC codes of classified drugs assessed by cross-validation and independent test sets, and its efficacy exceeds previous methods. Further to facilitate the use, the model is developed into a user-friendly web service SPACE ( S: imilarity-based P: redictor of A: TC C: od E: ), which for each submitted compound, will give candidate ATC codes (ranked according to the decreasing probability_score predicted by the model) together with corresponding supporting evidence. This work not only contributes to knowing drugs' therapeutic, pharmacological and chemical properties, but also provides clues for drug repositioning and side-effect discovery. In addition, the construction of the prediction model also provides a general framework for similarity-based data integration which is suitable for other drug-related studies such as target, side-effect prediction etc. The web service SPACE is available at http://www.bprc.ac.cn/space. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Towards Compensation Correctness in Interactive Systems

    NASA Astrophysics Data System (ADS)

    Vaz, Cátia; Ferreira, Carla

    One fundamental idea of service-oriented computing is that applications should be developed by composing already available services. Due to the long running nature of service interactions, a main challenge in service composition is ensuring correctness of failure recovery. In this paper, we use a process calculus suitable for modelling long running transactions with a recovery mechanism based on compensations. Within this setting, we discuss and formally state correctness criteria for compensable processes compositions, assuming that each process is correct with respect to failure recovery. Under our theory, we formally interpret self-healing compositions, that can detect and recover from failures, as correct compositions of compensable processes.

  2. Feature-based pairwise retinal image registration by radial distortion correction

    NASA Astrophysics Data System (ADS)

    Lee, Sangyeol; Abràmoff, Michael D.; Reinhardt, Joseph M.

    2007-03-01

    Fundus camera imaging is widely used to document disorders such as diabetic retinopathy and macular degeneration. Multiple retinal images can be combined together through a procedure known as mosaicing to form an image with a larger field of view. Mosaicing typically requires multiple pairwise registrations of partially overlapped images. We describe a new method for pairwise retinal image registration. The proposed method is unique in that the radial distortion due to image acquisition is corrected prior to the geometric transformation. Vessel lines are detected using the Hessian operator and are used as input features to the registration. Since the overlapping region is typically small in a retinal image pair, only a few correspondences are available, thus limiting the applicable model to an afine transform at best. To recover the distortion due to curved-surface of retina and lens optics, a combined approach of an afine model with a radial distortion correction is proposed. The parameters of the image acquisition and radial distortion models are estimated during an optimization step that uses Powell's method driven by the vessel line distance. Experimental results using 20 pairs of green channel images acquired from three subjects with a fundus camera confirmed that the afine model with distortion correction could register retinal image pairs to within 1.88+/-0.35 pixels accuracy (mean +/- standard deviation) assessed by vessel line error, which is 17% better than the afine-only approach. Because the proposed method needs only two correspondences, it can be applied to obtain good registration accuracy even in the case of small overlap between retinal image pairs.

  3. Genetic deletion of keratin 8 corrects the altered bone formation and osteopenia in a mouse model of cystic fibrosis.

    PubMed

    Le Henaff, Carole; Faria Da Cunha, Mélanie; Hatton, Aurélie; Tondelier, Danielle; Marty, Caroline; Collet, Corinne; Zarka, Mylène; Geoffroy, Valérie; Zatloukal, Kurt; Laplantine, Emmanuel; Edelman, Aleksander; Sermet-Gaudelus, Isabelle; Marie, Pierre J

    2016-04-01

    Patients with cystic fibrosis (CF) display low bone mass and alterations in bone formation. Mice carrying the F508del genetic mutation in the cystic fibrosis conductance regulator (Cftr) gene display reduced bone formation and decreased bone mass. However, the underlying molecular mechanisms leading to these skeletal defects are unknown, which precludes the development of an efficient anti-osteoporotic therapeutic strategy. Here we report a key role for the intermediate filament protein keratin 8 (Krt8), in the osteoblast dysfunctions in F508del-Cftr mice. We found that murine and human osteoblasts express Cftr and Krt8 at low levels. Genetic studies showed that Krt8 deletion (Krt8(-/-)) in F508del-Cftr mice increased the levels of circulating markers of bone formation, corrected the expression of osteoblast phenotypic genes, promoted trabecular bone formation and improved bone mass and microarchitecture. Mechanistically, Krt8 deletion in F508del-Cftr mice corrected overactive NF-κB signaling and decreased Wnt-β-catenin signaling induced by the F508del-Cftr mutation in osteoblasts. In vitro, treatment with compound 407, which specifically disrupts the Krt8-F508del-Cftr interaction in epithelial cells, corrected the abnormal NF-κB and Wnt-β-catenin signaling and the altered phenotypic gene expression in F508del-Cftr osteoblasts. In vivo, short-term treatment with 407 corrected the altered Wnt-β-catenin signaling and bone formation in F508del-Cftr mice. Collectively, the results show that genetic or pharmacologic targeting of Krt8 leads to correction of osteoblast dysfunctions, altered bone formation and osteopenia in F508del-Cftr mice, providing a therapeutic strategy targeting the Krt8-F508del-CFTR interaction to correct the abnormal bone formation and bone loss in cystic fibrosis. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Accuracy of radiotherapy dose calculations based on cone-beam CT: comparison of deformable registration and image correction based methods

    NASA Astrophysics Data System (ADS)

    Marchant, T. E.; Joshi, K. D.; Moore, C. J.

    2018-03-01

    Radiotherapy dose calculations based on cone-beam CT (CBCT) images can be inaccurate due to unreliable Hounsfield units (HU) in the CBCT. Deformable image registration of planning CT images to CBCT, and direct correction of CBCT image values are two methods proposed to allow heterogeneity corrected dose calculations based on CBCT. In this paper we compare the accuracy and robustness of these two approaches. CBCT images for 44 patients were used including pelvis, lung and head & neck sites. CBCT HU were corrected using a ‘shading correction’ algorithm and via deformable registration of planning CT to CBCT using either Elastix or Niftyreg. Radiotherapy dose distributions were re-calculated with heterogeneity correction based on the corrected CBCT and several relevant dose metrics for target and OAR volumes were calculated. Accuracy of CBCT based dose metrics was determined using an ‘override ratio’ method where the ratio of the dose metric to that calculated on a bulk-density assigned version of the same image is assumed to be constant for each patient, allowing comparison to the patient’s planning CT as a gold standard. Similar performance is achieved by shading corrected CBCT and both deformable registration algorithms, with mean and standard deviation of dose metric error less than 1% for all sites studied. For lung images, use of deformed CT leads to slightly larger standard deviation of dose metric error than shading corrected CBCT with more dose metric errors greater than 2% observed (7% versus 1%).

  5. Optimization-based mesh correction with volume and convexity constraints

    DOE PAGES

    D'Elia, Marta; Ridzal, Denis; Peterson, Kara J.; ...

    2016-02-24

    In this study, we consider the problem of finding a mesh such that 1) it is the closest, with respect to a suitable metric, to a given source mesh having the same connectivity, and 2) the volumes of its cells match a set of prescribed positive values that are not necessarily equal to the cell volumes in the source mesh. This volume correction problem arises in important simulation contexts, such as satisfying a discrete geometric conservation law and solving transport equations by incremental remapping or similar semi-Lagrangian transport schemes. In this paper we formulate volume correction as a constrained optimizationmore » problem in which the distance to the source mesh defines an optimization objective, while the prescribed cell volumes, mesh validity and/or cell convexity specify the constraints. We solve this problem numerically using a sequential quadratic programming (SQP) method whose performance scales with the mesh size. To achieve scalable performance we develop a specialized multigrid-based preconditioner for optimality systems that arise in the application of the SQP method to the volume correction problem. Numerical examples illustrate the importance of volume correction, and showcase the accuracy, robustness and scalability of our approach.« less

  6. Therapeutic action of ghrelin in a mouse model of colitis.

    PubMed

    Gonzalez-Rey, Elena; Chorny, Alejo; Delgado, Mario

    2006-05-01

    Ghrelin is a novel growth hormone-releasing peptide with potential endogenous anti-inflammatory activities ameliorating some pathologic inflammatory conditions. Crohn's disease is a chronic debilitating disease characterized by severe T helper cell (Th)1-driven inflammation of the colon. The aim of this study was to investigate the therapeutic effect of ghrelin in a murine model of colitis. We examined the anti-inflammatory action of ghrelin in the colitis induced by intracolonic administration of trinitrobenzene sulfonic acid. Diverse clinical signs of the disease were evaluated, including weight loss, diarrhea, colitis, and histopathology. We also investigated the mechanisms involved in the potential therapeutic effect of ghrelin, such as inflammatory cytokines and chemokines, Th1-type response, and regulatory factors. Ghrelin ameliorated significantly the clinical and histopathologic severity of the trinitrobenzene sulfonic acid-induced colitis; abrogating body weight loss, diarrhea, and inflammation; and increasing survival. The therapeutic effect was associated with down-regulation of both inflammatory and Th1-driven autoimmune response through the regulation of a wide spectrum of inflammatory mediators. In addition, a partial involvement of interluekin-10/transforming growth factor-beta1-secreting regulatory T cells in this therapeutic effect was demonstrated. Importantly, the ghrelin treatment was therapeutically effective in established colitis and avoided the recurrence of the disease. Our data demonstrate novel anti-inflammatory actions for ghrelin in the gastrointestinal tract, ie, the capacity to deactivate the intestinal inflammatory response and to restore mucosal immune tolerance at multiple levels. Consequently, ghrelin administration represents a novel possible therapeutic approach for the treatment of Crohn's disease and other Th1-mediated inflammatory diseases, such as rheumatoid arthritis and multiple sclerosis.

  7. NTCP modelling of lung toxicity after SBRT comparing the universal survival curve and the linear quadratic model for fractionation correction.

    PubMed

    Wennberg, Berit M; Baumann, Pia; Gagliardi, Giovanna; Nyman, Jan; Drugge, Ninni; Hoyer, Morten; Traberg, Anders; Nilsson, Kristina; Morhed, Elisabeth; Ekberg, Lars; Wittgren, Lena; Lund, Jo-Åsmund; Levin, Nina; Sederholm, Christer; Lewensohn, Rolf; Lax, Ingmar

    2011-05-01

    In SBRT of lung tumours no established relationship between dose-volume parameters and the incidence of lung toxicity is found. The aim of this study is to compare the LQ model and the universal survival curve (USC) to calculate biologically equivalent doses in SBRT to see if this will improve knowledge on this relationship. Toxicity data on radiation pneumonitis grade 2 or more (RP2+) from 57 patients were used, 10.5% were diagnosed with RP2+. The lung DVHs were corrected for fractionation (LQ and USC) and analysed with the Lyman- Kutcher-Burman (LKB) model. In the LQ-correction α/β = 3 Gy was used and the USC parameters used were: α/β = 3 Gy, D(0) = 1.0 Gy, [Formula: see text] = 10, α = 0.206 Gy(-1) and d(T) = 5.8 Gy. In order to understand the relative contribution of different dose levels to the calculated NTCP the concept of fractional NTCP was used. This might give an insight to the questions of whether "high doses to small volumes" or "low doses to large volumes" are most important for lung toxicity. NTCP analysis with the LKB-model using parameters m = 0.4, D(50) = 30 Gy resulted for the volume dependence parameter (n) with LQ correction n = 0.87 and with USC correction n = 0.71. Using parameters m = 0.3, D(50) = 20 Gy n = 0.93 with LQ correction and n = 0.83 with USC correction. In SBRT of lung tumours, NTCP modelling of lung toxicity comparing models (LQ,USC) for fractionation correction, shows that low dose contribute less and high dose more to the NTCP when using the USC-model. Comparing NTCP modelling of SBRT data and data from breast cancer, lung cancer and whole lung irradiation implies that the response of the lung is treatment specific. More data are however needed in order to have a more reliable modelling.

  8. The potential of prison-based democratic therapeutic communities.

    PubMed

    Bennett, Jamie; Shuker, Richard

    2017-03-13

    Purpose The purpose of this paper is to describe the work of HMP Grendon, the only prison in the UK to operate entirely as a series of democratic therapeutic communities and to summarise the research of its effectiveness. Design/methodology/approach The paper is both descriptive, providing an overview of the work of a prison-based therapeutic community, and offers a literature review regarding evidence of effectiveness. Findings The work of HMP Grendon has a wide range of positive benefits including reduced levels of disruption in prison, reduced self-harm, improved well-being, an environment that is experienced as more humane and reduced levels of reoffending. Originality/value The work of HMP Grendon offers a well established and evidenced approach to managing men who have committed serious violent and sexually violent offences. It also promotes and embodies a progressive approach to managing prisons rooted in the welfare tradition.

  9. Investigating Supervisory Relationships and Therapeutic Alliances Using Structural Equation Modeling

    ERIC Educational Resources Information Center

    DePue, Mary Kristina; Lambie, Glenn W.; Liu, Ren; Gonzalez, Jessica

    2016-01-01

    The authors used structural equation modeling to examine the contribution of supervisees' supervisory relationship levels to therapeutic alliance (TA) scores with their clients in practicum. Results showed that supervisory relationship scores positively contributed to the TA. Client and counselor ratings of the TA also differed.

  10. Correction of scatter in megavoltage cone-beam CT

    NASA Astrophysics Data System (ADS)

    Spies, L.; Ebert, M.; Groh, B. A.; Hesse, B. M.; Bortfeld, T.

    2001-03-01

    The role of scatter in a cone-beam computed tomography system using the therapeutic beam of a medical linear accelerator and a commercial electronic portal imaging device (EPID) is investigated. A scatter correction method is presented which is based on a superposition of Monte Carlo generated scatter kernels. The kernels are adapted to both the spectral response of the EPID and the dimensions of the phantom being scanned. The method is part of a calibration procedure which converts the measured transmission data acquired for each projection angle into water-equivalent thicknesses. Tomographic reconstruction of the projections then yields an estimate of the electron density distribution of the phantom. It is found that scatter produces cupping artefacts in the reconstructed tomograms. Furthermore, reconstructed electron densities deviate greatly (by about 30%) from their expected values. The scatter correction method removes the cupping artefacts and decreases the deviations from 30% down to about 8%.

  11. SU-F-T-69: Correction Model of NIPAM Gel and Presage for Electron and Proton PDD Measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, C; Lin, C; Tu, P

    Purpose: The current standard equipment for proton PDD measurement is multilayer-parallel-ion-chamber. Disadvantage of multilayer-parallel-ion-chamber is expensive and complexity manipulation. NIPAM-gel and Presage are options for PDD measurement. Due to different stopping power, the result of NIPAM-gel and Presage need to be corrected. This study aims to create a correction model for NIPAM-gel and Presage PDD measurement. Methods: Standard water based PDD profiles of electron 6MeV, 12MeV, and proton 90MeV were acquired. Electron PDD profile after 1cm thickness of NIPAM-gel added on the top of water was measured. Electron PDD profile with extra 1cm thickness of solid water, PTW RW3, wasmore » measured. The distance shift among standard PDD, NIPAM-gel PDD, and solid water PDD at R50% was compared and water equivalent thickness correction factor (WET) was calculated. Similar process was repeated. WETs for electron with Presage, proton with NIPAM-gel, and proton with Presage were calculated. PDD profiles of electron and proton with NIPAM-gel and Presage columns were corrected with each WET. The corrected profiles were compared with standard profiles. Results: WET for electron 12MeV with NIPAM-gel was 1.135, and 1.034 for electron 12Mev with Presage. After correction, PDD profile matched to the standard profile at the fall-off range well. The difference at R50% was 0.26mm shallower and 0.39mm deeper. The same WET was used to correct electron 6MeV profile. Energy independence of electron WET was observed. The difference at R50% was 0.17mm deeper for NIPAM-gel and 0.54mm deeper for Presage. WET for proton 90MeV with NIPAM-gel was 1.056. The difference at R50% was 0.37 deeper. Quenching effect at Bragg peak was revealed. The underestimated dose percentage at Bragg peak was 27%. Conclusion: This correction model can be used to modify PDD profile with depth error within 1mm. With this correction model, NIPAM-gel and Presage can be practical at PDD profile measurement.« less

  12. Marker Configuration Model-Based Roentgen Fluoroscopic Analysis.

    PubMed

    Garling, Eric H; Kaptein, Bart L; Geleijns, Koos; Nelissen, Rob G H H; Valstar, Edward R

    2005-04-01

    It remains unknown if and how the polyethylene bearing in mobile bearing knees moves during dynamic activities with respect to the tibial base plate. Marker Configuration Model-Based Roentgen Fluoroscopic Analysis (MCM-based RFA) uses a marker configuration model of inserted tantalum markers in order to accurately estimate the pose of an implant or bone using single plane Roentgen images or fluoroscopic images. The goal of this study is to assess the accuracy of (MCM-Based RFA) in a standard fluoroscopic set-up using phantom experiments and to determine the error propagation with computer simulations. The experimental set-up of the phantom study was calibrated using a calibration box equipped with 600 tantalum markers, which corrected for image distortion and determined the focus position. In the computer simulation study the influence of image distortion, MC-model accuracy, focus position, the relative distance between MC-models and MC-model configuration on the accuracy of MCM-Based RFA were assessed. The phantom study established that the in-plane accuracy of MCM-Based RFA is 0.1 mm and the out-of-plane accuracy is 0.9 mm. The rotational accuracy is 0.1 degrees. A ninth-order polynomial model was used to correct for image distortion. Marker-Based RFA was estimated to have, in a worst case scenario, an in vivo translational accuracy of 0.14 mm (x-axis), 0.17 mm (y-axis), 1.9 mm (z-axis), respectively, and a rotational accuracy of 0.3 degrees. When using fluoroscopy to study kinematics, image distortion and the accuracy of models are important factors, which influence the accuracy of the measurements. MCM-Based RFA has the potential to be an accurate, clinically useful tool for studying kinematics after total joint replacement using standard equipment.

  13. Affinity approaches in RNAi-based therapeutics purification.

    PubMed

    Pereira, Patrícia; Queiroz, João A; Figueiras, Ana; Sousa, Fani

    2016-05-15

    The recent investigation on RNA interference (RNAi) related mechanisms and applications led to an increased awareness of the importance of RNA in biology. Nowadays, RNAi-based technology has emerged as a potentially powerful tool for silencing gene expression, being exploited to develop new therapeutics for treating a vast number of human disease conditions, as it is expected that this technology can be translated onto clinical applications in a near future. This approach makes use of a large number of small (namely short interfering RNAs, microRNAs and PIWI-interacting RNAs) and long non-coding RNAs (ncRNAs), which are likely to have a crucial role as the next generation therapeutics. The commercial and biomedical interest in these RNAi-based therapy applications have fostered the need to develop innovative procedures to easily and efficiently purify RNA, aiming to obtain the final product with high purity degree, good quality and biological activity. Recently, affinity chromatography has been applied to ncRNAs purification, in view of the high specificity. Therefore, this article intends to review the biogenesis pathways of regulatory ncRNAs and also to discuss the most significant and recent developments as well as applications of affinity chromatography in the challenging task of purifying ncRNAs. In addition, the importance of affinity chromatography in ncRNAs purification is addressed and prospects for what is forthcoming are presented. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. An Automated Baseline Correction Method Based on Iterative Morphological Operations.

    PubMed

    Chen, Yunliang; Dai, Liankui

    2018-05-01

    Raman spectra usually suffer from baseline drift caused by fluorescence or other reasons. Therefore, baseline correction is a necessary and crucial step that must be performed before subsequent processing and analysis of Raman spectra. An automated baseline correction method based on iterative morphological operations is proposed in this work. The method can adaptively determine the structuring element first and then gradually remove the spectral peaks during iteration to get an estimated baseline. Experiments on simulated data and real-world Raman data show that the proposed method is accurate, fast, and flexible for handling different kinds of baselines in various practical situations. The comparison of the proposed method with some state-of-the-art baseline correction methods demonstrates its advantages over the existing methods in terms of accuracy, adaptability, and flexibility. Although only Raman spectra are investigated in this paper, the proposed method is hopefully to be used for the baseline correction of other analytical instrumental signals, such as IR spectra and chromatograms.

  15. Predicting the Uncertain Future of Aptamer-Based Diagnostics and Therapeutics.

    PubMed

    Bruno, John G

    2015-04-16

    Despite the great promise of nucleic acid aptamers in the areas of diagnostics and therapeutics for their facile in vitro development, lack of immunogenicity and other desirable properties, few truly successful aptamer-based products exist in the clinical or other markets. Core reasons for these commercial deficiencies probably stem from industrial commitment to antibodies including a huge financial investment in humanized monoclonal antibodies and a general ignorance about aptamers and their performance among the research and development community. Given the early failures of some strong commercial efforts to gain government approval and bring aptamer-based products to market, it may seem that aptamers are doomed to take a backseat to antibodies forever. However, the key advantages of aptamers over antibodies coupled with niche market needs that only aptamers can fill and more recent published data still point to a bright commercial future for aptamers in areas such as infectious disease and cancer diagnostics and therapeutics. As more researchers and entrepreneurs become familiar with aptamers, it seems inevitable that aptamers will at least be considered for expanded roles in diagnostics and therapeutics. This review also examines new aptamer modifications and attempts to predict new aptamer applications that could revolutionize biomedical technology in the future and lead to marketed products.

  16. Mobility-based correction for accurate determination of binding constants by capillary electrophoresis-frontal analysis.

    PubMed

    Qian, Cheng; Kovalchik, Kevin A; MacLennan, Matthew S; Huang, Xiaohua; Chen, David D Y

    2017-06-01

    Capillary electrophoresis frontal analysis (CE-FA) can be used to determine binding affinity of molecular interactions. However, its current data processing method mandate specific requirement on the mobilities of the binding pair in order to obtain accurate binding constants. This work shows that significant errors are resulted when the mobilities of the interacting species do not meet these requirements. Therefore, the applicability of CE-FA in many real word applications becomes questionable. An electrophoretic mobility-based correction method is developed in this work based on the flux of each species. A simulation program and a pair of model compounds are used to verify the new equations and evaluate the effectiveness of this method. Ibuprofen and hydroxypropyl-β-cyclodextrinare used to demonstrate the differences in the obtained binding constant by CE-FA when different calculation methods are used, and the results are compared with those obtained by affinity capillary electrophoresis (ACE). The results suggest that CE-FA, with the mobility-based correction method, can be a generally applicable method for a much wider range of applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Evaluation of the impact of metal artifacts in CT-based attenuation correction of positron emission tomography scans

    NASA Astrophysics Data System (ADS)

    Wu, Jay; Shih, Cheng-Ting; Chang, Shu-Jun; Huang, Tzung-Chi; Chen, Chuan-Lin; Wu, Tung Hsin

    2011-08-01

    The quantitative ability of PET/CT allows the widespread use in clinical research and cancer staging. However, metal artifacts induced by high-density metal objects degrade the quality of CT images. These artifacts also propagate to the corresponding PET image and cause a false increase of 18F-FDG uptake near the metal implants when the CT-based attenuation correction (AC) is performed. In this study, we applied a model-based metal artifact reduction (MAR) algorithm to reduce the dark and bright streaks in the CT image and compared the differences between PET images with the general CT-based AC (G-AC) and the MAR-corrected-CT AC (MAR-AC). Results showed that the MAR algorithm effectively reduced the metal artifacts in the CT images of the ACR flangeless phantom and two clinical cases. The MAR-AC also removed the false-positive hot spot near the metal implants of the PET images. We conclude that the MAR-AC could be applied in clinical practice to improve the quantitative accuracy of PET images. Additionally, further use of PET/CT fusion images with metal artifact correction could be more valuable for diagnosis.

  18. Ionospheric Correction Based on Ingestion of Global Ionospheric Maps into the NeQuick 2 Model

    PubMed Central

    Yu, Xiao; She, Chengli; Zhen, Weimin; Bruno, Nava; Liu, Dun; Yue, Xinan; Ou, Ming; Xu, Jisheng

    2015-01-01

    The global ionospheric maps (GIMs), generated by Jet Propulsion Laboratory (JPL) and Center for Orbit Determination in Europe (CODE) during a period over 13 years, have been adopted as the primary source of data to provide global ionospheric correction for possible single frequency positioning applications. The investigation aims to assess the performance of new NeQuick model, NeQuick 2, in predicting global total electron content (TEC) through ingesting the GIMs data from the previous day(s). The results show good performance of the GIMs-driven-NeQuick model with average 86% of vertical TEC error less than 10 TECU, when the global daily effective ionization indices (Az) versus modified dip latitude (MODIP) are constructed as a second order polynomial. The performance of GIMs-driven-NeQuick model presents variability with solar activity and behaves better during low solar activity years. The accuracy of TEC prediction can be improved further through performing a four-coefficient function expression of Az versus MODIP. As more measurements from earlier days are involved in the Az optimization procedure, the accuracy may decrease. The results also reveal that more efforts are needed to improve the NeQuick 2 model capabilities to represent the ionosphere in the equatorial and high-latitude regions. PMID:25815369

  19. Correcting for Measurement Error in Time-Varying Covariates in Marginal Structural Models.

    PubMed

    Kyle, Ryan P; Moodie, Erica E M; Klein, Marina B; Abrahamowicz, Michał

    2016-08-01

    Unbiased estimation of causal parameters from marginal structural models (MSMs) requires a fundamental assumption of no unmeasured confounding. Unfortunately, the time-varying covariates used to obtain inverse probability weights are often error-prone. Although substantial measurement error in important confounders is known to undermine control of confounders in conventional unweighted regression models, this issue has received comparatively limited attention in the MSM literature. Here we propose a novel application of the simulation-extrapolation (SIMEX) procedure to address measurement error in time-varying covariates, and we compare 2 approaches. The direct approach to SIMEX-based correction targets outcome model parameters, while the indirect approach corrects the weights estimated using the exposure model. We assess the performance of the proposed methods in simulations under different clinically plausible assumptions. The simulations demonstrate that measurement errors in time-dependent covariates may induce substantial bias in MSM estimators of causal effects of time-varying exposures, and that both proposed SIMEX approaches yield practically unbiased estimates in scenarios featuring low-to-moderate degrees of error. We illustrate the proposed approach in a simple analysis of the relationship between sustained virological response and liver fibrosis progression among persons infected with hepatitis C virus, while accounting for measurement error in γ-glutamyltransferase, using data collected in the Canadian Co-infection Cohort Study from 2003 to 2014. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Biomechanically based simulation of brain deformations for intraoperative image correction: coupling of elastic and fluid models

    NASA Astrophysics Data System (ADS)

    Hagemann, Alexander; Rohr, Karl; Stiehl, H. Siegfried

    2000-06-01

    In order to improve the accuracy of image-guided neurosurgery, different biomechanical models have been developed to correct preoperative images w.r.t. intraoperative changes like brain shift or tumor resection. All existing biomechanical models simulate different anatomical structures by using either appropriate boundary conditions or by spatially varying material parameter values, while assuming the same physical model for all anatomical structures. In general, this leads to physically implausible results, especially in the case of adjacent elastic and fluid structures. Therefore, we propose a new approach which allows to couple different physical models. In our case, we simulate rigid, elastic, and fluid regions by using the appropriate physical description for each material, namely either the Navier equation or the Stokes equation. To solve the resulting differential equations, we derive a linear matrix system for each region by applying the finite element method (FEM). Thereafter, the linear matrix systems are linked together, ending up with one overall linear matrix system. Our approach has been tested using synthetic as well as tomographic images. It turns out from experiments, that the integrated treatment of rigid, elastic, and fluid regions significantly improves the prediction results in comparison to a pure linear elastic model.

  1. Quantification of Hepatic Steatosis with T1-independent, T2*-corrected MR Imaging with Spectral Modeling of Fat: Blinded Comparison with MR Spectroscopy

    PubMed Central

    Hines, Catherine D. G.; Hamilton, Gavin; Sirlin, Claude B.; McKenzie, Charles A.; Yu, Huanzhou; Brittain, Jean H.; Reeder, Scott B.

    2011-01-01

    Purpose: To prospectively compare an investigational version of a complex-based chemical shift–based fat fraction magnetic resonance (MR) imaging method with MR spectroscopy for the quantification of hepatic steatosis. Materials and Methods: This study was approved by the institutional review board and was HIPAA compliant. Written informed consent was obtained before all studies. Fifty-five patients (31 women, 24 men; age range, 24–71 years) were prospectively imaged at 1.5 T with quantitative MR imaging and single-voxel MR spectroscopy, each within a single breath hold. The effects of T2* correction, spectral modeling of fat, and magnitude fitting for eddy current correction on fat quantification with MR imaging were investigated by reconstructing fat fraction images from the same source data with different combinations of error correction. Single-voxel T2-corrected MR spectroscopy was used to measure fat fraction and served as the reference standard. All MR spectroscopy data were postprocessed at a separate institution by an MR physicist who was blinded to MR imaging results. Fat fractions measured with MR imaging and MR spectroscopy were compared statistically to determine the correlation (r2), and the slope and intercept as measures of agreement between MR imaging and MR spectroscopy fat fraction measurements, to determine whether MR imaging can help quantify fat, and examine the importance of T2* correction, spectral modeling of fat, and eddy current correction. Two-sided t tests (significance level, P = .05) were used to determine whether estimated slopes and intercepts were significantly different from 1.0 and 0.0, respectively. Sensitivity and specificity for the classification of clinically significant steatosis were evaluated. Results: Overall, there was excellent correlation between MR imaging and MR spectroscopy for all reconstruction combinations. However, agreement was only achieved when T2* correction, spectral modeling of fat, and magnitude

  2. Graphene-based platforms for cancer therapeutics.

    PubMed

    Patel, Sunny C; Lee, Stephen; Lalwani, Gaurav; Suhrland, Cassandra; Chowdhury, Sayan Mullick; Sitharaman, Balaji

    2016-01-01

    Graphene is a multifunctional carbon nanomaterial and could be utilized to develop platform technologies for cancer therapies. Its surface can be covalently and noncovalently functionalized with anticancer drugs and functional groups that target cancer cells and tissue to improve treatment efficacies. Furthermore, its physicochemical properties can be harnessed to facilitate stimulus responsive therapeutics and drug delivery. This review article summarizes the recent literature specifically focused on development of graphene technologies to treat cancer. We will focus on advances at the interface of graphene based drug/gene delivery, photothermal/photodynamic therapy and combinations of these techniques. We also discuss the current understanding in cytocompatibility and biocompatibility issues related to graphene formulations and their implications pertinent to clinical cancer management.

  3. Improving Antibody-Based Cancer Therapeutics Through Glycan Engineering.

    PubMed

    Yu, Xiaojie; Marshall, Michael J E; Cragg, Mark S; Crispin, Max

    2017-06-01

    Antibody-based therapeutics has emerged as a major tool in cancer treatment. Guided by the superb specificity of the antibody variable domain, it allows the precise targeting of tumour markers. Recently, eliciting cellular effector functions, mediated by the Fc domain, has gained traction as a means by which to generate more potent antibody therapeutics. Extensive mutagenesis studies of the Fc protein backbone has enabled the generation of Fc variants that more optimally engage the Fcγ receptors known to mediate cellular effector functions such as antibody-dependent cellular cytotoxicity (ADCC) and cellular phagocytosis. In addition to the protein backbone, the homodimeric Fc domain contains two opposing N-linked glycans, which represent a further point of potential immunomodulation, independent of the Fc protein backbone. For example, a lack of core fucose usually attached to the IgG Fc glycan leads to enhanced ADCC activity, whereas a high level of terminal sialylation is associated with reduced inflammation. Significant growth in knowledge of Fc glycosylation over the last decade, combined with advancement in genetic engineering, has empowered glyco-engineering to fine-tune antibody therapeutics. This has culminated in the approval of two glyco-engineered antibodies for cancer therapy: the anti-CCR4 mogamulizumab approved in 2012 and the anti-CD20 obinutuzumab in 2013. We discuss here the technological platforms for antibody glyco-engineering and review the current clinical landscape of glyco-engineered antibodies.

  4. Pharmacokinetic parameters explain the therapeutic activity of antimicrobial agents in a silkworm infection model.

    PubMed

    Paudel, Atmika; Panthee, Suresh; Urai, Makoto; Hamamoto, Hiroshi; Ohwada, Tomohiko; Sekimizu, Kazuhisa

    2018-01-25

    Poor pharmacokinetic parameters are a major reason for the lack of therapeutic activity of some drug candidates. Determining the pharmacokinetic parameters of drug candidates at an early stage of development requires an inexpensive animal model with few associated ethical issues. In this study, we used the silkworm infection model to perform structure-activity relationship studies of an antimicrobial agent, GPI0039, a novel nitrofuran dichloro-benzyl ester, and successfully identified compound 5, a nitrothiophene dichloro-benzyl ester, as a potent antimicrobial agent with superior therapeutic activity in the silkworm infection model. Further, we compared the pharmacokinetic parameters of compound 5 with a nitrothiophene benzyl ester lacking chlorine, compound 7, that exerted similar antimicrobial activity but had less therapeutic activity in silkworms, and examined the metabolism of these antimicrobial agents in human liver fractions in vitro. Compound 5 had appropriate pharmacokinetic parameters, such as an adequate half-life, slow clearance, large area under the curve, low volume of distribution, and long mean residence time, compared with compound 7, and was slowly metabolized by human liver fractions. These findings suggest that the therapeutic effectiveness of an antimicrobial agent in the silkworms reflects appropriate pharmacokinetic properties.

  5. Temporal high-pass non-uniformity correction algorithm based on grayscale mapping and hardware implementation

    NASA Astrophysics Data System (ADS)

    Jin, Minglei; Jin, Weiqi; Li, Yiyang; Li, Shuo

    2015-08-01

    In this paper, we propose a novel scene-based non-uniformity correction algorithm for infrared image processing-temporal high-pass non-uniformity correction algorithm based on grayscale mapping (THP and GM). The main sources of non-uniformity are: (1) detector fabrication inaccuracies; (2) non-linearity and variations in the read-out electronics and (3) optical path effects. The non-uniformity will be reduced by non-uniformity correction (NUC) algorithms. The NUC algorithms are often divided into calibration-based non-uniformity correction (CBNUC) algorithms and scene-based non-uniformity correction (SBNUC) algorithms. As non-uniformity drifts temporally, CBNUC algorithms must be repeated by inserting a uniform radiation source which SBNUC algorithms do not need into the view, so the SBNUC algorithm becomes an essential part of infrared imaging system. The SBNUC algorithms' poor robustness often leads two defects: artifacts and over-correction, meanwhile due to complicated calculation process and large storage consumption, hardware implementation of the SBNUC algorithms is difficult, especially in Field Programmable Gate Array (FPGA) platform. The THP and GM algorithm proposed in this paper can eliminate the non-uniformity without causing defects. The hardware implementation of the algorithm only based on FPGA has two advantages: (1) low resources consumption, and (2) small hardware delay: less than 20 lines, it can be transplanted to a variety of infrared detectors equipped with FPGA image processing module, it can reduce the stripe non-uniformity and the ripple non-uniformity.

  6. A cell-based assay for aggregation inhibitors as therapeutics of polyglutamine-repeat disease and validation in Drosophila

    NASA Astrophysics Data System (ADS)

    Apostol, Barbara L.; Kazantsev, Alexsey; Raffioni, Simona; Illes, Katalin; Pallos, Judit; Bodai, Laszlo; Slepko, Natalia; Bear, James E.; Gertler, Frank B.; Hersch, Steven; Housman, David E.; Marsh, J. Lawrence; Michels Thompson, Leslie

    2003-05-01

    The formation of polyglutamine-containing aggregates and inclusions are hallmarks of pathogenesis in Huntington's disease that can be recapitulated in model systems. Although the contribution of inclusions to pathogenesis is unclear, cell-based assays can be used to screen for chemical compounds that affect aggregation and may provide therapeutic benefit. We have developed inducible PC12 cell-culture models to screen for loss of visible aggregates. To test the validity of this approach, compounds that inhibit aggregation in the PC12 cell-based screen were tested in a Drosophila model of polyglutamine-repeat disease. The disruption of aggregation in PC12 cells strongly correlates with suppression of neuronal degeneration in Drosophila. Thus, the engineered PC12 cells coupled with the Drosophila model provide a rapid and effective method to screen and validate compounds.

  7. Correction of self-reported BMI based on objective measurements: a Belgian experience.

    PubMed

    Drieskens, S; Demarest, S; Bel, S; De Ridder, K; Tafforeau, J

    2018-01-01

    Based on successive Health Interview Surveys (HIS), it has been demonstrated that also in Belgium obesity, measured by means of a self-reported body mass index (BMI in kg/m 2 ), is a growing public health problem that needs to be monitored as accurately as possible. Studies have shown that a self-reported BMI can be biased. Consequently, if the aim is to rely on a self-reported BMI, adjustment is recommended. Data on measured and self-reported BMI, derived from the Belgian Food Consumption Survey (FCS) 2014 offers the opportunity to do so. The HIS and FCS are cross-sectional surveys based on representative population samples. This study focused on adults aged 18-64 years (sample HIS = 6545 and FCS = 1213). Measured and self-reported BMI collected in FCS were used to assess possible misreporting. Using FCS data, correction factors (measured BMI/self-reported BMI) were calculated in function of a combination of background variables (region, gender, educational level and age group). Individual self-reported BMI of the HIS 2013 were then multiplied with the corresponding correction factors to produce a corrected BMI-classification. When compared with the measured BMI, the self-reported BMI in the FCS was underestimated (mean 0.97 kg/m 2 ). 28% of the obese people underestimated their BMI. After applying the correction factors, the prevalence of obesity based on HIS data significantly increased (from 13% based on the original HIS data to 17% based on the corrected HIS data) and approximated the measured one derived from the FCS data. Since self-reported calculations of BMI are underestimated, it is recommended to adjust them to obtain accurate estimates which are important for decision making.

  8. Inferential Procedures for Correlation Coefficients Corrected for Attenuation.

    ERIC Educational Resources Information Center

    Hakstian, A. Ralph; And Others

    1988-01-01

    A model and computation procedure based on classical test score theory are presented for determination of a correlation coefficient corrected for attenuation due to unreliability. Delta and Monte Carlo method applications are discussed. A power analysis revealed no serious loss in efficiency resulting from correction for attentuation. (TJH)

  9. Model-Based Angular Scan Error Correction of an Electrothermally-Actuated MEMS Mirror

    PubMed Central

    Zhang, Hao; Xu, Dacheng; Zhang, Xiaoyang; Chen, Qiao; Xie, Huikai; Li, Suiqiong

    2015-01-01

    In this paper, the actuation behavior of a two-axis electrothermal MEMS (Microelectromechanical Systems) mirror typically used in miniature optical scanning probes and optical switches is investigated. The MEMS mirror consists of four thermal bimorph actuators symmetrically located at the four sides of a central mirror plate. Experiments show that an actuation characteristics difference of as much as 4.0% exists among the four actuators due to process variations, which leads to an average angular scan error of 0.03°. A mathematical model between the actuator input voltage and the mirror-plate position has been developed to predict the actuation behavior of the mirror. It is a four-input, four-output model that takes into account the thermal-mechanical coupling and the differences among the four actuators; the vertical positions of the ends of the four actuators are also monitored. Based on this model, an open-loop control method is established to achieve accurate angular scanning. This model-based open loop control has been experimentally verified and is useful for the accurate control of the mirror. With this control method, the precise actuation of the mirror solely depends on the model prediction and does not need the real-time mirror position monitoring and feedback, greatly simplifying the MEMS control system. PMID:26690432

  10. In vivo gene correction with targeted sequence substitution through microhomology-mediated end joining.

    PubMed

    Shin, Jeong Hong; Jung, Soobin; Ramakrishna, Suresh; Kim, Hyongbum Henry; Lee, Junwon

    2018-07-07

    Genome editing technology using programmable nucleases has rapidly evolved in recent years. The primary mechanism to achieve precise integration of a transgene is mainly based on homology-directed repair (HDR). However, an HDR-based genome-editing approach is less efficient than non-homologous end-joining (NHEJ). Recently, a microhomology-mediated end-joining (MMEJ)-based transgene integration approach was developed, showing feasibility both in vitro and in vivo. We expanded this method to achieve targeted sequence substitution (TSS) of mutated sequences with normal sequences using double-guide RNAs (gRNAs), and a donor template flanking the microhomologies and target sequence of the gRNAs in vitro and in vivo. Our method could realize more efficient sequence substitution than the HDR-based method in vitro using a reporter cell line, and led to the survival of a hereditary tyrosinemia mouse model in vivo. The proposed MMEJ-based TSS approach could provide a novel therapeutic strategy, in addition to HDR, to achieve gene correction from a mutated sequence to a normal sequence. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Statistical bias correction modelling for seasonal rainfall forecast for the case of Bali island

    NASA Astrophysics Data System (ADS)

    Lealdi, D.; Nurdiati, S.; Sopaheluwakan, A.

    2018-04-01

    Rainfall is an element of climate which is highly influential to the agricultural sector. Rain pattern and distribution highly determines the sustainability of agricultural activities. Therefore, information on rainfall is very useful for agriculture sector and farmers in anticipating the possibility of extreme events which often cause failures of agricultural production. This research aims to identify the biases from seasonal forecast products from ECMWF (European Centre for Medium-Range Weather Forecasts) rainfall forecast and to build a transfer function in order to correct the distribution biases as a new prediction model using quantile mapping approach. We apply this approach to the case of Bali Island, and as a result, the use of bias correction methods in correcting systematic biases from the model gives better results. The new prediction model obtained with this approach is better than ever. We found generally that during rainy season, the bias correction approach performs better than in dry season.

  12. The sustainability of community-based therapeutic care (CTC) in nonemergency contexts.

    PubMed

    Gatchell, Valerie; Forsythe, Vivienne; Thomas, Paul-Rees

    2006-09-01

    Concern Worldwide is an international humanitarian nongovernmental organization that piloted and is now implementing and researching community-based therapeutic care (CTC) approaches to managing acute malnutrition. Experience in several countries suggests that there are key issues to be addressed at the international, national, regional, and community levels for community-based treatment of acute malnutrition to be sustainable. At the national level there must be demonstrated commitment to a clear health policy and strategy to address outpatient treatment of acute malnutrition. In addition, locally available, affordable ready-to-use therapeutic food (RUTF) must be accessible. At the regional level a functional health system and appropriate capacity for service provision are required. Integration of outpatient services should be viewed as a process with different levels of inputs at different phases depending on the capacity of the Ministry of Health (MOH). There is a need for indicators to facilitate scale-up and scale-back for future emergency response. Strong community participation and active screening linked to health service provision at the local level is paramount for sustainable assessment and referral of severe acute malnutrition. FUTURE CHALLENGES TO SUSTAIN COMMUNITY-BASED THERAPEUTIC CARE. Key challenges to the sustainable treatment of severe acute malnutrition include the development of locally produced RUTF, development of international standards on local RUTF production, the integration of outpatient treatment protocols into international health and nutrition guidelines, and further operational research into integration of community-based treatment of severe acute malnutrition into health systems in nonemergency contexts.

  13. Systems biology approach to developing S(2)RM-based "systems therapeutics" and naturally induced pluripotent stem cells.

    PubMed

    Maguire, Greg; Friedman, Peter

    2015-05-26

    The degree to, and the mechanisms through, which stem cells are able to build, maintain, and heal the body have only recently begun to be understood. Much of the stem cell's power resides in the release of a multitude of molecules, called stem cell released molecules (SRM). A fundamentally new type of therapeutic, namely "systems therapeutic", can be realized by reverse engineering the mechanisms of the SRM processes. Recent data demonstrates that the composition of the SRM is different for each type of stem cell, as well as for different states of each cell type. Although systems biology has been successfully used to analyze multiple pathways, the approach is often used to develop a small molecule interacting at only one pathway in the system. A new model is emerging in biology where systems biology is used to develop a new technology acting at multiple pathways called "systems therapeutics". A natural set of healing pathways in the human that uses SRM is instructive and of practical use in developing systems therapeutics. Endogenous SRM processes in the human body use a combination of SRM from two or more stem cell types, designated as S(2)RM, doing so under various state dependent conditions for each cell type. Here we describe our approach in using state-dependent SRM from two or more stem cell types, S(2)RM technology, to develop a new class of therapeutics called "systems therapeutics." Given the ubiquitous and powerful nature of innate S(2)RM-based healing in the human body, this "systems therapeutic" approach using S(2)RM technology will be important for the development of anti-cancer therapeutics, antimicrobials, wound care products and procedures, and a number of other therapeutics for many indications.

  14. Continental-Scale Validation of Modis-Based and LEDAPS Landsat ETM + Atmospheric Correction Methods

    NASA Technical Reports Server (NTRS)

    Ju, Junchang; Roy, David P.; Vermote, Eric; Masek, Jeffrey; Kovalskyy, Valeriy

    2012-01-01

    The potential of Landsat data processing to provide systematic continental scale products has been demonstratedby several projects including the NASA Web-enabled Landsat Data (WELD) project. The recent freeavailability of Landsat data increases the need for robust and efficient atmospheric correction algorithms applicableto large volume Landsat data sets. This paper compares the accuracy of two Landsat atmospheric correctionmethods: a MODIS-based method and the Landsat Ecosystem Disturbance Adaptive ProcessingSystem (LEDAPS) method. Both methods are based on the 6SV radiative transfer code but have different atmosphericcharacterization approaches. The MODIS-based method uses the MODIS Terra derived dynamicaerosol type, aerosol optical thickness, and water vapor to atmospherically correct ETM+ acquisitions ineach coincident orbit. The LEDAPS method uses aerosol characterizations derived independently from eachLandsat acquisition and assumes a fixed continental aerosol type and uses ancillary water vapor. Validationresults are presented comparing ETM+ atmospherically corrected data generated using these two methodswith AERONET corrected ETM+ data for 95 10 km10 km 30 m subsets, a total of nearly 8 million 30 mpixels, located across the conterminous United States. The results indicate that the MODIS-based methodhas better accuracy than the LEDAPS method for the ETM+ red and longer wavelength bands.

  15. Publisher Correction: Oncolytic viruses as engineering platforms for combination immunotherapy.

    PubMed

    Twumasi-Boateng, Kwame; Pettigrew, Jessica L; Kwok, Y Y Eunice; Bell, John C; Nelson, Brad H

    2018-05-04

    In the online html version of this article, the affiliations for Jessica L. Pettigrew and John C. Bell were not correct. Jessica L. Pettigrew is at the Department of Medicine, University of British Columbia, Vancouver, British Columbia, Canada and John C. Bell is at the Center for Innovative Cancer Therapeutics, Ottawa Hospital Research Institute, Ottawa, Ontario, Canada. This is correct in the print and PDF versions of the article and has been corrected in the html version.

  16. 3-D Magnetotelluric Forward Modeling And Inversion Incorporating Topography By Using Vector Finite-Element Method Combined With Divergence Corrections Based On The Magnetic Field (VFEH++)

    NASA Astrophysics Data System (ADS)

    Shi, X.; Utada, H.; Jiaying, W.

    2009-12-01

    The vector finite-element method combined with divergence corrections based on the magnetic field H, referred to as VFEH++ method, is developed to simulate the magnetotelluric (MT) responses of 3-D conductivity models. The advantages of the new VFEH++ method are the use of edge-elements to eliminate the vector parasites and the divergence corrections to explicitly guarantee the divergence-free conditions in the whole modeling domain. 3-D MT topographic responses are modeling using the new VFEH++ method, and are compared with those calculated by other numerical methods. The results show that MT responses can be modeled highly accurate using the VFEH+ +method. The VFEH++ algorithm is also employed for the 3-D MT data inversion incorporating topography. The 3-D MT inverse problem is formulated as a minimization problem of the regularized misfit function. In order to avoid the huge memory requirement and very long time for computing the Jacobian sensitivity matrix for Gauss-Newton method, we employ the conjugate gradient (CG) approach to solve the inversion equation. In each iteration of CG algorithm, the cost computation is the product of the Jacobian sensitivity matrix with a model vector x or its transpose with a data vector y, which can be transformed into two pseudo-forwarding modeling. This avoids the full explicitly Jacobian matrix calculation and storage which leads to considerable savings in the memory required by the inversion program in PC computer. The performance of CG algorithm will be illustrated by several typical 3-D models with horizontal earth surface and topographic surfaces. The results show that the VFEH++ and CG algorithms can be effectively employed to 3-D MT field data inversion.

  17. Can faith-based correctional programs work? An outcome evaluation of the innerchange freedom initiative in Minnesota.

    PubMed

    Duwe, Grant; King, Michelle

    2013-07-01

    This study evaluated the effectiveness of the InnerChange Freedom Initiative (InnerChange), a faith-based prisoner reentry program, by examining recidivism outcomes among 732 offenders released from Minnesota prisons between 2003 and 2009. Results from the Cox regression analyses revealed that participating in InnerChange significantly reduced reoffending (rearrest, reconviction, and new offense reincarceration), although it did not have a significant impact on reincarceration for a technical violation revocation. The findings further suggest that the beneficial recidivism outcomes for InnerChange participants may have been due, in part, to the continuum of mentoring support some offenders received in the institution and the community. The results imply that faith-based correctional programs can reduce recidivism, but only if they apply evidence-based practices that focus on providing a behavioral intervention within a therapeutic community, addressing the criminogenic needs of participants and delivering a continuum of care from the institution to the community. Given that InnerChange relies heavily on volunteers and program costs are privately funded, the program exacts no additional costs to the State of Minnesota. Yet, because InnerChange lowers recidivism, which includes reduced reincarceration and victimization costs, the program may be especially advantageous from a cost-benefit perspective.

  18. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  19. A velocity-correction projection method based immersed boundary method for incompressible flows

    NASA Astrophysics Data System (ADS)

    Cai, Shanggui

    2014-11-01

    In the present work we propose a novel direct forcing immersed boundary method based on the velocity-correction projection method of [J.L. Guermond, J. Shen, Velocity-correction projection methods for incompressible flows, SIAM J. Numer. Anal., 41 (1)(2003) 112]. The principal idea of immersed boundary method is to correct the velocity in the vicinity of the immersed object by using an artificial force to mimic the presence of the physical boundaries. Therefore, velocity-correction projection method is preferred to its pressure-correction counterpart in the present work. Since the velocity-correct projection method is considered as a dual class of pressure-correction method, the proposed method here can also be interpreted in the way that first the pressure is predicted by treating the viscous term explicitly without the consideration of the immersed boundary, and the solenoidal velocity is used to determine the volume force on the Lagrangian points, then the non-slip boundary condition is enforced by correcting the velocity with the implicit viscous term. To demonstrate the efficiency and accuracy of the proposed method, several numerical simulations are performed and compared with the results in the literature. China Scholarship Council.

  20. MRI-Based Nonrigid Motion Correction in Simultaneous PET/MRI

    PubMed Central

    Chun, Se Young; Reese, Timothy G.; Ouyang, Jinsong; Guerin, Bastien; Catana, Ciprian; Zhu, Xuping; Alpert, Nathaniel M.; El Fakhri, Georges

    2014-01-01

    Respiratory and cardiac motion is the most serious limitation to whole-body PET, resulting in spatial resolution close to 1 cm. Furthermore, motion-induced inconsistencies in the attenuation measurements often lead to significant artifacts in the reconstructed images. Gating can remove motion artifacts at the cost of increased noise. This paper presents an approach to respiratory motion correction using simultaneous PET/MRI to demonstrate initial results in phantoms, rabbits, and nonhuman primates and discusses the prospects for clinical application. Methods Studies with a deformable phantom, a free-breathing primate, and rabbits implanted with radioactive beads were performed with simultaneous PET/MRI. Motion fields were estimated from concurrently acquired tagged MR images using 2 B-spline nonrigid image registration methods and incorporated into a PET list-mode ordered-subsets expectation maximization algorithm. Using the measured motion fields to transform both the emission data and the attenuation data, we could use all the coincidence data to reconstruct any phase of the respiratory cycle. We compared the resulting SNR and the channelized Hotelling observer (CHO) detection signal-to-noise ratio (SNR) in the motion-corrected reconstruction with the results obtained from standard gating and uncorrected studies. Results Motion correction virtually eliminated motion blur without reducing SNR, yielding images with SNR comparable to those obtained by gating with 5–8 times longer acquisitions in all studies. The CHO study in dynamic phantoms demonstrated a significant improvement (166%–276%) in lesion detection SNR with MRI-based motion correction as compared with gating (P < 0.001). This improvement was 43%–92% for large motion compared with lesion detection without motion correction (P < 0.001). CHO SNR in the rabbit studies confirmed these results. Conclusion Tagged MRI motion correction in simultaneous PET/MRI significantly improves lesion detection

  1. Polyhedral shape model for terrain correction of gravity and gravity gradient data based on an adaptive mesh

    NASA Astrophysics Data System (ADS)

    Guo, Zhikui; Chen, Chao; Tao, Chunhui

    2016-04-01

    Since 2007, there are four China Da yang cruises (CDCs), which have been carried out to investigate polymetallic sulfides in the southwest Indian ridge (SWIR) and have acquired both gravity data and bathymetry data on the corresponding survey lines(Tao et al., 2014). Sandwell et al. (2014) published a new global marine gravity model including the free air gravity data and its first order vertical gradient (Vzz). Gravity data and its gradient can be used to extract unknown density structure information(e.g. crust thickness) under surface of the earth, but they contain all the mass effect under the observation point. Therefore, how to get accurate gravity and its gradient effect of the existing density structure (e.g. terrain) has been a key issue. Using the bathymetry data or ETOPO1 (http://www.ngdc.noaa.gov/mgg/global/global.html) model at a full resolution to calculate the terrain effect could spend too much computation time. We expect to develop an effective method that takes less time but can still yield the desired accuracy. In this study, a constant-density polyhedral model is used to calculate the gravity field and its vertical gradient, which is based on the work of Tsoulis (2012). According to gravity field attenuation with distance and variance of bathymetry, we present an adaptive mesh refinement and coarsening strategies to merge both global topography data and multi-beam bathymetry data. The local coarsening or size of mesh depends on user-defined accuracy and terrain variation (Davis et al., 2011). To depict terrain better, triangular surface element and rectangular surface element are used in fine and coarse mesh respectively. This strategy can also be applied to spherical coordinate in large region and global scale. Finally, we applied this method to calculate Bouguer gravity anomaly (BGA), mantle Bouguer anomaly(MBA) and their vertical gradient in SWIR. Further, we compared the result with previous results in the literature. Both synthetic model

  2. Recent Trends in Nanotechnology-Based Drugs and Formulations for Targeted Therapeutic Delivery.

    PubMed

    Iqbal, Hafiz M N; Rodriguez, Angel M V; Khandia, Rekha; Munjal, Ashok; Dhama, Kuldeep

    2017-01-01

    In the recent past, a wider spectrum of nanotechnologybased drugs or drug-loaded devices and systems has been engineered and investigated with high interests. The key objective is to help for an enhanced/better quality of patient life in a secure way by avoiding/limiting drug abuse, or severe adverse effects of some in practice traditional therapies. Various methodological approaches including in vitro, in vivo, and ex vivo techniques have been exploited, so far. Among them, nanoparticles-based therapeutic agents are of supreme interests for an enhanced and efficient delivery in the current biomedical sector of the modern world. The development of new types of novel, effective and highly reliable therapeutic drug delivery system (DDS) for multipurpose applications is essential and a core demand to tackle many human health related diseases. In this context, nanotechnology-based several advanced DDS have been engineered with novel characteristics for biomedical, pharmaceutical and cosmeceutical applications that include but not limited to the enhanced/improved bioactivity, bioavailability, drug efficacy, targeted delivery, and therapeutically safer with an extra advantage of overcoming demerits of traditional drug formulations/designs. This review work is focused on recent trends/advances in nanotechnology-based drugs and formulations designed for targeted therapeutic delivery. Moreover, information is also reviewed and given from recent patents and summarized or illustrated diagrammatically to depict a better understanding. Recent patents covering various nanotechnology-based approaches for several applications have also been reviewed. The drug-loaded nanoparticles are among versatile candidates with multifunctional characteristics for potential applications in biomedical, and tissue engineering sector. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  3. Detect, correct, retract: How to manage incorrect structural models.

    PubMed

    Wlodawer, Alexander; Dauter, Zbigniew; Porebski, Przemyslaw J; Minor, Wladek; Stanfield, Robyn; Jaskolski, Mariusz; Pozharski, Edwin; Weichenberger, Christian X; Rupp, Bernhard

    2018-02-01

    The massive technical and computational progress of biomolecular crystallography has generated some adverse side effects. Most crystal structure models, produced by crystallographers or well-trained structural biologists, constitute useful sources of information, but occasional extreme outliers remind us that the process of structure determination is not fail-safe. The occurrence of severe errors or gross misinterpretations raises fundamental questions: Why do such aberrations emerge in the first place? How did they evade the sophisticated validation procedures which often produce clear and dire warnings, and why were severe errors not noticed by the depositors themselves, their supervisors, referees and editors? Once detected, what can be done to either correct, improve or eliminate such models? How do incorrect models affect the underlying claims or biomedical hypotheses they were intended, but failed, to support? What is the long-range effect of the propagation of such errors? And finally, what mechanisms can be envisioned to restore the validity of the scientific record and, if necessary, retract publications that are clearly invalidated by the lack of experimental evidence? We suggest that cognitive bias and flawed epistemology are likely at the root of the problem. By using examples from the published literature and from public repositories such as the Protein Data Bank, we provide case summaries to guide correction or improvement of structural models. When strong claims are unsustainable because of a deficient crystallographic model, removal of such a model and even retraction of the affected publication are necessary to restore the integrity of the scientific record. © 2017 Federation of European Biochemical Societies.

  4. Confirming the RNAi-mediated mechanism of action of siRNA-based cancer therapeutics in mice.

    PubMed

    Judge, Adam D; Robbins, Marjorie; Tavakoli, Iran; Levi, Jasna; Hu, Lina; Fronda, Anna; Ambegia, Ellen; McClintock, Kevin; MacLachlan, Ian

    2009-03-01

    siRNAs that specifically silence the expression of cancer-related genes offer a therapeutic approach in oncology. However, it remains critical to determine the true mechanism of their therapeutic effects. Here, we describe the preclinical development of chemically modified siRNA targeting the essential cell-cycle proteins polo-like kinase 1 (PLK1) and kinesin spindle protein (KSP) in mice. siRNA formulated in stable nucleic acid lipid particles (SNALP) displayed potent antitumor efficacy in both hepatic and subcutaneous tumor models. This was correlated with target gene silencing following a single intravenous administration that was sufficient to cause extensive mitotic disruption and tumor cell apoptosis. Our siRNA formulations induced no measurable immune response, minimizing the potential for nonspecific effects. Additionally, RNAi-specific mRNA cleavage products were found in tumor cells, and their presence correlated with the duration of target mRNA silencing. Histological biomarkers confirmed that RNAi-mediated gene silencing effectively inhibited the target's biological activity. This report supports an RNAi-mediated mechanism of action for siRNA antitumor effects, suggesting a new methodology for targeting other key genes in cancer development with siRNA-based therapeutics.

  5. Confirming the RNAi-mediated mechanism of action of siRNA-based cancer therapeutics in mice

    PubMed Central

    Judge, Adam D.; Robbins, Marjorie; Tavakoli, Iran; Levi, Jasna; Hu, Lina; Fronda, Anna; Ambegia, Ellen; McClintock, Kevin; MacLachlan, Ian

    2009-01-01

    siRNAs that specifically silence the expression of cancer-related genes offer a therapeutic approach in oncology. However, it remains critical to determine the true mechanism of their therapeutic effects. Here, we describe the preclinical development of chemically modified siRNA targeting the essential cell-cycle proteins polo-like kinase 1 (PLK1) and kinesin spindle protein (KSP) in mice. siRNA formulated in stable nucleic acid lipid particles (SNALP) displayed potent antitumor efficacy in both hepatic and subcutaneous tumor models. This was correlated with target gene silencing following a single intravenous administration that was sufficient to cause extensive mitotic disruption and tumor cell apoptosis. Our siRNA formulations induced no measurable immune response, minimizing the potential for nonspecific effects. Additionally, RNAi-specific mRNA cleavage products were found in tumor cells, and their presence correlated with the duration of target mRNA silencing. Histological biomarkers confirmed that RNAi-mediated gene silencing effectively inhibited the target’s biological activity. This report supports an RNAi-mediated mechanism of action for siRNA antitumor effects, suggesting a new methodology for targeting other key genes in cancer development with siRNA-based therapeutics. PMID:19229107

  6. Ultrasound fusion image error correction using subject-specific liver motion model and automatic image registration.

    PubMed

    Yang, Minglei; Ding, Hui; Zhu, Lei; Wang, Guangzhi

    2016-12-01

    Ultrasound fusion imaging is an emerging tool and benefits a variety of clinical applications, such as image-guided diagnosis and treatment of hepatocellular carcinoma and unresectable liver metastases. However, respiratory liver motion-induced misalignment of multimodal images (i.e., fusion error) compromises the effectiveness and practicability of this method. The purpose of this paper is to develop a subject-specific liver motion model and automatic registration-based method to correct the fusion error. An online-built subject-specific motion model and automatic image registration method for 2D ultrasound-3D magnetic resonance (MR) images were combined to compensate for the respiratory liver motion. The key steps included: 1) Build a subject-specific liver motion model for current subject online and perform the initial registration of pre-acquired 3D MR and intra-operative ultrasound images; 2) During fusion imaging, compensate for liver motion first using the motion model, and then using an automatic registration method to further correct the respiratory fusion error. Evaluation experiments were conducted on liver phantom and five subjects. In the phantom study, the fusion error (superior-inferior axis) was reduced from 13.90±2.38mm to 4.26±0.78mm by using the motion model only. The fusion error further decreased to 0.63±0.53mm by using the registration method. The registration method also decreased the rotation error from 7.06±0.21° to 1.18±0.66°. In the clinical study, the fusion error was reduced from 12.90±9.58mm to 6.12±2.90mm by using the motion model alone. Moreover, the fusion error decreased to 1.96±0.33mm by using the registration method. The proposed method can effectively correct the respiration-induced fusion error to improve the fusion image quality. This method can also reduce the error correction dependency on the initial registration of ultrasound and MR images. Overall, the proposed method can improve the clinical practicability of

  7. Reporter Assay for Endo/Lysosomal Escape of Toxin-Based Therapeutics

    PubMed Central

    Gilabert-Oriol, Roger; Thakur, Mayank; von Mallinckrodt, Benedicta; Bhargava, Cheenu; Wiesner, Burkhard; Eichhorst, Jenny; Melzig, Matthias F.; Fuchs, Hendrik; Weng, Alexander

    2014-01-01

    Protein-based therapeutics with cytosolic targets are capable of exhibiting their therapeutic effect once they have escaped from the endosomes or lysosomes. In this study, the reporters—horseradish peroxidase (HRP), Alexa Fluor 488 (Alexa) and ricin A-chain (RTA)—were investigated for their capacity to monitor the endo/lysosomal escape of the ribosome-inactivating protein, saporin. The conjugates—saporin-HRP, Alexasaporin and saporin-KQ-RTA—were constructed, and the endo/lysosomal escape of these conjugates alone (lack of endo/lysosomal release) or in combination with certain structurally-specific triterpenoidal saponins (efficient endo/lysosomal escape) was characterized. HRP failed in reporting the endo/lysosomal escape of saporin. Contrastingly, Alexa Fluor 488 successfully allowed the report of the process at a toxin concentration of 1000 nM. In addition, single endo/lysosome analysis facilitated the determination of the amount of Alexasaporin released from each vesicle. RTA was also successful in reporting the endo/lysosomal escape of the enzymatically inactive mutant, saporin-KQ, but in this case, the sensitivity of the method reached a toxin concentration of 10 nM. In conclusion, the simultaneous usage of Alexa Fluor 488 and RTA as reporters may provide the possibility of monitoring the endo/lysosomal escape of protein-based therapeutics in the concentration range of 10–1000 nM. PMID:24859158

  8. A DSP-based neural network non-uniformity correction algorithm for IRFPA

    NASA Astrophysics Data System (ADS)

    Liu, Chong-liang; Jin, Wei-qi; Cao, Yang; Liu, Xiu

    2009-07-01

    An effective neural network non-uniformity correction (NUC) algorithm based on DSP is proposed in this paper. The non-uniform response in infrared focal plane array (IRFPA) detectors produces corrupted images with a fixed-pattern noise(FPN).We introduced and analyzed the artificial neural network scene-based non-uniformity correction (SBNUC) algorithm. A design of DSP-based NUC development platform for IRFPA is described. The DSP hardware platform designed is of low power consumption, with 32-bit fixed point DSP TMS320DM643 as the kernel processor. The dependability and expansibility of the software have been improved by DSP/BIOS real-time operating system and Reference Framework 5. In order to realize real-time performance, the calibration parameters update is set at a lower task priority then video input and output in DSP/BIOS. In this way, calibration parameters updating will not affect video streams. The work flow of the system and the strategy of real-time realization are introduced. Experiments on real infrared imaging sequences demonstrate that this algorithm requires only a few frames to obtain high quality corrections. It is computationally efficient and suitable for all kinds of non-uniformity.

  9. Scene-based nonuniformity corrections for optical and SWIR pushbroom sensors.

    PubMed

    Leathers, Robert; Downes, Trijntje; Priest, Richard

    2005-06-27

    We propose and evaluate several scene-based methods for computing nonuniformity corrections for visible or near-infrared pushbroom sensors. These methods can be used to compute new nonuniformity correction values or to repair or refine existing radiometric calibrations. For a given data set, the preferred method depends on the quality of the data, the type of scenes being imaged, and the existence and quality of a laboratory calibration. We demonstrate our methods with data from several different sensor systems and provide a generalized approach to be taken for any new data set.

  10. Image-based spectral distortion correction for photon-counting x-ray detectors

    PubMed Central

    Ding, Huanjun; Molloi, Sabee

    2012-01-01

    Purpose: To investigate the feasibility of using an image-based method to correct for distortions induced by various artifacts in the x-ray spectrum recorded with photon-counting detectors for their application in breast computed tomography (CT). Methods: The polyenergetic incident spectrum was simulated with the tungsten anode spectral model using the interpolating polynomials (TASMIP) code and carefully calibrated to match the x-ray tube in this study. Experiments were performed on a Cadmium-Zinc-Telluride (CZT) photon-counting detector with five energy thresholds. Energy bins were adjusted to evenly distribute the recorded counts above the noise floor. BR12 phantoms of various thicknesses were used for calibration. A nonlinear function was selected to fit the count correlation between the simulated and the measured spectra in the calibration process. To evaluate the proposed spectral distortion correction method, an empirical fitting derived from the calibration process was applied on the raw images recorded for polymethyl methacrylate (PMMA) phantoms of 8.7, 48.8, and 100.0 mm. Both the corrected counts and the effective attenuation coefficient were compared to the simulated values for each of the five energy bins. The feasibility of applying the proposed method to quantitative material decomposition was tested using a dual-energy imaging technique with a three-material phantom that consisted of water, lipid, and protein. The performance of the spectral distortion correction method was quantified using the relative root-mean-square (RMS) error with respect to the expected values from simulations or areal analysis of the decomposition phantom. Results: The implementation of the proposed method reduced the relative RMS error of the output counts in the five energy bins with respect to the simulated incident counts from 23.0%, 33.0%, and 54.0% to 1.2%, 1.8%, and 7.7% for 8.7, 48.8, and 100.0 mm PMMA phantoms, respectively. The accuracy of the effective attenuation

  11. General rigid motion correction for computed tomography imaging based on locally linear embedding

    NASA Astrophysics Data System (ADS)

    Chen, Mianyi; He, Peng; Feng, Peng; Liu, Baodong; Yang, Qingsong; Wei, Biao; Wang, Ge

    2018-02-01

    The patient motion can damage the quality of computed tomography images, which are typically acquired in cone-beam geometry. The rigid patient motion is characterized by six geometric parameters and are more challenging to correct than in fan-beam geometry. We extend our previous rigid patient motion correction method based on the principle of locally linear embedding (LLE) from fan-beam to cone-beam geometry and accelerate the computational procedure with the graphics processing unit (GPU)-based all scale tomographic reconstruction Antwerp toolbox. The major merit of our method is that we need neither fiducial markers nor motion-tracking devices. The numerical and experimental studies show that the LLE-based patient motion correction is capable of calibrating the six parameters of the patient motion simultaneously, reducing patient motion artifacts significantly.

  12. Stripe nonuniformity correction for infrared imaging system based on single image optimization

    NASA Astrophysics Data System (ADS)

    Hua, Weiping; Zhao, Jufeng; Cui, Guangmang; Gong, Xiaoli; Ge, Peng; Zhang, Jiang; Xu, Zhihai

    2018-06-01

    Infrared imaging is often disturbed by stripe nonuniformity noise. Scene-based correction method can effectively reduce the impact of stripe noise. In this paper, a stripe nonuniformity correction method based on differential constraint is proposed. Firstly, the gray distribution of stripe nonuniformity is analyzed and the penalty function is constructed by the difference of horizontal gradient and vertical gradient. With the weight function, the penalty function is optimized to obtain the corrected image. Comparing with other single-frame approaches, experiments show that the proposed method performs better in both subjective and objective analysis, and does less damage to edge and detail. Meanwhile, the proposed method runs faster. We have also discussed the differences between the proposed idea and multi-frame methods. Our method is finally well applied in hardware system.

  13. How well can charge transfer inefficiency be corrected? A parameter sensitivity study for iterative correction

    NASA Astrophysics Data System (ADS)

    Israel, Holger; Massey, Richard; Prod'homme, Thibaut; Cropper, Mark; Cordes, Oliver; Gow, Jason; Kohley, Ralf; Marggraf, Ole; Niemi, Sami; Rhodes, Jason; Short, Alex; Verhoeve, Peter

    2015-10-01

    Radiation damage to space-based charge-coupled device detectors creates defects which result in an increasing charge transfer inefficiency (CTI) that causes spurious image trailing. Most of the trailing can be corrected during post-processing, by modelling the charge trapping and moving electrons back to where they belong. However, such correction is not perfect - and damage is continuing to accumulate in orbit. To aid future development, we quantify the limitations of current approaches, and determine where imperfect knowledge of model parameters most degrades measurements of photometry and morphology. As a concrete application, we simulate 1.5 × 109 `worst-case' galaxy and 1.5 × 108 star images to test the performance of the Euclid visual instrument detectors. There are two separable challenges. If the model used to correct CTI is perfectly the same as that used to add CTI, 99.68 per cent of spurious ellipticity is corrected in our setup. This is because readout noise is not subject to CTI, but gets overcorrected during correction. Secondly, if we assume the first issue to be solved, knowledge of the charge trap density within Δρ/ρ = (0.0272 ± 0.0005) per cent and the characteristic release time of the dominant species to be known within Δτ/τ = (0.0400 ± 0.0004) per cent will be required. This work presents the next level of definition of in-orbit CTI calibration procedures for Euclid.

  14. Therapeutics incorporating blood constituents.

    PubMed

    Charoenphol, Phapanin; Oswalt, Katie; Bishop, Corey J

    2018-04-05

    Blood deficiency and dysfunctionality can result in adverse events, which can primarily be treated by transfusion of blood or the re-introduction of properly functioning sub-components. Blood constituents can be engineered on the sub-cellular (i.e., DNA recombinant technology) and cellular level (i.e., cellular hitchhiking for drug delivery) for supplementing and enhancing therapeutic efficacy, in addition to rectifying dysfunctioning mechanisms (i.e., clotting). Herein, we report the progress of blood-based therapeutics, with an emphasis on recent applications of blood transfusion, blood cell-based therapies and biomimetic carriers. Clinically translated technologies and commercial products of blood-based therapeutics are subsequently highlighted and perspectives on challenges and future prospects are discussed. Blood-based therapeutics is a burgeoning field and has advanced considerably in recent years. Blood and its constituents, with and without modification (i.e., combinatorial), have been utilized in a broad spectrum of pre-clinical and clinically-translated treatments. This review article summarizes the most up-to-date progress of blood-based therapeutics in the following contexts: synthetic blood substitutes, acellular/non-recombinant therapies, cell-based therapies, and therapeutic sub-components. The article subsequently discusses clinically-translated technologies and future prospects thereof. Copyright © 2018 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  15. Graphene-based platforms for cancer therapeutics

    PubMed Central

    Patel, Sunny C; Lee, Stephen; Lalwani, Gaurav; Suhrland, Cassandra; Chowdhury, Sayan Mullick; Sitharaman, Balaji

    2016-01-01

    Graphene is a multifunctional carbon nanomaterial and could be utilized to develop platform technologies for cancer therapies. Its surface can be covalently and noncovalently functionalized with anticancer drugs and functional groups that target cancer cells and tissue to improve treatment efficacies. Furthermore, its physicochemical properties can be harnessed to facilitate stimulus responsive therapeutics and drug delivery. This review article summarizes the recent literature specifically focused on development of graphene technologies to treat cancer. We will focus on advances at the interface of graphene based drug/gene delivery, photothermal/photodynamic therapy and combinations of these techniques. We also discuss the current understanding in cytocompatibility and biocompatibility issues related to graphene formulations and their implications pertinent to clinical cancer management. PMID:26769305

  16. Individualized correction of insulin measurement in hemolyzed serum samples.

    PubMed

    Wu, Zhi-Qi; Lu, Ju; Chen, Huanhuan; Chen, Wensen; Xu, Hua-Guo

    2017-06-01

    Insulin measurement plays a key role in the investigation of patients with hypoglycemia, subtype classification of diabetes mellitus, insulin resistance, and impaired beta cell function. However, even slight hemolysis can negatively affect insulin measurement due to RBC insulin-degrading enzyme (IDE). Here, we derived and validated an individualized correction equation in an attempt to eliminate the effects of hemolysis on insulin measurement. The effects of hemolysis on insulin measurement were studied by adding lysed self-RBCs to serum. A correction equation was derived, accounting for both percentage and exposure time of hemolysis. The performance of this individualized correction was evaluated in intentionally hemolyzed samples. Insulin concentration decreased with increasing percentage and exposure time of hemolysis. Based on the effects of hemolysis on insulin measurement of 17 donors (baseline insulin concentrations ranged from 156 to 2119 pmol/L), the individualized hemolysis correction equation was derived: INS corr  = INS meas /(0.705lgHb plasma /Hb serum  - 0.001Time - 0.612). This equation can revert insulin concentrations of the intentionally hemolyzed samples to values that were statistically not different from the corresponding insulin baseline concentrations (p = 0.1564). Hemolysis could lead to a negative interference on insulin measurement; by individualized hemolysis correction equation for insulin measurement, we can correct and report reliable serum insulin results for a wide range of degrees of sample hemolysis. This correction would increase diagnostic accuracy, reduce inappropriate therapeutic decisions, and improve patient satisfaction with care.

  17. Emerging Mitochondrial Therapeutic Targets in Optic Neuropathies.

    PubMed

    Lopez Sanchez, M I G; Crowston, J G; Mackey, D A; Trounce, I A

    2016-09-01

    Optic neuropathies are an important cause of blindness worldwide. The study of the most common inherited mitochondrial optic neuropathies, Leber hereditary optic neuropathy (LHON) and autosomal dominant optic atrophy (ADOA) has highlighted a fundamental role for mitochondrial function in the survival of the affected neuron-the retinal ganglion cell. A picture is now emerging that links mitochondrial dysfunction to optic nerve disease and other neurodegenerative processes. Insights gained from the peculiar susceptibility of retinal ganglion cells to mitochondrial dysfunction are likely to inform therapeutic development for glaucoma and other common neurodegenerative diseases of aging. Despite it being a fast-evolving field of research, a lack of access to human ocular tissues and limited animal models of mitochondrial disease have prevented direct retinal ganglion cell experimentation and delayed the development of efficient therapeutic strategies to prevent vision loss. Currently, there are no approved treatments for mitochondrial disease, including optic neuropathies caused by primary or secondary mitochondrial dysfunction. Recent advances in eye research have provided important insights into the molecular mechanisms that mediate pathogenesis, and new therapeutic strategies including gene correction approaches are currently being investigated. Here, we review the general principles of mitochondrial biology relevant to retinal ganglion cell function and provide an overview of the major optic neuropathies with mitochondrial involvement, LHON and ADOA, whilst highlighting the emerging link between mitochondrial dysfunction and glaucoma. The pharmacological strategies currently being trialed to improve mitochondrial dysfunction in these optic neuropathies are discussed in addition to emerging therapeutic approaches to preserve retinal ganglion cell function. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Speciation in Metal Toxicity and Metal-Based Therapeutics

    PubMed Central

    Templeton, Douglas M.

    2015-01-01

    Metallic elements, ions and compounds produce varying degrees of toxicity in organisms with which they come into contact. Metal speciation is critical to understanding these adverse effects; the adjectives “heavy” and “toxic” are not helpful in describing the biological properties of individual elements, but detailed chemical structures are. As a broad generalization, the metallic form of an element is inert, and the ionic salts are the species that show more significant bioavailability. Yet the salts and other chelates of a metal ion can give rise to quite different toxicities, as exemplified by a range of carcinogenic potential for various nickel species. Another important distinction comes when a metallic element is organified, increasing its lipophilicity and hence its ability to penetrate the blood brain barrier, as is seen, for example, with organic mercury and tin species. Some metallic elements, such as gold and platinum, are themselves useful therapeutic agents in some forms, while other species of the same element can be toxic, thus focusing attention on species interconversions in evaluating metal-based drugs. The therapeutic use of metal-chelating agents introduces new species of the target metal in vivo, and this can affect not only its desired detoxification, but also introduce a potential for further mechanisms of toxicity. Examples of therapeutic iron chelator species are discussed in this context, as well as the more recent aspects of development of chelation therapy for uranium exposure. PMID:29056656

  19. Cell-based therapeutic strategies for multiple sclerosis

    PubMed Central

    Scolding, Neil J; Pasquini, Marcelo; Reingold, Stephen C; Cohen, Jeffrey A; Atkins, Harold; Banwell, Brenda; Bar-Or, Amit; Bebo, Bruce; Bowen, James; Burt, Richard; Calabresi, Peter; Cohen, Jeffrey; Comi, Giancarlo; Connick, Peter; Cross, Anne; Cutter, Gary; Derfuss, Tobias; Ffrench-Constant, Charles; Freedman, Mark; Galipeau, Jacques; Goldman, Myla; Goldman, Steven; Goodman, Andrew; Green, Ari; Griffith, Linda; Hartung, Hans-Peter; Hemmer, Bernhard; Hyun, Insoo; Iacobaeus, Ellen; Inglese, Matilde; Jubelt, Burk; Karussis, Dimitrios; Küry, Patrick; Landsman, Douglas; Laule, Cornelia; Liblau, Roland; Mancardi, Giovanni; Ann Marrie, Ruth; Miller, Aaron; Miller, Robert; Miller, David; Mowry, Ellen; Muraro, Paolo; Nash, Richard; Ontaneda, Daniel; Pasquini, Marcelo; Pelletier, Daniel; Peruzzotti-Jametti, Luca; Pluchino, Stefano; Racke, Michael; Reingold, Stephen; Rice, Claire; Ringdén, Olle; Rovira, Alex; Saccardi, Riccardo; Sadiq, Saud; Sarantopoulos, Stefanie; Savitz, Sean; Scolding, Neil; Soelberg Sorensen, Per; Pia Sormani, Maria; Stuve, Olaf; Tesar, Paul; Thompson, Alan; Trojano, Maria; Uccelli, Antonio; Uitdehaag, Bernard; Utz, Ursula; Vukusic, Sandra; Waubant, Emmanuelle; Wilkins, Alastair

    2017-01-01

    Abstract The availability of multiple disease-modifying medications with regulatory approval to treat multiple sclerosis illustrates the substantial progress made in therapy of the disease. However, all are only partially effective in preventing inflammatory tissue damage in the central nervous system and none directly promotes repair. Cell-based therapies, including immunoablation followed by autologous haematopoietic stem cell transplantation, mesenchymal and related stem cell transplantation, pharmacologic manipulation of endogenous stem cells to enhance their reparative capabilities, and transplantation of oligodendrocyte progenitor cells, have generated substantial interest as novel therapeutic strategies for immune modulation, neuroprotection, or repair of the damaged central nervous system in multiple sclerosis. Each approach has potential advantages but also safety concerns and unresolved questions. Moreover, clinical trials of cell-based therapies present several unique methodological and ethical issues. We summarize here the status of cell-based therapies to treat multiple sclerosis and make consensus recommendations for future research and clinical trials. PMID:29053779

  20. ECHO: A reference-free short-read error correction algorithm

    PubMed Central

    Kao, Wei-Chun; Chan, Andrew H.; Song, Yun S.

    2011-01-01

    Developing accurate, scalable algorithms to improve data quality is an important computational challenge associated with recent advances in high-throughput sequencing technology. In this study, a novel error-correction algorithm, called ECHO, is introduced for correcting base-call errors in short-reads, without the need of a reference genome. Unlike most previous methods, ECHO does not require the user to specify parameters of which optimal values are typically unknown a priori. ECHO automatically sets the parameters in the assumed model and estimates error characteristics specific to each sequencing run, while maintaining a running time that is within the range of practical use. ECHO is based on a probabilistic model and is able to assign a quality score to each corrected base. Furthermore, it explicitly models heterozygosity in diploid genomes and provides a reference-free method for detecting bases that originated from heterozygous sites. On both real and simulated data, ECHO is able to improve the accuracy of previous error-correction methods by several folds to an order of magnitude, depending on the sequence coverage depth and the position in the read. The improvement is most pronounced toward the end of the read, where previous methods become noticeably less effective. Using a whole-genome yeast data set, it is demonstrated here that ECHO is capable of coping with nonuniform coverage. Also, it is shown that using ECHO to perform error correction as a preprocessing step considerably facilitates de novo assembly, particularly in the case of low-to-moderate sequence coverage depth. PMID:21482625

  1. 3D Printing to Model Surgical Repair of Complex Congenitally Corrected Transposition of the Great Arteries.

    PubMed

    Sahayaraj, R Anto; Ramanan, Sowmya; Subramanyan, Raghavan; Cherian, Kotturathu Mammen

    2017-01-01

    We report the use of three-dimensional (3D) modeling to plan surgery for physiologic repair of congenitally corrected transposition of the great arteries with pulmonary atresia, dextrocardia, and complex intra cardiac anatomy. Based on measurements made from the 3D printed model of the actual patient's anatomy, we anticipated using a composite valved conduit (Dacron tube graft, decellularized bovine jugular vein, and aortic homograft) to establish left ventricle-to-pulmonary artery continuity with relief of stenosis involving the pulmonary artery confluence and bilateral branch pulmonary arteries.

  2. Engineering of Fc Fragments with Optimized Physicochemical Properties Implying Improvement of Clinical Potentials for Fc-Based Therapeutics.

    PubMed

    Yang, Chunpeng; Gao, Xinyu; Gong, Rui

    2017-01-01

    Therapeutic monoclonal antibodies and Fc-fusion proteins are successfully used in treatment of various diseases mainly including cancer, immune disease, and viral infection, which belong to the Fc-based therapeutics. In recent years, engineered Fc-derived antibody domains have also shown potential for Fc-based therapeutics. To increase the druggability of Fc-based therapeutic candidates, many efforts have been made in optimizing physicochemical properties and functions mediated by Fc fragment. The desired result is that we can simultaneously obtain Fc variants with increased physicochemical properties in vitro and capacity of mediating appropriate functions in vivo . However, changes of physicochemical properties of Fc may result in alternation of Fc-mediated functions and vice versa , which leads to undesired outcomes for further development of Fc-based therapeutics. Therefore, whether modified Fc fragments are suitable for achievement of expected clinical results or not needs to be seriously considered. Now, this question comes to be noticed and should be figured out to make better translation from the results of laboratory into clinical applications. In this review, we summarize different strategies on engineering physicochemical properties of Fc, and preliminarily elucidate the relationships between modified Fc in vitro and the subsequent therapeutic influence in vivo .

  3. Insights on Localized and Systemic Delivery of Redox-Based Therapeutics

    PubMed Central

    Batrakova, Elena V.; Mota, Roberto

    2018-01-01

    Reactive oxygen and nitrogen species are indispensable in cellular physiology and signaling. Overproduction of these reactive species or failure to maintain their levels within the physiological range results in cellular redox dysfunction, often termed cellular oxidative stress. Redox dysfunction in turn is at the molecular basis of disease etiology and progression. Accordingly, antioxidant intervention to restore redox homeostasis has been pursued as a therapeutic strategy for cardiovascular disease, cancer, and neurodegenerative disorders among many others. Despite preliminary success in cellular and animal models, redox-based interventions have virtually been ineffective in clinical trials. We propose the fundamental reason for their failure is a flawed delivery approach. Namely, systemic delivery for a geographically local disease limits the effectiveness of the antioxidant. We take a critical look at the literature and evaluate successful and unsuccessful approaches to translation of redox intervention to the clinical arena, including dose, patient selection, and delivery approach. We argue that when interpreting a failed antioxidant-based clinical trial, it is crucial to take into account these variables and importantly, whether the drug had an effect on the redox status. Finally, we propose that local and targeted delivery hold promise to translate redox-based therapies from the bench to the bedside. PMID:29636836

  4. The importance of topographically corrected null models for analyzing ecological point processes.

    PubMed

    McDowall, Philip; Lynch, Heather J

    2017-07-01

    Analyses of point process patterns and related techniques (e.g., MaxEnt) make use of the expected number of occurrences per unit area and second-order statistics based on the distance between occurrences. Ecologists working with point process data often assume that points exist on a two-dimensional x-y plane or within a three-dimensional volume, when in fact many observed point patterns are generated on a two-dimensional surface existing within three-dimensional space. For many surfaces, however, such as the topography of landscapes, the projection from the surface to the x-y plane preserves neither area nor distance. As such, when these point patterns are implicitly projected to and analyzed in the x-y plane, our expectations of the point pattern's statistical properties may not be met. When used in hypothesis testing, we find that the failure to account for the topography of the generating surface may bias statistical tests that incorrectly identify clustering and, furthermore, may bias coefficients in inhomogeneous point process models that incorporate slope as a covariate. We demonstrate the circumstances under which this bias is significant, and present simple methods that allow point processes to be simulated with corrections for topography. These point patterns can then be used to generate "topographically corrected" null models against which observed point processes can be compared. © 2017 by the Ecological Society of America.

  5. Integrated Patient-Derived Models Delineate Individualized Therapeutic Vulnerabilities of Pancreatic Cancer.

    PubMed

    Witkiewicz, Agnieszka K; Balaji, Uthra; Eslinger, Cody; McMillan, Elizabeth; Conway, William; Posner, Bruce; Mills, Gordon B; O'Reilly, Eileen M; Knudsen, Erik S

    2016-08-16

    Pancreatic ductal adenocarcinoma (PDAC) harbors the worst prognosis of any common solid tumor, and multiple failed clinical trials indicate therapeutic recalcitrance. Here, we use exome sequencing of patient tumors and find multiple conserved genetic alterations. However, the majority of tumors exhibit no clearly defined therapeutic target. High-throughput drug screens using patient-derived cell lines found rare examples of sensitivity to monotherapy, with most models requiring combination therapy. Using PDX models, we confirmed the effectiveness and selectivity of the identified treatment responses. Out of more than 500 single and combination drug regimens tested, no single treatment was effective for the majority of PDAC tumors, and each case had unique sensitivity profiles that could not be predicted using genetic analyses. These data indicate a shortcoming of reliance on genetic analysis to predict efficacy of currently available agents against PDAC and suggest that sensitivity profiling of patient-derived models could inform personalized therapy design for PDAC. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  6. Measurement correction method for force sensor used in dynamic pressure calibration based on artificial neural network optimized by genetic algorithm

    NASA Astrophysics Data System (ADS)

    Gu, Tingwei; Kong, Deren; Shang, Fei; Chen, Jing

    2017-12-01

    We present an optimization algorithm to obtain low-uncertainty dynamic pressure measurements from a force-transducer-based device. In this paper, the advantages and disadvantages of the methods that are commonly used to measure the propellant powder gas pressure, the applicable scope of dynamic pressure calibration devices, and the shortcomings of the traditional comparison calibration method based on the drop-weight device are firstly analysed in detail. Then, a dynamic calibration method for measuring pressure using a force sensor based on a drop-weight device is introduced. This method can effectively save time when many pressure sensors are calibrated simultaneously and extend the life of expensive reference sensors. However, the force sensor is installed between the drop-weight and the hammerhead by transition pieces through the connection mode of bolt fastening, which causes adverse effects such as additional pretightening and inertia forces. To solve these effects, the influence mechanisms of the pretightening force, the inertia force and other influence factors on the force measurement are theoretically analysed. Then a measurement correction method for the force measurement is proposed based on an artificial neural network optimized by a genetic algorithm. The training and testing data sets are obtained from calibration tests, and the selection criteria for the key parameters of the correction model is discussed. The evaluation results for the test data show that the correction model can effectively improve the force measurement accuracy of the force sensor. Compared with the traditional high-accuracy comparison calibration method, the percentage difference of the impact-force-based measurement is less than 0.6% and the relative uncertainty of the corrected force value is 1.95%, which can meet the requirements of engineering applications.

  7. A Model for Assessing the Liability of Seemingly Correct Software

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Voas, Larry K.; Miller, Keith W.

    1991-01-01

    Current research on software reliability does not lend itself to quantitatively assessing the risk posed by a piece of life-critical software. Black-box software reliability models are too general and make too many assumptions to be applied confidently to assessing the risk of life-critical software. We present a model for assessing the risk caused by a piece of software; this model combines software testing results and Hamlet's probable correctness model. We show how this model can assess software risk for those who insure against a loss that can occur if life-critical software fails.

  8. Scene-based nonuniformity correction algorithm based on interframe registration.

    PubMed

    Zuo, Chao; Chen, Qian; Gu, Guohua; Sui, Xiubao

    2011-06-01

    In this paper, we present a simple and effective scene-based nonuniformity correction (NUC) method for infrared focal plane arrays based on interframe registration. This method estimates the global translation between two adjacent frames and minimizes the mean square error between the two properly registered images to make any two detectors with the same scene produce the same output value. In this way, the accumulation of the registration error can be avoided and the NUC can be achieved. The advantages of the proposed algorithm lie in its low computational complexity and storage requirements and ability to capture temporal drifts in the nonuniformity parameters. The performance of the proposed technique is thoroughly studied with infrared image sequences with simulated nonuniformity and infrared imagery with real nonuniformity. It shows a significantly fast and reliable fixed-pattern noise reduction and obtains an effective frame-by-frame adaptive estimation of each detector's gain and offset.

  9. Accuracy Improvement of Multi-Axis Systems Based on Laser Correction of Volumetric Geometric Errors

    NASA Astrophysics Data System (ADS)

    Teleshevsky, V. I.; Sokolov, V. A.; Pimushkin, Ya I.

    2018-04-01

    The article describes a volumetric geometric errors correction method for CNC- controlled multi-axis systems (machine-tools, CMMs etc.). The Kalman’s concept of “Control and Observation” is used. A versatile multi-function laser interferometer is used as Observer in order to measure machine’s error functions. A systematic error map of machine’s workspace is produced based on error functions measurements. The error map results into error correction strategy. The article proposes a new method of error correction strategy forming. The method is based on error distribution within machine’s workspace and a CNC-program postprocessor. The postprocessor provides minimal error values within maximal workspace zone. The results are confirmed by error correction of precision CNC machine-tools.

  10. An assessment of the factors affecting the commercialization of cell-based therapeutics: a systematic review protocol.

    PubMed

    Pettitt, David; Arshad, Zeeshaan; Davies, Benjamin; Smith, James; French, Anna; Cole, Doug; Bure, Kim; Dopson, Sue; DiGiusto, David; Karp, Jeff; Reeve, Brock; Barker, Richard; Holländer, Georg; Brindley, David

    2017-06-26

    Cellular-based therapies represent a platform technology within the rapidly expanding field of regenerative medicine and are distinct from conventional therapeutics-offering a unique approach to managing what were once considered untreatable diseases. Despite a significant increase in basic science activity within the cell therapy arena, alongside a growing portfolio of cell therapy trials and promising investment, the translation of cellular-based therapeutics from "bench to bedside" remains challenging, and the number of industry products available for widespread clinical use remains comparatively low. This systematic review identifies unique intrinsic and extrinsic barriers in the cell-based therapy domain. Eight electronic databases will be searched, specifically Medline, EMBASE (OvidSP), BIOSIS & Web of Science, Cochrane Library & HEED, EconLit (ProQuest), WHOLIS WHO Library Database, PAIS International (ProQuest), and Scopus. Addition to this gray literature was searched by manually reviewing relevant work. All identified articles will be subjected for review by two authors who will decide whether or not each article passes our inclusion/exclusion criteria. Eligible papers will subsequently be reviewed, and key data extracted into a pre-designed data extraction scorecard. An assessment of the perceived impact of broad commercial barriers to the adoption of cell-based therapies will be conducted. These broad categories will include manufacturing, regulation and intellectual property, reimbursement, clinical trials, clinical adoption, ethics, and business models. This will inform further discussion in the review. There is no PROSPERO registration number. Through a systematic search and appraisal of available literature, this review will identify key challenges in the commercialization pathway of cellular-based therapeutics and highlights significant barriers impeding successful clinical adoption. This will aid in creating an adaptable, acceptable, and

  11. Determining spherical lens correction for astronaut training underwater.

    PubMed

    Porter, Jason; Gibson, C Robert; Strauss, Samuel

    2011-09-01

    To develop a model that will accurately predict the distance spherical lens correction needed to be worn by National Aeronautics and Space Administration astronauts while training underwater. The replica space suit's helmet contains curved visors that induce refractive power when submersed in water. Anterior surface powers and thicknesses were measured for the helmet's protective and inside visors. The impact of each visor on the helmet's refractive power in water was analyzed using thick lens calculations and Zemax optical design software. Using geometrical optics approximations, a model was developed to determine the optimal distance spherical power needed to be worn underwater based on the helmet's total induced spherical power underwater and the astronaut's manifest spectacle plane correction in air. The validity of the model was tested using data from both eyes of 10 astronauts who trained underwater. The helmet's visors induced a total power of -2.737 D when placed underwater. The required underwater spherical correction (FW) was linearly related to the spectacle plane spherical correction in air (FAir): FW = FAir + 2.356 D. The mean magnitude of the difference between the actual correction worn underwater and the calculated underwater correction was 0.20 ± 0.11 D. The actual and calculated values were highly correlated (r = 0.971) with 70% of eyes having a difference in magnitude of <0.25 D between values. We devised a model to calculate the spherical spectacle lens correction needed to be worn underwater by National Aeronautics and Space Administration astronauts. The model accurately predicts the actual values worn underwater and can be applied (more generally) to determine a suitable spectacle lens correction to be worn behind other types of masks when submerged underwater.

  12. Oral Immunization with a Multivalent Epitope-Based Vaccine, Based on NAP, Urease, HSP60, and HpaA, Provides Therapeutic Effect on H. pylori Infection in Mongolian gerbils.

    PubMed

    Guo, Le; Yang, Hua; Tang, Feng; Yin, Runting; Liu, Hongpeng; Gong, Xiaojuan; Wei, Jun; Zhang, Ying; Xu, Guangxian; Liu, Kunmei

    2017-01-01

    Epitope-based vaccine is a promising strategy for therapeutic vaccination against Helicobacter pylori ( H. pylori ) infection. A multivalent subunit vaccine containing various antigens from H. pylori is superior to a univalent subunit vaccine. However, whether a multivalent epitope-based vaccine is superior to a univalent epitope-based vaccine in therapeutic vaccination against H. pylori , remains unclear. In this study, a multivalent epitope-based vaccine named CWAE against H. pylori urease, neutrophil-activating protein (NAP), heat shock protein 60 (HSP60) and H. pylori adhesin A (HpaA) was constructed based on mucosal adjuvant cholera toxin B subunit (CTB), Th1-type adjuvant NAP, multiple copies of selected B and Th cell epitopes (UreA 27-53 , UreA 183-203 , HpaA 132-141 , and HSP60 189-203 ), and also the epitope-rich regions of urease B subunit (UreB 158-251 and UreB 321-385 ) predicted by bioinformatics. Immunological properties of CWAE vaccine were characterized in BALB/c mice model. Its therapeutic effect was evaluated in H. pylori -infected Mongolian gerbil model by comparing with a univalent epitope-based vaccine CTB-UE against H. pylori urease that was constructed in our previous studies. Both CWAE and CTB-UE could induce similar levels of specific antibodies against H. pylori urease, and had similar inhibition effect of H. pylori urease activity. However, only CWAE could induce high levels of specific antibodies to NAP, HSP60, HpaA, and also the synthetic peptides epitopes (UreB 158-172 , UreB 181-195 , UreB 211-225 , UreB 349-363 , HpaA 132-141 , and HSP60 189-203 ). In addition, oral therapeutic immunization with CWAE significantly reduced the number of H. pylori colonies in the stomach of Mongolian gerbils, compared with oral immunization using CTB-UE or H. pylori urease. The protection of CWAE was associated with higher levels of mixed CD4 + T cell (Th cell) response, IgG, and secretory IgA (sIgA) antibodies to H. pylori . These results indic ate

  13. Oral Immunization with a Multivalent Epitope-Based Vaccine, Based on NAP, Urease, HSP60, and HpaA, Provides Therapeutic Effect on H. pylori Infection in Mongolian gerbils

    PubMed Central

    Guo, Le; Yang, Hua; Tang, Feng; Yin, Runting; Liu, Hongpeng; Gong, Xiaojuan; Wei, Jun; Zhang, Ying; Xu, Guangxian; Liu, Kunmei

    2017-01-01

    Epitope-based vaccine is a promising strategy for therapeutic vaccination against Helicobacter pylori (H. pylori) infection. A multivalent subunit vaccine containing various antigens from H. pylori is superior to a univalent subunit vaccine. However, whether a multivalent epitope-based vaccine is superior to a univalent epitope-based vaccine in therapeutic vaccination against H. pylori, remains unclear. In this study, a multivalent epitope-based vaccine named CWAE against H. pylori urease, neutrophil-activating protein (NAP), heat shock protein 60 (HSP60) and H. pylori adhesin A (HpaA) was constructed based on mucosal adjuvant cholera toxin B subunit (CTB), Th1-type adjuvant NAP, multiple copies of selected B and Th cell epitopes (UreA27–53, UreA183–203, HpaA132–141, and HSP60189–203), and also the epitope-rich regions of urease B subunit (UreB158–251 and UreB321–385) predicted by bioinformatics. Immunological properties of CWAE vaccine were characterized in BALB/c mice model. Its therapeutic effect was evaluated in H. pylori-infected Mongolian gerbil model by comparing with a univalent epitope-based vaccine CTB-UE against H. pylori urease that was constructed in our previous studies. Both CWAE and CTB-UE could induce similar levels of specific antibodies against H. pylori urease, and had similar inhibition effect of H. pylori urease activity. However, only CWAE could induce high levels of specific antibodies to NAP, HSP60, HpaA, and also the synthetic peptides epitopes (UreB158–172, UreB181–195, UreB211–225, UreB349–363, HpaA132–141, and HSP60189–203). In addition, oral therapeutic immunization with CWAE significantly reduced the number of H. pylori colonies in the stomach of Mongolian gerbils, compared with oral immunization using CTB-UE or H. pylori urease. The protection of CWAE was associated with higher levels of mixed CD4+ T cell (Th cell) response, IgG, and secretory IgA (sIgA) antibodies to H. pylori. These results indic ate that a

  14. Therapeutic target discovery using Boolean network attractors: improvements of kali

    PubMed Central

    Guziolowski, Carito

    2018-01-01

    In a previous article, an algorithm for identifying therapeutic targets in Boolean networks modelling pathological mechanisms was introduced. In the present article, the improvements made on this algorithm, named kali, are described. These improvements are (i) the possibility to work on asynchronous Boolean networks, (ii) a finer assessment of therapeutic targets and (iii) the possibility to use multivalued logic. kali assumes that the attractors of a dynamical system, such as a Boolean network, are associated with the phenotypes of the modelled biological system. Given a logic-based model of pathological mechanisms, kali searches for therapeutic targets able to reduce the reachability of the attractors associated with pathological phenotypes, thus reducing their likeliness. kali is illustrated on an example network and used on a biological case study. The case study is a published logic-based model of bladder tumorigenesis from which kali returns consistent results. However, like any computational tool, kali can predict but cannot replace human expertise: it is a supporting tool for coping with the complexity of biological systems in the field of drug discovery. PMID:29515890

  15. An Inherent-Optical-Property-Centered Approach to Correct the Angular Effects in Water-Leaving Radiance

    DTIC Science & Technology

    2011-07-01

    10%. These results demonstrate that the IOP-based BRDF correction scheme (which is composed of the R„ model along with the IOP retrieval...distribution was averaged over 10 min 5. Validation of the lOP-Based BRDF Correction Scheme The IOP-based BRDF correction scheme is applied to both...oceanic and coastal waters were very consistent qualitatively and quantitatively and thus validate the IOP- based BRDF correction system, at least

  16. The accuracy of climate models' simulated season lengths and the effectiveness of grid scale correction factors

    DOE PAGES

    Winterhalter, Wade E.

    2011-09-01

    Global climate change is expected to impact biological populations through a variety of mechanisms including increases in the length of their growing season. Climate models are useful tools for predicting how season length might change in the future. However, the accuracy of these models tends to be rather low at regional geographic scales. Here, I determined the ability of several atmosphere and ocean general circulating models (AOGCMs) to accurately simulate historical season lengths for a temperate ectotherm across the continental United States. I also evaluated the effectiveness of regional-scale correction factors to improve the accuracy of these models. I foundmore » that both the accuracy of simulated season lengths and the effectiveness of the correction factors to improve the model's accuracy varied geographically and across models. These results suggest that regional specific correction factors do not always adequately remove potential discrepancies between simulated and historically observed environmental parameters. As such, an explicit evaluation of the correction factors' effectiveness should be included in future studies of global climate change's impact on biological populations.« less

  17. ITG: A New Global GNSS Tropospheric Correction Model

    PubMed Central

    Yao, Yibin; Xu, Chaoqian; Shi, Junbo; Cao, Na; Zhang, Bao; Yang, Junjian

    2015-01-01

    Tropospheric correction models are receiving increasing attentions, as they play a crucial role in Global Navigation Satellite System (GNSS). Most commonly used models to date include the GPT2 series and the TropGrid2. In this study, we analyzed the advantages and disadvantages of existing models and developed a new model called the Improved Tropospheric Grid (ITG). ITG considers annual, semi-annual and diurnal variations, and includes multiple tropospheric parameters. The amplitude and initial phase of diurnal variation are estimated as a periodic function. ITG provides temperature, pressure, the weighted mean temperature (Tm) and Zenith Wet Delay (ZWD). We conducted a performance comparison among the proposed ITG model and previous ones, in terms of meteorological measurements from 698 observation stations, Zenith Total Delay (ZTD) products from 280 International GNSS Service (IGS) station and Tm from Global Geodetic Observing System (GGOS) products. Results indicate that ITG offers the best performance on the whole. PMID:26196963

  18. PSF mapping-based correction of eddy-current-induced distortions in diffusion-weighted echo-planar imaging.

    PubMed

    In, Myung-Ho; Posnansky, Oleg; Speck, Oliver

    2016-05-01

    To accurately correct diffusion-encoding direction-dependent eddy-current-induced geometric distortions in diffusion-weighted echo-planar imaging (DW-EPI) and to minimize the calibration time at 7 Tesla (T). A point spread function (PSF) mapping based eddy-current calibration method is newly presented to determine eddy-current-induced geometric distortions even including nonlinear eddy-current effects within the readout acquisition window. To evaluate the temporal stability of eddy-current maps, calibration was performed four times within 3 months. Furthermore, spatial variations of measured eddy-current maps versus their linear superposition were investigated to enable correction in DW-EPIs with arbitrary diffusion directions without direct calibration. For comparison, an image-based eddy-current correction method was additionally applied. Finally, this method was combined with a PSF-based susceptibility-induced distortion correction approach proposed previously to correct both susceptibility and eddy-current-induced distortions in DW-EPIs. Very fast eddy-current calibration in a three-dimensional volume is possible with the proposed method. The measured eddy-current maps are very stable over time and very similar maps can be obtained by linear superposition of principal-axes eddy-current maps. High resolution in vivo brain results demonstrate that the proposed method allows more efficient eddy-current correction than the image-based method. The combination of both PSF-based approaches allows distortion-free images, which permit reliable analysis in diffusion tensor imaging applications at 7T. © 2015 Wiley Periodicals, Inc.

  19. Spectral matching research for light-emitting diode-based neonatal jaundice therapeutic device light source

    NASA Astrophysics Data System (ADS)

    Gan, Ruting; Guo, Zhenning; Lin, Jieben

    2015-09-01

    To decrease the risk of bilirubin encephalopathy and minimize the need for exchange transfusions, we report a novel design for light source of light-emitting diode (LED)-based neonatal jaundice therapeutic device (NJTD). The bilirubin absorption spectrum in vivo was regarded as target. Based on spectral constructing theory, we used commercially available LEDs with different peak wavelengths and full width at half maximum as matching light sources. Simple genetic algorithm was first proposed as the spectral matching method. The required LEDs number at each peak wavelength was calculated, and then, the commercial light source sample model of the device was fabricated to confirm the spectral matching technology. In addition, the corresponding spectrum was measured and the effect was analyzed finally. The results showed that fitted spectrum was very similar to the target spectrum with 98.86 % matching degree, and the actual device model has a spectrum close to the target with 96.02 % matching degree. With higher fitting degree and efficiency, this matching algorithm is very suitable for light source matching technology of LED-based spectral distribution, and bilirubin absorption spectrum in vivo will be auspicious candidate for the target spectrum of new LED-based NJTD light source.

  20. A method of measuring and correcting tilt of anti - vibration wind turbines based on screening algorithm

    NASA Astrophysics Data System (ADS)

    Xiao, Zhongxiu

    2018-04-01

    A Method of Measuring and Correcting Tilt of Anti - vibration Wind Turbines Based on Screening Algorithm is proposed in this paper. First of all, we design a device which the core is the acceleration sensor ADXL203, the inclination is measured by installing it on the tower of the wind turbine as well as the engine room. Next using the Kalman filter algorithm to filter effectively by establishing a state space model for signal and noise. Then we use matlab for simulation. Considering the impact of the tower and nacelle vibration on the collected data, the original data and the filtering data are classified and stored by the Screening algorithm, then filter the filtering data to make the output data more accurate. Finally, we eliminate installation errors by using algorithm to achieve the tilt correction. The device based on this method has high precision, low cost and anti-vibration advantages. It has a wide range of application and promotion value.

  1. Community-Based Correctional Education

    ERIC Educational Resources Information Center

    Office of Vocational and Adult Education, US Department of Education, 2011

    2011-01-01

    Although it is known that many persons under community supervision need and eventually want correctional education programs, little is known about the providers and characteristics of these educational programs. This report provides an overview of initiatives at the national and state levels supporting new approaches to community supervision and…

  2. Gummy smile: clinical parameters useful for diagnosis and therapeutical approach.

    PubMed

    Monaco, Annalisa; Streni, Oriana; Marci, Maria Chiara; Marzo, Giuseppe; Gatto, Roberto; Giannoni, Mario

    2004-01-01

    In the analysis of the characteristics of a pleasant smile, a gummy smile has negative components, which most affect the esthetics of non-verbal communication. For this purpose a proposed classification based upon etiopathogenetic criteria as useful indications for a therapeutical approach is given. The nature of a high smile line can be: dento-gingival, connected to an abnormal dental eruption, which is revealed by a short clinic crown; muscular, caused by an hyperactivity of the elevator muscle of the upper lip; dento-alveolar (skeletal), due to an excessive protuberance or vertical growth of the jawbone (maxillary); lastly, a mixed nature, in the presence of more than one of the above described factors The diagnosis of gummy smile must be precocious and based, with reference to specific parameters, upon a careful analysis of the etiopathogenetic factors and the degree of seriousness of the alteration. A correct treatment plan must contemplate the possibility of an orthognatodontic, orthopedic and/or surgical therapeutic resolution considering the seriousness and complexity of the gums exposures (high smile line) in connection with the age of the subject.

  3. Engineering of Fc Fragments with Optimized Physicochemical Properties Implying Improvement of Clinical Potentials for Fc-Based Therapeutics

    PubMed Central

    Yang, Chunpeng; Gao, Xinyu; Gong, Rui

    2018-01-01

    Therapeutic monoclonal antibodies and Fc-fusion proteins are successfully used in treatment of various diseases mainly including cancer, immune disease, and viral infection, which belong to the Fc-based therapeutics. In recent years, engineered Fc-derived antibody domains have also shown potential for Fc-based therapeutics. To increase the druggability of Fc-based therapeutic candidates, many efforts have been made in optimizing physicochemical properties and functions mediated by Fc fragment. The desired result is that we can simultaneously obtain Fc variants with increased physicochemical properties in vitro and capacity of mediating appropriate functions in vivo. However, changes of physicochemical properties of Fc may result in alternation of Fc-mediated functions and vice versa, which leads to undesired outcomes for further development of Fc-based therapeutics. Therefore, whether modified Fc fragments are suitable for achievement of expected clinical results or not needs to be seriously considered. Now, this question comes to be noticed and should be figured out to make better translation from the results of laboratory into clinical applications. In this review, we summarize different strategies on engineering physicochemical properties of Fc, and preliminarily elucidate the relationships between modified Fc in vitro and the subsequent therapeutic influence in vivo. PMID:29375551

  4. Multipole correction of atomic monopole models of molecular charge distribution. I. Peptides

    NASA Technical Reports Server (NTRS)

    Sokalski, W. A.; Keller, D. A.; Ornstein, R. L.; Rein, R.

    1993-01-01

    The defects in atomic monopole models of molecular charge distribution have been analyzed for several model-blocked peptides and compared with accurate quantum chemical values. The results indicate that the angular characteristics of the molecular electrostatic potential around functional groups capable of forming hydrogen bonds can be considerably distorted within various models relying upon isotropic atomic charges only. It is shown that these defects can be corrected by augmenting the atomic point charge models by cumulative atomic multipole moments (CAMMs). Alternatively, sets of off-center atomic point charges could be automatically derived from respective multipoles, providing approximately equivalent corrections. For the first time, correlated atomic multipoles have been calculated for N-acetyl, N'-methylamide-blocked derivatives of glycine, alanine, cysteine, threonine, leucine, lysine, and serine using the MP2 method. The role of the correlation effects in the peptide molecular charge distribution are discussed.

  5. Improvement of Klobuchar model for GNSS single-frequency ionospheric delay corrections

    NASA Astrophysics Data System (ADS)

    Wang, Ningbo; Yuan, Yunbin; Li, Zishen; Huo, Xingliang

    2016-04-01

    Broadcast ionospheric model is currently an effective approach to mitigate the ionospheric time delay for real-time Global Navigation Satellite System (GNSS) single-frequency users. Klobuchar coefficients transmitted in Global Positioning System (GPS) navigation message have been widely used in various GNSS positioning and navigation applications; however, this model can only reduce the ionospheric error by approximately 50% in mid-latitudes. With the emerging BeiDou and Galileo, as well as the modernization of GPS and GLONASS, more precise ionospheric correction models or algorithms are required by GNSS single-frequency users. Numerical analysis of the initial phase and nighttime term in Klobuchar algorithm demonstrates that more parameters should be introduced to better describe the variation of nighttime ionospheric total electron content (TEC). In view of this, several schemes are proposed for the improvement of Klobuchar algorithm. Performance of these improved Klobuchar-like models are validated over the continental and oceanic regions during high (2002) and low (2006) levels of solar activities, respectively. Over the continental region, GPS TEC generated from 35 International GNSS Service (IGS) and the Crust Movement Observation Network of China (CMONOC) stations are used as references. Over the oceanic region, TEC data from TOPEX/Poseidon and JASON-1 altimeters are used for comparison. A ten-parameter Klobuchar-like model, which describes the nighttime term as a linear function of geomagnetic latitude, is finally proposed for GNSS single-frequency ionospheric corrections. Compared to GPS TEC, while GPS broadcast model can correct for 55.0% and 49.5% of the ionospheric delay for the year 2002 and 2006, respectively, the proposed ten-parameter Klobuchar-like model can reduce the ionospheric error by 68.4% and 64.7% for the same period. Compared to TOPEX/Poseidon and JASON-1 TEC, the improved ten-parameter Klobuchar-like model can mitigate the ionospheric

  6. Effect of therapeutic touch on brain activation of preterm infants in response to sensory punctate stimulus: a near-infrared spectroscopy-based study.

    PubMed

    Honda, Noritsugu; Ohgi, Shohei; Wada, Norihisa; Loo, Kek Khee; Higashimoto, Yuji; Fukuda, Kanji

    2013-05-01

    The purpose of this study was to determine whether therapeutic touch in preterm infants can ameliorate their sensory punctate stimulus response in terms of brain activation measured by near-infrared spectroscopy. The study included 10 preterm infants at 34-40 weeks' corrected age. Oxyhaemoglobin (Oxy-Hb) concentration, heart rate (HR), arterial oxygen saturation (SaO2) and body movements were recorded during low-intensity sensory punctate stimulation for 1 s with and without therapeutic touch by a neonatal development specialist nurse. Each stimulation was followed by a resting phase of 30 s. All measurements were performed with the infants asleep in the prone position. sensory punctate stimulus exposure significantly increased the oxy-Hb concentration but did not affect HR, SaO2 and body movements. The infants receiving therapeutic touch had significantly decreased oxy-Hb concentrations over time. Therapeutic touch in preterm infants can ameliorate their sensory punctate stimulus response in terms of brain activation, indicated by increased cerebral oxygenation. Therefore, therapeutic touch may have a protective effect on the autoregulation of cerebral blood flow during sensory punctate stimulus in neonates.

  7. Scene-based nonuniformity correction with video sequences and registration.

    PubMed

    Hardie, R C; Hayat, M M; Armstrong, E; Yasuda, B

    2000-03-10

    We describe a new, to our knowledge, scene-based nonuniformity correction algorithm for array detectors. The algorithm relies on the ability to register a sequence of observed frames in the presence of the fixed-pattern noise caused by pixel-to-pixel nonuniformity. In low-to-moderate levels of nonuniformity, sufficiently accurate registration may be possible with standard scene-based registration techniques. If the registration is accurate, and motion exists between the frames, then groups of independent detectors can be identified that observe the same irradiance (or true scene value). These detector outputs are averaged to generate estimates of the true scene values. With these scene estimates, and the corresponding observed values through a given detector, a curve-fitting procedure is used to estimate the individual detector response parameters. These can then be used to correct for detector nonuniformity. The strength of the algorithm lies in its simplicity and low computational complexity. Experimental results, to illustrate the performance of the algorithm, include the use of visible-range imagery with simulated nonuniformity and infrared imagery with real nonuniformity.

  8. Travel cost demand model based river recreation benefit estimates with on-site and household surveys: Comparative results and a correction procedure

    NASA Astrophysics Data System (ADS)

    Loomis, John

    2003-04-01

    Past recreation studies have noted that on-site or visitor intercept surveys are subject to over-sampling of avid users (i.e., endogenous stratification) and have offered econometric solutions to correct for this. However, past papers do not estimate the empirical magnitude of the bias in benefit estimates with a real data set, nor do they compare the corrected estimates to benefit estimates derived from a population sample. This paper empirically examines the magnitude of the recreation benefits per trip bias by comparing estimates from an on-site river visitor intercept survey to a household survey. The difference in average benefits is quite large, with the on-site visitor survey yielding 24 per day trip, while the household survey yields 9.67 per day trip. A simple econometric correction for endogenous stratification in our count data model lowers the benefit estimate to $9.60 per day trip, a mean value nearly identical and not statistically different from the household survey estimate.

  9. A Web-Based Therapeutic Workplace for the Treatment of Drug Addiction and Chronic Unemployment

    ERIC Educational Resources Information Center

    Silverman, Kenneth; Wong, Conrad J.; Grabinski, Michael J.; Hampton, Jacqueline; Sylvest, Christine E.; Dillon, Erin M.; Wentland, R. Daniel

    2005-01-01

    This article describes a Web-based therapeutic workplace intervention designed to promote heroin and cocaine abstinence and train and employ participants as data entry operators. Patients are paid to participate in training and then to perform data entry jobs in a therapeutic workplace business. Salary is linked to abstinence by requiring patients…

  10. Modeling of Adaptive Optics-Based Free-Space Communications Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilks, S C; Morris, J R; Brase, J M

    2002-08-06

    We introduce a wave-optics based simulation code written for air-optic laser communications links, that includes a detailed model of an adaptive optics compensation system. We present the results obtained by this model, where the phase of a communications laser beam is corrected, after it propagates through a turbulent atmosphere. The phase of the received laser beam is measured using a Shack-Hartmann wavefront sensor, and the correction method utilizes a MEMS mirror. Strehl improvement and amount of power coupled to the receiving fiber for both 1 km horizontal and 28 km slant paths are presented.

  11. Optical conductivity calculation of a k.p model semiconductor GaAs incorporating first-order electron-hole vertex correction

    NASA Astrophysics Data System (ADS)

    Nurhuda, Maryam; Aziz Majidi, Muhammad

    2018-04-01

    The role of excitons in semiconducting materials carries potential applications. Experimental results show that excitonic signals also appear in optical absorption spectra of semiconductor system with narrow gap, such as Gallium Arsenide (GaAs). While on the theoretical side, calculation of optical spectra based purely on Density Functional Theory (DFT) without taking electron-hole (e-h) interactions into account does not lead to the appearance of any excitonic signal. Meanwhile, existing DFT-based algorithms that include a full vertex correction through Bethe-Salpeter equation may reveal an excitonic signal, but the algorithm has not provided a way to analyze the excitonic signal further. Motivated to provide a way to isolate the excitonic effect in the optical response theoretically, we develop a method of calculation for the optical conductivity of a narrow band-gap semiconductor GaAs within the 8-band k.p model that includes electron-hole interactions through first-order electron-hole vertex correction. Our calculation confirms that the first-order e-h vertex correction reveals excitonic signal around 1.5 eV (the band gap edge), consistent with the experimental data.

  12. Bias correction in the realized stochastic volatility model for daily volatility on the Tokyo Stock Exchange

    NASA Astrophysics Data System (ADS)

    Takaishi, Tetsuya

    2018-06-01

    The realized stochastic volatility model has been introduced to estimate more accurate volatility by using both daily returns and realized volatility. The main advantage of the model is that no special bias-correction factor for the realized volatility is required a priori. Instead, the model introduces a bias-correction parameter responsible for the bias hidden in realized volatility. We empirically investigate the bias-correction parameter for realized volatilities calculated at various sampling frequencies for six stocks on the Tokyo Stock Exchange, and then show that the dynamic behavior of the bias-correction parameter as a function of sampling frequency is qualitatively similar to that of the Hansen-Lunde bias-correction factor although their values are substantially different. Under the stochastic diffusion assumption of the return dynamics, we investigate the accuracy of estimated volatilities by examining the standardized returns. We find that while the moments of the standardized returns from low-frequency realized volatilities are consistent with the expectation from the Gaussian variables, the deviation from the expectation becomes considerably large at high frequencies. This indicates that the realized stochastic volatility model itself cannot completely remove bias at high frequencies.

  13. Streamflow Bias Correction for Climate Change Impact Studies: Harmless Correction or Wrecking Ball?

    NASA Astrophysics Data System (ADS)

    Nijssen, B.; Chegwidden, O.

    2017-12-01

    Projections of the hydrologic impacts of climate change rely on a modeling chain that includes estimates of future greenhouse gas emissions, global climate models, and hydrologic models. The resulting streamflow time series are used in turn as input to impact studies. While these flows can sometimes be used directly in these impact studies, many applications require additional post-processing to remove model errors. Water resources models and regulation studies are a prime example of this type of application. These models rely on specific flows and reservoir levels to trigger reservoir releases and diversions and do not function well if the unregulated streamflow inputs are significantly biased in time and/or amount. This post-processing step is typically referred to as bias-correction, even though this step corrects not just the mean but the entire distribution of flows. Various quantile-mapping approaches have been developed that adjust the modeled flows to match a reference distribution for some historic period. Simulations of future flows are then post-processed using this same mapping to remove hydrologic model errors. These streamflow bias-correction methods have received far less scrutiny than the downscaling and bias-correction methods that are used for climate model output, mostly because they are less widely used. However, some of these methods introduce large artifacts in the resulting flow series, in some cases severely distorting the climate change signal that is present in future flows. In this presentation, we discuss our experience with streamflow bias-correction methods as part of a climate change impact study in the Columbia River basin in the Pacific Northwest region of the United States. To support this discussion, we present a novel way to assess whether a streamflow bias-correction method is merely a harmless correction or is more akin to taking a wrecking ball to the climate change signal.

  14. A parametric approach for simultaneous bias correction and high-resolution downscaling of climate model rainfall

    NASA Astrophysics Data System (ADS)

    Mamalakis, Antonios; Langousis, Andreas; Deidda, Roberto; Marrocu, Marino

    2017-03-01

    Distribution mapping has been identified as the most efficient approach to bias-correct climate model rainfall, while reproducing its statistics at spatial and temporal resolutions suitable to run hydrologic models. Yet its implementation based on empirical distributions derived from control samples (referred to as nonparametric distribution mapping) makes the method's performance sensitive to sample length variations, the presence of outliers, the spatial resolution of climate model results, and may lead to biases, especially in extreme rainfall estimation. To address these shortcomings, we propose a methodology for simultaneous bias correction and high-resolution downscaling of climate model rainfall products that uses: (a) a two-component theoretical distribution model (i.e., a generalized Pareto (GP) model for rainfall intensities above a specified threshold u*, and an exponential model for lower rainrates), and (b) proper interpolation of the corresponding distribution parameters on a user-defined high-resolution grid, using kriging for uncertain data. We assess the performance of the suggested parametric approach relative to the nonparametric one, using daily raingauge measurements from a dense network in the island of Sardinia (Italy), and rainfall data from four GCM/RCM model chains of the ENSEMBLES project. The obtained results shed light on the competitive advantages of the parametric approach, which is proved more accurate and considerably less sensitive to the characteristics of the calibration period, independent of the GCM/RCM combination used. This is especially the case for extreme rainfall estimation, where the GP assumption allows for more accurate and robust estimates, also beyond the range of the available data.

  15. Retinal image contrast obtained by a model eye with combined correction of chromatic and spherical aberrations

    PubMed Central

    Ohnuma, Kazuhiko; Kayanuma, Hiroyuki; Lawu, Tjundewo; Negishi, Kazuno; Yamaguchi, Takefumi; Noda, Toru

    2011-01-01

    Correcting spherical and chromatic aberrations in vitro in human eyes provides substantial visual acuity and contrast sensitivity improvements. We found the same improvement in the retinal images using a model eye with/without correction of longitudinal chromatic aberrations (LCAs) and spherical aberrations (SAs). The model eye included an intraocular lens (IOL) and artificial cornea with human ocular LCAs and average human SAs. The optotypes were illuminated using a D65 light source, and the images were obtained using two-dimensional luminance colorimeter. The contrast improvement from the SA correction was higher than the LCA correction, indicating the benefit of an aspheric achromatic IOL. PMID:21698008

  16. Characterizing bias correction uncertainty in wheat yield predictions

    NASA Astrophysics Data System (ADS)

    Ortiz, Andrea Monica; Jones, Julie; Freckleton, Robert; Scaife, Adam

    2017-04-01

    uncertainty that result from different climate model simulation input and bias correction methods. We simulate wheat yields using a General Linear Model that includes the effects of seasonal maximum temperatures and precipitation, since wheat is sensitive to heat stress during important developmental stages. We use the same statistical model to predict future wheat yields using the recently available bias-corrected simulations of EURO-CORDEX-Adjust. While statistical models are often criticized for their lack of complexity, an advantage is that we are here able to consider only the effect of the choice of climate model, resolution or bias correction method on yield. Initial results using both past and future bias-corrected climate simulations with a process-based model will also be presented. Through these methods, we make recommendations in preparing climate model output for crop models.

  17. LINEAR LATTICE AND TRAJECTORY RECONSTRUCTION AND CORRECTION AT FAST LINEAR ACCELERATOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romanov, A.; Edstrom, D.; Halavanau, A.

    2017-07-16

    The low energy part of the FAST linear accelerator based on 1.3 GHz superconducting RF cavities was successfully commissioned [1]. During commissioning, beam based model dependent methods were used to correct linear lattice and trajectory. Lattice correction algorithm is based on analysis of beam shape from profile monitors and trajectory responses to dipole correctors. Trajectory responses to field gradient variations in quadrupoles and phase variations in superconducting RF cavities were used to correct bunch offsets in quadrupoles and accelerating cavities relative to their magnetic axes. Details of used methods and experimental results are presented.

  18. A School-Based Therapeutic/Educational Program for Severely Disturbed Latency Aged Children.

    ERIC Educational Resources Information Center

    Shelby, Madge E.

    Through joint efforts of mental health and education professionals, a school-based therapeutic/educational program for seriously emotionally disturbed children, some of whom had additional identified disabilities such as mental retardation, was initiated with 15 students (ages 8-13). The educational component of the program was based on an…

  19. Successful arrest of photoreceptor and vision loss expands the therapeutic window of retinal gene therapy to later stages of disease

    PubMed Central

    Beltran, William A.; Cideciyan, Artur V.; Iwabe, Simone; Swider, Malgorzata; Kosyk, Mychajlo S.; McDaid, Kendra; Martynyuk, Inna; Ying, Gui-Shuang; Shaffer, James; Deng, Wen-Tao; Boye, Sanford L.; Lewin, Alfred S.; Hauswirth, William W.; Jacobson, Samuel G.; Aguirre, Gustavo D.

    2015-01-01

    Inherited retinal degenerations cause progressive loss of photoreceptor neurons with eventual blindness. Corrective or neuroprotective gene therapies under development could be delivered at a predegeneration stage to prevent the onset of disease, as well as at intermediate-degeneration stages to slow the rate of progression. Most preclinical gene therapy successes to date have been as predegeneration interventions. In many animal models, as well as in human studies, to date, retinal gene therapy administered well after the onset of degeneration was not able to modify the rate of progression even when successfully reversing dysfunction. We evaluated consequences of gene therapy delivered at intermediate stages of disease in a canine model of X-linked retinitis pigmentosa (XLRP) caused by a mutation in the Retinitis Pigmentosa GTPase Regulator (RPGR) gene. Spatiotemporal natural history of disease was defined and therapeutic dose selected based on predegeneration results. Then interventions were timed at earlier and later phases of intermediate-stage disease, and photoreceptor degeneration monitored with noninvasive imaging, electrophysiological function, and visual behavior for more than 2 y. All parameters showed substantial and significant arrest of the progressive time course of disease with treatment, which resulted in long-term improved retinal function and visual behavior compared with control eyes. Histology confirmed that the human RPGR transgene was stably expressed in photoreceptors and associated with improved structural preservation of rods, cones, and ON bipolar cells together with correction of opsin mislocalization. These findings in a clinically relevant large animal model demonstrate the long-term efficacy of RPGR gene augmentation and substantially broaden the therapeutic window for intervention in patients with RPGR-XLRP. PMID:26460017

  20. Matching mice to malignancy: molecular subgroups and models of medulloblastoma

    PubMed Central

    Lau, Jasmine; Schmidt, Christin; Markant, Shirley L.; Taylor, Michael D.; Wechsler-Reya, Robert J.

    2012-01-01

    Introduction Medulloblastoma, the largest group of embryonal brain tumors, has historically been classified into five variants based on histopathology. More recently, epigenetic and transcriptional analyses of primary tumors have sub-classified medulloblastoma into four to six subgroups, most of which are incongruous with histopathological classification. Discussion Improved stratification is required for prognosis and development of targeted treatment strategies, to maximize cure and minimize adverse effects. Several mouse models of medulloblastoma have contributed both to an improved understanding of progression and to developmental therapeutics. In this review, we summarize the classification of human medulloblastoma subtypes based on histopathology and molecular features. We describe existing genetically engineered mouse models, compare these to human disease, and discuss the utility of mouse models for developmental therapeutics. Just as accurate knowledge of the correct molecular subtype of medulloblastoma is critical to the development of targeted therapy in patients, we propose that accurate modeling of each subtype of medulloblastoma in mice will be necessary for preclinical evaluation and optimization of those targeted therapies. PMID:22315164

  1. A new approach for beam hardening correction based on the local spectrum distributions

    NASA Astrophysics Data System (ADS)

    Rasoulpour, Naser; Kamali-Asl, Alireza; Hemmati, Hamidreza

    2015-09-01

    Energy dependence of material absorption and polychromatic nature of x-ray beams in the Computed Tomography (CT) causes a phenomenon which called "beam hardening". The purpose of this study is to provide a novel approach for Beam Hardening (BH) correction. This approach is based on the linear attenuation coefficients of Local Spectrum Distributions (LSDs) in the various depths of a phantom. The proposed method includes two steps. Firstly, the hardened spectra in various depths of the phantom (or LSDs) are estimated based on the Expectation Maximization (EM) algorithm for arbitrary thickness interval of known materials in the phantom. The performance of LSD estimation technique is evaluated by applying random Gaussian noise to transmission data. Then, the linear attenuation coefficients with regarding to the mean energy of LSDs are obtained. Secondly, a correction function based on the calculated attenuation coefficients is derived in order to correct polychromatic raw data. Since a correction function has been used for the conversion of the polychromatic data to the monochromatic data, the effect of BH in proposed reconstruction must be reduced in comparison with polychromatic reconstruction. The proposed approach has been assessed in the phantoms which involve less than two materials, but the correction function has been extended for using in the constructed phantoms with more than two materials. The relative mean energy difference in the LSDs estimations based on the noise-free transmission data was less than 1.5%. Also, it shows an acceptable value when a random Gaussian noise is applied to the transmission data. The amount of cupping artifact in the proposed reconstruction method has been effectively reduced and proposed reconstruction profile is uniform more than polychromatic reconstruction profile.

  2. Validation of a T1 and T2* leakage correction method based on multi-echo DSC-MRI using MION as a reference standard

    PubMed Central

    Stokes, Ashley M.; Semmineh, Natenael; Quarles, C. Chad

    2015-01-01

    Purpose A combined biophysical- and pharmacokinetic-based method is proposed to separate, quantify, and correct for both T1 and T2* leakage effects using dual-echo DSC acquisitions to provide more accurate hemodynamic measures, as validated by a reference intravascular contrast agent (CA). Methods Dual-echo DSC-MRI data were acquired in two rodent glioma models. The T1 leakage effects were removed and also quantified in order to subsequently correct for the remaining T2* leakage effects. Pharmacokinetic, biophysical, and combined biophysical and pharmacokinetic models were used to obtain corrected cerebral blood volume (CBV) and cerebral blood flow (CBF), and these were compared with CBV and CBF from an intravascular CA. Results T1-corrected CBV was significantly overestimated compared to MION CBV, while T1+T2*-correction yielded CBV values closer to the reference values. The pharmacokinetic and simplified biophysical methods showed similar results and underestimated CBV in tumors exhibiting strong T2* leakage effects. The combined method was effective for correcting T1 and T2* leakage effects across tumor types. Conclusions Correcting for both T1 and T2* leakage effects yielded more accurate measures of CBV. The combined correction method yields more reliable CBV measures than either correction method alone, but for certain brain tumor types (e.g., gliomas) the simplified biophysical method may provide a robust and computationally efficient alternative. PMID:26362714

  3. Determining spherical lens correction for astronaut training underwater

    PubMed Central

    Porter, Jason; Gibson, C. Robert; Strauss, Samuel

    2013-01-01

    Purpose To develop a model that will accurately predict the distance spherical lens correction needed to be worn by National Aeronautics and Space Administration (NASA) astronauts while training underwater. The replica space suit’s helmet contains curved visors that induce refractive power when submersed in water. Methods Anterior surface powers and thicknesses were measured for the helmet’s protective and inside visors. The impact of each visor on the helmet’s refractive power in water was analyzed using thick lens calculations and Zemax optical design software. Using geometrical optics approximations, a model was developed to determine the optimal distance spherical power needed to be worn underwater based on the helmet’s total induced spherical power underwater and the astronaut’s manifest spectacle plane correction in air. The validity of the model was tested using data from both eyes of 10 astronauts who trained underwater. Results The helmet visors induced a total power of −2.737 D when placed underwater. The required underwater spherical correction (FW) was linearly related to the spectacle plane spherical correction in air (FAir): FW = FAir + 2.356 D. The mean magnitude of the difference between the actual correction worn underwater and the calculated underwater correction was 0.20 ± 0.11 D. The actual and calculated values were highly correlated (R = 0.971) with 70% of eyes having a difference in magnitude of < 0.25 D between values. Conclusions We devised a model to calculate the spherical spectacle lens correction needed to be worn underwater by National Aeronautics and Space Administration astronauts. The model accurately predicts the actual values worn underwater and can be applied (more generally) to determine a suitable spectacle lens correction to be worn behind other types of masks when submerged underwater. PMID:21623249

  4. Role of T-cell epitope-based vaccine in prophylactic and therapeutic applications

    PubMed Central

    Testa, James S; Philip, Ramila

    2013-01-01

    Prophylactic and therapeutic vaccines against viral infections have advanced in recent years from attenuated live vaccines to subunit-based vaccines. An ideal prophylactic vaccine should mimic the natural immunity induced by an infection, in that it should generate long-lasting adaptive immunity. To complement subunit vaccines, which primarily target an antibody response, different methodologies are being investigated to develop vaccines capable of driving cellular immunity. T-cell epitope discovery is central to this concept. In this review, the significance of T-cell epitope-based vaccines for prophylactic and therapeutic applications is discussed. Additionally, methodologies for the discovery of T-cell epitopes, as well as recent developments in the clinical testing of these vaccines for various viral infections, are explained. PMID:23630544

  5. Investigation of model-based physical design restrictions (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Lucas, Kevin; Baron, Stanislas; Belledent, Jerome; Boone, Robert; Borjon, Amandine; Couderc, Christophe; Patterson, Kyle; Riviere-Cazaux, Lionel; Rody, Yves; Sundermann, Frank; Toublan, Olivier; Trouiller, Yorick; Urbani, Jean-Christophe; Wimmer, Karl

    2005-05-01

    As lithography and other patterning processes become more complex and more non-linear with each generation, the task of physical design rules necessarily increases in complexity also. The goal of the physical design rules is to define the boundary between the physical layout structures which will yield well from those which will not. This is essentially a rule-based pre-silicon guarantee of layout correctness. However the rapid increase in design rule requirement complexity has created logistical problems for both the design and process functions. Therefore, similar to the semiconductor industry's transition from rule-based to model-based optical proximity correction (OPC) due to increased patterning complexity, opportunities for improving physical design restrictions by implementing model-based physical design methods are evident. In this paper we analyze the possible need and applications for model-based physical design restrictions (MBPDR). We first analyze the traditional design rule evolution, development and usage methodologies for semiconductor manufacturers. Next we discuss examples of specific design rule challenges requiring new solution methods in the patterning regime of low K1 lithography and highly complex RET. We then evaluate possible working strategies for MBPDR in the process development and product design flows, including examples of recent model-based pre-silicon verification techniques. Finally we summarize with a proposed flow and key considerations for MBPDR implementation.

  6. Model-Based Individualized Treatment of Chemotherapeutics: Bayesian Population Modeling and Dose Optimization

    PubMed Central

    Jayachandran, Devaraj; Laínez-Aguirre, José; Rundell, Ann; Vik, Terry; Hannemann, Robert; Reklaitis, Gintaras; Ramkrishna, Doraiswami

    2015-01-01

    6-Mercaptopurine (6-MP) is one of the key drugs in the treatment of many pediatric cancers, auto immune diseases and inflammatory bowel disease. 6-MP is a prodrug, converted to an active metabolite 6-thioguanine nucleotide (6-TGN) through enzymatic reaction involving thiopurine methyltransferase (TPMT). Pharmacogenomic variation observed in the TPMT enzyme produces a significant variation in drug response among the patient population. Despite 6-MP’s widespread use and observed variation in treatment response, efforts at quantitative optimization of dose regimens for individual patients are limited. In addition, research efforts devoted on pharmacogenomics to predict clinical responses are proving far from ideal. In this work, we present a Bayesian population modeling approach to develop a pharmacological model for 6-MP metabolism in humans. In the face of scarcity of data in clinical settings, a global sensitivity analysis based model reduction approach is used to minimize the parameter space. For accurate estimation of sensitive parameters, robust optimal experimental design based on D-optimality criteria was exploited. With the patient-specific model, a model predictive control algorithm is used to optimize the dose scheduling with the objective of maintaining the 6-TGN concentration within its therapeutic window. More importantly, for the first time, we show how the incorporation of information from different levels of biological chain-of response (i.e. gene expression-enzyme phenotype-drug phenotype) plays a critical role in determining the uncertainty in predicting therapeutic target. The model and the control approach can be utilized in the clinical setting to individualize 6-MP dosing based on the patient’s ability to metabolize the drug instead of the traditional standard-dose-for-all approach. PMID:26226448

  7. Semiparametric modeling: Correcting low-dimensional model error in parametric models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Tyrus, E-mail: thb11@psu.edu; Harlim, John, E-mail: jharlim@psu.edu; Department of Meteorology, the Pennsylvania State University, 503 Walker Building, University Park, PA 16802-5013

    2016-03-01

    In this paper, a semiparametric modeling approach is introduced as a paradigm for addressing model error arising from unresolved physical phenomena. Our approach compensates for model error by learning an auxiliary dynamical model for the unknown parameters. Practically, the proposed approach consists of the following steps. Given a physics-based model and a noisy data set of historical observations, a Bayesian filtering algorithm is used to extract a time-series of the parameter values. Subsequently, the diffusion forecast algorithm is applied to the retrieved time-series in order to construct the auxiliary model for the time evolving parameters. The semiparametric forecasting algorithm consistsmore » of integrating the existing physics-based model with an ensemble of parameters sampled from the probability density function of the diffusion forecast. To specify initial conditions for the diffusion forecast, a Bayesian semiparametric filtering method that extends the Kalman-based filtering framework is introduced. In difficult test examples, which introduce chaotically and stochastically evolving hidden parameters into the Lorenz-96 model, we show that our approach can effectively compensate for model error, with forecasting skill comparable to that of the perfect model.« less

  8. Therapeutic enhancement: nursing intervention category for patients diagnosed with Readiness for Therapeutic Regimen Management.

    PubMed

    Kelly, Cynthia W

    2008-04-01

    To present a new nursing intervention category called therapeutic enhancement. Fewer than half of North Americans follow their physician's recommendations for diet and exercise, even when such are crucial to their health or recovery. It is imperative that nurses consider new ways to promote healthy behaviours. Therapeutic enhancement is intended to provide such a fresh approach. Traditional intervention techniques focusing on education, contracts, social support and more frequent interaction with physicians appear not to be effective when used alone. Successful strategies have been multidisciplinary; and have included interventions by professional nurses who assist patients to understand their disease and the disease process and that helps them to develop disease-management and self-management skills. Therapeutic enhancement incorporates The Stages of Change Theory, Commitment to Health Theory, Motivational Interviewing techniques and instrumentation specifically designed for process evaluation of health-promoting interventions. This is a critical review of approaches that, heretofore, have not been synthesised in a single published article. Based on the commonly used Stages of Change model, therapeutic enhancement is useful for patients who are at the action stage of change. Using therapeutic enhancement as well as therapeutic strategies identified in Stages of Change Theory, such as contingency management, helping relationships, counterconditioning, stimulus control and Motivational Interviewing techniques, nursing professionals can significantly increase the chances of patients moving from action to the maintenance stage of change for a specific health behaviour. Using the nursing intervention category, therapeutic enhancement can increase caregivers' success in helping patients maintain healthy behaviours.

  9. Cell-based therapeutic strategies for multiple sclerosis.

    PubMed

    Scolding, Neil J; Pasquini, Marcelo; Reingold, Stephen C; Cohen, Jeffrey A

    2017-11-01

    The availability of multiple disease-modifying medications with regulatory approval to treat multiple sclerosis illustrates the substantial progress made in therapy of the disease. However, all are only partially effective in preventing inflammatory tissue damage in the central nervous system and none directly promotes repair. Cell-based therapies, including immunoablation followed by autologous haematopoietic stem cell transplantation, mesenchymal and related stem cell transplantation, pharmacologic manipulation of endogenous stem cells to enhance their reparative capabilities, and transplantation of oligodendrocyte progenitor cells, have generated substantial interest as novel therapeutic strategies for immune modulation, neuroprotection, or repair of the damaged central nervous system in multiple sclerosis. Each approach has potential advantages but also safety concerns and unresolved questions. Moreover, clinical trials of cell-based therapies present several unique methodological and ethical issues. We summarize here the status of cell-based therapies to treat multiple sclerosis and make consensus recommendations for future research and clinical trials. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain.

  10. A positional misalignment correction method for Fourier ptychographic microscopy based on simulated annealing

    NASA Astrophysics Data System (ADS)

    Sun, Jiasong; Zhang, Yuzhen; Chen, Qian; Zuo, Chao

    2017-02-01

    Fourier ptychographic microscopy (FPM) is a newly developed super-resolution technique, which employs angularly varying illuminations and a phase retrieval algorithm to surpass the diffraction limit of a low numerical aperture (NA) objective lens. In current FPM imaging platforms, accurate knowledge of LED matrix's position is critical to achieve good recovery quality. Furthermore, considering such a wide field-of-view (FOV) in FPM, different regions in the FOV have different sensitivity of LED positional misalignment. In this work, we introduce an iterative method to correct position errors based on the simulated annealing (SA) algorithm. To improve the efficiency of this correcting process, large number of iterations for several images with low illumination NAs are firstly implemented to estimate the initial values of the global positional misalignment model through non-linear regression. Simulation and experimental results are presented to evaluate the performance of the proposed method and it is demonstrated that this method can both improve the quality of the recovered object image and relax the LED elements' position accuracy requirement while aligning the FPM imaging platforms.

  11. Scene-based nonuniformity correction technique for infrared focal-plane arrays.

    PubMed

    Liu, Yong-Jin; Zhu, Hong; Zhao, Yi-Gong

    2009-04-20

    A scene-based nonuniformity correction algorithm is presented to compensate for the gain and bias nonuniformity in infrared focal-plane array sensors, which can be separated into three parts. First, an interframe-prediction method is used to estimate the true scene, since nonuniformity correction is a typical blind-estimation problem and both scene values and detector parameters are unavailable. Second, the estimated scene, along with its corresponding observed data obtained by detectors, is employed to update the gain and the bias by means of a line-fitting technique. Finally, with these nonuniformity parameters, the compensated output of each detector is obtained by computing a very simple formula. The advantages of the proposed algorithm lie in its low computational complexity and storage requirements and ability to capture temporal drifts in the nonuniformity parameters. The performance of every module is demonstrated with simulated and real infrared image sequences. Experimental results indicate that the proposed algorithm exhibits a superior correction effect.

  12. [Beat therapeutic inertia in dyslipidemic patient management: A challenge in daily clinical practice] [corrected].

    PubMed

    Morales, Clotilde; Mauri, Marta; Vila, Lluís

    2014-01-01

    Beat therapeutic inertia in dyslipidemic patient management: a challenge in daily clinical practice. In patients with dyslipidemia, there is the need to reach the therapeutic goals in order to get the maximum benefit in the cardiovascular events risk reduction, especially myocardial infarction. Even having guidelines and some powerful hypolipidemic drugs, the goals of low-density lipoprotein-cholesterol (LDL-c) are often not reached, being of special in patients with a high cardiovascular risk. One of the causes is the therapeutic inertia. There are tools to plan the treatment and make the decisions easier. One of the challenges in everyday clinical practice is to know the needed percentage of reduction in LDL-c. Moreover: it is hard to know which one is the treatment we should use in the beginning of the treatment but also when the desired objective is not reached. This article proposes a practical method that can help solving these questions. Copyright © 2013 Sociedad Española de Arteriosclerosis. Published by Elsevier España. All rights reserved.

  13. CRISPR/Cas9-mediated correction of human genetic disease.

    PubMed

    Men, Ke; Duan, Xingmei; He, Zhiyao; Yang, Yang; Yao, Shaohua; Wei, Yuquan

    2017-05-01

    The clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated (Cas) protein 9 system (CRISPR/Cas9) provides a powerful tool for targeted genetic editing. Directed by programmable sequence-specific RNAs, this system introduces cleavage and double-stranded breaks at target sites precisely. Compared to previously developed targeted nucleases, the CRISPR/Cas9 system demonstrates several promising advantages, including simplicity, high specificity, and efficiency. Several broad genome-editing studies with the CRISPR/Cas9 system in different species in vivo and ex vivo have indicated its strong potential, raising hopes for therapeutic genome editing in clinical settings. Taking advantage of non-homologous end-joining (NHEJ) and homology directed repair (HDR)-mediated DNA repair, several studies have recently reported the use of CRISPR/Cas9 to successfully correct disease-causing alleles ranging from single base mutations to large insertions. In this review, we summarize and discuss recent preclinical studies involving the CRISPR/Cas9-mediated correction of human genetic diseases.

  14. A novel mouse model identifies cooperating mutations and therapeutic targets critical for chronic myeloid leukemia progression

    PubMed Central

    Giotopoulos, George; van der Weyden, Louise; Osaki, Hikari; Rust, Alistair G.; Gallipoli, Paolo; Meduri, Eshwar; Horton, Sarah J.; Chan, Wai-In; Foster, Donna; Prinjha, Rab K.; Pimanda, John E.; Tenen, Daniel G.; Vassiliou, George S.; Koschmieder, Steffen; Adams, David J.

    2015-01-01

    The introduction of highly selective ABL-tyrosine kinase inhibitors (TKIs) has revolutionized therapy for chronic myeloid leukemia (CML). However, TKIs are only efficacious in the chronic phase of the disease and effective therapies for TKI-refractory CML, or after progression to blast crisis (BC), are lacking. Whereas the chronic phase of CML is dependent on BCR-ABL, additional mutations are required for progression to BC. However, the identity of these mutations and the pathways they affect are poorly understood, hampering our ability to identify therapeutic targets and improve outcomes. Here, we describe a novel mouse model that allows identification of mechanisms of BC progression in an unbiased and tractable manner, using transposon-based insertional mutagenesis on the background of chronic phase CML. Our BC model is the first to faithfully recapitulate the phenotype, cellular and molecular biology of human CML progression. We report a heterogeneous and unique pattern of insertions identifying known and novel candidate genes and demonstrate that these pathways drive disease progression and provide potential targets for novel therapeutic strategies. Our model greatly informs the biology of CML progression and provides a potent resource for the development of candidate therapies to improve the dismal outcomes in this highly aggressive disease. PMID:26304963

  15. A model-free method for mass spectrometer response correction. [for oxygen consumption and cardiac output calculation

    NASA Technical Reports Server (NTRS)

    Shykoff, Barbara E.; Swanson, Harvey T.

    1987-01-01

    A new method for correction of mass spectrometer output signals is described. Response-time distortion is reduced independently of any model of mass spectrometer behavior. The delay of the system is found first from the cross-correlation function of a step change and its response. A two-sided time-domain digital correction filter (deconvolution filter) is generated next from the same step response data using a regression procedure. Other data are corrected using the filter and delay. The mean squared error between a step response and a step is reduced considerably more after the use of a deconvolution filter than after the application of a second-order model correction. O2 consumption and CO2 production values calculated from data corrupted by a simulated dynamic process return to near the uncorrupted values after correction. Although a clean step response or the ensemble average of several responses contaminated with noise is needed for the generation of the filter, random noise of magnitude not above 0.5 percent added to the response to be corrected does not impair the correction severely.

  16. Comparing paediatric intravenous phenytoin doses using physiologically based pharmacokinetic (PBPK) modelling software.

    PubMed

    Batchelor, Hannah; Appleton, Richard; Hawcutt, Daniel B

    2015-12-01

    To use a physiologically based pharmacokinetic (PBPK) modelling system to predict the serum levels achieved by two different intravenous loading doses of phenytoin. A phenytoin pharmacokinetic model was used in the Simcyp population-based ADME simulator, simulating 100 children age 2-10 years receiving intravenous phenytoin (18 and 20mg/kg). Visual checks were used to evaluate the predictive performance of the candidate model. Loading with doses of 18 mg/kg, blood levels were sub-therapeutic in 22/100 (concentration at 2h post infusion (C2h) <10 μg/mL), therapeutic in 62/100 (C2h 10-20 μg/mL), and supra-therapeutic in 16/100 (C2h>20 μg/mL). Loading with 20mg/kg, the percentages were 15, 59, and 26, respectively. Increasing from 18 to 20 mg/kg increased the mean C2h from 16.0 to 17.9 μg/mL, and the mean AUC from 145 to 162 μg/mL/h. A C2h>30 μg/mL was predicted in 4% and 8% of children in the 18 and 20 mg/kg doses, with 3% predicted to have a C2h>40 μg/mL following either dose. For maintenance doses, a 1st dose of 2.5 or 5mg/kg (intravenous) given at 12h (after either 18 or 20 mg/kg loading) gives the highest percentages of 10-20 μg/mL serum concentrations. For sub-therapeutic concentrations following intravenous loading (20 mg/kg), a 1st maintenance dose (intravenous) of 10mg/kg will achieve therapeutic concentrations in 93%. Use of PBPK modelling suggests that children receiving the 20 mg/kg intravenous loading dose are at slightly increased risk of supra-therapeutic blood levels. Ideally, therapeutic drug monitoring is required to monitor serum concentrations, although the dose regime suggested by the BNFc appear appropriate. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  17. Clearing the waters: Evaluating the need for site-specific field fluorescence corrections based on turbidity measurements

    USGS Publications Warehouse

    Saraceno, John F.; Shanley, James B.; Downing, Bryan D.; Pellerin, Brian A.

    2017-01-01

    In situ fluorescent dissolved organic matter (fDOM) measurements have gained increasing popularity as a proxy for dissolved organic carbon (DOC) concentrations in streams. One challenge to accurate fDOM measurements in many streams is light attenuation due to suspended particles. Downing et al. (2012) evaluated the need for corrections to compensate for particle interference on fDOM measurements using a single sediment standard in a laboratory study. The application of those results to a large river improved unfiltered field fDOM accuracy. We tested the same correction equation in a headwater tropical stream and found that it overcompensated fDOM when turbidity exceeded ∼300 formazin nephelometric units (FNU). Therefore, we developed a site-specific, field-based fDOM correction equation through paired in situ fDOM measurements of filtered and unfiltered streamwater. The site-specific correction increased fDOM accuracy up to a turbidity as high as 700 FNU, the maximum observed in this study. The difference in performance between the laboratory-based correction equation of Downing et al. (2012) and our site-specific, field-based correction equation likely arises from differences in particle size distribution between the sediment standard used in the lab (silt) and that observed in our study (fine to medium sand), particularly during high flows. Therefore, a particle interference correction equation based on a single sediment type may not be ideal when field sediment size is significantly different. Given that field fDOM corrections for particle interference under turbid conditions are a critical component in generating accurate DOC estimates, we describe a way to develop site-specific corrections.

  18. Specification Search for Identifying the Correct Mean Trajectory in Polynomial Latent Growth Models

    ERIC Educational Resources Information Center

    Kim, Minjung; Kwok, Oi-Man; Yoon, Myeongsun; Willson, Victor; Lai, Mark H. C.

    2016-01-01

    This study investigated the optimal strategy for model specification search under the latent growth modeling (LGM) framework, specifically on searching for the correct polynomial mean or average growth model when there is no a priori hypothesized model in the absence of theory. In this simulation study, the effectiveness of different starting…

  19. MRI-Based Computed Tomography Metal Artifact Correction Method for Improving Proton Range Calculation Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Peter C.; Schreibmann, Eduard; Roper, Justin

    2015-03-15

    Purpose: Computed tomography (CT) artifacts can severely degrade dose calculation accuracy in proton therapy. Prompted by the recently increased popularity of magnetic resonance imaging (MRI) in the radiation therapy clinic, we developed an MRI-based CT artifact correction method for improving the accuracy of proton range calculations. Methods and Materials: The proposed method replaces corrupted CT data by mapping CT Hounsfield units (HU number) from a nearby artifact-free slice, using a coregistered MRI. MRI and CT volumetric images were registered with use of 3-dimensional (3D) deformable image registration (DIR). The registration was fine-tuned on a slice-by-slice basis by using 2D DIR.more » Based on the intensity of paired MRI pixel values and HU from an artifact-free slice, we performed a comprehensive analysis to predict the correct HU for the corrupted region. For a proof-of-concept validation, metal artifacts were simulated on a reference data set. Proton range was calculated using reference, artifactual, and corrected images to quantify the reduction in proton range error. The correction method was applied to 4 unique clinical cases. Results: The correction method resulted in substantial artifact reduction, both quantitatively and qualitatively. On respective simulated brain and head and neck CT images, the mean error was reduced from 495 and 370 HU to 108 and 92 HU after correction. Correspondingly, the absolute mean proton range errors of 2.4 cm and 1.7 cm were reduced to less than 2 mm in both cases. Conclusions: Our MRI-based CT artifact correction method can improve CT image quality and proton range calculation accuracy for patients with severe CT artifacts.« less

  20. [Predictors of the therapeutic discharge in patients with dual pathology admitted to a therapeutic community with a psychiatric unit].

    PubMed

    Madoz-Gúrpide, Agustín; García Vicent, Vicente; Luque Fuentes, Encarnación; Ochoa Mangado, Enriqueta

    2013-01-01

    This study aims to analyze the variables on which depends therapeutic discharge, in patients with a severe dual diagnosis admitted to a professional therapeutic community where their pathology is treated. 325 patients admitted between June 2000 and June 2009 to the therapeutic community. This is a retrospective, cross-sectional study with no control group, based on the detailed analysis of the information collected in a model of semi-structured clinical interview designed in the therapeutic community. The 29.5% of the individuals included in the sample were therapeutically discharged. Of all the variables introduced in this analysis the most significant ones were gender, age at the beginning of treatment, education level, opiate dependence, polidrug abuse, and the presence of psychotic disorders and borderline personality disorder. In our study, gender determines the type of discharge, being therapeutic discharge more frequent among women. A higher educational also increases a better prognosis with a higher rate of therapeutic discharge among individuals with higher education level. A later age at the beginning of the treatment reduces the likelihood of therapeutic discharge. Likewise, polidrug abuse, diagnosis of psychotic disorders and borderline personality disorder are associated to a lower rate of therapeutic discharge. Recognizing these characteristics will allow the early identification of those patients more at risk of dropping treatment hastily, while trying to prevent it by increasing the therapeutic intensity.

  1. Correction of stopping power and LET quenching for radiophotoluminescent glass dosimetry in a therapeutic proton beam

    NASA Astrophysics Data System (ADS)

    Chang, Weishan; Koba, Yusuke; Katayose, Tetsurou; Yasui, Keisuke; Omachi, Chihiro; Hariu, Masatsugu; Saitoh, Hidetoshi

    2017-12-01

    To measure the absorbed dose to water D w in proton beams using a radiophotoluminescent glass dosimeter (RGD), a method with the correction for the change of the mass stopping power ratio (SPR) and the linear energy transfer (LET) dependence of radiophotoluminescent efficiency \\varepsilon LETRGD is proposed. The calibration coefficient in terms of D w for RGDs (GD-302M, Asahi Techno Glass) was obtained using a 60Co γ-ray. The SPR of water to the RGD was calculated by Monte Carlo simulation, and \\varepsilon LETRGD was investigated experimentally using a 70 MeV proton beam. For clinical usage, the residual range R res was used as a quality index to determine the correction factor for the beam quality kQ,{{Q0}}RGD and the LET quenching effect of the RGD kLETRGD . The proposed method was evaluated by measuring D w at different depths in a 200 MeV proton beam. For both non-modulated and modulated proton beams, kQ,{{Q0}}RGD decreases rapidly where R res is less than 4 cm. The difference in kQ,{{Q0}}RGD between a non-modulated and a modulated proton beam is less than 0.5% for the R res range from 0 cm to 22 cm. \\varepsilon LETRGD decreases rapidly at a LET range from 1 to 2 keV µm-1. In the evaluation experiments, D w using RGDs, Dw,QRGD showed good agreement with that obtained using an ionization chamber and the relative difference was within 3% where R res was larger than 1 cm. The uncertainty budget for Dw,QRGD in a proton beam was estimated to investigate the potential of RGD postal dosimetry in proton therapy. These results demonstrate the feasibility of RGD dosimetry in a therapeutic proton beam and the general versatility of the proposed method. In conclusion, the proposed methodology for RGDs in proton dosimetry is applicable where R res  >  1 cm and the RGD is feasible as a postal audit dosimeter for proton therapy.

  2. An Enhanced MWR-Based Wet Tropospheric Correction for Sentinel-3: Inheritance from Past ESA Altimetry Missions

    NASA Astrophysics Data System (ADS)

    Lazaro, Clara; Fernandes, Joanna M.

    2015-12-01

    The GNSS-derived Path Delay (GPD) and the Data Combination (DComb) algorithms were developed by University of Porto (U.Porto), in the scope of different projects funded by ESA, to compute a continuous and improved wet tropospheric correction (WTC) for use in satellite altimetry. Both algorithms are mission independent and are based on a linear space-time objective analysis procedure that combines various wet path delay data sources. A new algorithm that gets the best of each aforementioned algorithm (GNSS-derived Path Delay Plus, GPD+) has been developed at U.Porto in the scope of SL_cci project, where the use of consistent and stable in time datasets is of major importance. The algorithm has been applied to the main eight altimetric missions (TOPEX/Poseidon, Jason-1, Jason-2, ERS-1, ERS-2, Envisat and CryoSat-2 and SARAL). Upcoming Sentinel-3 possesses a two-channel on-board radiometer similar to those that were deployed in ERS-1/2 and Envisat. Consequently, the fine-tuning of the GPD+ algorithm to these missions datasets shall enrich it, by increasing its capability to quickly deal with Sentinel-3 data. Foreseeing that the computation of an improved MWR-based WTC for use with Sentinel-3 data will be required, this study focuses on the results obtained for ERS-1/2 and Envisat missions, which are expected to give insight into the computation of this correction for the upcoming ESA altimetric mission. The various WTC corrections available for each mission (in general, the original correction derived from the on-board MWR, the model correction and the one derived from GPD+) are inter-compared either directly or using various sea level anomaly variance statistical analyses. Results show that the GPD+ algorithm is efficient in generating global and continuous datasets, corrected for land and ice contamination and spurious measurements of instrumental origin, with significant impacts on all ESA missions.

  3. A Mis-recognized Medical Vocabulary Correction System for Speech-based Electronic Medical Record

    PubMed Central

    Seo, Hwa Jeong; Kim, Ju Han; Sakabe, Nagamasa

    2002-01-01

    Speech recognition as an input tool for electronic medical record (EMR) enables efficient data entry at the point of care. However, the recognition accuracy for medical vocabulary is much poorer than that for doctor-patient dialogue. We developed a mis-recognized medical vocabulary correction system based on syllable-by-syllable comparison of speech text against medical vocabulary database. Using specialty medical vocabulary, the algorithm detects and corrects mis-recognized medical vocabularies in narrative text. Our preliminary evaluation showed 94% of accuracy in mis-recognized medical vocabulary correction.

  4. An Adaptive Deghosting Method in Neural Network-Based Infrared Detectors Nonuniformity Correction.

    PubMed

    Li, Yiyang; Jin, Weiqi; Zhu, Jin; Zhang, Xu; Li, Shuo

    2018-01-13

    The problems of the neural network-based nonuniformity correction algorithm for infrared focal plane arrays mainly concern slow convergence speed and ghosting artifacts. In general, the more stringent the inhibition of ghosting, the slower the convergence speed. The factors that affect these two problems are the estimated desired image and the learning rate. In this paper, we propose a learning rate rule that combines adaptive threshold edge detection and a temporal gate. Through the noise estimation algorithm, the adaptive spatial threshold is related to the residual nonuniformity noise in the corrected image. The proposed learning rate is used to effectively and stably suppress ghosting artifacts without slowing down the convergence speed. The performance of the proposed technique was thoroughly studied with infrared image sequences with both simulated nonuniformity and real nonuniformity. The results show that the deghosting performance of the proposed method is superior to that of other neural network-based nonuniformity correction algorithms and that the convergence speed is equivalent to the tested deghosting methods.

  5. Image Guidance in Stem Cell Therapeutics: Unfolding the Blindfold.

    PubMed

    Bukhari, Amirali B; Dutta, Shruti; De, Abhijit

    2015-01-01

    Stem cell therapeutics is the future of regenerative medicine in the modern world. Many studies have been instigated with the hope of translating the outcome for the treatment of several disease conditions ranging from heart and neuronal disease to malignancies as grave as cancers. Stem cell therapeutics undoubtedly holds great promise on the front of regenerative medicine, however, the correct distribution and homing of these stem cells to the host site remained blinded until the recent advances in the discipline of molecular imaging. Herein, we discuss the various imaging guidance applied for determination of the proper delivery of various types of stem cell used as therapeutics for various maladies. Additionally, we scrutinize the use of several indirect labeling mechanisms for efficient tagging of the reporter entity for image guidance. Further, the promise of improving patient healthcare has led to the initiation of several clinical trials worldwide. However, in number of the cases, the benefits arrive with a price heavy enough to pose a serious health risk, one such being formation of teratomas. Thus numerous challenges and methodological obstacles must be overcome before their eloquent clinical impact can be realized. Therefore, we also discuss several clinical trials that have taken into consideration the various imaging guided protocols to monitor correct delivery and understand the distribution of therapeutic stem cells in real time.

  6. A new statistical time-dependent model of earthquake occurrence: failure processes driven by a self-correcting model

    NASA Astrophysics Data System (ADS)

    Rotondi, Renata; Varini, Elisa

    2016-04-01

    The long-term recurrence of strong earthquakes is often modelled by the stationary Poisson process for the sake of simplicity, although renewal and self-correcting point processes (with non-decreasing hazard functions) are more appropriate. Short-term models mainly fit earthquake clusters due to the tendency of an earthquake to trigger other earthquakes; in this case, self-exciting point processes with non-increasing hazard are especially suitable. In order to provide a unified framework for analyzing earthquake catalogs, Schoenberg and Bolt proposed the SELC (Short-term Exciting Long-term Correcting) model (BSSA, 2000) and Varini employed a state-space model for estimating the different phases of a seismic cycle (PhD Thesis, 2005). Both attempts are combinations of long- and short-term models, but results are not completely satisfactory, due to the different scales at which these models appear to operate. In this study, we split a seismic sequence in two groups: the leader events, whose magnitude exceeds a threshold magnitude, and the remaining ones considered as subordinate events. The leader events are assumed to follow a well-known self-correcting point process named stress release model (Vere-Jones, J. Phys. Earth, 1978; Bebbington & Harte, GJI, 2003, Varini & Rotondi, Env. Ecol. Stat., 2015). In the interval between two subsequent leader events, subordinate events are expected to cluster at the beginning (aftershocks) and at the end (foreshocks) of that interval; hence, they are modeled by a failure processes that allows bathtub-shaped hazard function. In particular, we have examined the generalized Weibull distributions, a large family that contains distributions with different bathtub-shaped hazard as well as the standard Weibull distribution (Lai, Springer, 2014). The model is fitted to a dataset of Italian historical earthquakes and the results of Bayesian inference are shown.

  7. Improved atmospheric correction and chlorophyll-a remote sensing models for turbid waters in a dusty environment

    NASA Astrophysics Data System (ADS)

    Al Shehhi, Maryam R.; Gherboudj, Imen; Zhao, Jun; Ghedira, Hosni

    2017-11-01

    This study presents a comprehensive assessment of the performance of the commonly used atmospheric correction models (NIR, SWIR, NIR-SWIR and FM) and ocean color products (OC3 and OC2) derived from MODIS images over the Arabian Gulf, Sea of Oman, and Arabian Sea. The considered atmospheric correction models have been used to derive MODIS normalized water-leaving radiances (nLw), which are compared to in situ water nLw(λ) data collected at different locations by Masdar Institute, United Arab of Emirates, and from AERONET-OC (the ocean color component of the Aerosol Robotic Network) database. From this comparison, the NIR model has been found to be the best performing model among the considered atmospheric correction models, which in turn shows disparity, especially at short wavelengths (400-500 nm) under high aerosol optical depth conditions (AOT (869) > 0.3) and over turbid waters. To reduce the error induced by these factors, a modified model taking into consideration the atmospheric and water turbidity conditions has been proposed. A turbidity index was used to identify the turbid water and a threshold of AOT (869) = 0.3 was used to identify the dusty atmosphere. Despite improved results in the MODIS nLw(λ) using the proposed approach, Chl-a models (OC3 and OC2) show low performance when compared to the in situ Chl-a measurements collected during several field campaigns organized by local, regional and international organizations. This discrepancy might be caused by the improper parametrization of these models or/and the improper selection of bands. Thus, an adaptive power fit algorithm (R2 = 0.95) has been proposed to improve the estimation of Chl-a concentration from 0.07 to 10 mg/m3 by using a new blue/red MODIS band ratio of (443,488)/645 instead of the default band ratio used for OC3(443,488)/547. The selection of this new band ratio (443,488)/645 has been based on using band 645 nm which has been found to represent both water turbidity and algal

  8. The controversial origin of pericytes during angiogenesis - Implications for cell-based therapeutic angiogenesis and cell-based therapies.

    PubMed

    Blocki, Anna; Beyer, Sebastian; Jung, Friedrich; Raghunath, Michael

    2018-01-01

    Pericytes reside within the basement membrane of small vessels and are often in direct cellular contact with endothelial cells, fulfilling important functions during blood vessel formation and homeostasis. Recently, these pericytes have been also identified as mesenchymal stem cells. Mesenchymal stem cells, and especially their specialized subpopulation of pericytes, represent promising candidates for therapeutic angiogenesis applications, and have already been widely applied in pre-clinical and clinical trials. However, cell-based therapies of ischemic diseases (especially of myocardial infarction) have not resulted in significant long-term improvement. Interestingly, pericytes from a hematopoietic origin were observed in embryonic skin and a pericyte sub-population expressing leukocyte and monocyte markers was described during adult angiogenesis in vivo. Since mesenchymal stem cells do not express hematopoietic markers, the latter cell type might represent an alternative pericyte population relevant to angiogenesis. Therefore, we sourced blood-derived angiogenic cells (BDACs) from monocytes that closely resembled hematopoietic pericytes, which had only been observed in vivo thus far. BDACs displayed many pericytic features and exhibited enhanced revascularization and functional tissue regeneration in a pre-clinical model of critical limb ischemia. Comparison between BDACs and mesenchymal pericytes indicated that BDACs (while resembling hematopoietic pericytes) enhanced early stages of angiogenesis, such as endothelial cell sprouting. In contrast, mesenchymal pericytes were responsible for blood vessel maturation and homeostasis, while reducing endothelial sprouting.Since the formation of new blood vessels is crucial during therapeutic angiogenesis or during integration of implants into the host tissue, hematopoietic pericytes (and therefore BDACs) might offer an advantageous addition or even an alternative for cell-based therapies.

  9. Image-based modeling of tumor shrinkage in head and neck radiation therapy1

    PubMed Central

    Chao, Ming; Xie, Yaoqin; Moros, Eduardo G.; Le, Quynh-Thu; Xing, Lei

    2010-01-01

    Purpose: Understanding the kinetics of tumor growth∕shrinkage represents a critical step in quantitative assessment of therapeutics and realization of adaptive radiation therapy. This article presents a novel framework for image-based modeling of tumor change and demonstrates its performance with synthetic images and clinical cases. Methods: Due to significant tumor tissue content changes, similarity-based models are not suitable for describing the process of tumor volume changes. Under the hypothesis that tissue features in a tumor volume or at the boundary region are partially preserved, the kinetic change was modeled in two steps: (1) Autodetection of homologous tissue features shared by two input images using the scale invariance feature transformation (SIFT) method; and (2) establishment of a voxel-to-voxel correspondence between the images for the remaining spatial points by interpolation. The correctness of the tissue feature correspondence was assured by a bidirectional association procedure, where SIFT features were mapped from template to target images and reversely. A series of digital phantom experiments and five head and neck clinical cases were used to assess the performance of the proposed technique. Results: The proposed technique can faithfully identify the known changes introduced when constructing the digital phantoms. The subsequent feature-guided thin plate spline calculation reproduced the “ground truth” with accuracy better than 1.5 mm. For the clinical cases, the new algorithm worked reliably for a volume change as large as 30%. Conclusions: An image-based tumor kinetic algorithm was developed to model the tumor response to radiation therapy. The technique provides a practical framework for future application in adaptive radiation therapy. PMID:20527569

  10. Evaluation of simulation-based scatter correction for 3-D PET cardiac imaging

    NASA Astrophysics Data System (ADS)

    Watson, C. C.; Newport, D.; Casey, M. E.; deKemp, R. A.; Beanlands, R. S.; Schmand, M.

    1997-02-01

    Quantitative imaging of the human thorax poses one of the most difficult challenges for three-dimensional (3-D) (septaless) positron emission tomography (PET), due to the strong attenuation of the annihilation radiation and the large contribution of scattered photons to the data. In [/sup 18/F] fluorodeoxyglucose (FDG) studies of the heart with the patient's arms in the field of view, the contribution of scattered events can exceed 50% of the total detected coincidences. Accurate correction for this scatter component is necessary for meaningful quantitative image analysis and tracer kinetic modeling. For this reason, the authors have implemented a single-scatter simulation technique for scatter correction in positron volume imaging. Here, they describe this algorithm and present scatter correction results from human and chest phantom studies.

  11. PDF-based heterogeneous multiscale filtration model.

    PubMed

    Gong, Jian; Rutland, Christopher J

    2015-04-21

    Motivated by modeling of gasoline particulate filters (GPFs), a probability density function (PDF) based heterogeneous multiscale filtration (HMF) model is developed to calculate filtration efficiency of clean particulate filters. A new methodology based on statistical theory and classic filtration theory is developed in the HMF model. Based on the analysis of experimental porosimetry data, a pore size probability density function is introduced to represent heterogeneity and multiscale characteristics of the porous wall. The filtration efficiency of a filter can be calculated as the sum of the contributions of individual collectors. The resulting HMF model overcomes the limitations of classic mean filtration models which rely on tuning of the mean collector size. Sensitivity analysis shows that the HMF model recovers the classical mean model when the pore size variance is very small. The HMF model is validated by fundamental filtration experimental data from different scales of filter samples. The model shows a good agreement with experimental data at various operating conditions. The effects of the microstructure of filters on filtration efficiency as well as the most penetrating particle size are correctly predicted by the model.

  12. A modified adjoint-based grid adaptation and error correction method for unstructured grid

    NASA Astrophysics Data System (ADS)

    Cui, Pengcheng; Li, Bin; Tang, Jing; Chen, Jiangtao; Deng, Youqi

    2018-05-01

    Grid adaptation is an important strategy to improve the accuracy of output functions (e.g. drag, lift, etc.) in computational fluid dynamics (CFD) analysis and design applications. This paper presents a modified robust grid adaptation and error correction method for reducing simulation errors in integral outputs. The procedure is based on discrete adjoint optimization theory in which the estimated global error of output functions can be directly related to the local residual error. According to this relationship, local residual error contribution can be used as an indicator in a grid adaptation strategy designed to generate refined grids for accurately estimating the output functions. This grid adaptation and error correction method is applied to subsonic and supersonic simulations around three-dimensional configurations. Numerical results demonstrate that the sensitive grids to output functions are detected and refined after grid adaptation, and the accuracy of output functions is obviously improved after error correction. The proposed grid adaptation and error correction method is shown to compare very favorably in terms of output accuracy and computational efficiency relative to the traditional featured-based grid adaptation.

  13. Geological Corrections in Gravimetry

    NASA Astrophysics Data System (ADS)

    Mikuška, J.; Marušiak, I.

    2015-12-01

    Applying corrections for the known geology to gravity data can be traced back into the first quarter of the 20th century. Later on, mostly in areas with sedimentary cover, at local and regional scales, the correction known as gravity stripping has been in use since the mid 1960s, provided that there was enough geological information. Stripping at regional to global scales became possible after releasing the CRUST 2.0 and later CRUST 1.0 models in the years 2000 and 2013, respectively. Especially the later model provides quite a new view on the relevant geometries and on the topographic and crustal densities as well as on the crust/mantle density contrast. Thus, the isostatic corrections, which have been often used in the past, can now be replaced by procedures working with an independent information interpreted primarily from seismic studies. We have developed software for performing geological corrections in space domain, based on a-priori geometry and density grids which can be of either rectangular or spherical/ellipsoidal types with cells of the shapes of rectangles, tesseroids or triangles. It enables us to calculate the required gravitational effects not only in the form of surface maps or profiles but, for instance, also along vertical lines, which can shed some additional light on the nature of the geological correction. The software can work at a variety of scales and considers the input information to an optional distance from the calculation point up to the antipodes. Our main objective is to treat geological correction as an alternative to accounting for the topography with varying densities since the bottoms of the topographic masses, namely the geoid or ellipsoid, generally do not represent geological boundaries. As well we would like to call attention to the possible distortions of the corrected gravity anomalies. This work was supported by the Slovak Research and Development Agency under the contract APVV-0827-12.

  14. Accelerating EPI distortion correction by utilizing a modern GPU-based parallel computation.

    PubMed

    Yang, Yao-Hao; Huang, Teng-Yi; Wang, Fu-Nien; Chuang, Tzu-Chao; Chen, Nan-Kuei

    2013-04-01

    The combination of phase demodulation and field mapping is a practical method to correct echo planar imaging (EPI) geometric distortion. However, since phase dispersion accumulates in each phase-encoding step, the calculation complexity of phase modulation is Ny-fold higher than conventional image reconstructions. Thus, correcting EPI images via phase demodulation is generally a time-consuming task. Parallel computing by employing general-purpose calculations on graphics processing units (GPU) can accelerate scientific computing if the algorithm is parallelized. This study proposes a method that incorporates the GPU-based technique into phase demodulation calculations to reduce computation time. The proposed parallel algorithm was applied to a PROPELLER-EPI diffusion tensor data set. The GPU-based phase demodulation method reduced the EPI distortion correctly, and accelerated the computation. The total reconstruction time of the 16-slice PROPELLER-EPI diffusion tensor images with matrix size of 128 × 128 was reduced from 1,754 seconds to 101 seconds by utilizing the parallelized 4-GPU program. GPU computing is a promising method to accelerate EPI geometric correction. The resulting reduction in computation time of phase demodulation should accelerate postprocessing for studies performed with EPI, and should effectuate the PROPELLER-EPI technique for clinical practice. Copyright © 2011 by the American Society of Neuroimaging.

  15. Aligning observed and modelled behaviour based on workflow decomposition

    NASA Astrophysics Data System (ADS)

    Wang, Lu; Du, YuYue; Liu, Wei

    2017-09-01

    When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.

  16. Emergence of spacetime dynamics in entropy corrected and braneworld models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheykhi, A.; Dehghani, M.H.; Hosseini, S.E., E-mail: asheykhi@shirazu.ac.ir, E-mail: mhd@shirazu.ac.ir, E-mail: elahehhosseini90@gmail.com

    2013-04-01

    A very interesting new proposal on the origin of the cosmic expansion was recently suggested by Padmanabhan [arXiv:1206.4916]. He argued that the difference between the surface degrees of freedom and the bulk degrees of freedom in a region of space drives the accelerated expansion of the universe, as well as the standard Friedmann equation through relation ΔV = Δt(N{sub sur}−N{sub bulk}). In this paper, we first present the general expression for the number of degrees of freedom on the holographic surface, N{sub sur}, using the general entropy corrected formula S = A/(4L{sub p}{sup 2})+s(A). Then, as two example, by applyingmore » the Padmanabhan's idea we extract the corresponding Friedmann equations in the presence of power-law and logarithmic correction terms in the entropy. We also extend the study to RS II and DGP braneworld models and derive successfully the correct form of the Friedmann equations in these theories. Our study further supports the viability of Padmanabhan's proposal.« less

  17. Magnetic responsive cell based strategies for diagnostic and therapeutics.

    PubMed

    Gonçalves, Ana I; Miranda, Margarida S; Rodrigues, Márcia T; Reis, Rui Luis; Gomes, Manuela

    2018-05-24

    The potential of magnetically assisted strategies within the remit of cell-based therapies is increasing and creates new opportunities in biomedical platforms and in the field of tissue engineering and regenerative medicine (TERM). Among the magnetic elements approached to build magnetically responsive strategies, superparamagnetic iron oxide nanoparticles (SPIONs) represent tunable and precise tools whose properties can be modelled for detection, diagnosis, targeting and therapy purposes. The most investigated clinical role of SPIONs is as contrast imaging agents for tracking and monitoring cells and tissues. Nevertheless, magnetic detection also includes biomarker mapping, cell labelling and cell/drug targeting to monitor cell events and anticipate the disruption of homeostatic conditions and progression of disease. Additionally, isolation and screening techniques of cell subsets in heterogeneous populations or of proteins of interest have been explored in a magnetic sorting context. More recently, SPIONs-based technologies have been applied to stimulate cell differentiation and mechanotransduction processes and to transport genetic or drug cargo to study biological mechanisms and contribute for improved therapies. Magnetically based strategies significantly contribute for magnetic tissue engineering (magTE), in which magnetically responsive actuators built from magnetic labelled cells or magnetic functionalized systems can be remotely controlled and spatially manipulated upon the actuation of an external magnetic field for delivery or target of TE solutions. SPIONs functionalities combined with the magnetic responsiveness in multifactorial magnetically assisted platforms can revolutionize diagnosis and therapeutics providing new diagnosis and theranostic tools, encouraging regenerative medicine approaches and holding potential for more effective therapies. This review will address the contribution of SPIONs based technologies as

  18. A corrected formulation for marginal inference derived from two-part mixed models for longitudinal semi-continuous data.

    PubMed

    Tom, Brian Dm; Su, Li; Farewell, Vernon T

    2016-10-01

    For semi-continuous data which are a mixture of true zeros and continuously distributed positive values, the use of two-part mixed models provides a convenient modelling framework. However, deriving population-averaged (marginal) effects from such models is not always straightforward. Su et al. presented a model that provided convenient estimation of marginal effects for the logistic component of the two-part model but the specification of marginal effects for the continuous part of the model presented in that paper was based on an incorrect formulation. We present a corrected formulation and additionally explore the use of the two-part model for inferences on the overall marginal mean, which may be of more practical relevance in our application and more generally. © The Author(s) 2013.

  19. A corrected formulation for marginal inference derived from two-part mixed models for longitudinal semi-continuous data

    PubMed Central

    Su, Li; Farewell, Vernon T

    2013-01-01

    For semi-continuous data which are a mixture of true zeros and continuously distributed positive values, the use of two-part mixed models provides a convenient modelling framework. However, deriving population-averaged (marginal) effects from such models is not always straightforward. Su et al. presented a model that provided convenient estimation of marginal effects for the logistic component of the two-part model but the specification of marginal effects for the continuous part of the model presented in that paper was based on an incorrect formulation. We present a corrected formulation and additionally explore the use of the two-part model for inferences on the overall marginal mean, which may be of more practical relevance in our application and more generally. PMID:24201470

  20. Correcting the Alar Base Retraction in Crooked Nose by Dissection of Levator Alaque Nasi Muscle.

    PubMed

    Taş, Süleyman

    2016-10-01

    Nasal base retraction results from cephalic malposition of the alar base in the vertical plane causing disharmonies in the alar base. In literature, there are some excisional procedures to correct this deformity, but it may result to nostril distortion, stenosis, or upper lip elevation. Here, a new technique is reported for the correction of nasal base retraction in crooked nose by manipulating the levator labii alaeque nasi muscle. Sixteen patients, 6 women and 10 men ranging in age from 21 to 42 years, who have alar retraction with crooked nose, were operated, with a follow-up period of 12 months. Preoperative and postoperative frontal, profile, base, and oblique base views in a standard manner were taken and analyzed with Image software. Comparison of preoperative and postoperative photographs demonstrated that nasal base retractions were corrected in all cases without distortion and recurrence. Nasal obstruction was reduced after surgery, and self-evaluation of nasal patency scores significantly increased in all patients (P < 0.001). Functional and aesthetic outcomes were satisfactory for surgeons and the patients. Careful analysis to identify the deformity and proper selection of the technique will ensure a pleasing outcome. The new techniques presented for the correction of nasal base retraction and prevention of the recurrence of the dorsal deviation will help rhinoplasty surgeons obtain pleasing outcomes.

  1. SU-E-T-226: Correction of a Standard Model-Based Dose Calculator Using Measurement Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, M; Jiang, S; Lu, W

    Purpose: To propose a hybrid method that combines advantages of the model-based and measurement-based method for independent dose calculation. Modeled-based dose calculation, such as collapsed-cone-convolution/superposition (CCCS) or the Monte-Carlo method, models dose deposition in the patient body accurately; however, due to lack of detail knowledge about the linear accelerator (LINAC) head, commissioning for an arbitrary machine is tedious and challenging in case of hardware changes. On the contrary, the measurement-based method characterizes the beam property accurately but lacks the capability of dose disposition modeling in heterogeneous media. Methods: We used a standard CCCS calculator, which is commissioned by published data,more » as the standard model calculator. For a given machine, water phantom measurements were acquired. A set of dose distributions were also calculated using the CCCS for the same setup. The difference between the measurements and the CCCS results were tabulated and used as the commissioning data for a measurement based calculator. Here we used a direct-ray-tracing calculator (ΔDRT). The proposed independent dose calculation consists of the following steps: 1. calculate D-model using CCCS. 2. calculate D-ΔDRT using ΔDRT. 3. combine Results: D=D-model+D-ΔDRT. Results: The hybrid dose calculation was tested on digital phantoms and patient CT data for standard fields and IMRT plan. The results were compared to dose calculated by the treatment planning system (TPS). The agreement of the hybrid and the TPS was within 3%, 3 mm for over 98% of the volume for phantom studies and lung patients. Conclusion: The proposed hybrid method uses the same commissioning data as those for the measurement-based method and can be easily extended to any non-standard LINAC. The results met the accuracy, independence, and simple commissioning criteria for an independent dose calculator.« less

  2. Cannabinoid Receptor 2 Participates in Amyloid-β Processing in a Mouse Model of Alzheimer's Disease but Plays a Minor Role in the Therapeutic Properties of a Cannabis-Based Medicine.

    PubMed

    Aso, Ester; Andrés-Benito, Pol; Carmona, Margarita; Maldonado, Rafael; Ferrer, Isidre

    2016-01-01

    The endogenous cannabinoid system represents a promising therapeutic target to modify neurodegenerative pathways linked to Alzheimer's disease (AD). The aim of the present study was to evaluate the specific contribution of CB2 receptor to the progression of AD-like pathology and its role in the positive effect of a cannabis-based medicine (1:1 combination of Δ9-tetrahidrocannabinol and cannabidiol) previously demonstrated to be beneficial in the AβPP/PS1 transgenic model of the disease. A new mouse strain was generated by crossing AβPP/PS1 transgenic mice with CB2 knockout mice. Results show that lack of CB2 exacerbates cortical Aβ deposition and increases the levels of soluble Aβ40. However, CB2 receptor deficiency does not affect the viability of AβPP/PS1 mice, does not accelerate their memory impairment, does not modify tau hyperphosphorylation in dystrophic neurites associated to Aβ plaques, and does not attenuate the positive cognitive effect induced by the cannabis-based medicine in these animals. These findings suggest a minor role for the CB2 receptor in the therapeutic effect of the cannabis-based medicine in AβPP/PS1 mice, but also constitute evidence of a link between CB2 receptor and Aβ processing.

  3. Anti-infective therapeutics from the Lepidopteran model host Galleria mellonella.

    PubMed

    Vilcinskas, Andreas

    2011-01-01

    The larvae of the greater wax moth Galleria mellonella prosper in use both as surrogate alternative model hosts for human pathogens and as a whole-animal-high-throughput-system for in vivo testing of antibiotics or mutant-libraries of pathogens. In addition, a broad spectrum of antimicrobial peptides and proteins has been identified in this insect during past decade among which some appear to be specific for Lepidoptera. Its arsenal of immunity-related effector molecules encompasses peptides and proteins exhibiting potent activity against bacteria, fungi or both, whose potential as new anti-infective therapeutics are presently being explored. Of particular interest is the insect metalloproteinase inhibitor (IMPI) which has been discovered in G. mellonella. The IMPI exhibits a specific and potent activity against thermolysin-like microbial metalloproteinases including a number of prominent virulence and/or pathogenic factors of human pathogens which are responsible for severe symptoms such as septicemia, hemorrhagic tissue bleeding, necrosis and enhancement of vascular permeability. The IMPI and antimicrobial peptides from G. mellonella may provide promising templates for the rational design of new drugs since evidence is available that the combination of antibiotics with inhibitors of pathogen-associated proteolytic enzymes yields synergistic therapeutic effects. The potential and limitations of insect-derived gene-encoded antimicrobial compounds as anti-infective therapeutics are discussed.

  4. Mapping ASTI patient’s therapeutic-data model to virtual Medical Record: can VMR represent therapeutic data elements used by ASTI in clinical guideline implementations?

    PubMed Central

    Ebrahiminia, Vahid; Yasini, Mobin; Lamy, Jean Baptiste

    2013-01-01

    Lack of interoperability between health information systems is a major obstacle in implementing Clinical decision supports systems (CDSS) and their widespread disseminations. Virtual Medical Record (vMR) proposed by HL7 is a common data model for representing clinical information Inputs and outputs that can be used by CDSS and local clinical systems. A CDSS called ASTI used a similar model to represent clinical data and therapeutic history of patient. In order to evaluate the compatibility of ASTI with vMR, we started to map the ASTI model of representing patient’s therapeutic data to vMR. We compared the data elements and associated terminologies used in ASTI and vMR and we evaluated the semantic fidelity between the models. Only one data element the qualitative description of drug dosage, did not match the vMR model. However, it can be calculated in the execution engine. The semantic fidelity was satisfactorily preserved in 12 of 17 elements mapped between the models. This model of ASTI seems compatible to vMR. Further work is necessary to evaluate the compatibility of clinical data model of ASTI to vMR and the use of vMR in implementing practice guidelines. PMID:24551344

  5. Inhibitor-Based Therapeutics for Treatment of Viral Hepatitis.

    PubMed

    Dey, Debajit; Banerjee, Manidipa

    2016-09-28

    Viral hepatitis remains a significant worldwide threat, in spite of the availability of several successful therapeutic and vaccination strategies. Complications associated with acute and chronic infections, such as liver failure, cirrhosis and hepatocellular carcinoma, are the cause of considerable morbidity and mortality. Given the significant burden on the healthcare system caused by viral hepatitis, it is essential that novel, more effective therapeutics be developed. The present review attempts to summarize the current treatments against viral hepatitis, and provides an outline for upcoming, promising new therapeutics. Development of novel therapeutics requires an understanding of the viral life cycles and viral effectors in molecular detail. As such, this review also discusses virally-encoded effectors, found to be essential for virus survival and replication in the host milieu, which may be utilized as potential candidates for development of alternative therapies in the future.

  6. Bottleneck limitations for microRNA-based therapeutics from bench to the bedside.

    PubMed

    Chen, Yan; Zhao, Hongliang; Tan, Zhijun; Zhang, Cuiping; Fu, Xiaobing

    2015-03-01

    MicroRNAs are endogenous non-coding small RNAs that repress expression of a broad array of target genes. Research into the role and underlying molecular events of microRNAs in disease processes and the potential of microRNAs as drug targets has expanded rapidly. Significant advances have been made in identifying the associations of microRNAs with cancers, viral infections, immune diseases, cardiovascular diseases, wound healing, biological development and other areas of medicine. However, because of intense competition and financial risks, there is a series of stringent criteria and conditions that must be met before microRNA-based therapeutics could be pursued as new drug candidates. In this review, we specifically emphasized the obstacles for bench-based microRNA to the bedside, including common barriers in basic research, application limitations while moving to the clinic at the aspects of vector delivery, off-target effects, toxicity mediation, immunological activation and dosage determination, which should be overcome before microRNA-based therapeutics take their place in the clinic.

  7. Bias correction of temperature produced by the Community Climate System Model using Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Moghim, S.; Hsu, K.; Bras, R. L.

    2013-12-01

    General Circulation Models (GCMs) are used to predict circulation and energy transfers between the atmosphere and the land. It is known that these models produce biased results that will have impact on their uses. This work proposes a new method for bias correction: the equidistant cumulative distribution function-artificial neural network (EDCDFANN) procedure. The method uses artificial neural networks (ANNs) as a surrogate model to estimate bias-corrected temperature, given an identification of the system derived from GCM models output variables. A two-layer feed forward neural network is trained with observations during a historical period and then the adjusted network can be used to predict bias-corrected temperature for future periods. To capture the extreme values this method is combined with the equidistant CDF matching method (EDCDF, Li et al. 2010). The proposed method is tested with the Community Climate System Model (CCSM3) outputs using air and skin temperature, specific humidity, shortwave and longwave radiation as inputs to the ANN. This method decreases the mean square error and increases the spatial correlation between the modeled temperature and the observed one. The results indicate the EDCDFANN has potential to remove the biases of the model outputs.

  8. Cell-based therapeutics from an economic perspective: primed for a commercial success or a research sinkhole?

    PubMed

    McAllister, Todd N; Dusserre, Nathalie; Maruszewski, Marcin; L'heureux, Nicolas

    2008-11-01

    Despite widespread hype and significant investment through the late 1980s and 1990s, cell-based therapeutics have largely failed from both a clinical and financial perspective. While the early pioneers were able to create clinically efficacious products, small margins coupled with small initial indications made it impossible to produce a reasonable return on the huge initial investments that had been made to support widespread research activities. Even as US FDA clearance opened up larger markets, investor interest waned, and the crown jewels of cell-based therapeutics went bankrupt or were rescued by corporate bailout. Despite the hard lessons learned from these pioneering companies, many of today's regenerative medicine companies are supporting nearly identical strategies. It remains to be seen whether or not our proposed tenets for investment and commercialization strategy yield an economic success or whether the original model can produce a return on investment sufficient to justify the large up-front investments. Irrespective of which approach yields a success, it is critically important that more of the second-generation products establish profitability if the field is to enjoy continued investment from both public and private sectors.

  9. Robust Approach for Nonuniformity Correction in Infrared Focal Plane Array.

    PubMed

    Boutemedjet, Ayoub; Deng, Chenwei; Zhao, Baojun

    2016-11-10

    In this paper, we propose a new scene-based nonuniformity correction technique for infrared focal plane arrays. Our work is based on the use of two well-known scene-based methods, namely, adaptive and interframe registration-based exploiting pure translation motion model between frames. The two approaches have their benefits and drawbacks, which make them extremely effective in certain conditions and not adapted for others. Following on that, we developed a method robust to various conditions, which may slow or affect the correction process by elaborating a decision criterion that adapts the process to the most effective technique to ensure fast and reliable correction. In addition to that, problems such as bad pixels and ghosting artifacts are also dealt with to enhance the overall quality of the correction. The performance of the proposed technique is investigated and compared to the two state-of-the-art techniques cited above.

  10. Robust Approach for Nonuniformity Correction in Infrared Focal Plane Array

    PubMed Central

    Boutemedjet, Ayoub; Deng, Chenwei; Zhao, Baojun

    2016-01-01

    In this paper, we propose a new scene-based nonuniformity correction technique for infrared focal plane arrays. Our work is based on the use of two well-known scene-based methods, namely, adaptive and interframe registration-based exploiting pure translation motion model between frames. The two approaches have their benefits and drawbacks, which make them extremely effective in certain conditions and not adapted for others. Following on that, we developed a method robust to various conditions, which may slow or affect the correction process by elaborating a decision criterion that adapts the process to the most effective technique to ensure fast and reliable correction. In addition to that, problems such as bad pixels and ghosting artifacts are also dealt with to enhance the overall quality of the correction. The performance of the proposed technique is investigated and compared to the two state-of-the-art techniques cited above. PMID:27834893

  11. Full-wave acoustic and thermal modeling of transcranial ultrasound propagation and investigation of skull-induced aberration correction techniques: a feasibility study.

    PubMed

    Kyriakou, Adamos; Neufeld, Esra; Werner, Beat; Székely, Gábor; Kuster, Niels

    2015-01-01

    Transcranial focused ultrasound (tcFUS) is an attractive noninvasive modality for neurosurgical interventions. The presence of the skull, however, compromises the efficiency of tcFUS therapy, as its heterogeneous nature and acoustic characteristics induce significant distortion of the acoustic energy deposition, focal shifts, and thermal gain decrease. Phased-array transducers allow for partial compensation of skull-induced aberrations by application of precalculated phase and amplitude corrections. An integrated numerical framework allowing for 3D full-wave, nonlinear acoustic and thermal simulations has been developed and applied to tcFUS. Simulations were performed to investigate the impact of skull aberrations, the possibility of extending the treatment envelope, and adverse secondary effects. The simulated setup comprised an idealized model of the ExAblate Neuro and a detailed MR-based anatomical head model. Four different approaches were employed to calculate aberration corrections (analytical calculation of the aberration corrections disregarding tissue heterogeneities; a semi-analytical ray-tracing approach compensating for the presence of the skull; two simulation-based time-reversal approaches with and without pressure amplitude corrections which account for the entire anatomy). These impact of these approaches on the pressure and temperature distributions were evaluated for 22 brain-targets. While (semi-)analytical approaches failed to induced high pressure or ablative temperatures in any but the targets in the close vicinity of the geometric focus, simulation-based approaches indicate the possibility of considerably extending the treatment envelope (including targets below the transducer level and locations several centimeters off the geometric focus), generation of sharper foci, and increased targeting accuracy. While the prediction of achievable aberration correction appears to be unaffected by the detailed bone-structure, proper consideration of

  12. Neuropeptide Y, resilience, and PTSD therapeutics.

    PubMed

    Kautz, Marin; Charney, Dennis S; Murrough, James W

    2017-05-10

    Resilience to traumatic stress is a complex psychobiological process that protects individuals from developing posttraumatic stress disorder (PTSD) or other untoward consequences of exposure to extreme stress, including depression. Progress in translational research points toward the neuropeptide Y (NPY) system - among others - as a key mediator of stress response and as a potential therapeutic focus for PTSD. Substantial preclinical evidence supports the role of NPY in the modulation of stress response and in the regulation of anxiety in animal models. Clinical studies testing the safety and efficacy of modulating the NPY system in humans, however, have lagged behind. In the current article, we review the evidence base for targeting the NPY system as a therapeutic approach in PTSD, and consider impediments and potential solutions to therapeutic development. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Evaluation and automatic correction of metal-implant-induced artifacts in MR-based attenuation correction in whole-body PET/MR imaging

    NASA Astrophysics Data System (ADS)

    Schramm, G.; Maus, J.; Hofheinz, F.; Petr, J.; Lougovski, A.; Beuthien-Baumann, B.; Platzek, I.; van den Hoff, J.

    2014-06-01

    The aim of this paper is to describe a new automatic method for compensation of metal-implant-induced segmentation errors in MR-based attenuation maps (MRMaps) and to evaluate the quantitative influence of those artifacts on the reconstructed PET activity concentration. The developed method uses a PET-based delineation of the patient contour to compensate metal-implant-caused signal voids in the MR scan that is segmented for PET attenuation correction. PET emission data of 13 patients with metal implants examined in a Philips Ingenuity PET/MR were reconstructed with the vendor-provided method for attenuation correction (MRMaporig, PETorig) and additionally with a method for attenuation correction (MRMapcor, PETcor) developed by our group. MRMaps produced by both methods were visually inspected for segmentation errors. The segmentation errors in MRMaporig were classified into four classes (L1 and L2 artifacts inside the lung and B1 and B2 artifacts inside the remaining body depending on the assigned attenuation coefficients). The average relative SUV differences (\\varepsilon _{rel}^{av}) between PETorig and PETcor of all regions showing wrong attenuation coefficients in MRMaporig were calculated. Additionally, relative SUVmean differences (ɛrel) of tracer accumulations in hot focal structures inside or in the vicinity of these regions were evaluated. MRMaporig showed erroneous attenuation coefficients inside the regions affected by metal artifacts and inside the patients' lung in all 13 cases. In MRMapcor, all regions with metal artifacts, except for the sternum, were filled with the soft-tissue attenuation coefficient and the lung was correctly segmented in all patients. MRMapcor only showed small residual segmentation errors in eight patients. \\varepsilon _{rel}^{av} (mean ± standard deviation) were: ( - 56 ± 3)% for B1, ( - 43 ± 4)% for B2, (21 ± 18)% for L1, (120 ± 47)% for L2 regions. ɛrel (mean ± standard deviation) of hot focal structures were

  14. Rodent Models of Experimental Endometriosis: Identifying Mechanisms of Disease and Therapeutic Targets

    PubMed Central

    Bruner-Tran, Kaylon L.; Mokshagundam, Shilpa; Herington, Jennifer L.; Ding, Tianbing; Osteen, Kevin G.

    2018-01-01

    Background: Although it has been more than a century since endometriosis was initially described in the literature, understanding the etiology and natural history of the disease has been challenging. However, the broad utility of murine and rat models of experimental endometriosis has enabled the elucidation of a number of potentially targetable processes which may otherwise promote this disease. Objective: To review a variety of studies utilizing rodent models of endometriosis to illustrate their utility in examining mechanisms associated with development and progression of this disease. Results: Use of rodent models of endometriosis has provided a much broader understanding of the risk factors for the initial development of endometriosis, the cellular pathology of the disease and the identification of potential therapeutic targets. Conclusion: Although there are limitations with any animal model, the variety of experimental endometriosis models that have been developed has enabled investigation into numerous aspects of this disease. Thanks to these models, our under-standing of the early processes of disease development, the role of steroid responsiveness, inflammatory processes and the peritoneal environment has been advanced. More recent models have begun to shed light on how epigenetic alterations con-tribute to the molecular basis of this disease as well as the multiple comorbidities which plague many patients. Continued de-velopments of animal models which aid in unraveling the mechanisms of endometriosis development provide the best oppor-tunity to identify therapeutic strategies to prevent or regress this enigmatic disease.

  15. Fast correction approach for wavefront sensorless adaptive optics based on a linear phase diversity technique.

    PubMed

    Yue, Dan; Nie, Haitao; Li, Ye; Ying, Changsheng

    2018-03-01

    Wavefront sensorless (WFSless) adaptive optics (AO) systems have been widely studied in recent years. To reach optimum results, such systems require an efficient correction method. This paper presents a fast wavefront correction approach for a WFSless AO system mainly based on the linear phase diversity (PD) technique. The fast closed-loop control algorithm is set up based on the linear relationship between the drive voltage of the deformable mirror (DM) and the far-field images of the system, which is obtained through the linear PD algorithm combined with the influence function of the DM. A large number of phase screens under different turbulence strengths are simulated to test the performance of the proposed method. The numerical simulation results show that the method has fast convergence rate and strong correction ability, a few correction times can achieve good correction results, and can effectively improve the imaging quality of the system while needing fewer measurements of CCD data.

  16. A correction for Dupuit-Forchheimer interface flow models of seawater intrusion in unconfined coastal aquifers

    NASA Astrophysics Data System (ADS)

    Koussis, Antonis D.; Mazi, Katerina; Riou, Fabien; Destouni, Georgia

    2015-06-01

    Interface flow models that use the Dupuit-Forchheimer (DF) approximation for assessing the freshwater lens and the seawater intrusion in coastal aquifers lack representation of the gap through which fresh groundwater discharges to the sea. In these models, the interface outcrops unrealistically at the same point as the free surface, is too shallow and intersects the aquifer base too far inland, thus overestimating an intruding seawater front. To correct this shortcoming of DF-type interface solutions for unconfined aquifers, we here adapt the outflow gap estimate of an analytical 2-D interface solution for infinitely thick aquifers to fit the 50%-salinity contour of variable-density solutions for finite-depth aquifers. We further improve the accuracy of the interface toe location predicted with depth-integrated DF interface solutions by ∼20% (relative to the 50%-salinity contour of variable-density solutions) by combining the outflow-gap adjusted aquifer depth at the sea with a transverse-dispersion adjusted density ratio (Pool and Carrera, 2011), appropriately modified for unconfined flow. The effectiveness of the combined correction is exemplified for two regional Mediterranean aquifers, the Israel Coastal and Nile Delta aquifers.

  17. Stochastic theory of large-scale enzyme-reaction networks: Finite copy number corrections to rate equation models

    NASA Astrophysics Data System (ADS)

    Thomas, Philipp; Straube, Arthur V.; Grima, Ramon

    2010-11-01

    Chemical reactions inside cells occur in compartment volumes in the range of atto- to femtoliters. Physiological concentrations realized in such small volumes imply low copy numbers of interacting molecules with the consequence of considerable fluctuations in the concentrations. In contrast, rate equation models are based on the implicit assumption of infinitely large numbers of interacting molecules, or equivalently, that reactions occur in infinite volumes at constant macroscopic concentrations. In this article we compute the finite-volume corrections (or equivalently the finite copy number corrections) to the solutions of the rate equations for chemical reaction networks composed of arbitrarily large numbers of enzyme-catalyzed reactions which are confined inside a small subcellular compartment. This is achieved by applying a mesoscopic version of the quasisteady-state assumption to the exact Fokker-Planck equation associated with the Poisson representation of the chemical master equation. The procedure yields impressively simple and compact expressions for the finite-volume corrections. We prove that the predictions of the rate equations will always underestimate the actual steady-state substrate concentrations for an enzyme-reaction network confined in a small volume. In particular we show that the finite-volume corrections increase with decreasing subcellular volume, decreasing Michaelis-Menten constants, and increasing enzyme saturation. The magnitude of the corrections depends sensitively on the topology of the network. The predictions of the theory are shown to be in excellent agreement with stochastic simulations for two types of networks typically associated with protein methylation and metabolism.

  18. Pedagogical Knowledge Base Underlying EFL Teachers' Provision of Oral Corrective Feedback in Grammar Instruction

    ERIC Educational Resources Information Center

    Atai, Mahmood Reza; Shafiee, Zahra

    2017-01-01

    The present study investigated the pedagogical knowledge base underlying EFL teachers' provision of oral corrective feedback in grammar instruction. More specifically, we explored the consistent thought patterns guiding the decisions of three Iranian teachers regarding oral corrective feedback on grammatical errors. We also examined the potential…

  19. Ellipsoidal terrain correction based on multi-cylindrical equal-area map projection of the reference ellipsoid

    NASA Astrophysics Data System (ADS)

    Ardalan, A. A.; Safari, A.

    2004-09-01

    An operational algorithm for computation of terrain correction (or local gravity field modeling) based on application of closed-form solution of the Newton integral in terms of Cartesian coordinates in multi-cylindrical equal-area map projection of the reference ellipsoid is presented. Multi-cylindrical equal-area map projection of the reference ellipsoid has been derived and is described in detail for the first time. Ellipsoidal mass elements with various sizes on the surface of the reference ellipsoid are selected and the gravitational potential and vector of gravitational intensity (i.e. gravitational acceleration) of the mass elements are computed via numerical solution of the Newton integral in terms of geodetic coordinates {λ,ϕ,h}. Four base- edge points of the ellipsoidal mass elements are transformed into a multi-cylindrical equal-area map projection surface to build Cartesian mass elements by associating the height of the corresponding ellipsoidal mass elements to the transformed area elements. Using the closed-form solution of the Newton integral in terms of Cartesian coordinates, the gravitational potential and vector of gravitational intensity of the transformed Cartesian mass elements are computed and compared with those of the numerical solution of the Newton integral for the ellipsoidal mass elements in terms of geodetic coordinates. Numerical tests indicate that the difference between the two computations, i.e. numerical solution of the Newton integral for ellipsoidal mass elements in terms of geodetic coordinates and closed-form solution of the Newton integral in terms of Cartesian coordinates, in a multi-cylindrical equal-area map projection, is less than 1.6×10-8 m2/s2 for a mass element with a cross section area of 10×10 m and a height of 10,000 m. For a mass element with a cross section area of 1×1 km and a height of 10,000 m the difference is less than 1.5×10-4m2/s2. Since 1.5× 10-4 m2/s2 is equivalent to 1.5×10-5m in the vertical

  20. A Split-Luciferase-Based Trimer Formation Assay as a High-throughput Screening Platform for Therapeutics in Alport Syndrome.

    PubMed

    Omachi, Kohei; Kamura, Misato; Teramoto, Keisuke; Kojima, Haruka; Yokota, Tsubasa; Kaseda, Shota; Kuwazuru, Jun; Fukuda, Ryosuke; Koyama, Kosuke; Matsuyama, Shingo; Motomura, Keishi; Shuto, Tsuyoshi; Suico, Mary Ann; Kai, Hirofumi

    2018-05-17

    Alport syndrome is a hereditary glomerular disease caused by mutation in type IV collagen α3-α5 chains (α3-α5(IV)), which disrupts trimerization, leading to glomerular basement membrane degeneration. Correcting the trimerization of α3/α4/α5 chain is a feasible therapeutic approach, but is hindered by lack of information on the regulation of intracellular α(IV) chain and the absence of high-throughput screening (HTS) platforms to assess α345(IV) trimer formation. Here, we developed sets of split NanoLuc-fusion α345(IV) proteins to monitor α345(IV) trimerization of wild-type and clinically associated mutant α5(IV). The α345(IV) trimer assay, which satisfied the acceptance criteria for HTS, enabled the characterization of intracellular- and secretion-dependent defects of mutant α5(IV). Small interfering RNA-based and chemical screening targeting the ER identified several chemical chaperones that have potential to promote α345(IV) trimer formation. This split luciferase-based trimer formation assay is a functional HTS platform that realizes the feasibility of targeting α345(IV) trimers to treat Alport syndrome. Copyright © 2018 Elsevier Ltd. All rights reserved.